Hadoop and NoSQL Data Ingestion using Oracle Data Integrator 12c and Hadoop Technologies

Track

-

Date and time

Wednesday, 15. October 2014., 16:00

Room

Hall 5 Apartmants

Duration

60'

There are many ways to ingest (load) data into a Hadoop cluster, from file copying using the Hadoop Filesystem (FS) shell through to real-time streaming using technologies such as Flume and Hadoop streaming. 

In this workshop we'll take a high-level look at the data ingestion options for Hadoop, and then show how Oracle Data Integrator and Oracle GoldenGate leverage these technologies to load and process data within your Hadoop cluster. We'll also consider the updated Oracle Information Management Reference Architecture and look at the best places to land and process your enterprise data, using Hadoop's schema-on-read approach to hold low-value, low-density raw data, and then use the concept of a "data factory" to load and process your data into more traditional Oracle relational storage, where we hold high-density, high-value data.

Lecture details

Type: Workshop
Level of difficulty: Expert
Experience Level: Advanced
Desirable listeners function: Developers , Designer
Group of activity: Middleware

About speaker

HrOUG.hr

The conference is organized by the Croatian Association of Oracle users. More about the association can be found at Hroug.hr.

Follow us on Twitter

Keep yourself up to date with all updates!

Follow us on Facebook

Connect 360

Povežite se putem novog servisa za umrežavanje i upoznajte nove poslovne kontakte!

Partners

Big Sponsors

Silver Sponsors

Bronze Sponsors

Media Partner

Media Sponsors