Manipulating data with the extract, transform, and load (ETL)
framework
During the extract, transform, and load (ETL) process,
data is extracted from various sources, transformed and normalized
through encoded business rules, and then loaded into the fact and
dimension tables of a data mart.
Sample catalog for the extract, transform and load (ETL) process
To extract data from a data source, you need to know the
structure of data, create an extract, transform, and load (ETL) catalog
that reproduces this structure in IBM® Cognos® Data
Manager, and
then set up the processes required to transform the data into the
star schema or metadata format required by the data mart or IBM Cognos Framework
Manager.
To help you simplify the process, IBM Rational® Insight has sample
catalogs for extracting data from the data services, loading the data
to the operational data store in IBM Rational Insight
data warehouse, and
building the conformed data marts. This section is an overview of
the sample catalog.
Getting started with the extract, transform, and load (ETL) process
To get started with the extract, transform,
and load (ETL) process
of IBM Rational Insight,
you
need to prepare the data warehouse, data source connections, and ETL
catalog,
and run the ETL jobs. You can also configure the ODBC and JDBC drivers,
which
are used to provide data in relational form to the ETL framework.
Customizing ETL builds
When you create or customize an extract, transform, and
load (ETL) catalog, you define the specifications for builds, reference
structures, connections, and other IBM Cognos Data Manager components.