Building a Data Mart with Pentaho Data Integration – Packt Publishing

In Stock

Original price was: $85.00.Current price is: $21.00.

Building a Data Mart with Pentaho Data Integration – Packt Publishing Download. A step-by-step tutorial that takes you through the creation of an ETL proce…

Purchase this course you will earn 21 Points worth of $2.10!
Quantity

Unlock Expert Knowledge: Discover the Premium Building a Data Mart with Pentaho Data Integration – Packt Publishing Course Exclusively at GBESY. Elevate your skills and achieve career success by learning from world-renowned instructors and industry experts through our extensive library of premium self-paced courses.

Salepage link: At HERE. Archive: http://archive.is/wip/FzMso

$85   $26 – Building a Data Mart with Pentaho Data Integration – Packt Publishing

Building a Data Mart with Pentaho Data Integration

A step-by-step tutorial that takes you through the creation of an ETL process to populate a Kimball-style star schema

A step-by-step tutorial that takes you through the creation of an ETL process to populate a Kimball-style star schema

About This Video

Learn how to create ETL transformations to populate a star schema in a short span of time

Create a fully-functional ETL process using a practical approach

Follow the step-by-step instructions for creating an ETL based on a fictional company – get your hands dirty and learn fast

In Detail

Companies store a lot of data, but in most cases, it is not available in a format that makes it easily accessible for analysis and reporting tools. Ralph Kimball realized this a long time ago, so he paved the way for the star schema.

Building a Data Mart with Pentaho Data Integration walks you through the creation of an ETL process to create a data mart based on a fictional company. This course will show you how to source the raw data and prepare it for the star schema step-by-step. The practical approach of this course will get you up and running quickly, and will explain the key concepts in an easy to understand manner.

Building a Data Mart with Pentaho Data Integration teaches you how to source raw data with Pentaho Kettle and transform it so that the output can be a Kimball-style star schema. After sourcing the raw data with our ETL process, you will quality check the data using an agile approach. Next, you will learn how to load slowly changing dimensions and the fact table. The star schema will reside in the column-oriented database, so you will learn about bulk-loading the data whenever possible. You will also learn how to create an OLAP schema and analyze the output of your ETL process easily.

By covering all the essential topics in a hands-down approach, you will be in the position of creating your own ETL processes within a short span of time.

Course Curriculum

Getting Started

  • The Second-hand Lens Store (6:49)
  • The Derived Star Schema (4:29)
  • Setting up Our Development Environment (7:07)

Agile BI – Creating ETLs to Prepare Joined Data Set

  • Importing Raw Data (3:22)
  • Exporting Data Using the Standard Table Output (4:33)
  • Exporting Data Using the Dedicated Bulk Loading (4:32)

Agile BI – Building OLAP Schema, Analyzing Data, and Implementing Required ETL Improvements

  • Creating a Pentaho Analysis Model (3:25)
  • Analyzing Data Using Pentaho Analyzer (3:49)
  • Improving Your ETL for Better Data Quality (4:15)

Slowly Changing Dimensions

  • Creating a Slowly Changing Dimension of Type 1 Using Insert/Update (6:47)
  • Creating a Slowly Changing Dimension of Type 1 Using Dimension Lookup Update (4:58)
  • Creating a Slowly Changing Dimension Type 2 (5:18)

Populating Data Dimension

  • Defining Start and End Date Parameters (5:17)
  • Auto-generating Daily Rows for a Given Period (4:26)
  • Auto-generating Year, Month, and Day (6:27)

Creating the Fact Transformation

  • Sourcing Raw Data for Fact Table (3:52)
  • Lookup Slowly Changing Dimension of the Type 1 Key (4:28)
  • Lookup Slowly Changing Dimension of the Type 2 key (6:08)

Orchestration

  • Loading Dimensions in Parallel (6:20)
  • Creating Master Jobs (4:09)

ID-based Change Data Capture

  • Implementing Change Data Capture (CDC) (4:58)
  • Creating a CDC Job Flow (4:48)

Final Touches: Logging and Scheduling

  • Setting up a Dedicated DB Schema (1:22)
  • Setting up Built-in Logging (4:22)
  • Scheduling on the Command Line (5:30)

$85   $26 – Building a Data Mart with Pentaho Data Integration – Packt Publishing

Unlock Expert Knowledge with the Building a Data Mart with Pentaho Data Integration – Packt Publishing Course on GBESY.

Access over 70,000 premium learning programs curated by leading experts and renowned authors at GBESY. Our Building a Data Mart with Pentaho Data Integration – Packt Publishing course provides actionable knowledge and real-world skills through:

  • Expert Authors: Learn from renowned figures like John Overdurf, Conor Harris, Samir Kahlot, and more.
  • Flexible Learning: Enjoy self-paced study for ultimate convenience.
  • Comprehensive Resources: Benefit from detailed manuals and step-by-step guidance.
  • Lifetime Access: Get ongoing learning with free updates.
  • Secure Purchase: Your transactions are protected with 256-bit AES encryption and verified payment gateways (PayPal, Stripe).
  • Instant Download: Access your Building a Data Mart with Pentaho Data Integration – Packt Publishing course immediately after payment from your account dashboard or via email. Learn on any device.
0/5 (0 Reviews)
0/5 (0 Reviews)
Status

Language

Author

Reviews

There are no reviews yet.

Be the first to review “Building a Data Mart with Pentaho Data Integration – Packt Publishing”

Your email address will not be published. Required fields are marked *

Back to Top