Building a Data Mart with Pentaho Data Integration – Packt Publishing

In Stock

Original price was: $85.00.Current price is: $21.00.

Building a Data Mart with Pentaho Data Integration – Packt Publishing Download. A step-by-step tutorial that takes you through the creation of an ETL proce…

Purchase this course you will earn 21 Points worth of $2.10!
Quantity

Achieve more with the Building a Data Mart with Pentaho Data Integration – Packt Publishing course, priced at just Original price was: $85.00.Current price is: $21.00. on GBESY.biz! Explore our extensive collection of over 60,000 downloadable courses in SEO and Social. We offer professional, self-paced digital education at up to 80% off original rates. Start transforming your expertise now!

Salepage link: At HERE. Archive: http://archive.is/wip/FzMso

$85   $26 – Building a Data Mart with Pentaho Data Integration – Packt Publishing

Building a Data Mart with Pentaho Data Integration

A step-by-step tutorial that takes you through the creation of an ETL process to populate a Kimball-style star schema

A step-by-step tutorial that takes you through the creation of an ETL process to populate a Kimball-style star schema

About This Video

Learn how to create ETL transformations to populate a star schema in a short span of time

Create a fully-functional ETL process using a practical approach

Follow the step-by-step instructions for creating an ETL based on a fictional company – get your hands dirty and learn fast

In Detail

Companies store a lot of data, but in most cases, it is not available in a format that makes it easily accessible for analysis and reporting tools. Ralph Kimball realized this a long time ago, so he paved the way for the star schema.

Building a Data Mart with Pentaho Data Integration walks you through the creation of an ETL process to create a data mart based on a fictional company. This course will show you how to source the raw data and prepare it for the star schema step-by-step. The practical approach of this course will get you up and running quickly, and will explain the key concepts in an easy to understand manner.

Building a Data Mart with Pentaho Data Integration teaches you how to source raw data with Pentaho Kettle and transform it so that the output can be a Kimball-style star schema. After sourcing the raw data with our ETL process, you will quality check the data using an agile approach. Next, you will learn how to load slowly changing dimensions and the fact table. The star schema will reside in the column-oriented database, so you will learn about bulk-loading the data whenever possible. You will also learn how to create an OLAP schema and analyze the output of your ETL process easily.

By covering all the essential topics in a hands-down approach, you will be in the position of creating your own ETL processes within a short span of time.

Course Curriculum

Getting Started

  • The Second-hand Lens Store (6:49)
  • The Derived Star Schema (4:29)
  • Setting up Our Development Environment (7:07)

Agile BI – Creating ETLs to Prepare Joined Data Set

  • Importing Raw Data (3:22)
  • Exporting Data Using the Standard Table Output (4:33)
  • Exporting Data Using the Dedicated Bulk Loading (4:32)

Agile BI – Building OLAP Schema, Analyzing Data, and Implementing Required ETL Improvements

  • Creating a Pentaho Analysis Model (3:25)
  • Analyzing Data Using Pentaho Analyzer (3:49)
  • Improving Your ETL for Better Data Quality (4:15)

Slowly Changing Dimensions

  • Creating a Slowly Changing Dimension of Type 1 Using Insert/Update (6:47)
  • Creating a Slowly Changing Dimension of Type 1 Using Dimension Lookup Update (4:58)
  • Creating a Slowly Changing Dimension Type 2 (5:18)

Populating Data Dimension

  • Defining Start and End Date Parameters (5:17)
  • Auto-generating Daily Rows for a Given Period (4:26)
  • Auto-generating Year, Month, and Day (6:27)

Creating the Fact Transformation

  • Sourcing Raw Data for Fact Table (3:52)
  • Lookup Slowly Changing Dimension of the Type 1 Key (4:28)
  • Lookup Slowly Changing Dimension of the Type 2 key (6:08)

Orchestration

  • Loading Dimensions in Parallel (6:20)
  • Creating Master Jobs (4:09)

ID-based Change Data Capture

  • Implementing Change Data Capture (CDC) (4:58)
  • Creating a CDC Job Flow (4:48)

Final Touches: Logging and Scheduling

  • Setting up a Dedicated DB Schema (1:22)
  • Setting up Built-in Logging (4:22)
  • Scheduling on the Command Line (5:30)

$85   $26 – Building a Data Mart with Pentaho Data Integration – Packt Publishing

Invest in endless knowledge with the Building a Data Mart with Pentaho Data Integration – Packt Publishing course at GBESY.biz! Gain lifetime access to premium digital content designed to fuel your professional and personal growth.

  • Lifetime Access: Unrestricted, permanent access to your purchased courses.
  • Unbeatable Value: Save significantly with prices up to 80% less than direct purchases.
  • Protected Payments: Complete your transactions securely.
  • Empowering Skills: Learn practical, in-demand skills for immediate application.
  • Immediate Download: Access your course content instantly after purchase.
  • Any Device, Anywhere: Study on your preferred device with full flexibility.

Discover your next opportunity with GBESY.biz!

Status

Language

Author

Reviews

There are no reviews yet.

Be the first to review “Building a Data Mart with Pentaho Data Integration – Packt Publishing”

Your email address will not be published. Required fields are marked *

Back to Top