Introduction to Industry 4.0 is a program designed to introduce to the participants about the key design principles and components of the new revolution, its key challenges, and the new potential opportunities for the entire value chain of the implementation. In the current industry environment, providing high-end quality service or product with the least cost is the key to success and industrial factories are trying to achieve as much performance as possible to increase their profit as well as their reputation. In this way, various data sources are available to provide worthwhile information about different aspects of the factory. In this stage, the utilization of data for understanding current operating conditions and detecting faults and failures is an important topic to research. In contrast, in an Industry 4.0 factory, in addition to condition monitoring and fault diagnosis, components and systems are able to gain self-awareness and self-predictive, which will provide management with more insight on the status of the factory.


By the end of the course, participants will be able to understand how the system integration works between various aspects of the Industry 4.0 which include Supply Chain, Cloud Computing, Cyber Security, Additive Manufacturing, Autonomous Robot, Augmented Reality, Industrial Internet of Things, Big Data Analytics, and Horizontal & Vertical Integration. Participants will also be introduced to the basic understanding of each pillar which include its working principles, benefits, pro and cons, and key challenges.


  • National Strategic Road Map on IoT
  • Defining Industry 4.0
  • The Enablers of the New Revolution
  • Industry 4.0 Framework and Blueprint
  • Introduction to 9 Pillars of Industry 4.0
  • Pillar 1 – 4 (Supply Chain, IIoT, Cloud, Big Data Analytics)
  • Pillar 5 – 9 (Cyber Security, Augmented Reality, Additive Manufacturing, Horizontal & Vertical Integration, Autonomous Robots)
  • Key Benefits to the Industry
  • Sharing of Best Practices from Industry 4.0 Pioneers
  • Convergence and Operational Technology and Information Technology



Across all lines of business, sharp and timely data insights are required to keep an organisation competitive in this digital era.  Big Data is a change agent that challenges the ways in which organizational leaders have traditionally made decisions. Used effectively, it provides accurate business models and forecasts to support better decision-making across all facets of an organization.

This course provides participants an overview with the data literacy they need to remain efficient, effective and ahead of the curve. Participants will learn why, where and how to deploy technologies and methodologies, from Big Data and Hadoop, to data analytics and data science.


By the end of the course, participants will be able to:

  • Illustrate the benefits, functionality, and ecosystem of Big Data
  • Prompt a Big Data initiative within their organization and generate organizational value by adopting data analytics
  • Leverage free, open-source applications and open data to deliver insights that generate an organizational competitive advantage.


Module 1: The Big Data Landscape Overview

  • What is Big Data?
  • Big Data vs its predecessors
  • How Big Data relates to data analytics and data science
  • The Big Data paradigm
  • Big Data professional roles
  • How Big Data projects benefit businesses and industries
  • The Hadoop ecosystem and architecture
  • Other technologies in the Big Data paradigm

Module 2: Big Data Project Planning

  • Beyond the Hadoop ecosystem
  • Other popular projects by MapR
  • Commercial distributions of Hadoop
  • Security within Hadoop
  • Data engineering
  • Useful programming languages
  • The 4-step Big Data planning process
  • Staying competitive as a Big Data professional

Module 3: Free Resources to Analyse Data and Communicate Findings

  • Free applications for data science and analytics
  • Context and benchmarking using free and open data
  • Scraping the web for market data
  • The different types of data visualization
  • Three simple steps to design for your audience
  • Data graphics
  • Design styles to convey powerful messages
  • Design data analytics dashboards



This course describes how to implement a data warehouse platform to support a BI solution. Students will learn how to create a data warehouse with Microsoft SQL Server 2014, implement ETL with SQL Server Integration Services, and validate and cleanse data with SQL Server Data Quality Services and SQL Server Master Data Services. IMPORTANT NOTE: This course is designed for customers who are interested in learning SQL Server 2012 OR SQL Server 2014; it covers the new features in SQL Server 2014, but also the important capabilities across the SQL Server data platform.


By the end of the course, participants will be able to:

  • Describe data warehouse concepts and architecture considerations
  • Select an appropriate hardware platform for a data warehouse
  • Design and implement a data warehouse
  • Implement Data Flow in an SSIS Package
  • Implement Control Flow in an SSIS Package
  • Debug and Troubleshoot SSIS packages
  • Implement an ETL solution that supports incremental data extraction
  • Implement an ETL solution that supports incremental data loading
  • Implement data cleansing by using Microsoft Data Quality Services
  • Implement Master Data Services to enforce data integrity
  • Extend SSIS with custom scripts and components
  • Deploy and Configure SSIS packages
  • Describe how BI solutions can consume data from the data warehouse


Module 1: Introduction to Data Warehousing

  • Overview of Data Warehousing
  • Considerations for a Data Warehouse Solution
  • Lab: Exploring a Data Warehousing Solution
  • Exploring Data Sources
  • Exploring and ETL Process
  • Exploring a Data Warehouse

Module 2: Planning Data Warehouse Infrastructure

  • Considerations for Data Warehouse Infrastructure
  • Planning Data Warehouse Hardware
  • Lab: Planning Data Warehouse Infrastructure
  • Planning Data Warehouse Hardware

Module 3: Designing and Implementing a Data Warehouse

  • Data Warehouse Design Overview
  • Designing Dimension Tables
  • Designing Fact Tables
  • Physical Design for a Data Warehouse
  • Lab: Implementing a Data Warehouse
  • Implement a Star Schema
  • Implement a Snowflake Schema
  • Implement a Time Dimension

Module 4: Creating an ETL Solution with SSIS

  • Introduction to ETL with SSIS
  • Exploring Data Sources
  • Implementing Data Flow
  • Lab: Implementing Data Flow in an SSIS Package
  • Exploring Data Sources
  • Transferring Data by Using a Data Flow Task
  • Using Transformations in a Data Flow

Module 5: Implementing Control Flow in an SSIS Package

  • Introduction to Control Flow
  • Creating Dynamic Packages
  • Using Containers
  • Managing Consistency
  • Lab: Implementing Control Flow in an SSIS Package
  • Using Tasks and Precedence in a Control Flow
  • Using Variables and Parameters
  • Using Containers
  • Lab: Using Transactions and Checkpoints
  • Using Transactions
  • Using Checkpoints

Module 6: Debugging and Troubleshooting SSIS Packages

  • Debugging an SSIS Package
  • Logging SSIS Package Events
  • Handling Errors in an SSIS Package
  • Lab: Debugging and Troubleshooting an SSIS Package
  • Debugging an SSIS Package
  • Logging SSIS Package Execution
  • Implementing an Event Handler
  • Handling Errors in a Data Flow

Module 7: Implementing a Data Extraction Solution

  • Planning Data Extraction
  • Extracting Modified Data
  • Lab: Extracting Modified Data
  • Using a Datetime Column
  • Using Change Data Capture
  • Using the CDC Control Task
  • Using Change Tracking

Module 8: Loading Data into a Data Warehouse

  • Planning Data Loads
  • Using SSIS for Incremental Loads
  • Using Transact-SQL Loading Techniques
  • Lab: Loading a Data Warehouse
  • Loading Data from CDC Output Tables
  • Using a Lookup Transformation to Insert or Update Dimension Data
  • Implementing a Slowly Changing Dimension
  • Using the MERGE Statement

Module 9: Enforcing Data Quality           

  • Introduction to Data Quality
  • Using Data Quality Services to Cleanse Data
  • Using Data Quality Services to Cleanse Data
  • Lab: Cleansing Data
  • Creating a DQS Knowledge Base
  • Using a DQS Project to Cleanse Data
  • Using DQS in an SSIS Package

Module 10: Master Data Services

  • Introduction to Master Data Services
  • Implementing a Master Data Services Model
  • Managing Master Data
  • Creating a Master Data Hub
  • Lab: Implementing Master Data Services
  • Creating a Master Data Services Model
  • Using the Master Data Services Add-in for Excel
  • Enforcing Business Rules
  • Loading Data into a Model
  • Consuming Master Data Services Data

Module 11: Extending SQL Server Integration Services

  • Using Scripts in SSIS
  • Using Custom Components in SSIS
  • Lab: Using Custom Scripts
  • Using a Script Task

Module 12: Deploying and Configuring SSIS Packages

  • Overview of SSIS Deployment
  • Deploying SSIS Projects
  • Planning SSIS Package Execution
  • Lab: Deploying and Configuring SSIS Packages
  • Creating an SSIS Catalogue
  • Deploying an SSIS Project
  • Running an SSIS Package in SQL Server Management Studio
  • Scheduling SSIS Packages with SQL Server Agent

Module 13: Consuming Data in a Data Warehouse

  • Introduction to Business Intelligence
  • Enterprise Business Intelligence
  • Self-Service BI and Big Data
  • Lab: Using a Data Warehouse
  • Exploring an Enterprise BI Solution
  • Exploring a Self-Service BI Solution



This course is intended for database professionals who need to create and support a data warehousing solution. Primary responsibilities include: Implementing a data warehouse, developing SSIS packages for data extraction, transformation, and loading, enforcing data integrity by using Master Data Services, and cleansing data by using Data Quality Services.



  • 20461 Querying Microsoft SQL Server®
  • 20462 Administering Microsoft® SQL Server® Databases



The 5 days training are instructor-led training with access to remote training labs.  Labs within a course can be accessed via the Microsoft Labs Online (MLO) platform via Microsoft Official Courses (MOC) On-Demand.

MOC On-Demand is an integrated combination of video, text, practical tasks and knowledge tests designed to help IT experts and developers to expand their knowledge about Microsoft technologies. They can also be used in the form of a Blended Class together with managed training courses, or as the basis for training solutions with mentoring and other learning programmes.

Note: Broadband Internet connection (Recommended: Network bandwidth of over 4 Mbps)



8 days (mandatory to attend all training days)

Introduction to Industry 4.0: 2 Days

Big Data Analytics Overview: 1 Day

Implementing a Data Warehouse with Microsoft SQL Server 2012/2014: 5 Days



19, 20, 25, 26 May
9, 10, 16, 17 June



Funded by HRDF under the National Empowerment In Certification And Training For Next Generation Workers (NECT-Gen – Industry 4.0) Program



  • Participants must be Malaysian
  • Participants must be an employee of an HRDF levy-contributing company
  • Applicants must fulfill the track prerequisites
  • Participants must attend all training sessions and fill up all relevant attendance and evaluation forms
  • Participants are not allowed to withdraw from the Program once their registration is approved
  • Participants must sit for all relevant examinations after attending the training
  • Any revision or re-examination attempt by participants will be at their own cost
  • Participants must complete the Tracer Study forms (after 6 months) and submit a copy of their Certificate to PSDC


For further information, please contact Marie Ngan (ext 577/ or Yuki Lee (ext 517/