Data Engineering on Google Cloud Platform
Duration : 4 Days (32 Hours)
Data Engineering on Google Cloud Platform Course Overview:
The Data Engineering on Google Cloud Platform class offers a hands-on introduction to designing and constructing data processing systems on the Google Cloud Platform. Through a combination of presentations, live demonstrations, and practical labs, participants will acquire the skills needed to design efficient data processing systems, build end-to-end data pipelines, perform data analysis, and implement machine learning solutions. The course covers a wide range of data types, including structured, unstructured, and streaming data. By the end of the course, participants will have a solid understanding of how to work with diverse data sets and create robust data processing systems using the Google Cloud Platform.
This class is intended for developers who are responsible for:
- Extracting, Loading, Transforming, cleaning, and validating data
- Designing pipelines and architectures for data processing
- Integrating analytics and machine learning capabilities into data pipelines
- Querying datasets, visualizing query results and creating reports
- Design and build data processing systems on Google Cloud
- Process batch and streaming data by implementing auto scaling data pipelines on Dataflow
- Derive business insights from extremely large datasets using BigQuery
- Leverage unstructured data using Spark and ML APIs on Dataproc
- Enable instant insights from streaming data
- Introduce ML APIs, BigQuery ML, and learn to use Cloud AutoML to create powerful models without coding
Module 1: Introduction to Data Engineering
- Explore the role of a data engineer.
- Analyze data engineering challenges.
- Intro to BigQuery.
- Data Lakes and Data Warehouses.
- Demo: Federated Queries with BigQuery.
- Transactional Databases vs Data Warehouses.
- Website Demo: Finding PII in your dataset with DLP API.
- Partner effectively with other data teams.
- Manage data access and governance.
- Build production-ready pipelines.
- Review GCP customer case study.
- Lab: Analyzing Data with BigQuery.
Module 2: Building a Data Lake
- Introduction to Data Lakes.
- Data Storage and ETL options on GCP.
- Building a Data Lake using Cloud Storage.
- Optional Demo: Optimizing cost with Google Cloud Storage classes and Cloud Functions.
- Securing Cloud Storage.
- Storing All Sorts of Data Types.
- Video Demo: Running federated queries on Parquet and ORC files in BigQuery.
- Cloud SQL as a relational Data Lake.
- Lab: Loading Taxi Data into Cloud SQL.
Module 3: Building a Data Warehouse
- The modern data warehouse.
- Intro to BigQuery.
- Demo: Query TB+ of data in seconds.
- Getting Started.
- Loading Data.
- Video Demo: Querying Cloud SQL from BigQuery.
- Lab: Loading Data into BigQuery.
- Exploring Schemas.
- Demo: Exploring BigQuery Public Datasets with SQL using INFORMATION_SCHEMA.
- Schema Design.
- Nested and Repeated Fields.
- Demo: Nested and repeated fields in BigQuery.
- Lab: Working with JSON and Array data in BigQuery.
- Optimizing with Partitioning and Clustering.
- Demo: Partitioned and Clustered Tables in BigQuery.
- Preview: Transforming Batch and Streaming Data.
Module 4: Introduction to Building Batch Data Pipelines
- EL, ELT, ETL.
- Quality considerations.
- How to carry out operations in BigQuery.
- Demo: ELT to improve data quality in BigQuery.
- ETL to solve data quality issues.
Module 5: Executing Spark on Cloud Dataproc
- The Hadoop ecosystem.
- Running Hadoop on Cloud Dataproc.
- GCS instead of HDFS.
- Optimizing Dataproc.
- Lab: Running Apache Spark jobs on Cloud Dataproc.
Module 6: Serverless Data Processing with Cloud Dataflow
- Cloud Dataflow.
- Why customers value Dataflow.
- Dataflow Pipelines.
- Lab: A Simple Dataflow Pipeline (Python/Java).
- Lab: MapReduce in Dataflow (Python/Java).
- Lab: Side Inputs (Python/Java).
- Dataflow Templates.
- Dataflow SQL.
Module 7: Manage Data Pipelines with Cloud Data Fusion and Cloud Composer
- Building Batch Data Pipelines visually with Cloud Data Fusion.
- UI Overview.
- Building a Pipeline.
- Exploring Data using Wrangler.
- Lab: Building and executing a pipeline graph in Cloud Data Fusion.
- Orchestrating work between GCP services with Cloud Composer.
- Apache Airflow Environment.
- DAGs and Operators.
- Workflow Scheduling.
- Optional Long Demo: Event-triggered Loading of data with Cloud Composer, Cloud Functions, Cloud Storage, and BigQuery.
- Monitoring and Logging.
- Lab: An Introduction to Cloud Composer.
Module 8: Introduction to Processing Streaming Data
Processing Streaming Data.
Module 9: Serverless Messaging with Cloud Pub/Sub
- Introduction to Pub/Sub
- Lab: Publish Streaming Data into Pub/Sub.
Module 10: Cloud Dataflow Streaming Features
- Cloud Dataflow Streaming Features.
- Lab: Streaming Data Pipelines
Module 11: High-Throughput BigQuery and Bigtable Streaming Features
- BigQuery Streaming Features.
- Lab: Streaming Analytics and Dashboards.
- Cloud Bigtable.
- Lab: Streaming Data Pipelines into Bigtable.
Module 12: Advanced BigQuery Functionality and Performance
- Analytic Window Functions.
- Using with Clauses.
- GIS Functions.
- Demo: Mapping Fastest Growing Zip Codes with BigQuery GeoViz.
- Performance Considerations.
- Lab: Optimizing your BigQuery Queries for Performance.
- Optional Lab: Creating Date-Partitioned Tables in BigQuery.
Module 13: Introduction to Analytics and AI
- What is AI?.
- From Ad-hoc Data Analysis to Data Driven Decisions.
- Options for ML models on GCP.
Module 14: Prebuilt ML model APIs for Unstructured Data
- Unstructured Data is Hard.
- ML APIs for Enriching Data.
- Lab: Using the Natural Language API to Classify Unstructured Text.
Module 15: Big Data Analytics with Cloud AI Platform Notebooks
- What’s a Notebook.
- BigQuery Magic and Ties to Pandas.
- Lab: BigQuery in Jupyter Labs on AI Platform.
Module 16: Production ML Pipelines with Kubeflow
- Ways to do ML on GCP.
- AI Hub.
- Lab: Running AI models on Kubeflow.
Module 17: Custom Modelbuilding with SQL in BigQuery ML
- BigQuery ML for Quick Model Building.
- Demo: Train a model with BigQuery ML to predict NYC taxi fares.
- Supported Models.
- Lab Option 1: Predict Bike Trip Duration with a Regression Model in BQML.
- Lab Option 2: Movie Recommendations in BigQuery ML.
Module 18: Custom Modelbuilding with Cloud Auto ML
- Why Auto ML?
- Auto ML Vision.
- Auto ML NLP.
- Auto ML Tables
To get the most of out of this Data Engineering on Google Cloud Platform course, participants should have:
- Completed Google Cloud Fundamentals: Big Data & Machine Learning course OR have equivalent experience
- Basic proficiency with common query language such as SQL
- Experience with data modeling, extract, transform, load activities
- Developing applications using a common programming language such as Python
- Familiarity with Machine Learning and/or statistics
Q: What is the “Data Engineering on Google Cloud Platform” course?
A: The “Data Engineering on Google Cloud Platform” course is a comprehensive training program that focuses on teaching participants how to design, build, and maintain scalable data processing systems on Google Cloud Platform (GCP). The course covers various aspects of data engineering, including data ingestion, storage, processing, and analysis using GCP tools and services.
Q: Who is this course suitable for?
A: This course is suitable for data engineers, data analysts, software engineers, and IT professionals who want to gain expertise in data engineering on Google Cloud Platform. It is designed for individuals who have a basic understanding of data concepts and programming languages and want to learn how to leverage GCP tools for data processing and analysis.
Q: What topics are covered in this course?
A: The “Data Engineering on Google Cloud Platform” course covers a wide range of topics, including designing data processing architectures, ingesting data into GCP, storing and analyzing data using BigQuery, data preprocessing and transformation with Dataflow, creating data pipelines with Apache Beam, managing data workflows with Composer, and implementing data monitoring and logging.
Q: Are there any prerequisites for this course?
A: Participants should have a fundamental understanding of data concepts and programming languages such as Python or Java. Familiarity with cloud computing concepts and GCP services, such as BigQuery and Dataflow, is beneficial but not mandatory.
Q: What are the key skills and knowledge gained from this course?
A: By completing this course, participants will gain the skills and knowledge to design and build scalable data processing systems on Google Cloud Platform. They will learn how to ingest data into GCP, store and analyze data using BigQuery, preprocess and transform data with Dataflow, create data pipelines with Apache Beam, manage workflows with Composer, and implement data monitoring and logging for efficient data engineering.
Q: Is there any certification associated with this course?
A: This course is not directly associated with a specific certification. However, the knowledge and skills acquired through this training can be valuable for individuals pursuing various Google Cloud certifications, such as the “Google Cloud Certified – Professional Data Engineer” or “Google Cloud Certified – Associate Cloud Engineer” certifications.
Discover the perfect fit for your learning journey
Choose Learning Modality
This course comes with following benefits:
- Practice Labs.
- Get Trained by Certified Trainers.
- Access to the recordings of your class sessions for 90 days.
- Digital courseware
- Experience 24*7 learner support.
Got more questions? We’re all ears and ready to assist!