Training Introduction


Currently, Snowflake is the analytics store / data warehouse everybody is talking about.

It is a 100% cloud-based platform that offers many advantages. On Snowflake, you can access the data very flexibly and increase or decrease the scope of services as required. The Snowflake Data Cloud can also be used to unite data silos, discover and share data, and run various analytical workloads.

For this reason, Snowflake is used by many companies. And the platform is correspondingly in high demand in the skill set of Data Engineering and Data Analytics jobs.

When working with Snowflake, you have to know how to prepare the data for the platform accordingly and how to integrate and manage it there. You also need to be able to connect other systems to the platform. Therefore, Snowflake makes sense not only for data scientists or analysts, but also for data engineers.

Snowflake is available as a free trial, allowing you to work your way through approximately two hours of video footage in this training.

Get started and expand your skills!

What will I learn?

In this practical training, you learn everything you need to get started with Snowflake right away.

We start with some Snowflake basics and data warehouse fundamentals. What is Snowflake, who uses it, and how does Snowflake fit into data platforms. Afterwards, we immediately dive into our hands-on part. 

First, you get to know the e-commerce dataset you are going to work with, which I already used for previous trainings as it is very nice to use. Then, you learn how to set up Snowflake & SnowSQL.

During this training, you will create tables and file formats for the data you are going to load in and work with internal stages, where you are uploading csv files from your PC. You create a dashboard as well as visualization worksheets and connect PowerBI to Snowflake to learn how to connect other tools to the platform. 

Afterwards, you query data with Python, learn how to create and execute tasks, which allow you to automate work on Snowflake, and test a full pipeline.

Last but not least, you dive into Snowflake and AWS. You manually import data from AWS S3 through external stages and I also show you a way of doing this automatically by implementing an auto-ingest with snowpipes.

What's included?
  • Source Codes
  • Web Links
  • Prepared Data

  • Relational Databases Bascis
  • AWS account for S3 data integration