Autoplay
Autocomplete
Previous Lesson
Complete and Continue
Document Streaming Project
Project introduction
Introduction (1:12)
Project overview (5:33)
Docker Fundamentals (1:43)
Preparing the data
The Dataset we use (2:48)
Transform CSV to JSONs (10:51)
Data API
API Schema (3:42)
Creating the API with FastAPI (9:41)
Testing the API with Postman (6:10)
Apache Kafka & API as Docker
Apache Kafka Goals (2:33)
Kafka Docker Compose Explained (3:35)
Startup Kafka Compose File (2:46)
Kafka Topics Setup (7:11)
Preparing the API Docker build (4:13)
Build the API (3:25)
Deploy the API (2:48)
Test the API Container with Kafka (2:06)
Recap API & Kafka (1:37)
Apache Spark Structured Streaming into Kafka
Apache Spark Compose Config (4:38)
Startup Spark with Kafka & API (2:26)
Spark Ingest Kafka & Produce Kafka (6:34)
Setup Test configuration (3:01)
Test Spark Streaming Kafka (5:42)
Spark UI Monitoring (2:30)
MongoDB
MongoDB Goals (4:22)
MongoDB Docker Compose Conifg (3:58)
MongoDB Startup (2:44)
Prepare MongoDB Database & Collection (1:45)
Spark Streaming into MongoDB
Spark Code Streaming To MongoDB (6:31)
Transformations 1: Writing Kafka Message as String to MongoDB (3:25)
Transformations 2: Writing complete Kafka message to MongoDB (2:34)
Transformations 3: Writing Nested Document to MongoDB (4:28)
Transformations 4: Writing Messages as Document (2:13)
Spark Streaming Conclusion (2:51)
The API Client Writing JSONs
Writing the API Client (4:04)
Create Test Data & Run Client (5:36)
Streamlit Dashboard
Streamlit Intro & Goals (6:18)
Query Customer Invoices (4:07)
Query Invoice Documents (4:15)
Conclusion
Project Summary (3:25)
Outlook (6:23)
The Dataset we use
Lesson content locked
If you're already enrolled,
you'll need to login
.
Enroll in Course to Unlock