5, 10 or 20 seats+ for your team - learn more
Welcome to the Piper Data Concepts (PDC) team! You’re a member of its development team, and a Fortune 1000 client has asked you to modernize its workflow process, which just happens to be PDC’s specialty. In this liveProject series, you’ll review the client’s 15-year-old batch-based system, identify issues and bottlenecks, and determine what’s needed to transform its workflow into a more reactive, extensible, and dynamic system. To create observability, you’ll build an event-driven data pipeline with Kafka, use Python Poetry to package the project, write Python code using the Faust library to communicate with Kafka, and store the consumed data in a PostgreSQL database.
Your final goal will be to enable the client’s staff to gather workflow process information in real-time. You’ll write Python code that consumes messages from Kafka and prepares them for storing in the database, create Postgres queries to get the aggregated data, and build reports in CSV files to be read by visualization tools. When you’re done with these projects, your client’s workflow will be more resilient, responsive, and plugin-ready—and you’ll have a solid understanding of event-driven architecture.
Step into the role of a developer at Piper Data Concepts (PDC), a company that specializes in helping Fortune 1000 companies improve their workflows. Your task is to review the 15-year-old workflow architecture of one of your clients, Trade Data Systems. You’ll identify issues and bottlenecks, then determine what’s needed to transform its workflow into a more modern, responsive architecture. To accomplish this, you’ll set up a development environment with Docker using Kafka, Python, and Postgres. As you go, you’ll deploy a Kafka cluster and write Python code using the Faust library to seamlessly process pre-defined business events.
Put on your platform architect hat! You’re a member of the development team at Piper Data Concepts (PDC), and your client is looking to modernize its workflow. An existing benchmarked development environment, made up of Kafka, Python, and Postgres, is at your disposal. Now it’s time to start conceptualizing the new and improved workflow. You’ll use Kafka to create an event-driven data pipeline, review and understand business requirements, use Python Poetry to package the project, write Python code using the Faust library to communicate with Kafka, and store the consumed data in a PostgreSQL database.
As a member of the development team at Piper Data Concepts, you’ll carry out the final steps of a workflow-improvement project: enabling your client’s staff to gather workflow process information in real-time. Several prototypes have been built, and the client’s workflow is more resilient than ever. You’ll write Python code that consumes messages from Kafka and prepares them for storing in the database, create Postgres queries to access the aggregated data, and build reports in CSV files to be read by visualization tools—and ultimately, your client’s staff. When you’re done, your client’s modern system will provide a feedback loop, enable external API access to status updates, and be ready for more specialized services to be plugged in later, with no code changes.
This liveProject is for programmers interested in learning the concepts and skills used in event-driven development and its implementation. To begin these liveProjects you’ll need to be familiar with the following:
TOOLSgeekle is based on a wordle clone.