oreilly-workflow-orchestration icon indicating copy to clipboard operation
oreilly-workflow-orchestration copied to clipboard

Getting Started with Workflow Orchestration

Build, run, and monitor data pipelines at scale

Prepared for O'Reilly Media

Instructors:

About this course:

Data engineers and scientists spend most of their time on negative or defensive engineering, writing code to handle unpredictable failures such as resources going down, APIs intermittently failing, or malformed data corrupting data pipelines. Workflow orchestration tools help eliminate negative engineering, allowing engineers and scientists to focus on the problems they are solving. Modern data applications have evolved, and orchestrators such as Prefect are providing more runtime flexibility and the ability to leverage distributed compute through Dask.

Discover how workflow orchestration can free you up to build solutions, not just avert failures. You’ll learn about basic orchestration features such as retries, scheduling, parameterization, caching, and secret management, and you’ll construct real data pipelines.

Let's get our development environment set up! 🚀

For this course you will need:

Python

Python greater than version 3.6 is required (version 3.6 is reaching end of life soon).

  • Packages in the requirements.txt file
    • prefect==2.0b2 - workflow orchestration
    • beautifulsoup4 - web scraping
    • jupyter - interactive notebooks

Ideally, you should create a virtual environment (conda, pipenv, poetry) to install the dependencies.

To install the requirements with pip:

pip install -r requirements.txt

Docker

Docker is a great entrypoint (pun somewhat intended) into world of engineering! We'll be using it to provide reproducible environments to execute our workflows in. We also have a section devoted to Docker.

Optional Dependencies

These are optional dependencies but were added in the requirements.txt for convenience.

For the advanced section of this course, we will use a couple of common data engineering tools:

Cloning the repo

To clone the repo and run locally

git clone https://github.com/zzstoatzz/oreilly-workflow-orchestration.git

And then each notebook can be viewed and executed. Some of the code will extend beyond the notebooks because data workflows glue other tools (sometimes non-Python) together.

Contact Us

For any questions, feel free to reach to out us!

The Prefect Slack is also a good resource for Prefect and Workflow Orchestration questions.

Further Resources

Listed below are the documentation pages for the tools used:

Data Movement

Distributed Computing