Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

We've established a rule that all "custom" code (anything that isn't a preexisting operator in airflow) needs to be contained in a docker image and run through the k8s pod operator. What's resulted is most folks do exactly what you said. They create a repo with a simple CLI that runs a script and the only thing that gets put in our airflow repo is the dependency graph/configuration for the k8s jobs.


AFAICT this is the now-recommended way to use Airflow: as a k8s task orchestrator. Even the Astronomer team (original Airflow authors) will tell you to do it this way.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: