Know your customer pipeline in apache air flow

Know your customer pipeline in apache air flow
For a successful pipeline run take these steps:
- Run you Airflow server
- Admin -> connection -> create
- Triger the input_dag
- Before triggering the File_ process dag, move one of the JSON files into the tmp folder (for example the request_1411.json). In the program we must give the right name of the JSON file to load.
- Triger the File_process dag
Input:
For our data, when I read the assignment and what I understood was that I had to create a bunch of dummy data as JSON files with random dates. This can be done with this main.py (FourthlinePython codesGenerating data) which I turned into a Dag in Airflow as Input_dag.py.