Processing Data with Google Cloud Dataflow进入实验练习
cloud shell didn't load, this was useless
When executing: python df06.py -p $PROJECT_ID -b $BUCKET I get an error: Unable to get the Filesystem for path gs://qwiklabs-gcp-d0b29b9c1ddf916d-ml/flights/airports/airports.csv.gz but the file is uploaded correctly
Enabling dataflow jobs didn't always trigger checkpoints. Had to re run jobs multiple times to trigger a pass. env variables such as PROJECT_ID and BUCKET were not exported properly, so I had to hardcode.
Black box code