Hello and thanks for your time and consideration.
I am developing a Jupyter Notebook in the Google Cloud Platform / Datalab.
I have created a Pandas DataFrame and would like to write this DataFrame to both Google Cloud Storage(GCS) and/or BigQuery. I have a bucket in GCS and have, via the following code, created the following objects:
import gcp
import gcp.storage as storage
project = gcp.Context.default().project_id
bucket_name = 'steve-temp'
bucket_path = bucket_name
bucket = storage.Bucket(bucket_path)
bucket.exists()
I have tried various approaches based on Google Datalab documentation but continue to fail.
Thanks
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…