Automating Data Ingestion from Snowflake to GCS bucket

  1. Snowflake Account
  2. GCP Account
  • Create a Snowflake account if you don’t have one.
  • Note your Snowflake account URL, username, and password.
  • Create a Snowflake database for your data.
  • Set up a Snowflake warehouse for processing data.
  • Create a Snowflake user with the necessary privileges.
  • Grant the user access to the database and warehouse.
  • Construct Snowflake task-integrated Airflow DAGs.
  • Establish dependencies between tasks and plan their execution based on your ETL methodology.
  • Select Create Bucket > Create Folder in Bucket (dags/).
  • Drop the dag Python file into the newly created “dags/” folder.

    )

)



Leave a Reply