- May 27, 2025
- Posted by: Indumathi G
- Category: Snowflake
Introduction:
The wait is finally over! Snowflake has announced the preview of Openflow – its native Data Integration tool for a seamless self-managed experience. In this blog, we would demonstrate on how to setup Openflow in your snowflake account.

In today’s data-driven world, organizations are under increasing pressure to move faster, automate more, and reduce the complexity of their data infrastructure. Snowflake, a leader in cloud data platforms, has answered that call with the preview release of OpenFlow — a native, self-managed data integration tool built directly into the Snowflake ecosystem.
What is Snowflake OpenFlow? OpenFlow is a powerful new addition to the Snowflake platform that allows users to design and manage data pipelines without relying on third-party tools. It enables teams to build declarative workflows using simple YAML configurations, and execute them directly within the Snowflake environment, utilizing the same performance and scalability Snowflake is known for. It supports both structured and unstructured data formats, including text, images, audio, video, and sensor data. Openflow enables users to ingest data from unstructured sources like Google Drive and Box for AI applications or replicate CDC from databases like SQL Server for centralized reporting.
Benefits of using Openflow:
- Simplified Data Integration: Openflow provides a unified platform for connecting various data sources to Snowflake and other destinations.
- Data Transformation Capabilities: NiFi processors and controllers allow users to transform and prepare data for various applications.
- Centralized Reporting: Openflow can replicate data from different sources to Snowflake, enabling centralized reporting and analytics.
- AI Integration: Near real-time unstructured data Data ingested through Openflow can be used in AI applications like Snowflake Cortex.
- Declarative Workflows: Define complex data workflows using straightforward YAML-based configurations.
Steps to setup Openflow:
Openflow is currently available in AWS Commercial Regions.
Step 1: In a new SQL worksheet execute the below scripts to create a new database, schema, image repository, role and Warehouse and to give required grants to Public role and the new role.
USE ROLE ACCOUNTADMIN;
CREATE DATABASE IF NOT EXISTS OPENFLOW;
USE OPENFLOW;
CREATE SCHEMA IF NOT EXISTS OPENFLOW;
USE SCHEMA OPENFLOW;
CREATE IMAGE REPOSITORY IF NOT EXISTS OPENFLOW;
CREATE WAREHOUSE OPENFLOW_WH;
CREATE ROLE OPENFLOW_ADMIN_ROLE;
GRANT USAGE ON DATABASE OPENFLOW TO ROLE public;
GRANT USAGE ON SCHEMA OPENFLOW TO ROLE public;
GRANT READ ON IMAGE REPOSITORY OPENFLOW.OPENFLOW.OPENFLOW TO ROLE public;
GRANT USAGE ON DATABASE OPENFLOW TO ROLE OPENFLOW_ADMIN_ROLE;
GRANT ALL ON SCHEMA OPENFLOW TO ROLE OPENFLOW_ADMIN_ROLE;
GRANT READ ON IMAGE REPOSITORY OPENFLOW.OPENFLOW.OPENFLOW TO ROLE OPENFLOW_ADMIN_ROLE;
GRANT USAGE ON WAREHOUSE OPENFLOW_WH TO OPENFLOW_ADMIN_ROLE;
GRANT CREATE OPENFLOW DATA PLANE INTEGRATION ON ACCOUNT TO ROLE OPENFLOW_ADMIN_ROLE; GRANT CREATE OPENFLOW RUNTIME INTEGRATION ON ACCOUNT TO ROLE OPENFLOW_ADMIN_ROLE; ALTER USER <user_name> SET DEFAULT_SECONDARY_ROLES = ('ALL');
GRANT ROLE OPENFLOW_ADMIN_ROLE TO USER <user_name>;
Step 2: Users cannot sign in to Openflow if their default role is ACCOUNTADMIN, ORGADMIN, GLOBALORGADMIN, or SECURITYADMIN. You must change the default role for your user to a role other than ACCOUNTADMIN, ORGADMIN, GLOBALORGADMIN, or SECURITYADMIN to log in to Openflow. Once your default role is switched go to Data -> Openflow to launch Openflow. Key in your snowflake user credentials.


Step 3: Create a New Deployment by clicking on Create Deployment. Click on Next to select the Deployment Location. Snowflake offers two deployment paths
- Snowflake deployment (in Preview) – Snowflake managed Infrastructure
- Bring your own cloud (BYOC) – Deploy in AWS Account and take control of the full infrastructure.
Only BYOC is available now to try until Snowflake Deployment is released.


Select Snowflake Managed VPC configuration option so that snowflake will provide a Cloud Formation Template to download and use to complete the setup. Use the OPENFLOW_ADMIN_ROLE you created earlier as the owner of this Deployment.


Step 4: Login to your AWS Account and create a new Cloud Formation – Stack using the Snowflake provided template. Specify a Stack Name, keep the rest of them to default and submit to create a new stack. It completes the rest of the Installation process using infrastructure as code scripts. This process takes about 45 minutes to complete.



Step 5: Once AWS process is completed, Deployment will be marked as Active.

Step 6: Create Runtime using the deployment created. Select the min and max nodes that is required for your runtime depending on your workload. It will take 2-3 minutes to create this runtime.


You can click on the Runtime to open the Openflow Canvas.
Final Thoughts:
With OpenFlow, Snowflake takes a significant step toward unifying the data stack. For organizations looking to simplify their architecture, increase efficiency, and gain more control over their data pipelines, OpenFlow offers a compelling, cloud-native solution. As the preview continues to evolve, it’s clear that the future of data integration is not just about powerful tools — it’s about powerful tools that are built-in.
Stay tuned for deeper dives into different connectors for OpenFlow, best practices, and real-world applications in upcoming posts.
Referral Link: https://docs.snowflake.com/en/user-guide/data-integration/openflow/setup-openflow
Please feel free to reach out to us for your Snowflake solution needs. Cittabase is a Premier partner with Snowflake.