Continuous Data Ingestion from Azure Blob Storage to Snowflake using Snowpipe

In this blog, you will learn to load data from the Azure storage account to snowflake automatically using Snowpipe.

Process Flow

1. Prerequisite

Assume, we have a table in our Snowflake’s Database, and we want to copy the data from Azure Blob Storage into our tables as soon as new files are uploaded into the Blob Storage.

2. Facilitate Azure Services and Snowflake to build a data pipeline to auto-ingest files from Azure Blob Storage into Snowflake’s table.

Create a storage account under the resource group. Example: “testportal1 “

2.1 Create a container (Azure)

Example: “snowpipe-demo-container”

Share

Leave a Reply

Your email address will not be published. Required fields are marked *

Shahnewaz Khan

10 years of experience with BI and Analytics delivery.

Shahnewaz is a technically minded and accomplished Data management and technology leader with over 19 years’ experience in Data and Analytics.

Including;

  • Data Science
  • Strategic transformation
  • Delivery management
  • Data strategy
  • Artificial intelligence
  • Machine learning
  • Big data
  • Cloud transformation
  • Data governance. 


Highly skilled in developing and executing effective data strategies, conducting operational analysis, revamping technical systems, maintaining smooth workflow, operating model design and introducing change to organisational programmes. A proven leader with remarkable efficiency in building and leading cross-functional, cross-region teams & implementing training programmes for performance optimisation. 


Thiru Ps

Solution/ Data/ Technical / Cloud Architect

Thiru has 15+ years experience in the business intelligence community and has worked in a number of roles and environments that have positioned him to confidently speak about advancements in corporate strategy, analytics, data warehousing, and master data management. Thiru loves taking a leadership role in technology architecture always seeking to design solutions that meet operational requirements, leveraging existing operations, and innovating data integration and extraction solutions.

Thiru’s experience covers;

  • Database integration architecture
  • Big data
  • Hadoop
  • Software solutions
  • Data analysis, analytics, and quality. 
  • Global markets

 

In addition, Thiru is particularly equipped to handle global market shifts and technology advancements that often limit or paralyse corporations having worked in the US, Australia and India.