Often organizations work with Excel files or SharePoint lists to support business processes. When the way an organization manages its business process using these tools fundamentally changes, the IT solution supporting the process changes as a result.
For example, let's take a Consumer Packaged Goods company. The CPG company does promotions either in-store or online that can change from quarter to quarter and needs the business to approve a budget before it can run the promotions. For the budget approval process to run smoothly, there needs to be a way of capturing the projects and systematic way for key leaders to approve the process. This budget approval process example may not be natively supported within an ERP. As a result, IT may have to support a point solution to enable the business process.
In these scenarios, tools such as SharePoint or Power Apps can be leveraged to assist with the data collection, flow, and approval process.
However, often there are limitations to point solutions like managing large data sets in SharePoint lists. In these instances, there is a need to capture and calculate the data on an ACID-compliant relational database system.
In this blog post I discuss instances when there isn't a modern ETL tool or other tool in place that fits the bill for loading the data from Azure to a database.
The diagram below illustrates a flat file being loaded to blob storage. For example, Finance gets a list of approved projects and wants to hold onto that list and calculate accruals of the data.
Once a month, a flat file is generated from another system and lands in Azure blob storage.
Use BULK INSERT or OPENROWSET(BULK...) to import data to SQL Server - SQL Server | Microsoft Docs
Below I’ll demonstrate the process on how to create a connection to theflat file on SQL server and create a process to load that flat file from blob storage to a table within the database.
The following example shows how to use the “BULK INSERT” command to load data from a .csv file in an Azure Blob storage location on which you have created an SAS key. The Azure Blob storage location is configured as an external data source. This requires a database scoped credential using a shared access signature that is encrypted using a master key in the user database.
The first step is to create a master key for encryption using a strong password. Any strong password will do because this will be used for encryption.
/* For bulk insert with SAS token, create master encryption key */
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'YourStrongPassword';
--> Optional - a DATABASE SCOPED CREDENTIAL is not required because the blob is configured for public (anonymous) access
/* For bulk insert with an SAS token, create database-scoped credential */
CREATE DATABASE SCOPED CREDENTIAL YourCredential
WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET = 'sp=r&st=2021-05-14T15:17:35Z&se=2022-05-14T23:17:35Z&spr=https&sv=2020-02-10&sr=c&sig=sjhdfjlshdfjhdsjf2F62151hnn8eBvYjuGt67dyI76veEwyHHKk%3D';
-- NOTE: DO NOT PUT FIRST CHARACTER '?'' IN SECRET!!!
/* For bulk insert with SAS token, create data source referencing credential created above */
CREATE EXTERNAL DATA SOURCE YourDataSource
WITH ( TYPE = BLOB_STORAGE,
LOCATION = 'https://YourStorageAccount.blob.core.windows.net',
CREDENTIAL= YourCredential )
/* For bulk insert without SAS token, create data source without credential */
CREATE EXTERNAL DATA SOURCE YourDataSource
WITH ( TYPE = BLOB_STORAGE, LOCATION = 'https://YourStorageAccount.blob.core.windows.net');
After creating the external table, credential, and data source,you are ready to do a bulk insert.The next set of code uses the external data source which lives on blob storage that we defined to load the data from the storage account on the SQL server.
The following example executes the bulk insert using additional parameters. FIRSTROW=2 skips the first row of the input .csv file and is used to ignore column headers. Full documentation of the bulk insert command can be found here. BULK INSERT (Transact-SQL) - SQL Server | Microsoft Docs
BULK INSERT YourTable
FORMAT='CSV', CODEPAGE=65001, --UTF-8 encoding
In summary, I’ve shown how SQL server can be used to empower and execute point solutions for any business process. Using this approach can save development time and capital expenditures as the solution doesn’t require additional tools and simply uses SQL to solve common business problems.