Skip to main content

Displaying and Automating Legacy Data using PowerBI

Author by John Adali

One of the more common requests involving business intelligence projects is surfacing data from legacy, line-of-business systems in an automated fashion. I recently was involved with a project where the client was making this exact request. This blog will explain how I tackled the problem and used a number of Microsoft technologies to come up with a solution.

The Problem

The client had a large set of data from their AS/400 mainframe system that they needed to present on a weekly basis to their executives. This data is used to make critical business decisions, so the accuracy and timeliness of presenting this data is crucial to the client’s business.

A manual process was implemented, performed by an employee, to extract this data. Microsoft Excel 2013 was used for this process, and was performed on a weekly basis. The employee created used PowerPivot to create tabular list representations to show the necessary data. A PowerPivot data model was created as part of this manual process.

There were two main issues causing pain with this process: The data used to create the PowerPivot data model was very large, and necessitated multiple passes of extraction from the AS/400 system into temporary Excel spreadsheets, as the in-memory limits of Excel were being reached, causing application crashes. The second issue was that the resulting tabular data generated was not visual and the client wanted a graphical and interactive experience when presenting this data.

The Solution

In order to solve these problems, as well as automate the process, I utilized the PowerBI technology stack, along with SQL Server 2012 and SharePoint 2013. The PowerBI tools (PowerPivot and PowerView) provide the data model and interactive graphical dashboards. SharePoint provides a central area for the PowerView dashboards. SQL Server Integration Services (SSIS) and SQL Server Analysis Services (SSAS) provide the ability to extract, transform and load the data from the AS/400 system for consumption by the PowerBI tools. Finally, SQL Server Agent jobs automate the update of the data, so that any data changes on the AS/400 side are propagated to the SSAS data model.

The first step in the solution was to surface the data from the AS/400 system to SQL Server, so that other SQL Server technologies could be leveraged to manipulate the data. A linked server in SQL was created to do this. The steps to do this are beyond the scope of this post, but here is a link that explains the steps:

Once the linked server to the AS/400 data was in place, I needed to use SQL Server Integration Services (SSIS) to massage the data coming from the AS/400. There were a number of data fields that needed to be formatted properly (mostly dates), so I decided to create a dedicated SQL Server database to hold this formatted data. SSIS allows you to create a package that you can design to perform various actions, including the data formatting that was required.


The SSIS package created references the linked server I created, performs the data manipulation steps and then loads the results into tables in the dedicated SQL database. This package is published to the SQL Server environment and can be run on demand by an administrator. This extraction and transformation was then automated by creating a SQL Server Agent job that schedules the execution of the SSIS package. You can learn more about scheduling a SQL Server Agent job from this link:


Now that I had the formatted data in my dedicated SQL Server database, I was ready to process this data for use with the PowerBI tools. In order to do this, I needed to create a data model using SQL Server Analysis Services (SSAS) that can be used by the PowerBI tools.

Using SQL Server Data Tools (SSDT) I created a tabular data model, which allowed me to reference the dedicated SQL Server database that was created as a source. Using Data Analysis Expresssions (DAX) along with measures, the data model was designed to perform the necessary calculations to present the data in its final form in the PowerView dashboards. Once the data model was compiled and published to SQL Server, I automated the processing of these calculations using a SQL Server Agent job on a scheduled basis (very similar to the agent job created for the SSIS package).


Finally, I used SharePoint 2013 as a central location to access the PowerView reports I created, displaying the data. The PowerPivot add-in for SharePoint needed to be installed, and SharePoint Server 2013 was also required.

Before I could create a PowerView report in SharePoint, I needed to create a data connection to the SSAS data model I created in the previous step. Since I’m hosting these reports in SharePoint, I used the Microsoft BI Semantic Model for Power View connection type ( when creating the data connection.


Once the data connection was created, I am now able to create the PowerView report. I did this by clicking ‘Create Power View Report’ in the menu for the data connection, as shown here:


This creates a blank PowerView report and displays the report editor, ready for design of the report. At this point, I was able to create different views for the report I needed using the PowerView designer. Below is an example report that can be created with this designer.


As you can see, this process involved a number of Microsoft technologies that came together to provide a scalable, automated process for displaying legacy data in SharePoint using PowerBI. SQL Server Analysis Services and Integration Services provide the ETL capabilities and calculation power for manipulating the data for consumption by PowerBI. Users viewing the PowerView reports can be guaranteed the displayed data is fresh, since the reports query the Analysis Services model data, which is kept up-to-date by SQL Server Agent jobs.



John Adali

Senior Software Developer - Modern Applications