Skip to content
Home » T-SQL Tuesday Retrospective #005: Reporting

T-SQL Tuesday Retrospective #005: Reporting

  • by
chart

A few weeks ago, I began answering every single T-SQL Tuesday from the beginning. This week it’s the fifth entry, and on April 5th, 2010, Aaron Nelson invited us to write about reporting.

You can visit the previous entries here:

I used to work at a bank in South Africa in the Corporate Business Intelligence department, previously known as Management Reporting. Our job was to build and maintain a Business Intelligence (BI) environment for senior management, as well as create online tools for business banking employees. Charts, graphs, cubes, you name it, we did it. I even managed to automate a weekly PowerPoint slide deck using Visual Basic for Applications.

Here’s a little secret: our reporting numbers were allowed a variance of 3% due to rounding errors. You see, while we innovated with KPIs (key performance indicators) and web dashboards and Kimball data warehousing and Analysis Services cubes, our source data mostly came from the workhorse tools of corporate environments everywhere: the mainframe, Microsoft Excel, and Microsoft Access. And while we used bankers’ rounding and four decimal places, a lot of our data was recorded in floating points or two decimal places, so there was a margin for error.

But let’s get back to that bank job, and one of my favourite projects in the two years I worked in one of the largest banks in South Africa. The year was 2007, and SQL Server 2005 had been released with lots of fanfare in the previous year. Our department was tasked with migrating or replacing the entire SQL Server 2000 infrastructure — as well as the management reporting tools — with SQL Server 2005 including Analysis Services, Reporting Services, and ASP.NET.

One of my colleagues had an extremely complicated process that needed automation, and she was happy for the help to get it off her plate. In the first few days of the month she would receive Excel spreadsheets by email, and would import those manually into an Access database she put together. Then, after running a series of queries to aggregate the data, she was able to produce reports which also had to be in Excel1Anyone reading this and wondering why they had to be in Excel has never worked at a bank.. The whole process would take as much as two weeks every month, with a lot of waiting in between.

For about three months she and I sat together on this project. We literally had our desks next to each other, and through luck (and talking to the right people in IT) we each got high-performance desktop workstations for running SQL Server 2005, and on more than one occasion (I’m very hard on my computers) I had to replace a drive.

With her institutional knowledge and several years’ experience in creating these Excel documents, my colleague was a walking functional specification document, so at least I didn’t have to do any requirements gathering. She knew the business, she knew the format of the reporting files, the Access database was her design, and I knew databases. This was the project on which I brushed up my Visual Basic for Applications knowledge, and discovered that if you turn off the screen update, Excel populates cells much faster. Yes, dear reader, we were updating spreadsheets cell by cell, and then using some pretty nifty dynamic formulas to produce charts and graphs for a number of managers.

We experimented with some fancy SQL Server tricks as well, and this is where the bulk of my early experience with performance tuning comes from. When you work this closely with queries, you can tell how fast something is, and can feel the difference between 100 and 200 milliseconds.

The automation project completed in about three months, at which point we had managed to reduce the runtime of the process from two weeks to four hours, or 20 times faster. Considering that we could produce dynamic Excel reports using VBA and then put them in SharePoint for the managers to pick up, this was a significant win.

With accurate reports, even if the numbers are slightly stale, organizations have a much better idea of whether they’re moving in the right direction. We data professionals may roll our eyes at some of the phrases we hear from senior management, but there’s no substitute for that feeling we get experiencing unmitigated joy from our colleagues, when we produce value from raw data in a matter of hours — maybe even minutes — in a consistent and reliable manner. While this post is an homage to reporting, it is also a good reminder that automation plays a key role in making reporting timely and efficient.

Share your thoughts in the comments below.

Photo by Markus Winkler on Unsplash.