Back in the late seventies the very first company I worked for, a leading cosmetic company noticed if they raised the retail price of their perfume the sales increased and they made more profit.  This appeared to be counter intuitive, but what appeared to be happening was the product appealed to the well off and had the brand image that only the ‘well to do’ could afford to buy it.  Therefore as the price rose then it made the perfume more attractive to the customers.   Back then we didn’t have the analytical software that is available today, instead companies relied on employees understanding their business or just stumbling upon these kinds of trends in their data.  So what is my point?  Well most companies would love to be a position to find valuable insights into their business performance. Today we have the capability of storing huge volumes of historical data and have the software to analyse and process this data.  Back in the seventies 20 Mb of disk space was expensive and considered a huge investment, how the world has changed.

So what is this journey to data visibility?

Very small businesses are intimately in touch with the day to day running of their businesses.  Most of data associated with running the business can be held in a simple spreadsheet or even in the owners  head.  Life is relatively painless and the company’s day to day operations can be easily managed.

However as businesses grow up and expand, taking on new employees and doing more business you need a new approach to managing our day to day lives.  Applications are bought to manage various aspects of the business.  Maybe you have implemented a simple accounting package, a payroll package.  Then comes an HR application and a CRM system and before you know it the company data starts to become disjointed and resides in data silos. The visibility of your data is pretty poor and as for deriving any insights from the data then forget it it’s become too difficult.

So you need a tool to make your data more visible.

This is often where we engage with our customers.  We listen to the business pains and look for ways of using business intelligence tools to alleviate the pains. Typically the business applications have some kind of limited reporting capability, but few allow you to bring in other data sources without exporting data to spreadsheets which in itself is a major pain and error prone.  We usually find our customers’ requirements are relatively simple at this stage in their journey.  Provide me a reporting tool that is easy to use by the end user, has then ability to provide adhoc querying of the underlying data and allows me to access the data in a timely manner. Oh and by the way can I use this tool over more than one of my applications?  At this stage it’s all about visibility of the application data within their silos.  We’re not really talking about proper Business Intelligence at this stage, just the ability to report on what is there so you have a clear view of what the business has achieved in the past.

So your company has invested in a new reporting tool and life is good.  Users are happy and see the benefits of a user friendly and powerful reporting tool.  If your company has an IT department then they are happy too.  Users are able to write their own reports instead of bombarding IT for new reports every day.  Expensive IT resources have been freed up to tackle the more complex matters within the business.

More and more users within your company are directed to the new reporting tool and adopt the technology to create their reports.  Then it happens…. IT starts to receive complaints that the line of business systems are slowing down and investigates the cause.  They discover that the proliferation of queries on the database caused by the adoption of a reporting tool is sucking the performance out of the servers. IT’s initial reaction is to stop the users from using the reporting tools as frequently but the users push back and saying they can’t do without the reporting tools they have enjoyed over recent months and IT must find another solution to the issue.

The immediate response is to throw more hardware at the problem and a new server is purchased and the databases copied daily.  Normal service is resumed.

This approach is generally considered as a stop gap, a bit like taking a pain killer without understanding what is causing the underlying pain. The line of business systems are no longer competing for valuable server power and return to their previous performance levels and the users are happy because they can run their reports on a new server and their performance improves as well as they are only competing against themselves and not the line of business systems.

But is this really the answer?

Indeed the issue of performance hungry reporting applications goes away, but the underlying issue remains.  The databases supporting on-line transaction type line of business systems like ERP’s are not designed with Business Intelligence Reporting in mind.  They are designed to support rapid retrieval of a small amount of records to support a particular transaction or enquiry.  This has been fine up to now as you have only been interested in operational type reporting.  But the business is starting to ask for more.  Can I have reports that show trends over time? Can I see who my top and bottom 10 customers are by sales value and margin.  Can I see who my best performing sales man is and which customers he is servicing. Can I have compare my CRM data with my accounts data and maybe bring in some HR data as well. True Business Intelligence reporting involves joining data from many application and can involve reading many thousands of records to produce a report over the whole company’s performance.  Not only are the line of business databases designed wrong for reporting but because we have copied to a separate server they are now a static database, with data up to the point it was copied. This isn’t normally a problem for most business intelligence reports but those users that need real time reporting now have an issue.  Our data is visible but the information contains isn’t

So where do you go from here?

The next logical step is to build a reporting database with the data required for reporting and in a format that supports rapid retrieval. You may have heard of the term Data Warehousing and thought this sounds too expensive and too complex for us. It needn’t be complex or expensive.  If you have already been on the journey above, then you are in a good place because you understand the underlying data.  So what is needed now is to understand what data is actually required to support the data warehouse so you can extract the relevant bits. Simple?

Well yes and no.  Data warehouse design is a specialist area and something that requires care when its developed to avoid those unforeseen ”Bear Pits”.  The accepted design of a data warehouse is based on a dimensionally modelled approach involving Facts (the transaction values) and Dimensions (the structure you wish to report by).  I’m not going to go into depth here as to how it’s designed other than to say a good design allows the data warehouse to support the timely production of all manner of reports including those difficult to produce like trends, key performance indicators and scorecards.

So now you have your design,  you need to populate the Data Warehouse.  This is normally achieved using an ETL (Extract, Transformation and Load) tool like SQL Server Integration Services (SSIS) from Microsoft or Data Manager from IBM (other brands are available on request).

The data warehouse is now being populated but our key message is you need to make the data visible to the end user.  It must be easy to understand, accurate and timely. So how can you make the contents visible to the end user without making it technical and complicated?  Most users don’t want to code SQL statements or learn complicated scripting languages to produce a report.  This is where a tool like Cognos comes into its own.

I have resisted until now to beat the Cognos drum, but I don’t apologise for doing so at this point.  At Triangle we are passionate that Cognos is the best tool on the market today and I have built the business around its use.  The key to a successful BI implementation is to make the software work for the user and not the user having to work for the software.  It’s what Triangle is about, it’s our Ethos.  Cognos provides a Metadata tool that eradicates the needs for users to write SQL statements or complicated scripts. Once the metadata has been defined the users reporting experience is one of drag and drop.  Underlying database names are replaced by easy to understand and relevant names and the way the tables are joined is all taken care of in the model.  The reporting engine generates the necessary SQL code so you don’t have to.

You now have a data warehouse which is your trustworthy single version of the truth. Users are running their reports and analysing business performance but the business is moving on and needs more.

You can now see what you have done and you can probably explain why it happened but how do you measure this against what you planned to do.  Well you need to compare your actuals against your budget or forecast.  So where’s this information held?  Typically in a myriad of complex linked spreadsheets maintained by the Finance department and available to a small percentage of the user community.  Part of the process of making your data visible may have included the forecast data and it may have been included earlier in your journey to making the business data visible.  Typically business reporting either starts in Finance or in Sales but eventually both areas of the business get involved along with other areas like HR and Production.

You may have imported the relevant budget and forecast values from those spreadsheets in to the data warehouse, or the data may have already been input into your ERP system and extracted to the data warehouse as part of the ETL process.   Whatever way it is made available, the issue is around maintaining these forecasts and the spreadsheet hell that normally surrounds their maintenance.

Forecasting, budgeting and modelling is a well-recognised part of the Business Performance Circle and within the Cognos portfolio there is a tool to address this business pain.  Cognos TM1 provides the finance users with a structured way to collect and model budget and forecast data.  The resultant TM1 data is easily accessible to the Cognos reporting tools and can be used for forecast and budget comparisons.

Your journey to complete data visibility is moving on.  You have a good visibility of the past performance of your business.  You are able to plan based on past performance and create forecasts and budgets for the company’s future financial periods and report against these giving the company real insight into its performance against plan.  The business has moved on significantly on its journey to complete visibility of data and is now considering what its Key Performance indicators are. This is something the business has to understand.  What metrics are important to the efficient and profitable running of the business?  Most companies turn to management consultants for help in this area and it can be a long process.  But because you have established an accurate source for your business reporting i.e. the DWH then you are in good shape to respond to the business requirements.  The Cognos tools that you have invested in provides all the functionality required for this stage of the journey.

So have you achieved the “Holy Grail” and reached the end of your journey for complete visibility?

At the beginning of this Blog I mentioned my experience back in the seventies and that today we have the tools available to us to make discovering these important nuggets of information in our sea of data and to end up with predictable results. Of course this level of visibility may have been combined with some of our earlier levels of visibility but typically companies concentrate on what I like to call the “rear view mirror” reporting along with planning and forecasting to learn useful insights from the past.  However history does repeat itself so it worth taking note.

Once you understand what has happened you can ask the question as to why it happened and what you can learn from the experience.  So you have analysed the view in the mirror and you have created your forecasts and from these you are able to measure your success or otherwise against your plans.

But what if you could analyse all this data and predict the future.  Wouldn’t that be useful?  Well the final segment of the business performance circle is exactly that.  Predicting the future based on statistically analysing the past.  If you know what has happened and why then you have a good chance via clever algorithms to be able to predict what may happen should you execute your plans. IBM Cognos uses its SPSS tools to provide this functionality and because these tools are in the same brand family you are able to use the reporting tools to expose the results and distribute them around the business.

The journeys end.  You have full visibility of your data.  There is a single version of the truth, stored in a way that is easy to access, is accurate and is timely. From that data you are able create business plans and predict how those plans may turn out based on the past performance.  The data is accessible across the whole enterprise in a secure way and is available wherever it is required, on any device around the world.  Your enterprise is truly empowered to make informed decisions.

At Triangle we are passionate about Business Intelligence and the Cognos family of tools.

Let the Cognos software give your business control and let Triangle help you on the journey to true data visibility.

Author Bio

Chris Lewis

Chris Lewis

Chris is on a mission to further expand the business thus consolidating Triangles position as a key player in the market. One of the ways he sees to achieve this is by offering clients ways to do more for less.

View All Posts