1

Siemens builds out a single source of truth through quality data via Tableau

 2 years ago
source link: https://diginomica.com/siemens-builds-out-single-source-truth-through-quality-data-tableau
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Siemens builds out a single source of truth through quality data via Tableau

By Stuart Lauchlan

November 22, 2021

Dyslexia mode

One of the main goals for us is just to have a mental switch from our users from the old school BI to a modern one.

It’s an interesting mission statement articulated by Ilya Kovalenko, Data Analyst and Visualisation Team Lead at Siemens IT, and one that perhaps points to an enduring organization problem - data hoarding and data greed. People want to own their data and they want more and more of it, which can become an issue if they then end up ‘drowning’ in data and unable to use it for effective decision-making.

Or as Kovalenko puts it:

Still the majority of people only trust the data they have by their hands - ‘iI’s my data, it's my calculations, it's my logic’. We would like them to start to trust published and validated data.

Tableau transformation

To achieve that, Siemens has been on a transformation journey towards advanced analytics and self-service reporting with Tableau, a journey that for Kovalenko began accidentally in 2015 when he was on the hunt for a reporting tool that fitted his needs more than Excel:

I wanted to find something more flexible and scalable. I found several tools of course, but I liked Tableau from basically the first glance because I was able to deliver the first report in 15 minutes, which was really fast, without any training. Since that day, I started to work on different projects in Siemens, helping our colleagues to deliver reports.

Siemens IT is an internal service provider to the organization and as such there’s an ongoing need to provide its customers - who are also colleagues - with reliable data about the available IT services they consume and their resulting costs.  That’s not always been the case, admits Kovalenko:

In the past there were a lot of frequent issues, misunderstandings and discrepancies about the performance of the services, how much they are used, how much do they cost, questions about KPIs, about SLAs. Basically every person you may ask will have a different number. You may ask one person about one KPI,  he will name you one number, the other one will name you second number.

To tackle this, the firm introduced its IRAM service - Integrated Reporting and Analytic Metrics - which uses Tableau as a visualization platform . IRAM services are used at various management levels, mainly by financial and technical controllers, from all Siemens operations from around the globe, he explains:

We deliver quality-proven reports. Our reports include not only financial data, but service performance data, service level agreements, volumes, consumptions, and much more. Our users are not developers, just typical users who like to go to the report, filter something and get to know the information they're looking for. Their goal is usually to check IT service processes and understand their stated costs, see trends, do housekeeping, budgeting, improving the services and more. We provide basically full transparency of the cost flow, as well as the performance data.

You can find different topics in our reports for IT, starting with clients and mobile devices, network bandwidth, printers, incident management, demand management topics and many more…,And because we have data which is validated and approved by the service owners, we deliver fact-based KPIs so people can make data-driven decisions.

This all requires trust in the data, of course, and to create and reinforce that trust, a lot of effort goes into data quality checks, he adds:

We validate the data in our data lake and what is available from our providers and services. We try to compare it and check it's okay. If some delivery fails from the provider or if something is wrong,  we try to fix it, of course, and tell the provider or fix it on our on our side. The use of comprehensive analysis brings new insights into IT service businesses and enables fact-based decisions, with the possibility of continuously optimizing processes. We are constantly improving and learning. We deliver state-of-the-art reports. We are trying to embed some analytics there to help our users make data-driven decisions.

The journey

All of this started back in October 2016 with the setting up of the centralized data lake to which Kovalenko refers and delivered the first commercial reports. By 2017, several infrastructure-related reports were being produced, before in 2018 the entire project was deployed as a complete service. By the end of 2019, the potential capabilities offered by Tableau were being further explored. Kovalenko says:

We were very intrigued by the new analytical possibilities of Tableau and the natural language processing and queries and started our internal evaluation of what and who might help us in the future. We had several proofs of concept. The output of this research was the introduction of asked data in IRAM for several special users at first, in order to get their feedbacks about the usability and value.

A really important date in this data journey came in October 2020 when Siemens was split into Siemens itself and Siemens Energy. This brought some fresh challenges, notes Kovalenko:

We had to continuously deliver the reports to both companies. However, we had to implement data separation, so the employees from Siemens Energy cannot see the Siemens data and vice versa.

This was achieved through a combination of ServiceNow as a platform, the data lake and Tableau's built-in capabilities for row level security, he adds:

We could deliver one report for one topic. Each company is allowed to see only its own data. It was very important for us, that this solution is scalable. So if we have future acquisitions, if we have a due diligence, we can easily adjust who can see which data.

As of today, Kovalenko’s term has produced more than 300 certified workbooks and boasts more than 2500 active users, who consume more than 50,000 report views each month. Servicing this broad constituency means that there’s a constant drive to keep reports simple, but understandable for all users and with a high data quality.

Then are those ongoing challenges around data ownership.. Kovalenko explains:

The majority of people will still like to play with data themselves. So what a lot of people are still doing, they start with a report, filter maybe something and end up downloading data into CSV and then they load it to Excel and starting to pilot everything. As a team and as a service, we do not see much of a problem there, unless they need a custom report with up-to-date data and they have to repeat this work every day or every month or every week depending on their goal. If someone requires a custom report, it .definitely takes time for preparation, compilation, confirmation of logic documentation if we need to release it for 1000s of users. This is one of the biggest obstacles we have.

But the journey continues and there’s clearly a lot of success that can be pointed at to date, as Kovalenko notes of his users:

They see that they can trust the numbers they see in our reports and that's why we're becoming the single source of truth for them.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK