Those of us who have had first-hand experience with any form of data management know that the prospect of close-to-perfect data is a myth.
With the disparity in sources, media, size, format etc; getting a "perfect" database is not only a thankless task, but also one which can overshadow it's purpose in terms of the time/resources it'd require.
Nonetheless, UNIPER has been stepped up to the gauntlet and is actively looking to reign in the deadweight brought about by sub-optimal data management.
This will be covered extensively at ETOT 2019, but before that, we got the chance to speak to Volodymyr Sorokoumov, Senior Data Manager at UNIPER on a few points surrounding their data journey right now.
Can you outline what your main responsibilities are at UNIPER right now?
I’m part of the data management team and I’m responsible for the UNIPER data analytics platform, in particular for design and architecture. But also, for the implementation of major advanced analytics and Big Data use cases. One of the largest use cases that we brought on platform this year, was the Market Data Analytics Platform. It serves more than 150 users, contains 15+ billion of datapoints across 270K+ time series and 250+ data sources. We can hardly find an area within Uniper that doesn’t leverage the data.
There were quite some innovations there and quite some challenges that we had solved along this journey that you would like to share this with the industry, and it would make sense to present this as a separate case study.
"We didn’t have a holistic view on our data assets."
Whereas, on the platform itself, my colleague Surya Ayyagari will be reporting also at the same venue.
Right now, across UNIPER do you have any specific targets in terms of working with data?
Yes, we have identified the major obstacle for us was the number of data silos, a lots of redundancies, high cost of data ingestion and data integration. We were thinking about systems, applications, interfaces and not about data. We didn’t have a holistic view on our data assets. That was 2.5 years back.
We took a step back and decided to build a Uniper-wide platform dedicated for data analytics, that can host all of our data, can scale to Petabytes of data and server hundreds of users simultaneously. Where we have a holistic view on all of our data assets, can easily combine it and put it into a needed context, basically, where we focus on the data and its value.
Often we need data close to real time – this happens mainly in Trading, Sales and Risk Management, but also when it comes to our assets, we observe growth of IoT use cases. However, sometimes it not about speed but rather about quality and trust – I think about Finance, Accounting. The platform is scalable that can support all our needs with respect to data quality, velocity, veracity and volume.
Another challenge that we had to tackle was cost of data ingestions and integration- The platform allows us to connect to new data sources and systems in a matter of hours, not days or month, and we are able to seamlessly integrate structured and semi-structured data. Our slogan is rapid data, any data. Regardless where it comes from, either our internal data or external data, and in which format – we can bring it to the platform and make it available to all of our users. This is data democratisation at Uniper.
The platform allows us to build different use cases as part of the digital agenda but, also as a part of efficiency increases in various, and application modernisation. For example, market data analytics platform is part of the digitalisation roadmap. However, there is a strong business case for UNIPER behind. It brings significant savings in terms of IT costs for the business area as well.
How is it going in terms of getting rid of the silo’s? Has it worked so far?
Well, that has worked so far very well because we had a clear visions, tools and the right approach. We have not tried to put all possible data on a platform - that would be quite ambitious and demanding in terms of costs. However, we have been pragmatic in terms of what are the use cases and the projects which are producing the most value to the company, and we focused first on them. This is truly think big, start small, scale fast!
"...for us most important was to make people part of the journey though active inclusion."
We went to different business departments of the company such as: Sales, Generation and Trading and really tried to make a holistic view on how the data was used and where the values remains locked? What are the main challenges and what are the potential synergies? What is the silo that we could break? We then put this on a roadmap and connected the dots and identify the cases where we have the most redundancies and the highest potential for synergies. We had started with those use cases. This was really remarkable journey because sometimes we were coming across six different departments or areas which was using the same data, and which did the same exercise on their own simultaneously. Driving this centrally
for the whole Uniper, allowed us to finally get a holistic view on all our data assets, create a data roadmap and start delivering along it, focusing on cost vs. information value.
The effort in brining everything under the platform standardising, that could cause a lot of disruption in small areas. Did you find that? If so, how did you manage people’s expectations and to motivate them to tolerate that disruption?
Yes, for us most important was to make people part of the journey though active inclusion. It was about empowering people but. Of course at the start we put significant efforts in awareness and showcasing platform benefits. The second stage it was empowering people for data transformation, involving them and helping them to adopt the new approach to data, new tools, and making them an active part of the community. We explained the concept to people, showed the potential benefits, and asked to be an active part of it - it’s a win-win situation. Relatively quickly people’s mindset has changed.
Because we think about data, and not systems and applications, we have a completely different governance concept than in traditional set-up. We made them publicly available and everyone who wants to use the platform has to consent to the terms and conditions. The roles and the responsibilities were made clear for the platform and an explanation of the governance concept and doing community meetings, driving actively, meetups, promoting from showcasing and hosting days for the platform. A lot of internal marketing and internal promotion of the platform and the approach. This is visible and accessible to people and reducing the hurdle to get onboarded on the platform.
See more about UNIPER's Data Lake project at ETOT 2019 - see the programme here.