Espresso 4.0 by
Wizata
You had experience working in Shell and in their IT department's background, and with data analytics, you certainly have some input that I find very interesting. So why don't we jump into it right now?
Road to more sustainable operations
Something that a lot of companies in oil and gas, as well as mining, as well as steel manufacturing are working towards is more sustainable operations and production processes. Their focus is aimed at reducing CO2 emissions.
However, there are many other things they could be working on. In your experience, what other initiatives can manufacturers in these industries work on to be more sustainable?
I worked a long time with Shell in my last role in data analytics. For the last couple of years, I managed a team of over 100 people, and we built all kinds of different data analytics solutions. Inventory optimization is one of the more interesting ones we built over a couple of years with the Data Scientist team.
Sustainability through storage optimization
We started as a pilot, taking a small portion of the operational scope upstream. We looked at how we can ensure we have different parts in store. Stuff breaks, so you need to have parts available to replace it. If you have a costly spare part in every plant, that might not be the best thing to do.
If the plants are very close together, we could have just one for a couple of locations and then ship it over when necessary. There was a lot of thinking behind that, a lot of calculations, and some AI involvement, and it turned out very well they were actually saving a couple of million dollars a year in inventory.
That's also good because it means that you have to have less inventory. You throw away less stuff. If you look at manufacturing sites worldwide, they have many spare parts everywhere. Sometimes you have to throw them away at the end of the ride because they're not current, you're not using them anymore, or they deteriorated while in storage.
From a sustainability perspective, that is not very good. It's much better to optimize that. And it makes a lot of economic sense, as I said, just a couple million a year from that operation in one big area of Shell upstream.
Later, we expanded that to other areas in Shell and used the same method. After being less involved, I started working on predictive maintenance.
Sustainability through predictive maintenance
So how do you predict when certain things break down so that you have enough spare parts close by? So if a particular spare part breaks roughly every 1000 hours, then by hour 900, you should ensure that you have a spare part close by. There's a lot of waste in terms of inventory, and how can you optimize that. And not just in manufacturing. This is something that works everywhere in logistics.
I had a conversation with the head of maintenance of the manufacturing company, and they were having a lot of problems with their inventory. They had problems with maintenance and a lot of spare parts, and the supply chain was clogged up.
They were getting the spare parts at a later date, with a lag. And that was a huge cost because it caused the downtime to extend, and the availability of their assets was sub-optimal.
The supply chain is a challenge in a lot of areas now. I saw that the sunflower oil shelf in a supermarket was not empty but relatively empty. And these are very trivial, small things you barely even notice as a consumer, but the impact for some companies is much wider.
Inventory optimization route
Of all the values you got from the project you launched in Shell, what were all the value drivers that pushed you, and some of the ones you discovered once you've actually launched and implemented the solution?
This project started before I joined the team, so I don't know how that came about, but I think the value drive was interesting. There are a lot of expensive spare parts. If you can reduce the amount you have to keep in stock, it improves your cash flow.
One thing that came out of this whole discussion is this element of predictive maintenance that they actually found out through the inventory optimization route by collecting all the data together.
A lot of that was already in SAP systems and all these things, but by analyzing it, you find out where the bottlenecks are. And they can be trivial in terms of process. Sometimes things just get stuck because somebody changed the rule, and as a result, nobody changed the process.
And you figured that out from a different strategy, process mining, which is related but different. In inventory optimization, you find the same kind of thing. And the other interesting thing is that the whole thing about maintenance and predicting when things break works very well, much better than I would have thought.
I would have thought it would be much more spread out in how long parts last, but it can be predicted quite accurately. You have to have a bit of buffer, but it works very well.
Power of data-driven process
While working at Wizata with our clients, one of the biggest value drivers in their digitalization, in terms of first initiatives, is predicting when their machines will fail. Not only do they save on inventory, but they also save on asset health and lifespan. Obviously, the production keeps going, and they can plan the maintenance, they can plan to order the samples ahead, and so on.
And all of that really depends on the data. There has doubtlessly been a lot of data involved in the project you've worked on. What is the value of the data, and what are the best practices to manage it so that you get value out of it?
Yeah, there is a lot of data, and you generally find that there is, of course, data in many different systems. If you look at companies like Shell and many companies in similar boats, there's no one SAP system or one ERP system. Many, many companies have a lot of different ones. And that comes from a variety of reasons.
In the past, every country did the same thing. You have mergers, divestments, and different business divisions taking different paths from time to time. At a certain moment, we had various flavors of ERPs in Shell for various reasons. That will become more of a standard now.
Data governance
Putting it all together and standardizing the data requires people who understand both the application and the business perspective of data. It doesn't help to have just IT folk. You need business folk to help understand the data.
In the last few years, there has been a lot more emphasis on data governance on a group level in various companies, Shell being one of them. On how to make sure that you all talk the same language, ideally, as a company. But if it is a step too far, then at least within a division.
Upstream should talk the upstream language. And this is not a trivial exercise. There's a lot of work to get it right, but you have to get it right to maximize the value of the data. You can sometimes get some value without getting the data governance fully, but you may have to correct that later on.
The problem of data governance and rather scattered data in different sources seems to be a consequence of the fact that we're kind of in the very beginnings of data literacy.
Data standardization
Do you feel that there's going to be a shift - that it will just be a given that whatever tech solution you get, you should always keep in mind that your data should feed into some kind of a centralized source of “truth”? Or do you think the companies will be stumbling on this for years?
No, I think it will get better. I think there are a lot of initiatives going on in various industries. I know at least that, for instance, in the oil and gas industry, there has been a lot of standardization and agreements on how to name everything, how the data should be structured, et cetera.
Not everybody has adopted it yet since it will be very expensive to migrate all this data to new solutions. There will be a transition period. But I believe standardization will continue to grow and play an important role.
New solutions are also continuously being developed, and new solutions and business models are being deployed. This means that there will always be some disparity in the data. So, if you want to pump data into a central place where you can analyze it and combine different things, translation will always be needed. And that's not necessarily bad.
We're not going to expect everybody to speak the same language, right? We'll always have interpreters, and that's fine as long as we have good-quality interpreters.
Something really good in that regard is API standardizations, and the way applications and data sources talk to each other. This is something that is more heavily adopted by a lot of solution providers.
Technology innovation
Speaking of technology and innovation in the world, what is one of the cooler things you've seen lately in the field of new and innovative technology that really struck you?
It's not even that new and not that innovative, but the speed at which mobile has conquered the world continues to amaze me. And how many companies, but also government organizations, in the Netherlands, hook into this and make things suitable for mobile.
Digitalization within the government continues to amaze, or one level higher - the speed of digitalization continues to amaze me.
Filip Popov