In an increasingly automated world where decisions are based on results from models and algorithms, focus on the data fed into these is of paramount importance.
Data is a valuable asset
The realization that data is a valuable asset in itself is integral to effective digitalization. Good quality data allow people and organizations to increase trust in data and analysis, apply robust Machine Learning models, increase revenue and increase business performance.
Data produced by algorithms and models can be a valuable resource in providing insight that our human limitations prevent us from seeing. But does this mean that this kind of data are inherently reliable? Not quite.
Consider a gas generator on an oil rig which produces electricity for use on board. The generator is on a strict yearly or monthly servicing regime. Unfortunately, deterioration doesn’t always coincide with these servicing intervals. This is where sensors to detect and repair deterioration come in. Sensor technology can capture and respond to real-time physical effects such as deformation, vibration or temperature. Effects such as these are ongoing and can worsen considerably if left uncaptured until a scheduled check.
However, to rely on this sensor system, we must know that the data being fed through and by it is completely trustworthy. Untrustworthy data could mean a negative economic impact at best and a safety hazard at worst.
Therefore, we have developed a framework for data quality assurance.
The road to data quality for sensor-driven condition-based maintenance
For data to be valuable, you need to design for data quality; manage your data to meet business needs; and condition monitor data just as you condition monitor any other asset. When we do this, we ensure the data we receive is dependable, robust and cost-reductive as problems are identified early on.
Earlier this year, DNV piloted a sensor-based hull condition monitoring methodology which enables cargo owners to monitor their assets in real time. Since the data provided must be trusted before being used in any decision-making process, we set up requirements for data quality which were converted to rules applied to data sent from the sensors to determine if data are fit for use.
We focused on the essential practice of data quality assurance across industries in a webinar held on 17th October 2019.
In this webinar we discussed how a data management framework can help in clearly defining, assessing and continuously monitoring data quality for all critical systems and services. Topics covered apply across industries, but we will focus on sensor-based time series data with examples from the oil and gas and maritime sectors.
The framework is designed to help companies gauge if they have the capabilities in place to ensure data are managed in the best way to ensure it is of the highest possible quality.
The road to data quality for sensor-driven condition-based maintenance: watch the webinar recording