High quality data is essential in analysing the performance of a vessel or a fleet of vessels. The more accurate the data, the more accurate the analyses and the better decision support for the vessels crew, the owner and the operator. Even though automatic logging of data is available in many ships, the vast majority of Shipping Companies are basing their performance systems on manual logged data or noon data. This way of logging is independent of the integration of equipment in the vessel, it includes the human “sanity check” of the data, and it is an efficient and cost-effective way of getting data from ship to shore.
If the process of acquiring data is well defined for the vessels crew, if they get the right training of acquiring the good data set and if they get the feedback from the system after entering data, the manual logged data can add good value and quality to the data analysis.
By including the vessels crew in the logging process, a better understanding of the term “high performance” and which factors lead to this are shared among the stakeholders in the operations of the vessel i.e. the crew, the owner and the operator. All are included in the process and results and experiences are shared within the shipping company i.e. in the office and in the fleet.
Since the noon reporting in most companies is a well-defined process it is also important not to repeat the data logging in different reporting systems and to build a well consolidated logging platform that can distribute the relevant data to the relevant reporting system(s). This would to a high degree ease the administrative burden on the seafarer and limit the repetitive daily paper work.
It could also be relevant to include automatic imports of data e.g. from AIS or VDR systems to increase the frequency of reporting of certain parameters and in addition to validate the reported data towards external systems like weather information from weather data providers. This could be done without adding workload to the seafarer and add value to the analysis of the data.
The training process of the seafarer is very important for the data quality. The purpose of each reported parameters should be evident, and processes should be well defined. This to ensure a uniform data logging procedure across the fleet and therefore also a performance system which can be the foundation of a benchmarking tool and not be dependant on the person(s) who logs the data.
In the data logging process and implemented in the software that defines the logging platform, a number of rules and validation checks on the data sets can be defined.
For each parameter, boundaries on values should be defined e.g. the vessel or the model of the vessel sets the parameters and alerts should be available when values are entered out of the bounds.
External data can be used to validate manual input e.g. on weather parameters and on positions/speed.
Relations between the different parameters are defined by the model e.g. the relation between speed/RPM/Power and the Fuel Oil Consumption and out of bounds values are easily detected by comparing to model values.
Expected (calculated and trended for predicted) values can be defined by setting up the mathematical models and compared to values calculated by the actual input.
If all the rules are defined in a library and all data-sets are validated towards the rules, a systematic overview will be in place of all the errors that are triggered in the actual fleet and the type of errors are ranked in the systems. The overview provides information of the most common errors in the fleet and information of where to make an effort to improve the logging process.The system also ranks the vessels in fleet with respect to which vessels have the most and the least errors in their reports. A feedback scheme should be set up for the vessels including suggestions for improvements and possible updates of logging procedures.
Figure 1 Warnings and Errors per rule and per vessel
To monitor the data quality, a KPI system should be set up based on the overall data quality in the fleet and with possibility to drill down to vessel classes, trade routes and single vessels. Trends of the development should be visible over user defined time frames and customised for the actual purpose.The system offered by Vessel Performance Solutions includes the previous mentioned features and a case study by one of the users of the system are described in the following.
A fleet of 47 vessels are reporting to the VPS system, the status by 1. November is that the errors in the logging on the fleet level is around 13 % and an overall KPI level around 81 (where 100 is the optimum value).
Figure 2 Total number of errors (fleet) per month
Figure 3 Overall Data Quality for the fleet
These factors have been noticed by the Operational Fleet Management. In an effort to improve the data quality in the fleet in general, the management sent a message to the vessels, highlighting the importance of proper data logging and the importance of good data quality in the decision support processes for the operational management.
The message was sent out in November 2017 and followed up in February 2018. Already in December the data quality improved considerably and by January 2018, the KPI reached a level of 97. The level has been kept high and the focus has been maintained on the data quality since then.
A similar experience has recently been observed for another large fleet of vessels using the validation engine of Vessel Performance Solutions. By keeping a focus on reported data quality and by including the vessel’s crew in the performance monitoring process, it is possible to create a robust performance system based on manual reports. Including a feed back loop between all stakeholders in the process, the system will be a powerful and dynamic tool to constantly improve the performance of the vessels in a fleet.
Vessel Performance Solutions joins SMM in Hamburg September 4-7 2018. Our stand will be part of the Danish Pavilion, in Hall B (Stand B1.EG.107). Please drop by, in case you plan to visit SMM.