Email News Signup

Want to stay informed of news and updates on PipelineML? Subscribe to our email list and you will be connected and informed.

Email Subscription
To receive announcements and newsletters via email on PipelineML developments, subscribe using this form.  

PipelineML Tour

Problem Statement

Midstream pipeline operators are tasked with managing large volumes of constantly changing asset information. This data flows through the hands of numerous departments and service providers who all utilize different software applications and systems over the life of those assets. These disparate systems allow them to manage their workloads and make decisions relevant to their core mission based on best-available data. Those decisions can only be as good as their information is available, correct, and current.

Despite the frequency and complexity of information shared between parties, the pipeline industry lacks open standards for fully-interoperable data interchange (the ability for information to be exchanged across disparate systems without conversion or data loss). Considerable time, effort and money are spent converting and transforming this data to make it consumable by various data management systems. Additionally, this conversion process tends to introduce errors in the accuracy and loss to the density and resolution of data. The end result is time and money are wasted constantly transforming data instead of investing in the creation of open industry standards that allow data to flow between parties easier, faster, and cheaper.

Perhaps more importantly, these barriers to information sharing make it unnecessarily difficult for operators to maintain asset management systems that are Traceable, Verifiable and Complete (TVC). Following major pipeline industry incidents such as the one in San Bruno in 2010, regulatory agencies have mandated that operators maintain TVC-compliance in their asset data management practices and systems. The ability to track all decisions and activities that occur on all assets throughout their life from all involved parties requires their software and systems to freely and easily share data. Not only do all decisions and activities performed on all assets need to be shared, but the same applies to all decisions and activities performed on the data about those assets. They need to make all such data transparent, discoverable, and deliverable to the organization. This cannot be fully achieved without open, vendor-neutral data interchange standards.

News Updates