Email News Signup
Want to stay informed of news and updates on PipelineML? Subscribe to our email list and you will be connected and informed.
PipelineML Tour
Governing Design Principals
The following are the governing design principles established by the PipelineML Standards Working Group (SWG) in the initial months of its formation:
- Practical – A governing principle for the design and development of PipelineML is the need to solve practical business problems. Rather than pursue the perfect architectural design, we want to develop and deliver a solution that can be understood and implemented across the broad spectrum of stakeholders in the pipeline industry.
- Flexible – The standard shall be flexible enough to adapt to change. This flexibility is not achieved at the cost of sacrificing the other design principles. Instead, flexibility shall be designed within the context of all these governing design principles.
- Fit-for-purpose – Each element of PipelineML shall be focused on solving real-world business challenges facing those on the front lines of managing pipeline asset data. To fulfill this principle, a specific set of use cases shall be identified that are considered to be most fundamental to the pipeline industry. Our scope shall be focused on achieving these use cases and establishing fulfillment of those use cases with predefined test cases.
- Interoperability – Providing interoperability is essential to the success of this data interchange standard. A packet of PipelineML-compliant data shall remain consistent regardless of the data storage model or software application that produces or consumes it. Any PipelineML data packet can be consumed by other software application using any backend storage model without concern as to the platform or technologies used to produce it or consume it.
- Unambiguous – To achieve vender-neutral interoperability, any ambiguity must be removed from the data standard. If there is any room for interpretation of the meaning of any aspect of the interchange standard, variations may emerge in how vendors implement this standard. This would significantly dilute the value of this interoperable data interchange standard. As such, vigilance shall be applied to the process of removing ambiguity from the interchange standard.
- Spatial Accuracy – Spatial accuracy of the location of pipeline assets is essential to safe, efficient operations. Increasingly accurate GPS technologies are ubiquitous enough to facilitate a high level of resolution and spatial accuracy. The ability to accurately describe the location of assets via GPS is a critical governing design principle of this standards effort.
- Vendor Neutral – Standards are most effective when they do not favor any vendor solutions. By developing standards on a level playing field free of vendor favoritism, we create an ecosystem in which the needs of all stakeholders can be served. This fosters an environment of innovation. The users and managers of data can freely pick and choose the right software solutions to suit their needs with the assurance they can interchange data in and out of those systems in a seamless and integrated manner because of these open standards. Such an ecosystem rewards innovative solutions to business needs.
- Contextual Integrity – This data interchange standard endeavors to provide enough granularity and specificity to provide contextual integrity. The standard shall ensure that when a PipelineML-compliant data packet is received by any party, the integrity of the original context of that data is preserved.
- Extend & Embrace – We do not want to reinvent the wheel in terms of defining existing data interchange standards unless existing standards do not exist or are inadequate to address the defined business needs. We shall utilize existing technologies such as UML, XML, XSD, GML, WMS, and WFS. By complying with these standards and extending them to service the needs of the pipeline industry, we are building on a solid mature foundation—thereby increasing our likelihood of success. We plan to utilize as many existing domain reference code lists as possible from such sources as ANSI, API, ASME, ASCE, etc.
- Rapid ROI – In as much as possible, PipelineML shall be designed to provide rapid return on investment for adopters. This is one of the key decisions to develop this data interchange standard under the umbrella of the OGC. By building PipelineML on top of the existing GML standards suite, we immediately gain the benefits of compatibility with all software applications that support these well-adopted standards.
- Agile Lifecycles – Although we will not follow a true agile software development methodology, we want to work toward small incremental deliverables that solve practical needs. We will start with a small scope that focuses on solving the most common data interchange needs. Once we solve the simplest and most urgent set of needs, then we will cycle through the process again with a larger scope that considers the next most urgent set of needs within the pipeline industry. With each cycle, we expect to capitalize on lessons learned from previous iterations.
- Controlled Vocabularies – Removing ambiguity also requires the use of common domain reference code lists (also known as controlled vocabularies). This ensures that all parties mean the same thing when they define features, types and attributes in PipelineML-compliant data packages. In as much as possible, existing code list standards shall be leveraged for this purpose. The American Petroleum Institute (API) is recognized as one of the leading standards bodies in defining reference code lists. Other standards shall be assessed and considered for adoption based on the needs of invested stakeholders. Additional discussion is needed regarding the translation of controlled vocabularies into other languages and units of measure.
The foundational technologies we have chosen to adopt are easy to understand and commonly accessible (in terms of the ability to acquire needed skills affordably). Some of the foundational interchange technologies we evaluated in 2012 were found to be too academic and involved obscure technology stacks. As such, they were not utilized to avoid an unnecessarily steep, expensive learning curve. Instead, we have selected the ideal technologies to meet these governing design principles.