... Data Empowers Pipelines
 

Data Empowers Pipelines

In industrial environments, we are often told that information equals power. Recently, we’ve seen a dramatic rise in data collection along oil and gas pipelines. We’ve attached sensors to monitor flow, pressure, temperature, density and almost every other factor we
can measure.

It’s now estimated that every 30,000 miles of natural gas pipeline generates 17 terabytes of data each day — more than the entire printed collection of the Library of Congress. With more than 305,000 miles of natural gas pipelines in the United States, we can just begin to understand the massive amount of data currently being generated. This will only continue to grow.

As more sensors and assets become connected to the Industrial Internet of Things, business innovation for oil and gas companies will depend on deriving actionable intelligence from these data streams, to enable decisions that improve operations and processes. This is the only way we will ultimately be able to reduce operational cost and increase uptime.

Newly connected devices are coming online within our networks at an alarming rate. According to Cisco, 80 things per second are connecting to the Internet, and by 2020 there will be 50 billion things connected to the Internet of Things. All this, while 46 percent of businesses consider volume of information to be their primary data challenge. Indeed, many in the oil and gas sector collect so much data that we may in fact feel like we are drowning in it, primarily because most of the efforts thus far have been focused on data generation (and perhaps data warehousing for future analysis).

/*** Advertisement ***/

But what is the benefit of that data? All those connected devices throughout the pipeline, how much impact is it really making at present? The CIA World Factbook estimates that $40 billion is spent on pipeline networks and asset maintenance annually. The Pipeline and Hazardous Materials Safety Administration (PHMSA) noted that between 2002 and July 2012, remote sensors detected only 5 percent of the nation’s pipeline spills. Couldn’t the right information at the right time improve these figures, and address other pressing concerns that keep pipeline operators awake at night?

Although information may equal power, power doesn’t lie in the gathering of information; it lies in the analysis, the timing of the analysis and the actionable intelligence gained from that analysis. Get all of that right and you have the information to not only improve the cost of maintenance in an oil and gas operation, but also increase the reliability of critical safety and operational proficiency

By integrating data from all of a company’s assets, networks and systems, operators and engineers have ready access to all the information they need about asset performance, maintenance,  network health and other issues.

By integrating data from all of a company’s assets, networks and systems, operators and engineers have ready access to all the information they need about asset performance, maintenance, network health and other issues.

The Future Changes What We Must Expect from Data

What does this mean for pipelines with large data streams and millions of dollars invested
in connected devices, sensors and assets? Well, not only are we drowning in data, but we are also converging operations with information technology (IT), as data begins to be funneled from the operational theater and the field into our IT infrastructure.

From there, for companies that are past the “produce and collect data” phase, the data will ostensibly be used to generate service calls, maintenance, or action from a field operator somewhere along the stream. But this is when we run into another challenge: will you have field operators to implement the action required? In 2013, PricewaterhouseCoopers research indicated that the oil industry needs 120,000 new workers in the next decade to cover the retirement of baby boomers. That projected shortfall in available manpower and a skill created by the absence of the “tribal knowledge” of these retiring workers means that we must now elevate our expectations for that data.

With these pressing conditions moving forward, it becomes essential that we must not stop or settle for collecting and analyzing data, even if the analysis provides actionable plans. It is now a critical necessity for the maintenance and efficiency of our business to use the intelligence gained through data to both increase the operational IQ of everyone in the operations center and in the field and to work to eventually automate operations. This is the only way for us to mitigate the impending massive skill shortage in oil and gas and conquer other pressing industry concerns.

In today’s information-driven world, every 30,000 miles of natural gas pipeline generates an estimated 17 terabytes of data each day. That’s more than the entire printed collection of the Library of Congress. As new infrastructure is built, that number will continue to grow.

In today’s information-driven world, every 30,000 miles of natural gas pipeline generates an estimated 17 terabytes of data each day. That’s more than the entire printed collection of the Library of Congress. As new infrastructure is built, that number will continue to grow.

Big Data Parallels: Looking to the Utility Sector for Solutions

Intelligence and automation must be the ultimate goal of what we can derive from our data. But if our goal is to turn information into automation, that is easier said than done. However, it is possible, and is becoming even more so with every hour. We know this because the utility sector is beginning to manifest that reality, today.

With the proliferation of smart meters, grid sensors, rooftop solar and other devices connected to smart grids, utilities have never had as much data coming from as many different devices and sensors as they do today. In fact, this volume of generated data is very similar to the pipeline.

The key to a utility’s ultimate success is its capacity to manage the influx of data at scale, analyze, interpret and proactively provide instruction on best practices, or automate processes to relieve the massive burden on the organization.

For example, PG&E was able to unify data from an array of systems and sensors into a common data model, providing them with improved visualizations and a better understanding of what’s happening across all its networked grid devices. This improved situational awareness gives them the ability to better monitor, filter and manage alarms and alerts; see visual confirmation of network chokepoints; and gain early detection of anomalies and events across multiple systems in close to real-time. This ability will also allow them to meet their specific business needs by building their own analytics applications in the future.

Thus, the true potential of the Internet of Things is the genesis of Software Defined Operations. Software Defined Operations may sound like science fiction, but automation is happening in the Smart Grid, and soon, a process similar to this will change the way the entire oil and gas industry does business, whether it is at the extraction site, pipelines, or at the refineries.

By aggregating disparate and traditionally siloed data sets and systems, Software Defined Operations and machine intelligence can give greater visibility into real-time conditions in an operation and can provide a holistic view, or “single source of truth” for the operator.
Making the move to Software Defined Operations requires machine learning. This takes many forms in the industrial enterprise — a select number of parameters trigger a rule and a response based on specific processes. The more business rules an oil and gas company “teaches” its technology platform, the more sophisticated the virtual operator becomes.

With the software using rule-based analysis of events and responding automatically to certain triggers one can automate the manual workflows and inefficient models of the past to instantaneously unify information and derive intelligence from billions of
data points.

Deriving action from these data streams, determining a decision or process lies at the very heart of the business innovation needed in the next generation of pipelines. These processes will first be passed to the next generation of oil industry operators, but eventually the actions will be automated based on an algorithmic analysis of billions of data points.

For example, by uniting centrally managed data with connected devices across the pipeline, the platform automatically gathers information about physical assets to monitor status or behavior. Instead of waiting for the manufacturer’s recommendation for when to replace equipment, oil and gas companies can use performance data to understand when maintenance is needed or when an asset could potentially fail. This capability extends an asset’s life through better maintenance and reduces organizational expenditures.

With intelligence derived from the data, an organization can maintain continuity and knowledge transfer through changes in staff, whether they are third-party maintenance suppliers or the next shift of workers.

By integrating data from all of a company’s assets, networks and systems, operators and engineers have ready access to all the information they need about asset performance, maintenance, network health and other issues. This improves process and response times, accelerates speed to resolution, maintains continuity through employee shift changes and ultimately increases uptime.

At optimized levels, Software Defined Operations then becomes the effective bridge between written and tribal knowledge in the workforce.

Thus, the lessons learned from the utility sector make their way into oil and gas to drive business innovation among pipelines. The Industrial Internet of Things and the intelligence derived from numerous data streams helps oil and gas companies optimize processes, resource use and decision-making. This can minimize the potential for leaks and provide area analysis and geospatial views of affected areas if a disaster occurs. It also presents a triage view into alarms across all systems so operators can prioritize their actions.

System awareness also can identify the most beneficial time for asset maintenance or replacement — eliminating untimely repairs that can result in downtime and huge capital expenditure. This knowledge can be the difference between normal operations and a disaster, regulatory fines and any safety risks.

At the end of the day, oil and gas companies care about production. Unlocking the Industrial Internet of Things and adopting Software Defined Operations delivers operational and process improvements and optimized asset performance that lead to greater system uptime and maximized production.

Alex Clark is chief software architect and founder of Bit Stew Systems.

Tags: , ,