Telephone: 724-787-4451

The Explosion of Edge Computing in Oil and Gas

   December Vol 8 Issue 8

The Explosion of Edge Computing in Oil and Gas
By: Ian Eyberg, CEO, NanoVMS

If there is one industry that has quickly adopted and embraced edge computing, it’s the oil and gas industry. In fact, oil and gas are quickly becoming leaders of edge computing. There are even conferences springing up on this very topic like the upcoming “Emerging Computing Technologies in Oil & Gas” in Houston in late January and there was a prior one in October of 2018.

Why? There are quite a few interesting applications. Newer maintenance technologies are being used to predict work overs, rod changes and cleanings. When you have things like parted rods or leaky tubing those can leave pumps not pumping for days or even up to weeks. This type of software can drastically reduce downtime and prevent major repair costs, so it is obviously highly desirable.

However, it doesn’t stop there – seismic interpretation is being enhanced with newer machine learning algorithms that can calculate salt classifications and lead identification on seismic volumes. What used to cost millions and was done by humans, software can now plow through.

While there has always been a plethora of sensors on everything from catheads to mud buckets most of the sensors were relatively ‘dumb’ in the sense that they did simple things like threshold alerting. Then some companies started trying to feed some of this data back to their datacenters. While analysis done at this level can tell you a lot about what might have caused things like downtime this newer breed of software can actually predict equipment malfunction before it happens. How? By running the computational workloads next to where the data is produced.

Why not just feed it back to the cloud though? Well it’s no secret that oil rigs don’t have great connectivity – sometimes they are using 3g, 4g or vsat communications. However, there are around 1.7M active wells in the US alone that can also be in remote places. Moving all that data around doesn’t really work. Even when there is a digital pipe leading back to your datacenter sometimes it is on and sometimes it is not. When you have equipment that is generating terabytes daily, and that number is growing quickly, it becomes cost prohibitive to send it out to some Amazon datacenter in Virginia if that’s where it’s going.

But let’s revisit this notion of dumb sensors. Some of you might be wondering what’s the difference between IoT type of sensors that you might get from Honeywell and the edge? For example, say you have a blowout preventer that measures the pressure behind a piston. A lot of these older sensors might be analog sensors that perform one function. The newer breed is digital, and the difference is the edge style stuff actually has apps running there. It’s kind of like the software in your phone. It ties in all these data feeds and makes them interesting and powerful. Data by itself is uninteresting – it’s only when you apply logic to it that you can do something with it. When the water tank can ‘talk’ to the mud shaker through this software the business starts saving a lot of money and starts reducing a lot of downtime. Humans can be great operators with the data that they have available and there’s a lot of good intuition that experience brings but we’re now at the point where software can continuously make decisions with vast more amounts of data than a human could process by themselves and this software works 24/7/365 – no downtime.

There are a few challenges in adopting this newer edge based software however. Challenges with security and manageability come to mind. The software has to live somewhere. Something we’re seeing go hand in hand with edge deployments is the practice of utilizing newer unikernel based infrastructure. Unikernels have many advantages over Linux as their isolation primitives thwart hackers’ attempts to break into systems and come with a four point security model:

1) They don’t have shells.
2) They don’t have users with usernames and passwords.
3) They are single process systems.
4) They have a dramatically reduced attack surface.

Essentially, they are stripped down systems with only the functionality necessary to make a given application work and nothing else. Security implications aside they simply run faster too. Many applications can run 20% faster since each application is isolated to a single virtual machine.

There are other options as well like Clear Containers from Intel but since that still uses Linux as a base system it still has many of the same security and performance problems as the underlying Linux kernel does – not to mention that Linux is now north of 27 years old and predates both commercialized virtualization like VMWare and the ‘cloud’. There have been other attempts to deal with the wide adoption of container technology that some developers like to use but concerns about the incredible insecurity that containers like Docker and orchestrators like Kubernetes have, have led CISOs in general to be very wary of adopting container technology. So, if you are looking at leveraging some of the newer edge deployments and you want fast and secure systems ask your vendors if they are provisioning on Linux or unikernels.

Edge computing is all over the oil & gas industry and it’s going to continue to spread simply because it massively reduces cost and increases value. The data that flows in this ecosystem is becoming as valuable as the oil flows throughout it. Companies that are not shy and fully embrace edge are going to out compete those that don’t.

 

For more information, visit www.nanovms.com.

 

 

 

Visit Us On FacebookVisit Us On Twitter