How Microsoft is Extending Its Cloud to Chevron’s Oil Fields
A single fiber cable at an oil field generates a terabyte of data daily. Chevron will now be using Microsoft’s cloud platform to put that data to use.
November 21, 2017
A single fiber-optic cable at an oil well generates more than a terabyte of data every day, and Chevron has “thousands of wells,” from which it’s been collecting and analysing data for years. That data collection and analytics system is now getting an upgrade following the energy giant’s deal with Microsoft to use the latter’s cloud services designed specifically to help clients make use of data coming in from a highly distributed network of sources.
In shifting to Azure Chevron will use a combination of the cloud provider’s Cortana Analytics, Azure IoT Hubs and the newly available Azure IoT Edge. Azure has more regions worldwide than any other public cloud, and Microsoft has been willing to work with Chevron to tailor its services to the client’s needs, according to the energy company.
And Chevron can use all those services without having to manage the infrastructure they run on. “One of the key benefits of Azure cloud is that we can use very powerful technologies without having to support the hardware and software infrastructure,” Chevron CIO Bill Braun told Data Center Knowledge. “Now we can focus on the business use of the technology and leverage Azure IoT Hub” as a system that collects data from all the sensors and scales automatically to match the demand, charging Chevron only for the capacity it actually uses.
Chevron has been using analytics and operations research for planning refineries since the 1970s. Over the last 10 years the company’s been using AI for forecasting and building models of potential oil fields using seismic data. “Chevron has used many analytics and decision analysis tools for a long time,” Braun said. “And we are also a big data company. Consider one fiber-optic cable in a well generating over one terabyte of data a day -- and we have thousands of wells.”
The sensors in the oil wells collect performance, temperature, pressure, and equipment health data; drill ships and production facilities are instrumented with thousands more sensors, generating even more data. Azure IoT hub will not only continue collecting all that data; it will help deploy, manage, and secure those devices.
One use case is predictive maintenance. A small temperature rise or increasing vibration could be early signs of equipment about to fail. Chevron can use machine learning to detect those early signs and take action before the fault actually happens. Another use case is guiding engineers when they visit a site, letting them know exactly where to go and what to work on. Machine learning might also be able to process seismic data more quickly and consistently than human engineers can and help build oil-field models that help engineers decide where to drill and what drilling techniques to use.
Chevron is also interested in other Microsoft technologies, such as the HoloLens mixed reality headset, which could let senor engineers oversee equipment installation remotely, give on-site engineers an augmented view of the equipment they’re working with, and let managers understand the site as if they were there without leaving their desk. “We see this technology’s potential to provide hands-free visualization of IoT and business data in the field without having SMEs travel sometimes hundreds of miles,” Braun explained.
Computing on the Edge
As it uses more cloud services, Chevron won’t need as many data centers as it has today, and that may be one reason it recently sold a data center in San Antonio to Microsoft. The sale not only frees up money it can now spend on cloud computing that gives it more scale.
“We already obtain hundreds of millions of dollars in value from our analytical teams,” Braun said. “We’re aiming to more than double what we get through scaling it up, using the cloud to help us do this and applying it to areas that traditionally we have not.” He expects Azure to give Chevron new solutions in data-intensive areas like exploration, midstream logistics, retail operations, and the management of thousands of oil wells around the world.
In addition to compute in the cloud, edge computing will continue to be important for Chevron, and it might include Azure Stack in Chevron’s facilities or local compute inside IoT and SCADA devices. “For Chevron, the remoteness and harsh conditions of some of our locations will always require edge compute to manage the data and to perform local actions because of latency issues and local data residency laws,” he said.
The latency issue isn’t just about being able to make decision locally, in near real-time; it’s also about not using bandwidth – which is scarce in many of these locations – for the full flood of data. “We have been doing edge processing in our facilities that require second or in some scenarios sub-second responses for decades. But the bigger driver for edge compute is to simplify the information from millisecond level readings into higher-level business events to save on data transmission costs and address the limited bandwidth in remote locations. Local compute is also important for maintaining availability during natural disasters or accidental network disruption.”
That’s a common need, Sam George, director of Azure IoT, told us. “In oil and gas, and in manufacturing and similar environment, there are high-value assets that are sometimes in remote locations or on potentially [poor] network connections.” Customers wanting to make sure they are monitoring an expensive asset on the edge, where server network connections are poor, are a common pattern Microsoft has been observing, George said.
“Azure IoT Edge is driving a tremendous amount of interest in Azure IoT generally for that reason,” he added. The platform gives them stream analytics, machine learning, Azure Functions, and their own custom business logic “right on or right next to those assets.”
Machine learning is the main driver for that. Companies like Chevron want to be able to build machine learning models in the cloud and then run them on the edge – including managing and updating the models by updating the input parameters rather than having to resend the whole model over a slow link. “Once companies get their head around how easy it is to manage, that’s one of the most compelling and important parts of IoT Edge for them,” George said.
About the Author
You May Also Like