Data Science & Analytics Production & Operations Transfer

JPT Reflection: Tearing Down the Walls Among Disciplines

Happy New Year to everyone. I took some time over the last couple weeks to reflect on the decade behind us, as well as the decade ahead of us.  As all of us found out on January 1st, our friends at the JPT took some time to do exactly that!  In case you missed it, here’s a link to the article

A Changing Industry 

It is obvious that data science and analytics are front-of-mind for the SPE. The group has gone so far as to redefine the role of Management and Information Director as Data Science and Engineering Analytics Director. There cannot be a clearer signal that data science and analytics are critical to the next chapter of oil and gas.  At, we are proud to have been at the forefront of delivering Data Science and Petroleum Analytics to the industry since 2013. 

Common Vision 

Each member of the technical leadership of SPE shares a vision for the future of petroleum technology.  The technical directors unanimously declared that our “traditionally fragmented industry must become more integrated and collaborative.  A primary solution to breaking down those barriers: the continued evolution and adoption of digital technologies.” 

Photo by Mitchell Luo on Unsplash

While there are many great quotes in the article (you’ll find several below), this is the most striking.  The group is acknowledging that the status quo isn’t good enough and is issuing a call to action to the industry.  All of us have experienced the pain of the last 5 years; we’ve made great strides to streamline and improve our processes, but the work isn’t done yet.  I am convinced that will help the industry achieve this goal. 

Data Science and Engineering 

“Work flows will be more consolidated and integrated, a departure from the current status quo according to discipline, de-facto norms dictated by software, or the way things have always been done … Organizations will have to break down traditional work flow-deadline mandated “compartments” through a fundamental change in their culture …”

—Birol Dindoruk, Data Science and Engineering Analytics Director

This quote is incredibly exciting for me to read, since the team at shares the same view.  Today, data is stored and curated according to the OFS service line that collected the data: drilling data in a drilling database, completions data in a completions database, and so on.  In order to perform any meaningful data science or analytics at the well level (much less the reservoir level), a great deal of data cleansing, engineering, and normalization must be done. eliminates these repetitive tasks and empowers engineers by delivering high-caliber data and analytics tools. 


“Ultimately, the industry will need a better understanding of the production mechanism of unconventional wells. It’s not the same as in a conventional well where it’s just plain Darcy flow through a matrix [and the industry is] not going to solve these completions challenges with just completions engineers. This is a cross-discipline issue, and our biggest companion in this is reservoir engineers.” 

—Terry Palisch, Completions Director

The gap between completions engineers and reservoir engineers remains wide, even within single asset teams.  During a recent training course, we asked completion engineers and reservoir engineers to list the 5 most important factors in delivering a highly productive well.  The two groups did not share a single common factor within the top five. At, we believe that geomechanics is critical to bridging the gap between completions engineers and reservoir engineers.  We have partnered with Dr. Mark Zoback to incorporate his expertise into, delivering powerful geomechanical insights to engineers of all disciplines. 


“When it comes to reservoir technologies, the industry has neglected [unconventionals] for quite some time because it was always about drilling and completions. Now that cash flow has shrunk and the treadmill of drilling and completing wells has slowed, the reservoir discipline is getting more attention. More emphasis is being placed on recovery factors as companies try to squeeze more out of each existing well … For this approach to be successful … the industry needs to further improve its understanding of the unconventional reservoir.”

Erdal Ozkan, Reservoir Director

I absolutely agree with this quote; economically increasing recovery factor is the ultimate challenge in unconventionals.  One of my colleagues calls this the Shale Operator’s Dual Mandate: increase production while decreasing spend.  More simply: do more with less.  Engineers are learning every day what levers they can (and cannot) pull to achieve this goal.  The challenge is disentangling the multiplicity of factors that can impact a well’s productivity: lateral length, completion intensity, fluid system, landing zone, parent/child (horizontal spacing), parent/cousin (vertical spacing), etc.  There are simply too many factors for a human brain to internalize and reason about.  The good news? Machine learning is the perfect tool to solve interconnected, large-scale problems like this. delivers pre-made machine learning models that allow operators to identify which AFE dollars matter the most, allowing engineers to spend time (and money) on things that matter and eliminate things that don’t. 

Database, Cloud, & IT Production & Operations Transfer

Real-time Production in using Raspberry Pi

One of the most pressing topics for data administrators is “what can I do with my real-time production data?”. With the advent of science pads and a move to digitization in the oilfield, streaming data has become one of the most valuable assets. But it can take some practice and getting used to.

I enjoyed tinkering around with the platform and while we have simulated wells: it’s much more fun to have some real data. doesn’t own any wells but when the office got cold brew, I saw the opportunity.

We would connect a Raspberry Pi with a sensor for temperature to the cold brew keg and pipe temperature readings directly into the database. The data would come in as “casing temperature” and we’d be able to watch our coffee machine in real-time using!

The Plan

The over diagram would look like this:

The keg would be connected to the sensor and pass real-time information to the Raspberry Pi. Then it would shape it into the real-time schema and publish to the REST API endpoint.

Build out

The first step was to acquire the Raspberry Pi. I picked up a relatively inexpensive one off Amazon and then separately purchased two temperature sensors by Adafruit. It read temperature and humidity, but for the moment we’d just use the former.

There’s enough information online to confirm that these would be compatible. After unpacking it, I setup an Ubuntu image and booted it up.

The Script

The script was easy enough, the Adafruit came with a code snippet and then for the endpoint, it was a matter of picking the right collection to POST to.

[code language="python"]
import time
from multiprocessing import Pool
import os
import datetime
import requests
import csv
from pprint import pprint
import argparse
from functions import get_well_identifier, post_new_well, post_live_production, get_well_identifier
import sys
import Adafruit_DHT

while True:
freq_seconds = 3
wellId = 'COFFEE 001'
endpoint = 'RealTimeProduction'

pwi = '5ce813c9f384f2057c983601'

# Try to grab a sensor reading.  Use the read_retry method which will retry up
# to 15 times to get a sensor reading (waiting 2 seconds between each retry).
humidity, temperature = Adafruit_DHT.read_retry(Adafruit_DHT.DHT22, 4)

# Un-comment the line below to convert the temperature to Fahrenheit.
temperature = temperature * 9/5.0 + 32

if temperature is not None:
casingtemp = temperature
casingtemp = 0

post_live_production(endpoint, pwi, 0, casingtemp, 0, 0, 0, 0, 0, 0, PETRO_URL)
print((wellId + " Tag sent to " + PETRO_URL + endpoint + " at "+"%Y-%m-%d %H:%M:%S")))


Once connected, we were extremely pleased with the results. With the frequency of readings set to 3 seconds, we could watch the rising and falling of the temperature inside the keg. The well was affectionately named “COFFEE 001”