Categories
Business Intelligence Tools Data Science & Analytics

Connected Analytics

The world changes too fast to depend on top-down decision making.  Engineering teams must be empowered to generate, and act on, their own technical analysis. At the same time, modern technical challenges require cross-functional teams that may be distributed globally. How then, do fragmented teams of domain experts collaborate to deliver high-quality decisions?

The answer is Connected Analytics. 

Step 1: Merge subsurface and operational data 

Petro.ai delivers an industry-first environment for blending and interrogating 100+ operational and geotechnical data types: the Petron.  Petrons allow operational data types (e.g. WITS streams, frac van files, production volumes) to be contextualized by subsurface data types (e.g. directional surveys, SEG-Y, fiber optics).  Data blending and co-visualization is now as easy as a drag-and-drop. 

Step 2: Apply your own expertise to the work product 

After loading the appropriate data types into a Petron, you can begin to perform your own analysis.  Use previously developed models (stored in other Petrons), or uncover new relationships using Petro.ai’s inbuilt machine learning.  No need to create a PDF or upload a shared drive; your new Petron is automatically stored and updated across your team. 

Step 3: Collaborate with your peers 

Petrons provide a social workspace for experts to annotate data, share newly created models, and collaborate like never before.  Each team member can independently perform their own knowledge work, all from the same source – a shared Petron.  As new analysis is performed, the Petron and all dependent workflows are updated to reflect the team’s latest insight. 

Step 4: Build new value and start the next wave of optimization 

With the power of the Petron, engineering teams can focus their energy where it matters: making the next well more efficient and more productive.  By iteratively and fluidly incorporating new learnings, the possibilities are endless. 

Connected analytics is a new assembly line.  Rather than passing physical products to the next station, modern engineers are passing digital knowledge work to the next area of expertise.  This modern workforce must be able to interact with their peers digitally, expects a high level of interactivity, everyday tools to support remote work.  Incumbent software was not designed for the challenges of this century; Petro.ai, and the Petron, were born in it. 

Categories
Database, Cloud, & IT Production & Operations Transfer

Real-time Production in Petro.ai using Raspberry Pi

One of the most pressing topics for data administrators is “what can I do with my real-time production data?”. With the advent of science pads and a move to digitization in the oilfield, streaming data has become one of the most valuable assets. But it can take some practice and getting used to.

I enjoyed tinkering around with the Petro.ai platform and while we have simulated wells: it’s much more fun to have some real data. Ruths.ai doesn’t own any wells but when the office got cold brew, I saw the opportunity.


We would connect a Raspberry Pi with a sensor for temperature to the cold brew keg and pipe temperature readings directly into the Petro.ai database. The data would come in as “casing temperature” and we’d be able to watch our coffee machine in real-time using Petro.ai!

The Plan

The over diagram would look like this:


The keg would be connected to the sensor and pass real-time information to the Raspberry Pi. Then it would shape it into the real-time schema and publish to the REST API endpoint.

Build out

The first step was to acquire the Raspberry Pi. I picked up a relatively inexpensive one off Amazon and then separately purchased two temperature sensors by Adafruit. It read temperature and humidity, but for the moment we’d just use the former.


There’s enough information online to confirm that these would be compatible. After unpacking it, I setup an Ubuntu image and booted it up.

The Script

The script was easy enough, the Adafruit came with a code snippet and then for the Petro.ai endpoint, it was a matter of picking the right collection to POST to.

[code language="python"]
import time
from multiprocessing import Pool
import os
import datetime
import requests
import csv
from pprint import pprint
import argparse
from functions import get_well_identifier, post_new_well, post_live_production, get_well_identifier
import sys
import Adafruit_DHT


while True:
PETRO_URL = 'https://<YOUR PETRO.AI SERVER>/api/'
freq_seconds = 3
wellId = 'COFFEE 001'
endpoint = 'RealTimeProduction'

pwi = '5ce813c9f384f2057c983601'

# Try to grab a sensor reading.  Use the read_retry method which will retry up
# to 15 times to get a sensor reading (waiting 2 seconds between each retry).
humidity, temperature = Adafruit_DHT.read_retry(Adafruit_DHT.DHT22, 4)

# Un-comment the line below to convert the temperature to Fahrenheit.
temperature = temperature * 9/5.0 + 32

if temperature is not None:
casingtemp = temperature
else:
casingtemp = 0
sys.exit(1)

try:
post_live_production(endpoint, pwi, 0, casingtemp, 0, 0, 0, 0, 0, 0, PETRO_URL)
except:
pass
print((wellId + " Tag sent to " + PETRO_URL + endpoint + " at "+ datetime.datetime.now().strftime("%Y-%m-%d %H:%M:%S")))
time.sleep(freq_seconds)
[/code]

Results

Once connected, we were extremely pleased with the results. With the frequency of readings set to 3 seconds, we could watch the rising and falling of the temperature inside the keg. The well was affectionately named “COFFEE 001”

Categories
Developers Corner Transfer

Writing your First JavaScript Vue.js App for Petro.ai

Getting Petro.ai installed can be an exciting time and open quite a few doors for development, especially when it comes to JavaScript apps. Custom applications become a cinch using the API. In the coming weeks I’ll be putting together some simple applications that you can make on top of the Petro.ai platform. We’ll be using an assortment of languages to communicate with the Petro.ai API so feel free to ask for an example.

Here is the HTML

[code language="html"]

<h1>Hello, Wells!</h1>



<div id="hello-wells" class="demo">
  <blog-post v-for="well in wells" v-bind:key="well.id" v-bind:title="well.name">
   </blog-post>
</div>


[/code]

And the JavaScript (Vue.js)

[code language="javascript"]
Vue.component('blog-post', {
  props: ['title'],
  template: '

{{ title }}

'
})

new Vue({
  el: '#hello-wells',
  data: {
    wells: []
  },
  created: function () {
    var vm = this
    // Fetch our array of documents from the Petro.ai wells collection
    fetch('http://&amp;amp;amp;lt;your-petro-ai-server&amp;amp;amp;gt;/api/Wells?Limit=10')
      .then(function (response) {
        return response.json()
      })
      .then(function (data) {
        vm.wells = data['data']
      })
  }
})
[/code]

And poof! We’ve called the first 10 wells from the Petro.ai wells collection:

Hello, Wells!

DEJOUR WOODRUSH B-B100-E/094-H-01
BLACK SWAN HZ NIG CREEK B-A007-G/094-H-04
BLACK SWAN HZ NIG CREEK B- 007-G/094-H-04
BLACK SWAN HZ NIG CREEK B-G007-G/094-H-04
BLACK SWAN HZ NIG CREEK B-E007-G/094-H-04
BLACK SWAN HZ NIG CREEK B-D007-G/094-H-04
BLACK SWAN HZ NIG CREEK B-C007-G/094-H-04
ZEAL 4-25-46-26
PEYTO WHHORSE 4-9-49-15
BLACK SWAN HZ NIG CREEK A- 096-C/094-H-04

What’s going on here is that the app is pulling directly from the Petro.ai server asynchronously. In the coming weeks, I’ll show how we can create reactive JavaScript applications that will update from the Petro.ai server so that we can watch things like rigdata or real-time production data. This data was provided by GeoLogic and we’ll be setting up a public Petro.ai instance for everyone to develop against.