Categories
Business Intelligence Tools Data Science & Analytics Drilling & Completions Geology & Geoscience Passion for Change Reservoir Engineering

Passion for Change: Bonanza Creek Energy

with Kyle Gorynski, Director Reservoir Characterization and Exploration at Bonanza Creek Energy

If you missed the first part in this series, you can find the start of our conversation here.

Why do you think there’s been so much hesitation around change?

There’s a strong engrained culture in oil and gas with  a generation of people who’ve been doing this for decades – although on completely different rocks, play types and extraction techniques. There’s been pushback on adopting new geology or engineering software because people become so comfortable and familiar with the tools they use. I think a lot of it is simply the unique culture in oil and gas. Shale is still new, and our guesses are still evolving. I think we’re getting closer and closer to what that right answer is.

New organizations will have to adopt new technologies and adapt to new trends, because there’s no other way to make this business work.

As a scientist or engineer or manager in this space, we really have to be cognizant that the goal of a lot of vendors out there is not to help you get the right answer, it’s to make money. The onus is on us to vet everyone and make sure we’re getting the answers we want. 

Machine learning is simply another tool, like a physics-based model, aimed to help us predict an outcome and increase the precision and accuracy of these predictions. People have made pretty fantastic inferences from these kinds of tools. 

You can’t just pay a company to apply machine learning to a project. You need to help them utilize the correct inputs, the relationships, and ensure the predictions and outcomes match the other observations you have from other datasets.

I don’t think any organization should be cutting-edge for the sake of being cutting-edge. The goal is to solve these very specific technical challenges quicker and with more accuracy. Our job is to extract hydrocarbons in the most safe, environmentally friendly, and economic fashion. Technology like machine learning and AI are tools that can help us achieve these goals and needs to be done correctly.

Can you share any successes around data science or machine learning at your company?

The industry has been using these techniques for a long time. In their simplest form, people have been cross-plotting data since the early days of the oil and gas industry, trying to build relationships between things. At the beginning of my career, I remember using neural networks to predict log response.

Now we use predictive algorithms to help us predict log response where we don’t have certain logs. Let’s say we want to predict lithologies—carbonate-clay-quartz-feldspar content in a well— we’ll build relationships between triple-combo logs, and more sophisticated, but scarce elemental capture spectroscopy logs. We don’t have ECS logs everywhere, but we have triple-combo everywhere, so if you can build a relationship between those, then you have a massive dataset you can use to map your asset. That’s a simple way we use this type of technology. 

Like almost every company now, we’re also predicting performance. That’s how we’re able to make live economic decisions. We have a tool where we can put in a bunch of geologic and engineering inputs and it’ll predict production rates through time that we can forecast, add new costs, and run economics live. We’re running Monte Carlo simulations on variable rates, volumes, lateral length, spacing, commodity pricing, and costs that are based in our best estimates to predict tens of thousands of outcomes to try to help us better understand what the best decision could possibly be. I think that’s the most impactful place it’s being used, and I think that trend is being adopted more and more in industry as I talk to my peers. 

Type curve generation is no longer grabbing a set of wells and grouping them together and fitting a curve to it, but it’s trying to predict the infinite amount of outcomes that are between the extremes.

Have you seen any success among your competitors using technology, specifically data science and analytics tools?

There’s some great work out there across the board. I had a lot of fun at Encana (now Ovintiv) seeing a lot of my peers who are exceptionally smart really trying to adopt new technology to solve problems. I’ve seen some amazing work getting people to adopt new ideas, new thoughts, new predictions. I like going to URTeC. I think that’s a fantastic conference. I always find a number of great sets of technical work that has come out. 

I think the industry is doing a great job. There’s a ton of really smart people out there that know how to do this work. I think a lot of young people are really adopting coding and this bigger picture approach to subsurface, where it’s not just you’re an engineer or you’re a geoscientist, you really have to understand the fluid, the pore system, the stresses, what you’re doing to it. There’s no way we can be impactful unless we understand the really big picture, and people are getting much better at that, trying to use tools and develop skillsets that allow them to solve these problems a lot quicker.

How would you describe Petro.ai?

We see you guys as filling a gap we have. It’s the ability to pull data together. It’s the ability to simply apply big data to a dataset we quite frankly don’t have the time or the capability to do in-house. Petro.ai provides us with a very important service that allows us to get to a point that would take us 12-18 months to get to on our own, but in only a couple months. What we really like about it is the fact that we’re developing something that’s unique and new and therefore has our input and involvement, so we’re not just sending you a dataset and asking for an answer, we’re trying to say what we think drives the results, and we also want your feedback. So you’re also a group of experts as well that not only have your own experiences, but you’ve seen people’s assets and plays and how everyone else in industry is looking at it, so it’s nice to have this group of consultants that have the same goal – to address a problem and try to figure it out. We want to get to an answer as quickly as we possibly can and start to apply those learnings as quickly as we possibly can. 

Kyle Gorynski is currently Director of Reservoir Characterization and Exploration at Bonanza Creek Energy.  Kyle previously worked at Ovintiv where he spent 7 years in various technical and leadership roles, most recently as the Manager of Reservoir Characterization for their Eagle Ford and Austin Chalk assets.  Although he is heavily involved on the technical side of subsurface engineering and geoscience, his primarily focus is on their practical applications in resource and business development . Kyle received his B.S. and M.S. in Geology from the University of Kansas in 2008 and 2011, respectively.
Categories
Business Intelligence Tools Data Science & Analytics

Connected Analytics

The world changes too fast to depend on top-down decision making.  Engineering teams must be empowered to generate, and act on, their own technical analysis. At the same time, modern technical challenges require cross-functional teams that may be distributed globally. How then, do fragmented teams of domain experts collaborate to deliver high-quality decisions?

The answer is Connected Analytics. 

Step 1: Merge subsurface and operational data 

Petro.ai delivers an industry-first environment for blending and interrogating 100+ operational and geotechnical data types: the Petron.  Petrons allow operational data types (e.g. WITS streams, frac van files, production volumes) to be contextualized by subsurface data types (e.g. directional surveys, SEG-Y, fiber optics).  Data blending and co-visualization is now as easy as a drag-and-drop. 

Step 2: Apply your own expertise to the work product 

After loading the appropriate data types into a Petron, you can begin to perform your own analysis.  Use previously developed models (stored in other Petrons), or uncover new relationships using Petro.ai’s inbuilt machine learning.  No need to create a PDF or upload a shared drive; your new Petron is automatically stored and updated across your team. 

Step 3: Collaborate with your peers 

Petrons provide a social workspace for experts to annotate data, share newly created models, and collaborate like never before.  Each team member can independently perform their own knowledge work, all from the same source – a shared Petron.  As new analysis is performed, the Petron and all dependent workflows are updated to reflect the team’s latest insight. 

Step 4: Build new value and start the next wave of optimization 

With the power of the Petron, engineering teams can focus their energy where it matters: making the next well more efficient and more productive.  By iteratively and fluidly incorporating new learnings, the possibilities are endless. 

Connected analytics is a new assembly line.  Rather than passing physical products to the next station, modern engineers are passing digital knowledge work to the next area of expertise.  This modern workforce must be able to interact with their peers digitally, expects a high level of interactivity, everyday tools to support remote work.  Incumbent software was not designed for the challenges of this century; Petro.ai, and the Petron, were born in it. 

Categories
Business Intelligence Tools Data Science & Analytics Updates

Hacking for Houston 2020: Improving Care in Our Community

As a proud member of the community, Petro.ai wanted to give back. We created the “Hacking for Houston” event to give the Petro.ai user base a voice for good in the communities in which we live and work. 

Bringing together O&G technical experts and public health professionals

Uche Arizor, Team Lead at PHI Lab commented that, “Our mission is to facilitate cross-sector collaboration, creativity, and innovation in public health practice. Partnering with Petro.ai for Hacking for Houston 2020 was a great opportunity to bring people together from the sectors of oil & gas, data science, and public health to work on real issues that HCPH would like to address.” 

All of us were surprised when the night before the hackathon, a water main burst in downtown Houston. The 610 East Loop was closed for several hours due to flooding. Our team exchanged e-mails to decide if we needed to cancel the hackathon, but felt reluctant to do so, after all the hard work put into organizing the event with our partner, the Public Health Innovations Lab (PHI Lab) at Harris County Public Health. Employees arrived early and decided to press on with the Hackathon; we are so glad that we did!  

We encouraged anyone with a passion for data science to attend, especially our clients and partners, as well as university students in data science and public health. We were unsure if attendees would still be able to join us in light of the water main break—but even the turnout for the optional two-hour morning workshop was fantastic. Shota Ota, Support Engineer, and Jason May, Data Scientist at Petro.ai covered tools and topics useful for the Hackathon. 

After lunch, the hackathon began with a high-intensity couple of hours where participants worked in teams of 1-3 people to build and code projects. Teams were not restricted to any particular software or tools to implement their solutions and people deployed a variety of tools including Power BI, Spotfire, R, python, ArcGIS, Excel, Jupyter notebooks, and even open-sourced 3D visualization software. 

Three Challenges were laid out to participants, each with actual data provided by HCPH. Teams then chose one of the available challenges to work on during the event. 

Go Upstream for Downstream Costs

Objectives:   

  • Identify the rates of preventable hospitalization types and charges from zip codes with the highest rates of preventable visits.  
  • Create profiles of select zip codes that explore trends in socio-demographics, health outcomes, and issues in health care access.   

Increase Government Efficiency

Objectives:   

  • Model the overlap and gap of services provided by current facility locations based on community need (population density, poor health outcomes, etc.) 
  • Identify pilot sites for the co-location of public health, clinical health, and mental health services, while justifying community needs around the site. 
  • Explore the impact of other public and private facilities that may offer similar services in affected communities.   

Reducing West Nile virus (WNV) Disease Risk

 Objectives:   

  • Use disease, mosquito, environmental and population data from the past 4 years, to develop a model that predicts areas in Harris County at higher risk for WNV disease transmission compared to others.   
  • Identify the key factors that influence WNV disease risk in Harris County as a whole or in different clustered communities. 

At 5 pm, each team gave a 5-minute presentation or “pitch” to the panel of judges and other participants. Their projects were judged according to four categories: communication, technical achievement, creativity, and aesthetic. Our 2020 judges included: 

The judges were impressed by all the teams and how much they were able to accomplish in just four hours. Each team presented their findings and their recommendations for HCPH. The winning team consisted of Callie Hall from the Houston Health Department, Elena Feofanova, a PhD candidate at UT Health, and Alex Lach, a reservoir engineer at Oxy. Their team chose Challenge 2, Increase Government Efficiency, and combined outstanding data analysis with a great pitch.  

Dr. Beckham, Director of the Office of Science, Surveillance, and Technology at HCPH, said, “The hackathon was a great way to network with future leaders and address public health issues in a creative and innovative way. Information taken back will be implemented to assist with making better business decisions to provide services to Harris County residents. It was a great opportunity for government (HCPH) and private industry (Petro.ai) to work together for equity and better health outcomes for the community.” 

The success of Hacking for Houston 2020 made it an easy decision for us to bring it back in the future. If you missed the event, joined the Petro.ai Community to stay up to date and hear about our next hackathon. 

Categories
Business Intelligence Tools Geology & Geoscience Reservoir Engineering Transfer

View Well Logs in Spotfire with Petro.ai

https://www.youtube.com/watch?v=NLbAUo38szs

Well logs are a critical input into many engineering and geoscience workflows. However, integrating well logs can be a challenge as many workflows move to tools like TIBCO Spotfire which cannot natively load LAS files or view logs on a vertical plot. This is especially true in unconventionals where engineers typically use a combination of Spotfire and/or Excel rather than more specialized tools like Petrel to design wells.

Petro.ai lets you:

  • Organize LAS files in once place
  • Dynamically load well logs into Spotfire
  • Use Spotfire to view and interact with well logs
  • Access well logs through a REST API