Categories
Drilling & Completions Passion for Change

Passion for Change: Colorado School of Mines

with faculty from the Colorado School of Mines Dr. Bill Eustes, Associate Professor, Petroleum Engineering and Jim Crompton, Professor of Practice, Petroleum Engineering

Special thanks to Ronnie Arispe, Data and Analytics Specialist at Concho, and Anthony Bordonaro, Production Technologist at Chevron, from the SPE Permian Basin Section for helping to conduct this interview. The Permian Basin Section has been recognized by SPE with the 2019 Section Excellence Award in recognition of the section’s hard work and strong programs in industry engagement, operation and planning, community involvement, professional development and innovation.

Tell us about your background.

JC: I’m something called a Professor of Practice in the Petroleum Engineering Department at the Colorado School of Mines, somebody who got his lumps from a number of decades in the industry rather than a PhD. 

I am relatively new to the faculty, although I go back way to 1974 at the School of Mines when I got my degree in geophysical engineering. After getting my Master’s, I joined Chevron Oil Company where I spent the next 37 years. One company, one paycheck, but a number of different careers from traditional seismic processing, seismic interpretation, and then I finished the last third of my career in the area of digital oilfields, or integrated oilfields, as it was called at Chevron at the time. 

I retired in 2013 and moved back to Colorado. Four years ago, I was asked to create a capstone course for a Data Analytics Minor within the Petroleum Engineering program.

BE: I’m Bill Eustes. I have spent 42 years in this business. I graduated from Louisiana Tech back in 1978 with a Bachelor of Science in mechanical engineering. I went to work at ARCO Oil and Gas working as a drilling engineer out in Hobbs, New Mexico. Then I did a stint in Midland, so I’ve had the experience of living in the Permian Basin. Then I worked as a drilling engineer out of the Midcontinent District in Tulsa as well as in the East Texas and North Louisiana area, and then finally went to Enid, Oklahoma where I was a production engineer until 1987. 

At that time, I recall ARCO getting a spreadsheet program called Lotus 1-2-3.  We loaded the specs on all of our wells on it. When the market crashed in ‘85 and ‘86, we went through there and populated it and said, “What is our break-even point for the price of oil for each well?” I remember this was just an awesome event to be able to go through 2,500 wells and then sort it and see which wells were making money. That was an amazing epiphany to be able to look at something like that.

Another thing that stuck with me—there was this really deep well in 1982 that I was involved with in Oklahoma while working for ARCO. I remember a company called ExLog that did mud logging; and, they would print out all of the specifications of the drilling operations on one of those old tractor feed type of printers.  I remember looking at stacks of paper and wondering what I was going to do with it. I could see some value, but it wasn’t any sort of format that we could use. 

That’s always been in the back of my mind: how do I use this information to be able to do a better job?

And then I got laid off. 

In hindsight, that was the best thing, because I got to choose my own pathway forward. I decided I wanted to get more education. I went to the University of Colorado Boulder and have a Master of Science degree in Mechanical Engineering. I thought I’d change the industry I worked in, but when you start looking at your bloodstream when you’ve been in this business, it’s no longer blood— it’s oil.

It just so happens there was a school right down the road from CU-Boulder that had a Petroleum Engineering program. That’s how I wound up at the Colorado School of Mines as a graduate student. I spent six years as a graduate student in various areas of research including the Yucca Mountain project, the Hanford nuclear waste site, places like that.

I had my advisor retire right as I finished, so I put my name in the hat, and lo and behold, here I am 24 years later. It’s been a wild ride!

What do you do at the Colorado School of Mines and what makes your work unique?

JC: I think one of the things that Bill and I share is the passion to apply data to do something useful—drill a better well, have better production, artificial lift optimization, whatever it is. Through our individual four decades of experience, we’ve seen this data become more plentiful. We’ve seen this data become a little bit easier to use. We’ve seen better tools crop up. So, it’s getting closer and closer to being able to do decision-making analysis. 

It isn’t the company with the most data that wins. It’s a company that makes the best decisions from the data they have that wins. 

I think both of us share this idea of trying to instill into the next generation workforce their understanding of the data and then what you can do with it. It’s not an overemphasis on sensors or IOT or cloud computing or whatever. It’s the idea of application. 

We talk a lot about understanding data. We talk a lot about data visualization. Forty years ago, when I was on campus, a petroleum engineer wouldn’t go beyond Excel spreadsheets. Now, we’ve got R and Python programming and it’s a new world of the capabilities, a new generation of digital engineers.

BE: We now have the tools, but you know the famous phrase, “All models are wrong, but some are useful.” [AE1] We’re trying to build more useful models.

The machines are there to assist you, to augment you in being able to make decisions. They’re not there to make the decisions for you. 

We’re working on a certificate program for those that are at the postgraduate level, whether it be in a Masters or PhD program, or just somebody out in Industry interested in wanting to get a better understanding of how to be a digital engineer- actually working on projects in drilling, production, reservoir, and unconventional resources. At the end of the 12 credit-hour sequence, you would have a Graduate Level certificate in Petroleum Data Analytics from the Colorado School of Mines. 

We’re also looking at automation, developing really good high-quality data and models that can be able to tell the machine where things should be going. 

That’s one of the things I personally am looking at, deriving insights into making our operations better. But also looking at a longer-term goal of trying to see what areas we can automate and make things safer and more reliable and more consistent.

I’m part of the Drilling Systems Automation Technology Section of the SPE. One of our drivers is developing methodologies to be able to automate our drilling rigs for consistency as well as safety. A well-trained crew can beat a machine right now, but they can only last so long before they wear out, and of course, finding a well-trained crew might be a challenge these days with the loss of experience that we’re unfortunately seeing. So perhaps this is a way to help us drill wells better and safer.

We need to start with what kind of problem you’re solving and then need to understand what kind of data you’re using and tell a good story with the data, but at the same time, talk about what you could do with the data. It isn’t just data crunching. The model has to go beyond just telling you what’s happened. The challenge for petroleum is to figure out what’s going to happen in the future, not just what was my production today. Can you give me an accurate forecast for my production in the next three to six months so I can go to the shareholder meeting and tell them how much money we’re going to make?

JC: To help older graduates, we’ve developed a graduate certificate program for more mature engineering people practicing in the industry to take in the evenings and on the weekends. We think we can add value for a modest commitment to engineers at any level, even if you just take it to learn the language, you get some hands-on experience with the tools. We’re not turning petroleum engineers into programmers, but students learn basic scripting programming languages like Python and R.

BE: Something that’s kind of unique is that we have a drill and we actually collect our own data. It’s actually a mining coring rig and we have sensors all over it so that we can actually collect the core as we are drilling and record the data. The idea is that you collect and analyze your own data. I want to see how students handle this large volume of data: 20,000 Hertz in 10 minutes from two tri-axial sensors, being able to deal with that, and see the pitfalls and the promises of being able to handle that information, and what it tells you.

JC: There comes a moment in every young digital petroleum engineer’s career where they break Excel, and we want to give them that experience early so they can realize what’s on the other side of it, the new tools and new technologies that will help them build those models with that volume of data, variety of data, and velocity of data.

Do you see any gaps in the tools being used today? What do you think the tools of the future could look like?

JC: We’re building billion-cell reservoir simulations instead of a few thousand cells. Streaming analytics as well as spatial analytics are two areas that I think we’re moving into and it has to do with the variety of data and velocity of data. Maybe we don’t know exactly what to do with 20,000 Hertz, but we could if we could just downsize that to a thousand Hertz. That’s a lot of data. Can we then have a feedback loop where the model is learning from data? 

As we’re drilling a well, if that model gets updated, it could become a better predictor, and then we can find that potential stuck pipe problem, or we could find the fact that we’re going to break off a tooth on the drill bit and avoid an unnecessary trip to set another casing string. Right now, we’re trying to do the best we can, which means we’re probably an hour behind where the drill bit is. We have MWD units, LWD units. We’ve got wired pipe. 

We’ve got some of the capacity to move the data, but I don’t think we really have the capacity to use the data in a proactive fashion, really incorporating the data coming back so we can think ahead of the drill bit.

We’re trying to upgrade our capability managing higher and higher frequency multivariate data. If we’ve got six sensors, I don’t want to just use one. I’m going to use all six. There may be some sort of signal that comes, not just from one, but from a combination of several, so we want to do that. 

We’ve gotten pretty good at producing more oil, no doubt about that. But as shale producers have found out, they haven’t been doing all that well in producing more money and profitability, and they’ve sometimes had environmental issues.

We need to manage the whole, not the parts. We’ve come a long way in the last 30 years managing the parts. I think one of the challenges now is managing the whole.

When it comes to production or we’re dealing with the reservoir, the spatial analytics side becomes important.  We have SCADA data. We’ve got individual well production history; we’ve got all that. Now put that together in a cube. We’re not just dealing with the well, we’re dealing with a cube of rock, we’re building spatial understanding of the subsurface, and even on an operational side, energy use and emissions detection. How can I put all that together so that I am producing the field to make the most money, not just producing the field to make the most fluid volumes? 

BE: There are two other issues that I think need to be worked upon. There’s a lot of the sensors on a drilling rig that are not that accurate or not that precise. They’re not calibrated very often. You’ve got to have good information coming in to be able to come up with good insights, so improved sensors on drilling rigs is a factor as well as the data transmission. There’s wired pipe, but it’s very expensive and it has challenges in and of itself. 

Are there ways that we can get data from downhole back to us in a timely fashion at a rate we need right now? I don’t think we’re there. If we’re going to improve drilling operations, we need to have the information coming from the source, which is the drill bit, and the area around the drill bit, and we have to be able to deal with the velocity and the volume of data in real time so we can make decisions in real time. It doesn’t do you any good to know the well blew out and you’re on fire back there already. We need to know what’s happening now.

Have people been skeptical about incorporating data and analytics into the field? How have you dealt with it?

JC: The oil industry has been criticized, probably correctly, for being relatively slow adopters of some of this technology. My generation didn’t believe in the models enough. I think the new generation believes in them too much. We have to find somewhere in between. 

I don’t care if you are the slickest Python programmer in the world and you just built this reservoir model. You have to be able to explain it. 

Building trust is understanding your data and being able to explain it. It’s the physics as well as the data-driven analytical processes. It’s not one or the other, it’s both, and that’s a harder challenge.

BE: One of the things I like to tell our students in classes about the use of technology and information is you have to get buy-in from everybody, including in the field, because if the rig crew doesn’t want something to work, it won’t work. You’ve got to be able to sell your ideas, to explain what’s going on and why it’s going to make their job better and make their life easier. People are more willing to do stuff that helps them do their job better, and that’s how you have to sell it.

Are there any books, sites, or other resources you would recommend?

BE: Jim, this is a good time to talk about your two books!

JC: I have written two non-academic books: The Future Belongs to the Digital Engineer and A Digital Journey: The Transformation of the Oil and Gas IndustryI also blog on LinkedIn.

Automation will get rid of some jobs, probably jobs human beings don’t really want to deal with because they’re dangerous and dirty. The petroleum industry will certainly change, but it won’t go away. You’re going to have to become model masters and prediction wizards and future tellers and a whole bunch of funny things that you maybe didn’t get in your sophomore and junior classes in Petroleum Engineering. The role will change. I don’t think the role goes away, but if you don’t change with it, you might go away if your skill set isn’t competitive in the industry.

There’s going to be a greater emphasis on predicting what is going to happen and new ways of creating value, and with all of that, you need the data. I think it’s now inescapable that digital literacy is becoming a core competency of engineers, regardless of what specialty they go for, what industry they work in. AI is going to be a tool in the future. It’s going to be a co-worker and that’s something we have to wrap our heads around.

BE: I can add that other great resources include the different conferences, like the IADC Drilling Conference which had a number of sessions on digital transformation, and then also I’d recommend your local SPE. That’s a really great place just to get in on the ground floor about what’s going on and what your peers are doing in your region. 

Dr. Bill Eustes is an associate professor within the Petroleum Engineering Department at the Colorado School of Mines. He has a B.S. degree in Mechanical Engineering from Louisiana Tech University (1978), a M.S. Degree in Mechanical Engineering from the University of Colorado in Boulder (1989), and a Ph.D. in Petroleum Engineering from the Colorado School of Mines (1996). He specializes in drilling operations, experimental, and modeling research. 
Jim Crompton is a Professor of Practice at Colorado School of Mines. Jim retired from Chevron in 2013 after almost 37 years with the major international oil & gas company. After moving from Houston to Colorado Springs, Colorado, Jim established the Reflections Data Consulting LLC to continue his work in the area of data management and analytics for Exploration and Production industry.
Categories
Business Intelligence Tools Data Science & Analytics Drilling & Completions Geology & Geoscience Passion for Change Reservoir Engineering

Passion for Change: Bonanza Creek Energy

with Kyle Gorynski, Director Reservoir Characterization and Exploration at Bonanza Creek Energy

If you missed the first part in this series, you can find the start of our conversation here.

Why do you think there’s been so much hesitation around change?

There’s a strong engrained culture in oil and gas with  a generation of people who’ve been doing this for decades – although on completely different rocks, play types and extraction techniques. There’s been pushback on adopting new geology or engineering software because people become so comfortable and familiar with the tools they use. I think a lot of it is simply the unique culture in oil and gas. Shale is still new, and our guesses are still evolving. I think we’re getting closer and closer to what that right answer is.

New organizations will have to adopt new technologies and adapt to new trends, because there’s no other way to make this business work.

As a scientist or engineer or manager in this space, we really have to be cognizant that the goal of a lot of vendors out there is not to help you get the right answer, it’s to make money. The onus is on us to vet everyone and make sure we’re getting the answers we want. 

Machine learning is simply another tool, like a physics-based model, aimed to help us predict an outcome and increase the precision and accuracy of these predictions. People have made pretty fantastic inferences from these kinds of tools. 

You can’t just pay a company to apply machine learning to a project. You need to help them utilize the correct inputs, the relationships, and ensure the predictions and outcomes match the other observations you have from other datasets.

I don’t think any organization should be cutting-edge for the sake of being cutting-edge. The goal is to solve these very specific technical challenges quicker and with more accuracy. Our job is to extract hydrocarbons in the most safe, environmentally friendly, and economic fashion. Technology like machine learning and AI are tools that can help us achieve these goals and needs to be done correctly.

Can you share any successes around data science or machine learning at your company?

The industry has been using these techniques for a long time. In their simplest form, people have been cross-plotting data since the early days of the oil and gas industry, trying to build relationships between things. At the beginning of my career, I remember using neural networks to predict log response.

Now we use predictive algorithms to help us predict log response where we don’t have certain logs. Let’s say we want to predict lithologies—carbonate-clay-quartz-feldspar content in a well— we’ll build relationships between triple-combo logs, and more sophisticated, but scarce elemental capture spectroscopy logs. We don’t have ECS logs everywhere, but we have triple-combo everywhere, so if you can build a relationship between those, then you have a massive dataset you can use to map your asset. That’s a simple way we use this type of technology. 

Like almost every company now, we’re also predicting performance. That’s how we’re able to make live economic decisions. We have a tool where we can put in a bunch of geologic and engineering inputs and it’ll predict production rates through time that we can forecast, add new costs, and run economics live. We’re running Monte Carlo simulations on variable rates, volumes, lateral length, spacing, commodity pricing, and costs that are based in our best estimates to predict tens of thousands of outcomes to try to help us better understand what the best decision could possibly be. I think that’s the most impactful place it’s being used, and I think that trend is being adopted more and more in industry as I talk to my peers. 

Type curve generation is no longer grabbing a set of wells and grouping them together and fitting a curve to it, but it’s trying to predict the infinite amount of outcomes that are between the extremes.

Have you seen any success among your competitors using technology, specifically data science and analytics tools?

There’s some great work out there across the board. I had a lot of fun at Encana (now Ovintiv) seeing a lot of my peers who are exceptionally smart really trying to adopt new technology to solve problems. I’ve seen some amazing work getting people to adopt new ideas, new thoughts, new predictions. I like going to URTeC. I think that’s a fantastic conference. I always find a number of great sets of technical work that has come out. 

I think the industry is doing a great job. There’s a ton of really smart people out there that know how to do this work. I think a lot of young people are really adopting coding and this bigger picture approach to subsurface, where it’s not just you’re an engineer or you’re a geoscientist, you really have to understand the fluid, the pore system, the stresses, what you’re doing to it. There’s no way we can be impactful unless we understand the really big picture, and people are getting much better at that, trying to use tools and develop skillsets that allow them to solve these problems a lot quicker.

How would you describe Petro.ai?

We see you guys as filling a gap we have. It’s the ability to pull data together. It’s the ability to simply apply big data to a dataset we quite frankly don’t have the time or the capability to do in-house. Petro.ai provides us with a very important service that allows us to get to a point that would take us 12-18 months to get to on our own, but in only a couple months. What we really like about it is the fact that we’re developing something that’s unique and new and therefore has our input and involvement, so we’re not just sending you a dataset and asking for an answer, we’re trying to say what we think drives the results, and we also want your feedback. So you’re also a group of experts as well that not only have your own experiences, but you’ve seen people’s assets and plays and how everyone else in industry is looking at it, so it’s nice to have this group of consultants that have the same goal – to address a problem and try to figure it out. We want to get to an answer as quickly as we possibly can and start to apply those learnings as quickly as we possibly can. 

Kyle Gorynski is currently Director of Reservoir Characterization and Exploration at Bonanza Creek Energy.  Kyle previously worked at Ovintiv where he spent 7 years in various technical and leadership roles, most recently as the Manager of Reservoir Characterization for their Eagle Ford and Austin Chalk assets.  Although he is heavily involved on the technical side of subsurface engineering and geoscience, his primarily focus is on their practical applications in resource and business development . Kyle received his B.S. and M.S. in Geology from the University of Kansas in 2008 and 2011, respectively.
Categories
Drilling & Completions Passion for Change Reservoir Engineering

Passion for Change: Bonanza Creek Energy

with Kyle Gorynski, Director Reservoir Characterization and Exploration at Bonanza Creek Energy 

Tell us about your background and what you do now.

I’m from Kansas and got my Bachelor’s and Master’s degrees in Geoscience at University of Kansas, then moved straight out to Denver. Spent the first seven years of my career with Ovintiv, which was previously Encana, and had various roles, starting with mainly geology functions, and eventually working as a manager. I’ve always been interested in the technical side of the industry and novel approaches to petrophysics, geomechanics, and reservoir mapping, and how new data and new analyses can drive decision-making. It’s important that our decisions are driven through science and statistics and less through drillbit and opinion alone. I joined Bonanza Creek about a year and a half ago.

I’m the Director of Reservoir Characterization and Exploration. This role has two primary functions. One is an asset development function and the other is Business Development/Exploration. Asset development is the value optimization of our asset to maximize on key economic metrics by: 

  • understanding the subsurface to determine baseline performance
  • understanding key engineering drivers that impact performance 
  • applying those insights to modify things in real-time like spacing, stacking, completion design, well flowback etc.

At Bonanza Creek, one of the things our CEO Eric Greager likes to say is, “We’re unique because we have the agility of a small company, with the technical sophistication of a larger enterprise.” 

This allows us to respond to things that are changing quite rapidly, from the costs of goods and services to our own evolving understanding of the reservoir. By rapidly adapting, we’re able to maximize value and economic return.

That’s the main piece. The other part of the role is the exploration and business development function. We apply the same principles I just described to other assets inside and outside our basin and work with the greater operations and finance groups to determine an asset’s current value and what its potential future value could be.

How do you incorporate technology into your approach?

Technology is applied everywhere we possibly can. We have powerful technically savvy people who can develop tools and use tools to guide all our decision making. 

That’s where Petro.ai comes in. We need help building additional tools to make real-time decisions. That’s going to help us stay lean and agile but also make sure we have the right information to be making the most informed decisions. It comes down to the right data and the right people.

We need the ability to make decisions at multiple levels within an organization, so decisions can be made quickly without a top-down approach but with a high level of trust. We need to make sure the technical work is vetted and have a culture of best practices built-in for engineering and geoscience evaluation so we can have a lot of trust in our workflows. When that expertise is already built into tools—a lot of the equations, the input, the math—that helps us have trust in the inputs as well as the outputs and allows us to make those quick decisions.

How do you define real-time?

Our ultimate goal on the completions side is to be making real-time changes while we’re pumping – so minute by minute. That’s what motivated the project we have going with Petro.ai, to start turning knobs during the job, making sure the reservoir is sufficiently stimulated and not over capitalized. 

Let’s say we have sixteen wells per section permitted, but all of a sudden commodity prices drop and it’s at forty bucks, so we’ll only drill eight of those wells. Once those eight wells are in the ground, you’re stuck with that decision. Then maybe commodity prices go up, and we get a good price on sand or water, then we rerun the economics and what the type curves look like. Then we’ll make a new decision, for example, on the amount of sand or water or what the size of our stages are. 

We are making really quick decisions in terms of completions on a well-by-well basis. As price fluctuates and we’re teetering on the edge of break-even, we have to be real flexible in terms of trying to maximize the economics. 

Our next step with Petro.ai is using our 3D seismic data, well architecture, geosteering, and drilling data to understand what kind of rock we’re actually treating to make sure we’re putting a specific design for that specific formation, and we’re also reading the rock at the same time. We’re taking that information, which is mainly pressure response during the job, and trying to learn from that live.

Why do you have a passion for change in this industry?

I’m passionate about understanding the subsurface and the whole E&P industry and our evolving understanding of unconventionals. I’ve always been passionate about the big picture, trying to zoom out and understand how everything interconnects.

Change is a reality. It’s unavoidable. The pace of change and the path you take differs between organizations. The industry as a whole has been incredibly slow to adopt change. We’re now on the verge of a large extinction event. The E&P companies that remain will be in a better position to thrive once this is all over.

Unfortunately, something even as simple as generating a return on your investments has eluded many companies. For these companies, it is often the stubborn top-down culture that has resisted change and is their greatest detriment. It’s an exciting and scary time today. However, these events allow the best to survive and force others to adapt and change – for the better. 

Investments on a single well can be aprox. 6 to 10 million dollars for a 2-mile lateral in the U.S.. Investments on learning are 10s to 100s of thousands of dollars for an entire year. There’s a lot of capital destruction that could have been avoided by doing homework, collecting data, doing analysis, and connecting the dots from math to modeling to statistics all the way to a barrel of oil.

Science often ends up in a folder. It takes good leadership, management, and technical work to ensure that you’re making decisions with all your data and information. The point is to make better decisions and to make better wells. 

The conversation continues here.

Kyle Gorynski is currently Director of Reservoir Characterization and Exploration at Bonanza Creek Energy.  Kyle previously worked at Ovintiv where he spent 7 years in various technical and leadership roles, most recently as the Manager of Reservoir Characterization for their Eagle Ford and Austin Chalk assets.  Although he is heavily involved on the technical side of subsurface engineering and geoscience, his primarily focus is on their practical applications in resource and business development . Kyle received his B.S. and M.S. in Geology from the University of Kansas in 2008 and 2011, respectively.
Categories
Drilling & Completions Reservoir Engineering Transfer

The Impact of Well Orientation on Production in North American Shale Basins

Full house at the January 22nd Calgary Lunch and Learn presented by Kyle LaMotta, VP of Analytics at Petro.ai, on the impact of well orientation in the Montney and Duvernay plays

As North American shale reservoirs reach maturity, the need to optimize development plans has become more demanding and essential. There are many variables that are within our control as we design a well: lateral length, landing zone, completion design, and so on. In situ stress is clearly not something we can directly control, but we can optimize around using an under-appreciated mechanism: well orientation.

The vast scale of available data on monthly production and well orientation provides an opportunity for data science and machine learning to help optimize on this variable.

Our team at Petro.ai has done some clever work, investigating how well orientation with respect to the maximum horizontal stress can be an important variable in well design. This unique approach blends principles of geomechanics with data science techniques to uncover new insights in completions design.

We had the incredible opportunity to visit Calgary on January 22nd, during a thankfully mild week, for a lunch and learn about the effects of well orientation on production in the Montney and Duvernay. We greatly appreciated the warm welcome, fantastic turnout, and keen interest in our presentation by Petro.ai VP of Analytics Kyle LaMotta. He brought his passion and expertise to an awesome and insightful event! 

Because of the amazing response in Calgary, we wanted to bring this lunch and learn event to some upcoming cities. Coming soon to:

For more info on all our upcoming events, please join the Petro.ai community for updates. 

Categories
Drilling & Completions Transfer

Rig Count and Operator Size: Recent Trends

Like many in our industry, the team here at Petro.ai keeps a close eye on oil prices, rig count, and analyst reports to stay in tune with what’s happening. Richard Gaut, our CFO, and I were discussing the recent trends in the rig count which led us to dive into the data. Essentially, we were curious as to how different types of E&P’s are adjusting their activity in the current market. There’s been a lot of news recently on how the super-majors are now finally up to speed in unconventionals and that smaller operators won’t be able to compete.

Figure 1: North American Rig Count (from BakerHughes)

The figure above shows the North American rig count from BakerHuges and we can see the steady recovery from mid-2016 through 2018, followed by another decline in 2019. But has this drop been evenly distributed among operators? TPH provides a more detailed breakdown of the rig count, segmenting the rigs by operator. I put the operators into one of three buckets:

  • Large-cap and integrated E&P’s
  • Mid-cap and small-cap publicly traded E&P’s
  • Privately held E&P’s

The figure below shows the breakdown between these three groups overlaid with the total rig count. You can see the mid and small caps in yellow are a shrinking segment. Figure 3 shows these groups as percentages and makes the divergence between the groups extremely visible.

Figure 2: Rig count segmented by operator type

Figure 3: Percentage breakdown of rigs by operator type

Since the recovery in 2016, privately held companies have taken on a larger share of the rigs and that trend continues through the recent downturn in 2019. This is likely because they are tasked by their financial backers to deploy capital and have no choice but to keep drilling. The large-caps and integrated oil companies have been staying constant or growing slightly since mid-2016 as has been reported. These operators have deep pockets and can offset losses in unconventionals with profits made elsewhere – at least until they become profitable. The story is very different for the small and mid-caps. These operators have experienced the sharpest drop in activity as they are forced by investors to live with in their cash flows.

The data used in this analysis were pulled at the end of September. We typically see a slowdown in activity in Q4 and recent news shows that this slowdown might be worse than normal. It’s likely the divergence we see will only continue through the end of the year.

Next, I split out the rig count by basin and found some interesting trends there which I’ll elaborate on in a second blog post.

Categories
Drilling & Completions Transfer

Gun Barrel Diagram: Calculate and Visualize Well Spacing Part 2

Part two of this blog series builds on my last post and provides step by step procedures for dynamically calculating well spacing and building gun barrel diagrams with Petro.ai. The short video above from Kyle LaMotta highlights the key steps but keep reading for a detailed procedure of how this is done.

Required Data

The gun barrel calculation requires two sets of data; wellbore directional surveys and formation structure grids. It is also possible to run the gun barrel calculation without structure grids, but an additional data function is required.

The surveys must have X, Y, Z, MD, and a well identifier, and the X & Y should be shifted to the correct geo positions, not the X & Y offset values provided by the survey company. Check out this template if you need to convert a drilling survey with azimuth, inclination, measured depth to XYZ. The formation grids must have an X, Y, Z, and formation name. It’s also important to note that the two data sources must be projected in the same coordinate reference system with Z values referencing the same datum (e.g. TVDSS). Another useful template can be found here to convert a CRS in Spotfire.  With this information the Gun Barrel function will calculate the 3D distance between the midpoints of horizontal wellbores.

After loading these data sources into Spotfire (either via Petro.ai or by directly adding the tables), click Tools > Subsurface > Classify subsurface intervals. Clicking this will bring up your “Classify Subsurface Intervals” window. Here you be prompted to fill in the dropdown menus with the relevant information.


Figure 4: Classify Subsurface Intervals (and Gun Barrel) Menu

XYZ Input: Map to wellbore surveys

The top left section, shown below, is used to map columns to your wellbore surveys table. Select your wellbore surveys data table, then fill in the X, Y, and Z dropdowns. Be sure to select “use active filtering”. This will allow the user to filter and mark select groupings of wells and determine the Gun Barrel distances for that select grouping.


Figure 5: Classify Subsurface Intervals (and Gun Barrel) Menu: XYZ input box Horizons

Horizons: Map to surface grids

The top middle section, shown below, is used to map columns to your wellbore surface grids. Select your horizons data table. This will be a table that has your horizon grid data points. This table should contain X, Y and Z data points for each individual horizon. See below for reference. Be sure to select “use active filtering”. This will allow the user to filter the surface grids around the wells of interest, which will significantly improve computation time.


Figure 6: Classify Subsurface Intervals (and Gun Barrel) Menu: Horizons Input Box


Figure 7: Formation Grids data table requirement (example)

Output: Table names and columns

The top right section, shown below, is used to configure the output table and column names. Transfer Columns are simply the additional columns that will be displayed in the output data table, check the column name boxes to include any metadata columns that will help you identify your wells.


Figure 8: Classify Subsurface Intervals (and Gun Barrel) Menu: Output options

Gun Barrel Settings

The bottom half of the window shown below enables the Gun Barrel calculations to run and allows the user to configure settings for the gun barrel.

To enable the gun barrel, check the “Enable Gun Barrel” checkbox. In the Wellbores section, select the Well identifier and MD from your Wellbore Survey data table. Next, use the input boxes to define buffer dimensions around the lateral. This allows the user to update the dimensions around the laterals, creating a rectangular prism for the calculations. In general, the preset values are sufficient.

The right side, Gun Barrel – Output, is used to name the spacing result table generated by the calculation. Use the input box to update the table name. At this point, you are ready to run the calculations: click the OK button.


Figure 9: Classify Subsurface Intervals (and Gun Barrel) Menu: Gun Barrel Input Box

Gun Barrel Spotfire Template

With all the proper data imported and mapped, it is now possible use the Gun Barrel workflow. It’s possible to run the interval classifier and gun barrel calculator in any Spotfire DXP using Ruths.ai software, but we recommend starting with the Gun Barrel Spotfire template, as it’s preconfigured to automate this workflow. This template will be published soon, stay tuned for updates!

Select the Gun Barrel, mark a group of wells on the Map Chart, then click the “Update Gun Barrel” button on the left side menu. And that’s it! You have kicked off the calculation for the 3D distance between the horizontal midpoints of each selected wellbore. Just monitor the Spotfire notifications area to see when the function execution completes.

Figure 10: Map Chart: select wells

The map chart above is displaying the horizontal midpoint X Y points. Depending on your preference, you can change these to the heel or toe if you have those points available to you in your data set.

After running the gun barrel calculation, the results are displayed in four different visualizations to help the user interpret these findings. Each will be explained below.

Figure 11: Gun Barrel Spotfire Template: (Side menu Update Gun Barrel Button)

The “Spacing Report” cross table shows a tabular view of the data. This takes each combination of wells and displays the 3D distance between each of the combination pairs (e.g. A to B, A to C, and B to C. This view also provides the distances dx, dy and dz of each of the midpoints between the well pairs, and a flag indicating whether the combination crosses a horizon interval.

Figure 12: Spacing Report: Tabular View

A table is great for looking up specific values; now let’s look at the visual representation of the same data. The Gun Barrel diagram helps to do this. The Gun Barrel diagram displays the data in a 2D vertical cross-sectional view through, and perpendicular to, all wellbore lateral section midpoints, allowing the user to view the horizontal midpoint wellbore paths head on, (as if staring down the barrel of a gun).

Figure 13: Gun Barrel Visualization

Important note: This is a 2D vertical cross section showing the midpoint along the lateral of the selected wells, regardless of whether they are in a direct line or are spatially staggered from a bird’s-eye view.

Figure 14: Using the spacing report and Gun Barrel together to interpret the data.

With these two views it is possible to quickly interpret the gun barrel calculations and visualize the spacing results.

Lastly, viewing this data in map view and a 3D subsurface view help to further orient the user to the well’s spatial position in the field (shown below).

Figure 15: 3D Subsurface view and Map Chart view.

Note that the Map chart shows the horizontal portion of each selected well, with the heel and toe points colored by their respective formation intervals. In the above example, the black dots represent the lateral section midpoints of each well.

Let me know if you have any issues with the above procedure. Good luck making your gun barrel plots and spacing reports!

Categories
Drilling & Completions Reservoir Engineering Transfer

Gun Barrel Diagram: Calculate and Visualize Well Spacing Part 1

Introduction

The Petro.ai Gun Barrel workflow allows the user to quickly find the 3D distances between the midpoints of the lateral section of selected nearby horizontal wells. This critically important information was once only possible to calculate using specialized geoscience software or through painstaking and time-consuming manual work. With the Petro.ai integrations, we can now calculate this information directly from Spotfire:


Figure 1: Petro.ai Gun Barrel View

Categories
Drilling & Completions Transfer

From Science Pads to Every Pad: Diagnostic Measurements Cross the Chasm

My energy finance career began in late 2014. As commodity prices fell from $100 to $35, I had a front row seat to the devastation. To quote Berkshire Hathaway’s 2001 letter, “you only find out who is swimming naked when the tide goes out.” It turned out that many unconventional operators and service businesses didn’t own bathing suits. My job was to identify fundamentally strong businesses that needed additional capital … not to survive, but to thrive in the new “low tide” environment. To follow Buffett’s analogy, I was buying flippers and goggles for the modest swimmers.


Following 18 months of carnage, commodity prices began to improve in 2016. Surviving operators had been forced to rethink their business models: pivoting from “frac ‘n’ flip” to “hydrocarbon manufacturing”. Over the following two years, I familiarized myself with hundreds of service companies and their operator customers. The entire industry was chasing two seemingly conflating objectives: 1) creating wells that are more productive 2) creating wells that are less expensive. This is the Shale Operator Dual Mandate: make wells better while making them cheaper.

Shale Operator Dual Mandate


As an oil service investor, I was uniquely focused on how each company I met helped their customers accomplish the Shale Operator Dual Mandate. Services that achieved one goal would likely survive, but not thrive. However, those that helped customers meet both goals were sure to be the winners in the “new normal” price environment of $50 oil.

Transformations happened quickly throughout the OFS value chain. Zipper fracs drastically improved surface efficiencies, ultralong laterals were drilled further than ever before, in-basin sand mines appeared overnight, and new measurements came to the fore. These new measurements deliver incredible insights: fracture half length, well productivity by zone, vertical frac growth, optimal perforation placement, and much more.

In Basin Sand

New Measurements

While zipper fracs, long laterals, and local sand have taken over their respective markets, new measurements have struggled to gain traction outside of “science pads”. Frustrated technical service providers bemoan the resistance to change and slow pace of adoption in our industry. These obstacles failed to slow the advance of zipper fracs, long laterals, and local sand … why have disruptive new measurement technologies been on the outside looking in?

Challenge #1: Unclear Economics

The first challenge for new measurements is unclear economics. Despite the recent improvements, unconventional development remains cash flow negative … and has been since its inception.

The above data suggests ~ $400B of cash burn since 2001 … small wonder operators are wary of unproven returns on investment! (Note: to be fair, operators were incentivized by capital markets to outspend cash flow for the great majority of this period. Only recently has Wall Street evolved its thinking to contemplate cash on cash returns, as opposed to NAVs).

For any technology to become mainstream, it must either immediately lower costs (e.g. zipper fracs, local sand) or have obvious paybacks (e.g. long laterals). New measurements, by contrast, do not clearly map to economic returns. Instead, these service providers tend to focus on “interesting” engineering data and operational case studies. Operators will not put a technology into wide use until its economic impact is fully understood. This can mean waiting months for offset wells to come online or years for neighboring operators to release results.

Challenge #2: Changing How Customers Work

The second challenge, which is just as important to end users, is that service providers must deliver insights within a customer’s existing workflow. Operators are busier than ever before. E&P companies have experienced waves of layoffs, leaving those remaining to perform tasks previously done by now-departed colleagues.

In addition, many service providers don’t appreciate the opportunity cost of elongating an existing customer workflow to incorporate new variables. A smaller staff is already being asked to perform more work per person; it should be no surprise that customers are hesitant to allocate budget dollars to perform even more individual work.

Challenge #3: No Silver Bullets

While each new diagnostic data type is an important piece of the subsurface puzzle, no single element can complete the picture on its own. Instead, each measurement should be contextualized alongside others. For example, fiber optic measurements can be viewed alongside tracer data to better determine which stages are contributing the most to production. When each diagnostic data source is delivered in different medium, it becomes nearly impossible to overlay these measurements into a single view.

The Oxbow Theory

The combination of the above factors leads to the “Oxbow Theory” of new measurement abandonment. As you may know, as rivers age, certain sections of the river meander off course. Over time, sediment is redistributed around the meander, further enhancing the river’s bend. Eventually, the force of the river overwhelms the small remaining ‘meander neck’, and an oxbow lake is created. Sediment deposited by the (now straight) river prevents the oxbow lake from ever rejoining the river’s flow. By the same token, new measurement techniques that do not cater to existing workflows may be trialed but will not gain full adoption. Instead, they become oxbow lakes: abandoned to the side of further-entrenched workflows.

Our Solution: Petro.ai

Petro.ai is the only analytics platform designed for oil and gas. If you’re an operator, we can help make sense of the tsunami of data delivered by a fragmented universe of service providers. If you’re a service company, we can help deliver your digital answer product in a format readily useable by your customers. Please reach out to info@petro.ai to learn more.

Categories
Drilling & Completions Transfer

Extracting and Utilizing Valuable Rock Property Data from Drill Cuttings

Why the Fuss About Rock?

The relentless improvement in computing speed and power, furthered by the advent of essentially unlimited cloud-based computing, has allowed the upstream oil and gas industry to construct and run complex, multi-disciplinary simulations. We can now simulate entire fields – if not basins – at high resolution, incorporating multiple wells within highly heterogeneous structural and lithological environments. Computing power is no longer the limitation; we’re constrained by our input data.

Massive volumes of three-dimensional seismic data and petrophysical logs can be interpreted and correlated to produce detailed geological models. However, populating those models with representative mineralogical, geomechanical, and flow properties relies on interpolation between infrequent physical measurements and well logs. This data scarcity introduces significant uncertainty into the model, making it harder to history match with actual well performance and reducing predictive usefulness.

Physical measurements – to which wireline logs are calibrated and from which critical correlation coefficients are determined – are typically made on samples taken from whole cores. Some measurements can be performed on side wall cores collected after the well has been drilled using a wireline tool. However, intact side wall cores suitable for making rock property measurements aren’t always retrieved, and the sampling depth is relatively inaccurate compared to the precise drilling of test plugs from a whole core at surface.

The drilling, retrieval, and analysis of a full core adds significantly to well construction time and can cost anywhere from $0.5 to $2.0 million. In today’s cash-constrained world, that’s a tough ask. Core analysis has become faster and cheaper, but cores are still only collected from less than one percent of wells drilled. This produces a very sparse data set of rock property measurements across an area of interest.

Overlooked Rock Samples

Drill cuttings are an often-overlooked source of high-density rock samples. Although most wells are mud logged for operational geology purposes, such as picking casing points or landing zones, the cuttings samples are frequently discarded without performing any further analysis.

Geochemical analysis at the wellsite is sometimes used to assist with directional drilling. However, detailed characterization typically requires transporting the samples to a central laboratory where they can be properly inspected, separated from extraneous material, and processed to ensure a consistent and representative measurement.

At Premier, we encourage our clients to secure and archive cuttings samples from every well they drill. Even if the cuttings won’t be analyzed right away, they represent a rich, dense sample set from which lateral heterogeneity can be measured and used to fill in the gaps between wells where whole core has been cut and evaluated.

Figure 1: The images above show how rock properties can be collected, analyzed, and visualized from cuttings.

Rapid, high-resolution XRF measurements and state-of-the-art x-ray diffraction (XRD) can be used to match mineralogical signatures from cuttings to chemofacies observed in offset cores.

Information from cuttings samples expedited from the wellsite for fast-turnaround analysis can be used to optimize completion and stimulation of the well being drilled. For example, geomechanical properties correlated with the chemofacies identified along a horizontal well can be used to adjust stimulation stage boundaries. The objective is for every fracture initiation point within a stage to encounter rock with similar geomechanical characteristics, increasing the probability of successful fracture initiation at each point.

On a slower timeline, a more detailed suite of laboratory measurements can be completed, providing information about porosity, pore structure, permeability, thermal maturity, and hydrocarbon content.

The Cuttings Motherlode

In late 2017, Premier Oilfield Group acquired the Midland International Sample Library – now renamed the Premier Sample Library (PSL), which houses an incredible collection of drill cuttings and core samples dating back to the 1940s. The compelling story of how we saved it from imminent demise is the subject of another article. Suffice to say the premises needed some TLC, but the collection itself is in remarkable condition.

Figure 2: Original sample library at the time it was acquired by Premier.

The library is home to over fifty million samples from an estimated two hundred thousand wells – most of them onshore the United States, and many within areas of contemporary interest like the Permian Basin. It gives us the ability to produce high-density rock property datasets that can be used to reduce the uncertainty in all manner of subsurface models and simulations.

New samples are donated to the library each week, many of them from horizontal wells. This provides invaluable insight into lateral facies changes and reservoir heterogeneity. Instead of relying only on the sparse core measurements discussed earlier, our 3D reservoir models can now be populated with superior geostatistical distributions conditioned with data from hundreds of sets of horizontal well cuttings. In an ideal world, cuttings samples from every new well drilled would be contributed to the collection, preserving that information for future generations of explorers and developers.

We have already helped many clients study previously overlooked intervals by reaching back in time to test samples from wells passing through to historically more prolific horizons. Thanks to their predecessors’ rock squirreling habits, these clients have access to an otherwise unobtainable set of data.

Our processing team works around the clock to prepare and characterize library samples, adding more than 2,000 consistently generated data points to our database each day. Since it would take years to work through the entire collection, we have prioritized wells that will give us broad data coverage across basins of greatest industry interest. Over time, guided by our clients’ data needs, we will expand that coverage and create even higher-density data sets.

Share and Apply the Data

At Premier Oilfield Group, we believe that generating and sharing rock and fluid data is the key to making more efficient and more effective field development decisions. The Premier Sample Library is a prime example of that belief in action.

Following an intense program of scanning, digitization, well identification, and location, we are now able to produce a searchable, GIS-based index for a large part of the collection. Many of the remaining boxes are hand-labeled with long-since-acquired operating companies and non-unique well names but forensic work will continue in an attempt to match them with API-recognized locations.

We are excited to have just launched the first version of our datastak™ online platform. This will finally make the PSL collection visible to everyone. Visitors can see detailed, depth-based information on sample availability and any measurements that have already been performed. Subscribers gain access to data purchase and manipulation tools and, as the platform develops, we will add click-through functionality to display underlying test results and images.

Subscriptions for datastak™ cater to everyone from individual geologists to multi-national corporations. We want the data that’s generated to be as widely available and applicable as possible.

Rock properties available through datastak™ provide insights during several critical workflows. These often require integrating rock properties with other data sets, such as offset well information, pressure pumping data, and historical production. Many of the data types generated through cuttings analysis can already be brought into Petro.ai® and made readily available for engineers and geologists. This provides a rich data set, ready for analytics.

For example, Petro.ai® can be used to build machine learning models that tease apart the effects of completions design and reservoir quality on well performance. Premier and Ruths.ai continue working collaboratively to identify additional data types and engineering workflows that will help ground advanced analytics with sound geologic properties.

Armed with a consistent, comprehensive set of rock property data, developers will be in a position to separate spurious correlation from important, geology-driven causality when seeking to understand what drives superior well performance in their area of interest. Whether that work is being carried out entirely by humans or with the assistance of data science algorithms, bringing in this additional information will enable more effective field development and increase economic hydrocarbon recovery.

For further information on Premier Oilfield Group, please visit www.pofg.com.

Categories
Data Science & Analytics Drilling & Completions Transfer

Demystifying Completions Data: Collecting and Organizing Data for Analytics (Part 3)

As mentioned in my previous post, in order to really be of value, we need to extend this analysis to future wells where we won’t have such a complete data set. We need to build multi-variate models using common, “always” data – like pump curves, geologic maps, or bulk data. Our approach has been for engineers to build these models directly in Spotfire through a side panel we’ve added but save these models back to a central location so that they can be version controlled and accessed by anyone in the organization. They can quickly iterate through a variety of models trained on our data set to review the model performance and sensitivity.

If we have access to historical information from previous wells, we can run our model on a variety of data sets to confirm its performance. This could be past wells that had micro seismic or where we knew there were issues with containment. Based on these diagnostics we can select a model to be applied by engineers on future developments. In order to make sure the model is used correctly we can set fences on the variables based on our training set to ensure the models are used appropriately. Because the models are built by your team – not a third-party vendor – they know exactly what assumptions and uncertainties went into the model. This approach empowers them to explore their data and answer questions without the stigma of a black-box recommendation.

Figure 1: Your team builds the models in Petro. ai – not a third-party vendor – so you know exactly what assumptions and uncertainties went into it. This approach empowers you to explore your data in new ways and answer questions without the limitations of black-box recommendations.

However, in addition to fences, we need to make sure engineers understand how and when to apply the models correctly. I won’t go into this topic much but will just say that the direction our industry is moving requires a basic level of statistics and data science understanding by all engineers and geologists, because of this Ruths.ai has incorporated training into our standard engagements.

Slightly different hypothesis

This example used a variety of data, but it only answers one question. It’s important to note that even slight variations in the question we ask can alter what data is needed. In our example, instead of asking if a specific frac design would stay within our selected interval, we wanted to know if the vertical fracture length changed over time, we would need a different data set. Since micro seismic is a snapshot in time we wouldn’t know if the vertical frac stays open. A different data type would be needed to show these transient effects.

Data integration is often the biggest hurdle to analytics

We can start creating a map to tie back the required data needed for the questions we are interested in answering. The point of this diagram shown here is not to demonstrate the exact mapping of questions to data types, but rather, to illustrate how data integration quickly becomes a critical part of this story. This chart shows only a couple questions we may want to ask, and you can see how complicated the integration becomes. Not only are there additional questions, but new data types are constantly being added; none of which add value in isolation – there is no silver bullet, no one data type that will answer all our questions.

Figure 2: Data integration quickly becomes complicated based on the data types needed to build a robust model. There is no silver bullet. No single data type can answer all your questions.

With the pace of unconventional development, you probably don’t have time to build dedicated applications and processes for each question. You need a flexible framework to approach this analysis. Getting to an answer cannot take 6 or 12 months, by then the questions have changed and the answers are no longer relevant.

Wrap up

Bringing these data types together and analyzing them to gain cross-silo insights is critical in moving from science to scale. This is where we will find step changes in completions design and asset development that will lead to improving the capital efficiency of unconventionals. I focused on completions today, but the same story applies across the well lifecycle. Understanding what’s happening in artificial lift requires inputs from geology, drilling and completions. Petro.ai empowers asset teams to operationalize their data and start using it for analytics.


Three key take ways:

  • Specific questions should dictate data collection requirements.
  • Data integration is key to extracting meaningful answers.
  • We need flexible tools that can operate at the speed of unconventionals.

I’m excited about the progress we’ve already made and the direction we’re going.