Categories
Data Science & Analytics Geology & Geoscience Passion for Change

Passion for Change: Diamondback Energy

with David Cannon, Senior Vice President of Geoscience and Technology at Diamondback Energy

The views expressed here are not the official views of Diamondback Energy, but those solely of the subject of this interview.

If you missed the first part in this series, you can find the start of our conversation here.

What do you see as some of the key technologies that are going to help you succeed in this price environment?

There’s a lot of very cost-intensive technologies out there. During good times in the oil and gas industry, people go out there and spend millions and millions of dollars collecting high-resolution data: microseismic surveys, wide azimuth seismic shoots, pilot holes, logs in the laterals, and so on. This data comes at an extremely high cost and can give you very detailed data from their collection, but during times such as these, they’re usually the first items cut, because they’re considered discretionary spending. To follow the theme from my first post, the social proof is no longer there. So during poor price environments, we must fall back upon our scientific understand and dust of our mental models to contextualize all that high-end data we collected during the good times.

Pilot projects, high-density data pad projects, a lot of those wellbore centric data collection efforts are really just that—wellbore centric. What does that tell us about a complex reservoir 2 miles away? It doesn’t really tell us much. It’s hard to take that high-resolution data in a very finite area and extrapolate that to other parts of the reservoir without knowing context. 

Geologists utilize many mental models. That’s what we do day in, day out. Sequence stratigraphy is a mental model. Structural geology concepts are mental models. Depositional environments are mental models, and we have to take all those data sources that we’ve collected and put them within that mental model. This is a key step in our progression toward data analytics and machine learning.

In data analytics and machine learning, the algorithms need context. You can’t just throw numbers into a black box and expect results to appear that make any physical sense, so you need to contextualize these things, that is where I’ve seen successes recently. Data science and machine learning are getting better with the contextualization. That has been my biggest issue with data science in the past. Very early on, it was very black boxy. You threw your data in the box, never to be seen again, then a result came out. Well, how did that work?

At Diamondback, we’re very curious about that workflow and critically interrogate whoever we partner with that provides these services. We want to see how it works. We want to be able to understand the entire process from data loading to result creation, because if we can’t explain it physically, then there’s no point acquiring it. It’s just a number, it’s just a result at that point, completely unknown with no uncertainty bounds whatsoever. How can I put any trust in that value if I don’t understand it? That’s where a lot of advancements have been had, and I think will continue to be had, in 2020 and beyond within this particular industry—pushing geologic contextualization within data science constructs.

What successes have you seen with data science or machine learning?

At Diamondback in particular, I think we’ve run the gamut. We’ve done some projects around machine learning early on, back in 2016. One of the questions we wanted to answer was: 

Here’s a whole mess of data on horizontal wells—gamma ray profiles, stimulation designs, post job reports, drilling data. Here’s all the geological information, what formation they’re in, all this other stuff. And basically, they smooshed it all together and they spit out a couple of answers for what drives production. Those two things? Better production werefrom wells that were toe up and had little undulation. So does the subsurface play a role at all in this? And when we were interrogating the provider on that, they said, we don’t have enough data”.

What does that mean? We gave them 300+ wells to utilize. But it turns out that’s not nearly enough data. We needed an order of magnitude more. In that particular example, what really enlightened us was the effect of data population. In order for an algorithm to properly learn through data, you need a ton of it.

If you don’t have a lot of data, the learning band that algorithm has is very, very narrow. You do not create an environment for that algorithm to adapt to new inputs, to outlier inputs. If you have a population of 200 wells and you have knowledge that 20 of them have outlier type production, that’s 10% of the population. Those outlier results start becoming part of the distribution. That’s a problem. 

One of the things we see as a success with our partnership with Petro.ai is proper contextualization of a data science project. Tapping into geoscience expertise to build better contextualization, in addition with more data, will result in better outputs. Outputs that we can trust in our business decisions of directing a multi-billion dollar capital budget. 

It seems like Diamondback has had success with data trades. Do you see that as a trend in the industry or is that sort of unique to what you guys are doing?

No one outside of the majors has the type or volume of data necessary to make a machine learning algorithm output results that are meaningful. The majors have thousands and thousands of wells, which is why they have whole departments dedicated to data science and machine learning for their own internal use. There’s a reason why, a lot of times, they don’t contract vendors because they have their own data scientists building the algorithms themselves and using them within the mental models at Exxon or Chevron. 

For people like us, we have to rely upon outside data sources, either through data trades or incorporating other means of information, and that’s one of the things we’ve been able to do very successfully, acting upon those data trades and squeeze a ton of value out of that data.

One of the justifications I’ve used with management teams for high-resolution data is how this expense can be scaled to how many trades I can make with the data. If it costs $1 million dollars to collect and analyze a core, it will theoretically cost $167,000 because I will trade this data for 5 others cores nearby with competitors. That in turn will allow us to gain more knowledge on our reservoirs without increasing capital cost. 

For our industry to continue to move forward as the preeminent source of energy, either stand alone or in conjunction with other sustainable energy sources, we all have to work together. Working with others with different perspectives keeps our innovative skills sharp. If you are insular, you fall into complacency. 

We have built a system that’s competitive through public trading of companies. We want to compete with Apache, Concho, and Pioneer, but the problem is, we’re all extracting the same commodity. It’s not like my oil is any different than Pioneer’s oil and it’s not like the end game for that particular barrel of oil is any different either. It’s still going to get refined into the same suite of products that we all use and enjoy in our modern society. 

We need to start combining our efforts to focus on more industry-friendly consortiums that allow us to share not just data, but share ideas. That’s really what it comes down to at the end of the day. If your rock is good, your production is good, full stop, but more efficient ways of extracting could potentially lift areas that maybe have marginal rock to make it economic. That is the power collaboration. We should all be working together to continue our ability to provide energy to our world to lift up the human experience.

Are there any good books or blogs you would recommend?

I’m a huge fan of social psychology; or the study of human interaction. Also, I am a fan of how human interaction and environmental interaction creates the basis for all cultures and social constructs. What are the things that you can that you can learn from interacting with humans that can provide a better basis of doing business with each other? There are a couple authors that I really enjoy reading books from. One is Yuval Noah Harari. He wrote one called Sapiens: A Brief History of Humankind. It’s a real interesting read about how our species evolved and got to the ‘top of the pyramid’. He’s written a few others. One of them is Homo Deus, which is about how our technological adaptation and our ability to communicate on a much grander scale now because of the proliferation of information. This has set into process an evolutionary change within humans that could be permanent. And as we move forward, he believes, this is the time where we may be able to classify ourselves as a species as something different from Homo sapiens in terms of our ability to coordinate with each other and how modern stimuli have changed us mentally and physically.

The other one I’m reading right now is 21 lessons for the 21st century. It’s a very good book. I’m about halfway through it right now. Lot of interesting topical things in there that contextualize modern issues.  The other author I really like is Malcolm Gladwell. He’s a journalist by profession, but he’s a fan of investigating how people interact and push ideas through society. He’s a firm believer of outliers, usually the very few within our society that push a majority of our advancements and ideas through society.

How would you describe Petro.ai?

Petro.ai provides a differential product that allows our company to have an edge on understanding what we’re doing in the subsurface and how we interact with it.

As we were selling this project internally, I think what differentiates you from other people, is the fact that, simply put, you provide the platform and access to expertise to allow us to thrive within your platform. And that is something that is differential. There’s a lot of people who push machine learning, who push their own platforms for being able to come to terms with how we interact with the subsurface, but it’s usually all project-based. They don’t give you the keys. It’s like an Uber. You have to call them and they show up and they pick you up, whereas you guys are selling us the car. We’re able to drive it ourselves, but then we go back to the dealership every so often: something’s wrong here and you help us fix it, but we still own the car at the end of the day. So that’s kind of how I view Petro.ai.

That’s the kind of partnership we want, as I said before, we want to understand. We want to be able to know where to fix the issue if something breaks down. That is important to us because it allows our organization to learn and adapt in the proper manner to new information. 

David Cannon has served as Senior Vice President of Geoscience and Technology since February 2019. Previously, he served as Vice President of Geoscience from April 2017 to February 2019, Exploration Manager from March 2015 to April 2017, and as a Senior Geologist since March 2014. Before joining Diamondback, Mr. Cannon served as a Senior Geologist for Newfield Exploration, from January 2013 to March 2014, where he assisted in exploration, assessment, and development of SCOOP/STACK properties in the Anadarko Basin. Prior to that, he held the position of District Geologist for Samson Resources, from August 2011 to January 2013, where he held a position in the Corporate Exploration Department assessing Rockies, Mid-Continent, and East Texas unconventional plays. He was recruited by ConocoPhillips in 2008, where he held various positions in exploration and development of Rockies and Bakken assets. Mr. Cannon received his BS Geology from the State University of New York, College at Brockport in 2005 and obtained his MS in Geoscience from Pennsylvania State University in 2008 with a focus on structural geology and rock mechanics.

Petro.ai uses AI and Machine Learning to empower domain experts, data scientists, and executives with instant information in the right context, energizing teams and transforming data into action. Partnerships with global leaders in cloud computing (AWS) and geoscience (Dr. Mark Zoback) allow Petro.ai to deliver differentiated technology and teams to Accelerate Discovery.

Categories
Data Science & Analytics Geology & Geoscience Passion for Change

Passion for Change: Diamondback Energy

with David Cannon, Senior Vice President of Geoscience and Technology at Diamondback Energy

The views expressed here are not the official views of Diamondback Energy, but those solely of the subject of this interview.

Tell us about your background and what you do now.

I’m the Senior Vice President of Geoscience and Technology at Diamondback. Mostly my purview is on the subsurface side—being able to better characterize and extract Diamondback’s resources in a cost-efficient manner.

I started out in the industry back in 2007, a geoscientist by trade and by formal education. I got my Master’s Degree at Penn State University in Structural Geology with a focus mostly on rock mechanics and fracture mechanics, so I’m one of those weird geologists that actually likes math.

In our industry history, geoscientists have always worn the mantle as being those who spend all the money without regard for economics, under the guise of pursuit for truth. As I moved into the oil and gas industry, I think I’ve always had an inherent businessmen’s mind, which resulted in marrying science and technology of the subsurface into sound business decisions.

I know there’s a lot of others that think this way, too. Can we marry the pursuit for data and truth in the subsurface with sound economic practice? We can do these things in a cost-efficient manner that not only allows us to succeed in terms of creating a business case for data collection or data manipulation— which is where I see players like Petro.ai being an integral part of—but that also allows us to get stakeholders in the process. 

Engineers tend to have an economic mindset:  what’s the most cost-effective practice to be able to maximize my return? And if you can sing along those same lines, I think that you can get a lot of buy-in on new technologies, a lot of buy-in on something that might be disruptive for your particular industry. That’s honestly the way all companies should approach it. 

We have to stop thinking that this is an antagonistic setting where the geologists have one way of thinking, engineers another, and the finance guys another. We all have to meld our thought processes together to be able to get the best outcomes.

Why do you have a passion for change in this industry?

Our industry should always have a passion for change.  The way I see it is we must always evolve, and if we do not evolve, our industry will perish. It’s borne out in the data. You see historical references to it all the time. 

One particular industry that has gone through a period of low innovation, and now feels the repercussions, is the coal industry; one of the preeminent sources of energy for the world from the 1800s through I think about the 1980s. Well, they got complacent. That complacency led to a slowdown in innovation around things like more efficient extraction techniques and more efficient, environmentally friendly conversion techniques. Are there ways to thermally alter coal in a way that can reduce the amount of greenhouse gases and reduce the amount of pollution that is released into our environment? I just don’t think they were ever ahead on that respect. 

They always reacted whenever new regulations came out or whenever there was social pressure. Only then would the coal industry react, and they usually reacted in a very minimal way, just enough to get by. They were on top. Coal would reign forever. Why would they have to change? 

Technology evolves as new social pressures and new paradigm shifts occur throughout our human society. Other alternative sources for that energy started to come to the forefront, and I think the biggest displacement technology we see now is just switching over to natural gas. It’s simple, right? 

You can take a coal plant and convert it to a natural gas plant relatively easily, and with the advent of horizontal drilling and hydraulic stimulation in our industry, this resulted in an oversupply of natural gas. That fuel source became extremely cheap, so all the utilities that ran these coal fire plants saw an opportunity to behave in a much more efficient manner. At the same time, gas was able to better answer the calls for change from environmentalists and from general social pressure around pollution. So, they made the switch. 

More and more coal plants are continuing to switch to natural gas even today, and the market share of coal continues to slide. They are no longer the number one producer of electricity in the world. They’ve been displaced and they will continue to be displaced. The only reason why they were displaced isn’t really because natural gas is just better. It’s because they stopped innovating. They stopped seeing where they fit within the future and seeing how they can adapt to that new reality. And that is a problem that a lot of industries have, and that’s something I’m passionate about with our industry, as I can see us traveling down that same road really easily. 

If we continue to be complacent about where we fall within the energy industry and overall giving energy to the human population, we can easily be displaced if we’re not thinking about new technologies. 

There’s some research that’s being done now with the Earth and Mineral Sciences Department at Penn State in conjunction with private industry in Appalachia, looking at ways to take natural gas methane streams and thermally alter them with microwave plasma. The process breaks down methane and converts it into molecular hydrogen, for hydrogen fuel cell technologies, and graphene for structural additives to steel and concrete. So, they’re thinking about how our industry, the hydrocarbon industry, can be a part of the solution of renewable energy.

It’s not an or statement. It’s an and statement. And that’s one of the things that I’m passionate about is that our industry can be a part of that solution. We just have hurdles, and it’s mostly philosophical hurdles, of people saying, “we’ve always done it this way. We’ve always provided energy this way. This is how we’re going to do it.”

That is the mentality that kills you. That’s exactly what happened to the coal industry in the 60s and the 70s, and they refused to change because of that mindset. They basically said, “where else will you get your energy?” Well, they found out where else the world could get their energy. So, we can’t stand by and let that happen. It will be the death knell of our industry if we don’t find ways to couple into the new research around providing energy in a more sustainable way to the world.

It’s a full cycle problem. It’s not just extraction. Extraction is one part of the problem. It’s also taking that product, then converting it to the energy that’s consumable for whoever your consumer base is. That’s one of the things that I think we have an issue with is that we’re always really decoupled in that.

The independent E&P companies, they’re just worried about extraction and production, right? Once they sell the product, you’ve ended the value proposition. End of story. Move on to the next barrel. But some of the engineering, some of the science, some of the application of technologies that we’re using in the extraction realm could also help on the consumption realm, because at the end of the day, the physics, chemistry, biology that we use on a daily basis wraps up into the work done to convert that resource to energy. 

Are we going to completely focus on the science of resource conversion? No, because we have our business models that state that we have to extract this resource. But are there ways that we can bring ideas to the forefront that we could change the game on how those resources are then used? Yes, and we as extraction companies should be part of that research and ultimately that solution.

In 2015 and 2016, the downturn gave rise to this digital transformation in oil and gas. How do you see this 2020 downturn affecting things?

I always fall back upon is the diffusion of innovation. It’s a concept that was actually written back in the early 60s talking about how technology or even an idea, moves through and is adopted by a population. It’s an elegant construct really.

When a particular idea or technology starts, you have the innovators, the ones who are creating that process, the ones who are getting the bloody nose because they’re the first ones through the wall. Then you have early adopters, folks who say, “That’s a really cool idea. I want to adopt it and potentially make it better. Let’s push this technology forward.”

One perfect example is Elon Musk. Plug-in electric vehicles have been around since the 1960s. They actually made functional models of plug-in electric vehicles back then, so what he’s been able to deliver with the Tesla vehicle is nothing new. He was an early adopter of that technology. We have better complementary technology around battery tech and energy efficiency to increase range and operate more technologically advanced vehicles. So, he’s an early adopter. I wouldn’t consider him an innovator. 

Then once you go past the early adopter phase, you have what’s called the technological chasm. It’s at that point where the technology has to reach some sort of social proof, and social proof can be defined by anything, depending on the social system that you’re trying to push an innovation or an idea through. That social proof can change and vary, dependent upon the answer and solution that technology is trying to address. 

To continue with the example of electric vehicles, it’s usually cost and value. A middle-class family of four in Iowa wants not only a vehicle that has a long-extended range, which Tesla has, but they also want a vehicle that has a low cost. When that family buys a Toyota Camry for $30,000, they get the benefits of long range and value. This is why the Tesla Model S hasn’t jumped the chasm, because that vast consumer base, middle-class America, cannot couple range AND value. That’s why electric vehicles have not jumped the chasm, because right now they have not attained the social proof of value.

In our industry, oil and gas, the social proof for technological innovation is also monetary. It’s essentially the price of oil. If the price of oil drops below a certain threshold level a lot of technological advancements can’t jump the chasm to get more adaptation through our industry. 

What we do, what we spend money on, is inherently tied to the price of the product we sell. How much money we make, how much revenue we make on a quarter over quarter basis, year over year basis, is going to be our war chest to be able to go out and spend dollars on innovation and potentially be more efficient. 

During times like 2015, 2016, and also today, are times where the social proof concept falls apart for our industry, and a lot of innovations fall backwards in that curve. They no longer have social proof, and people start to drop that technology because it’s no longer viable within that social proof context.

The interesting part about that entire thesis from E.M. Rogers around technological adaptations is when you have something that falls out of social acceptance, something else replaces it. There’s always something waiting in the wings to get social proof in times like this. The constant struggle with competing technologies as social proof shifts and changes with time causes a very stilted technological history within our industry. When times are good, we focus innovation on subsurface assessment. When times are poor, we focus innovation on operational efficiency.  

Things like data science and machine learning met social proof in 2015 and 2016, as they were disruptors on the efficiency side. Then as things started getting better, there was a stasis, a plateauing of data science. There wasn’t this huge rush of, “Let’s keep pushing the paradigm and pushing the technology for our industry.” Instead, we fell into, “Well, this works, let’s just keep using it.” I predict we will see another creaming event for those technologies in 2020 and beyond, because I think the learnings that were acquired in 2015 and 2016 are going to be brought back up to the forefront and the value propositions are going to be shown again. Then there’s going to be more people paying attention to the technology. In turn, that attention is then going to continue to evolve it, and that evolution then keeps that technology moving forward and doesn’t allow it to get complacent and lag like the coal industry did.

The conversation continues here.

David Cannon has served as Senior Vice President of Geoscience and Technology since February 2019. Previously, he served as Vice President of Geoscience from April 2017 to February 2019, Exploration Manager from March 2015 to April 2017, and as a Senior Geologist since March 2014. Before joining Diamondback, Mr. Cannon served as a Senior Geologist for Newfield Exploration, from January 2013 to March 2014, where he assisted in exploration, assessment, and development of SCOOP/STACK properties in the Anadarko Basin. Prior to that, he held the position of District Geologist for Samson Resources, from August 2011 to January 2013, where he held a position in the Corporate Exploration Department assessing Rockies, Mid-Continent, and East Texas unconventional plays. He was recruited by ConocoPhillips in 2008, where he held various positions in exploration and development of Rockies and Bakken assets. Mr. Cannon received his BS Geology from the State University of New York, College at Brockport in 2005 and obtained his MS in Geoscience from Pennsylvania State University in 2008 with a focus on structural geology and rock mechanics.
Categories
Business Intelligence Tools Data Science & Analytics Drilling & Completions Geology & Geoscience Reservoir Engineering

Passion for Change: Bonanza Creek Energy

with Kyle Gorynski, Director Reservoir Characterization and Exploration at Bonanza Creek Energy

If you missed the first part in this series, you can find the start of our conversation here.

Why do you think there’s been so much hesitation around change?

There’s a strong engrained culture in oil and gas with  a generation of people who’ve been doing this for decades – although on completely different rocks, play types and extraction techniques. There’s been pushback on adopting new geology or engineering software because people become so comfortable and familiar with the tools they use. I think a lot of it is simply the unique culture in oil and gas. Shale is still new, and our guesses are still evolving. I think we’re getting closer and closer to what that right answer is.

New organizations will have to adopt new technologies and adapt to new trends, because there’s no other way to make this business work.

As a scientist or engineer or manager in this space, we really have to be cognizant that the goal of a lot of vendors out there is not to help you get the right answer, it’s to make money. The onus is on us to vet everyone and make sure we’re getting the answers we want. 

Machine learning is simply another tool, like a physics-based model, aimed to help us predict an outcome and increase the precision and accuracy of these predictions. People have made pretty fantastic inferences from these kinds of tools. 

You can’t just pay a company to apply machine learning to a project. You need to help them utilize the correct inputs, the relationships, and ensure the predictions and outcomes match the other observations you have from other datasets.

I don’t think any organization should be cutting-edge for the sake of being cutting-edge. The goal is to solve these very specific technical challenges quicker and with more accuracy. Our job is to extract hydrocarbons in the most safe, environmentally friendly, and economic fashion. Technology like machine learning and AI are tools that can help us achieve these goals and needs to be done correctly.

Can you share any successes around data science or machine learning at your company?

The industry has been using these techniques for a long time. In their simplest form, people have been cross-plotting data since the early days of the oil and gas industry, trying to build relationships between things. At the beginning of my career, I remember using neural networks to predict log response.

Now we use predictive algorithms to help us predict log response where we don’t have certain logs. Let’s say we want to predict lithologies—carbonate-clay-quartz-feldspar content in a well— we’ll build relationships between triple-combo logs, and more sophisticated, but scarce elemental capture spectroscopy logs. We don’t have ECS logs everywhere, but we have triple-combo everywhere, so if you can build a relationship between those, then you have a massive dataset you can use to map your asset. That’s a simple way we use this type of technology. 

Like almost every company now, we’re also predicting performance. That’s how we’re able to make live economic decisions. We have a tool where we can put in a bunch of geologic and engineering inputs and it’ll predict production rates through time that we can forecast, add new costs, and run economics live. We’re running Monte Carlo simulations on variable rates, volumes, lateral length, spacing, commodity pricing, and costs that are based in our best estimates to predict tens of thousands of outcomes to try to help us better understand what the best decision could possibly be. I think that’s the most impactful place it’s being used, and I think that trend is being adopted more and more in industry as I talk to my peers. 

Type curve generation is no longer grabbing a set of wells and grouping them together and fitting a curve to it, but it’s trying to predict the infinite amount of outcomes that are between the extremes.

Have you seen any success among your competitors using technology, specifically data science and analytics tools?

There’s some great work out there across the board. I had a lot of fun at Encana (now Ovintiv) seeing a lot of my peers who are exceptionally smart really trying to adopt new technology to solve problems. I’ve seen some amazing work getting people to adopt new ideas, new thoughts, new predictions. I like going to URTeC. I think that’s a fantastic conference. I always find a number of great sets of technical work that has come out. 

I think the industry is doing a great job. There’s a ton of really smart people out there that know how to do this work. I think a lot of young people are really adopting coding and this bigger picture approach to subsurface, where it’s not just you’re an engineer or you’re a geoscientist, you really have to understand the fluid, the pore system, the stresses, what you’re doing to it. There’s no way we can be impactful unless we understand the really big picture, and people are getting much better at that, trying to use tools and develop skillsets that allow them to solve these problems a lot quicker.

How would you describe Petro.ai?

We see you guys as filling a gap we have. It’s the ability to pull data together. It’s the ability to simply apply big data to a dataset we quite frankly don’t have the time or the capability to do in-house. Petro.ai provides us with a very important service that allows us to get to a point that would take us 12-18 months to get to on our own, but in only a couple months. What we really like about it is the fact that we’re developing something that’s unique and new and therefore has our input and involvement, so we’re not just sending you a dataset and asking for an answer, we’re trying to say what we think drives the results, and we also want your feedback. So you’re also a group of experts as well that not only have your own experiences, but you’ve seen people’s assets and plays and how everyone else in industry is looking at it, so it’s nice to have this group of consultants that have the same goal – to address a problem and try to figure it out. We want to get to an answer as quickly as we possibly can and start to apply those learnings as quickly as we possibly can. 

Kyle Gorynski is currently Director of Reservoir Characterization and Exploration at Bonanza Creek Energy.  Kyle previously worked at Ovintiv where he spent 7 years in various technical and leadership roles, most recently as the Manager of Reservoir Characterization for their Eagle Ford and Austin Chalk assets.  Although he is heavily involved on the technical side of subsurface engineering and geoscience, his primarily focus is on their practical applications in resource and business development . Kyle received his B.S. and M.S. in Geology from the University of Kansas in 2008 and 2011, respectively.
Categories
Data Science & Analytics Reservoir Engineering

Analytics Autonomy using Petrons

The most powerful Petro.ai concept, which instantly captures an engineer’s attention, is the Petron.  Petrons are the most intuitive, efficient, and natural way to interact with oil and gas data.  There is no need to combine spreadsheets, run JOINs in SQL, or create a specialized BI view to form an integrated dataset.  Instead, disparate data sources loaded into a Petron are auto-normalized, classified, and contextualized.   

The result is Analytics Autonomy. 

Do valuable work 

In oil and gas, almost every data source depends on the others: lateral length informs decline curves, completion intensity impacts well spacing, bit aggressiveness impacts failure rates.  Today, when an engineer wants to investigate an interesting question, most of his/her time is spent doing custodial work on the appropriate interconnected datasets.  With Petrons, data is delivered on-demand to users in an analytics-ready format. 

Add new data types 

There are a plethora of new measurement technologies on the market today: DNA tracers, electromagnetics, offset pressure gauges, etc.  The challenge lies in integrating these powerful new datatypes into a model that already exists.  Outside of images in presentations, it is extremely difficult to incorporate these new measurements into the analysis.  Petrons empower engineers to incorporate new measurements into their analysis with just a few clicks. 

Scale insights 

Engineers are perpetually identifying new areas for improvement and performing analysis to support their conclusions.  However, it is difficult to scale a single engineer’s work product in a format that is usable by others.  Petrons are a shared workspace that is automatically populated with each engineer’s latest models and work products.   

Accelerate time to value 

By eliminating analytics friction, Petrons are empowering engineers to 1) solve problems faster and 2) deliver new workflows to their peers.  Organizations can move new ideas from the whiteboard to the field faster than ever before, operationalizing high-value engineering insights in days, not months. 

By eliminating tedious, data normalization work while quickly delivering insights to the business, Petrons transform engineering teams from cost centers to profit centers. 

Categories
Data Science & Analytics Geology & Geoscience

SPE Permian Basin Data Analytics Kick-Off

Our VP of Analytics Kyle LaMotta is speaking at the SPE Data Analytics Kick-Off about large-scale properties of stress orientation, well orientation, and their combined effects on well productivity.

Originally scheduled to be in-person in Midland, the event will now be hosted by webinar on Wednesday, April 29th. Details and registration link here.

And check out this wonderful interview with Kyle and Ronnie Arispe, Data and Analytics Expert at Concho.

Categories
Business Intelligence Tools Data Science & Analytics

Connected Analytics

The world changes too fast to depend on top-down decision making.  Engineering teams must be empowered to generate, and act on, their own technical analysis. At the same time, modern technical challenges require cross-functional teams that may be distributed globally. How then, do fragmented teams of domain experts collaborate to deliver high-quality decisions?

The answer is Connected Analytics. 

Step 1: Merge subsurface and operational data 

Petro.ai delivers an industry-first environment for blending and interrogating 100+ operational and geotechnical data types: the Petron.  Petrons allow operational data types (e.g. WITS streams, frac van files, production volumes) to be contextualized by subsurface data types (e.g. directional surveys, SEG-Y, fiber optics).  Data blending and co-visualization is now as easy as a drag-and-drop. 

Step 2: Apply your own expertise to the work product 

After loading the appropriate data types into a Petron, you can begin to perform your own analysis.  Use previously developed models (stored in other Petrons), or uncover new relationships using Petro.ai’s inbuilt machine learning.  No need to create a PDF or upload a shared drive; your new Petron is automatically stored and updated across your team. 

Step 3: Collaborate with your peers 

Petrons provide a social workspace for experts to annotate data, share newly created models, and collaborate like never before.  Each team member can independently perform their own knowledge work, all from the same source – a shared Petron.  As new analysis is performed, the Petron and all dependent workflows are updated to reflect the team’s latest insight. 

Step 4: Build new value and start the next wave of optimization 

With the power of the Petron, engineering teams can focus their energy where it matters: making the next well more efficient and more productive.  By iteratively and fluidly incorporating new learnings, the possibilities are endless. 

Connected analytics is a new assembly line.  Rather than passing physical products to the next station, modern engineers are passing digital knowledge work to the next area of expertise.  This modern workforce must be able to interact with their peers digitally, expects a high level of interactivity, everyday tools to support remote work.  Incumbent software was not designed for the challenges of this century; Petro.ai, and the Petron, were born in it. 

Categories
Business Intelligence Tools Data Science & Analytics Updates

Hacking for Houston 2020: Improving Care in Our Community

As a proud member of the community, Petro.ai wanted to give back. We created the “Hacking for Houston” event to give the Petro.ai user base a voice for good in the communities in which we live and work. 

Bringing together O&G technical experts and public health professionals

Uche Arizor, Team Lead at PHI Lab commented that, “Our mission is to facilitate cross-sector collaboration, creativity, and innovation in public health practice. Partnering with Petro.ai for Hacking for Houston 2020 was a great opportunity to bring people together from the sectors of oil & gas, data science, and public health to work on real issues that HCPH would like to address.” 

All of us were surprised when the night before the hackathon, a water main burst in downtown Houston. The 610 East Loop was closed for several hours due to flooding. Our team exchanged e-mails to decide if we needed to cancel the hackathon, but felt reluctant to do so, after all the hard work put into organizing the event with our partner, the Public Health Innovations Lab (PHI Lab) at Harris County Public Health. Employees arrived early and decided to press on with the Hackathon; we are so glad that we did!  

We encouraged anyone with a passion for data science to attend, especially our clients and partners, as well as university students in data science and public health. We were unsure if attendees would still be able to join us in light of the water main break—but even the turnout for the optional two-hour morning workshop was fantastic. Shota Ota, Support Engineer, and Jason May, Data Scientist at Petro.ai covered tools and topics useful for the Hackathon. 

After lunch, the hackathon began with a high-intensity couple of hours where participants worked in teams of 1-3 people to build and code projects. Teams were not restricted to any particular software or tools to implement their solutions and people deployed a variety of tools including Power BI, Spotfire, R, python, ArcGIS, Excel, Jupyter notebooks, and even open-sourced 3D visualization software. 

Three Challenges were laid out to participants, each with actual data provided by HCPH. Teams then chose one of the available challenges to work on during the event. 

Go Upstream for Downstream Costs

Objectives:   

  • Identify the rates of preventable hospitalization types and charges from zip codes with the highest rates of preventable visits.  
  • Create profiles of select zip codes that explore trends in socio-demographics, health outcomes, and issues in health care access.   

Increase Government Efficiency

Objectives:   

  • Model the overlap and gap of services provided by current facility locations based on community need (population density, poor health outcomes, etc.) 
  • Identify pilot sites for the co-location of public health, clinical health, and mental health services, while justifying community needs around the site. 
  • Explore the impact of other public and private facilities that may offer similar services in affected communities.   

Reducing West Nile virus (WNV) Disease Risk

 Objectives:   

  • Use disease, mosquito, environmental and population data from the past 4 years, to develop a model that predicts areas in Harris County at higher risk for WNV disease transmission compared to others.   
  • Identify the key factors that influence WNV disease risk in Harris County as a whole or in different clustered communities. 

At 5 pm, each team gave a 5-minute presentation or “pitch” to the panel of judges and other participants. Their projects were judged according to four categories: communication, technical achievement, creativity, and aesthetic. Our 2020 judges included: 

The judges were impressed by all the teams and how much they were able to accomplish in just four hours. Each team presented their findings and their recommendations for HCPH. The winning team consisted of Callie Hall from the Houston Health Department, Elena Feofanova, a PhD candidate at UT Health, and Alex Lach, a reservoir engineer at Oxy. Their team chose Challenge 2, Increase Government Efficiency, and combined outstanding data analysis with a great pitch.  

Dr. Beckham, Director of the Office of Science, Surveillance, and Technology at HCPH, said, “The hackathon was a great way to network with future leaders and address public health issues in a creative and innovative way. Information taken back will be implemented to assist with making better business decisions to provide services to Harris County residents. It was a great opportunity for government (HCPH) and private industry (Petro.ai) to work together for equity and better health outcomes for the community.” 

The success of Hacking for Houston 2020 made it an easy decision for us to bring it back in the future. If you missed the event, joined the Petro.ai Community to stay up to date and hear about our next hackathon. 

Categories
Data Science & Analytics Production & Operations Transfer

JPT Reflection: Tearing Down the Walls Among Disciplines

Happy New Year to everyone. I took some time over the last couple weeks to reflect on the decade behind us, as well as the decade ahead of us.  As all of us found out on January 1st, our friends at the JPT took some time to do exactly that!  In case you missed it, here’s a link to the article

A Changing Industry 

It is obvious that data science and analytics are front-of-mind for the SPE. The group has gone so far as to redefine the role of Management and Information Director as Data Science and Engineering Analytics Director. There cannot be a clearer signal that data science and analytics are critical to the next chapter of oil and gas.  At Petro.ai, we are proud to have been at the forefront of delivering Data Science and Petroleum Analytics to the industry since 2013. 

Common Vision 

Each member of the technical leadership of SPE shares a vision for the future of petroleum technology.  The technical directors unanimously declared that our “traditionally fragmented industry must become more integrated and collaborative.  A primary solution to breaking down those barriers: the continued evolution and adoption of digital technologies.” 

Photo by Mitchell Luo on Unsplash

While there are many great quotes in the article (you’ll find several below), this is the most striking.  The group is acknowledging that the status quo isn’t good enough and is issuing a call to action to the industry.  All of us have experienced the pain of the last 5 years; we’ve made great strides to streamline and improve our processes, but the work isn’t done yet.  I am convinced that Petro.ai will help the industry achieve this goal. 

Data Science and Engineering 

“Work flows will be more consolidated and integrated, a departure from the current status quo according to discipline, de-facto norms dictated by software, or the way things have always been done … Organizations will have to break down traditional work flow-deadline mandated “compartments” through a fundamental change in their culture …”

—Birol Dindoruk, Data Science and Engineering Analytics Director

This quote is incredibly exciting for me to read, since the team at Petro.ai shares the same view.  Today, data is stored and curated according to the OFS service line that collected the data: drilling data in a drilling database, completions data in a completions database, and so on.  In order to perform any meaningful data science or analytics at the well level (much less the reservoir level), a great deal of data cleansing, engineering, and normalization must be done.  Petro.ai eliminates these repetitive tasks and empowers engineers by delivering high-caliber data and analytics tools. 

Completions 

“Ultimately, the industry will need a better understanding of the production mechanism of unconventional wells. It’s not the same as in a conventional well where it’s just plain Darcy flow through a matrix [and the industry is] not going to solve these completions challenges with just completions engineers. This is a cross-discipline issue, and our biggest companion in this is reservoir engineers.” 

—Terry Palisch, Completions Director

The gap between completions engineers and reservoir engineers remains wide, even within single asset teams.  During a recent Petro.ai training course, we asked completion engineers and reservoir engineers to list the 5 most important factors in delivering a highly productive well.  The two groups did not share a single common factor within the top five. At Petro.ai, we believe that geomechanics is critical to bridging the gap between completions engineers and reservoir engineers.  We have partnered with Dr. Mark Zoback to incorporate his expertise into Petro.ai, delivering powerful geomechanical insights to engineers of all disciplines. 

Reservoir 

“When it comes to reservoir technologies, the industry has neglected [unconventionals] for quite some time because it was always about drilling and completions. Now that cash flow has shrunk and the treadmill of drilling and completing wells has slowed, the reservoir discipline is getting more attention. More emphasis is being placed on recovery factors as companies try to squeeze more out of each existing well … For this approach to be successful … the industry needs to further improve its understanding of the unconventional reservoir.”

Erdal Ozkan, Reservoir Director

I absolutely agree with this quote; economically increasing recovery factor is the ultimate challenge in unconventionals.  One of my colleagues calls this the Shale Operator’s Dual Mandate: increase production while decreasing spend.  More simply: do more with less.  Engineers are learning every day what levers they can (and cannot) pull to achieve this goal.  The challenge is disentangling the multiplicity of factors that can impact a well’s productivity: lateral length, completion intensity, fluid system, landing zone, parent/child (horizontal spacing), parent/cousin (vertical spacing), etc.  There are simply too many factors for a human brain to internalize and reason about.  The good news? Machine learning is the perfect tool to solve interconnected, large-scale problems like this.  Petro.ai delivers pre-made machine learning models that allow operators to identify which AFE dollars matter the most, allowing engineers to spend time (and money) on things that matter and eliminate things that don’t. 

Categories
Data Science & Analytics Transfer

The Future of AI in O&G

On November 25th, Petro.ai Founder and CEO, Dr. Troy Ruths, was a guest on the Invest with James West podcast series hosted by James West, Senior Managing Director & Partner at Evercore ISI. During the 30-minute podcast, James and Troy discuss trends of artificial intelligence and machine learning in the oil and gas industry and how Petro.ai is changing the way E&P companies plan, develop, and operate their assets.

The Role or AI

The creation and application of artificial intelligence requires a lot of data. Oil and gas operators have always generated large quantities of data, but the massive increase in activity the industry has seen as a result of unconventionals created an ideal environment for AI. Each well, and even each stage, can be seen as a unique data point where operators are constantly changing and experimenting. The real power of AI is in unlocking all this data.

People think of AI at the top of the pyramid,” says Troy. “But the future is with AI at the bottom of the pyramid—the new backbone that serves information up to the enterprise, and humans are going to remain at the top of the pyramid.” This view represents a departure from how many individuals see AI but promises a much greater impact to operators. Engineers today think about their data in terms of spreadsheets or databases. The data layer of the future provides significantly more context while being much more intuitive. This is the role played by Petro.ai, intelligently storing, integrating, and activating more than 60 types of oil and gas specific data, as well as associated metadata. Many of these data types that are ingested by Petro.ai, like microseismic events, fiber, or electromagnetic imaging data don’t have a standard home today.

Challenges to AI Adoption and Change

 “I would negatively correlate ability to adopt new technology to oil price. The better the oil price is, the harder it is to get technology adoption,” remarked Troy. The current price environment is ideal for technology adoption, especially when it comes to AI. Operators are at a point now where they need digital tools to help them do more with less. The other impediment to AI adoption revolves around education. AI can mean a lot of different things to different people and there is a level of education that still needs to take place to inform the industry on how AI can best fit into their organizations.

Troy goes on to explain another challenge, “AI can only extrapolate from what it’s seen, and that can be a problem in a world where the solution may be outside of what we’ve actually tried in the past.” Petro.ai incorporates principles of geomechanics into our workflows, bridging the gap between what we know from physics with machine learning.

AI in Upstream O&G

When prompted by James on the differentiated approach upstream analytics, Troy noted that “A lot of the new software that has entered the space is focused on operational efficiency and labor.…but honesty, those aren’t going to be needle moving enough for the industry. We’re focused on the needle moving problem, which is how can we reengineer.  We need to reengineer how we approach these unconventional assets.” Good engineering done in the office is going to drive real improvements.

With recovery factors, well spacing, or frac hits, operators really need to focus on the productivity drivers for a resource unit. These questions cannot be investigated in isolation and some of the best practices we have seen come from bundling disparate workflows together. For example, a completions engineer may want to look at several different data types simultaneously. They may want to look at and ask questions about geology, drilling or surface constraints. This example goes back to humans being on top of the pyramid. The engineer needs to be fed with the relevant information, which is where AI can really help. Petro.ai not only serves up this data, but also uses a complex system model built using geomechanics and machine learning that takes engineers through an 8-step workflow to understand the key productivity drivers for a resource unit.

2020 Outlook

The industry has clearly learned that unconventionals are extremely difficult to develop profitably – even in the Permian. These are very complex systems with stacked pay that will require good engineering to be properly developed. This is good news for digital companies in 2020. In a broader sense, Troy sees operators evolving “towards surgical development, we’re going to go away from factory drilling and go more towards surgical.” However, some operators are clearing embracing digital more than others and so we expect a clear bifurcation in operator performance.

Listen to the podcast for the full discussion on AI and machine learning in oil and gas and the future for data in the energy sector.

Categories
Data Science & Analytics Geology & Geoscience Transfer

Q&A with Dr. Mark Zoback: The Petro.ai Approach to Unconventionals

Every well that gets drilled is an opportunity to gain more insight and understanding. The team behind Petro.ai believes in building tools for people to use the data they have to draw the right conclusions. How can machine learning aid geomechanics? Learn more about the Petro.ai approach in the above video.