Passion for Change

Passion for Change: Birchcliff Energy

with Theo van der Werken, Asset Manager at Birchcliff Energy

Tell us about your background and what you do now.

I am currently employed with Birchcliff Energy, which is a Canadian based intermediate producer with a large acreage position in the Montney unconventional resource play.  As the Asset Manager, I manage a team of multidisciplinary engineers and geoscientists that are very busy optimizing the development of our unconventional resource in the Montney.

I’m originally from Holland, where I graduated with a degree in Mining Engineering. Before graduating, I completed an internship, working offshore in the North Sea and realized that an oil and gas career path was more aligned with my interest. 

My first job took me to the Middle East, where I worked for an oil and gas service company. I was involved with a major that utilized Underbalanced Drilling as a drilling technology to explore for oil in the Omani desert. I was based out of Dubai, where I started my career in the drilling engineering department. Subsequently I relocated to Houston where after some good field experience in Texas and Canada, I pivoted from drilling into reservoir engineering.  

After several years I made the conscious decision to switch to the operator side and joined a large multinational and relocated from Houston to Calgary where I started working as an Exploitation engineer in an asset team. It was a great experience, I got to see a lot of different things and work on a variety of reservoirs. Subsequently, I went to a start-up and spent about two and a half years with them. This is where I picked up and learned a lot of the surface side of the business: production engineering, facilities, pipelines, joint venture and so forth.

I joined Birchcliff Energy in 2011 where I started as a Senior Development Engineer and took the role as Asset Manager at the end of 2011. I’ve been in this position for nine years, which has been very rewarding and I have never looked back. 

During the last 9 years we have seen tremendous growth in my team and in the company as whole, primarily through the drill bit where we have grown from approximately 16,000Boed to 80,000Boed. 

Can you tell us why you have a passion for change in this industry and what else you’re passionate about?

I think the oil and gas industry is a very exciting and dynamic industry that is under-appreciated as it relates to technological innovation. 

In addition, the industry often gets vilified by the general public without really understanding the disconnect between end-user habits and the associated energy requirements.  This motivates me to not just advocate for our industry, but also strive to continuously improve on the responsible extraction of hydrocarbons. 

In the last 20 years, with the rise of the unconventionals, we have seen a tremendous amount of technological innovation to both hardware and software that is used to extract hydrocarbons from tight oil and gas reservoirs. 

Above ground, rig automation has evolved with fit-for-purpose rig designs that are really well suited for large scale pad development. At Birchcliff, we are now developing our field with surface pads that can accommodate up to 28 wells from one surface location, minimizing the environmental footprint. These types of walking rigs are surprisingly agile and really help reduce cycle time and ultimately drive down finding and development cost. 

In the subsurface, we continue to see innovations on the software side with more advanced integrated physics-based models as well as data driven models that can guide completion design and field development strategies. In the field, the use of advanced diagnostics such as fiber, geophones and pressure data are really insightful to capture real time system behaviour as we zipper frac these massive pads. 

The diagnostic data is very useful to further advance the modelling space to calibrate and validate not just our hypothesis, but often also to test the reliability of these models. 

Because the system is so complex and the industry continues to innovate there is a great opportunity for continuous improvement on optimizing “where to drill, how to drill, where to frac and how to frac.” 

That’s my number one passion that I rally my team around—we have an opportunity to do better every year, or even every well, based on integrating more data sets, looking at new technology and just continuously pushing the envelope of how we develop these unconventional reservoirs.

As technology advances, it allows asset teams to move down the grain size, if you will. What was once viewed as poor quality rock back in 2011—we’ve now added multiple horizons. Technology is allowing us to economically explore, even with declining commodity prices and mounting external pressures and taxes.

To this point, tight reservoirs have revolutionized the supply side and it’s really driven down prices, particularly in natural gas sector. Notwithstanding that, we can still make a go of it with more room on the upside. That’s just really fascinating and motivating to me and my team. That’s the passion for what I do at work.

Outside of work, I would say my passion—and why I’m living in Canada as a Dutchman—is the mountains. I really like the outdoors, always have. Holland is a pretty busy place. There’s about 17 million people in the size of Southern Alberta, whereas all of Canada has about 38 million people. There is a ton of room here. It’s absolutely beautiful country in summer and winter. I really enjoy spending time with my friends and family in the mountains. I’m pretty passionate about that.

Do you think this downturn we’re experiencing now will accelerate digital transformation or put it on pause until we see better commodity prices?

I think we’re stalling a bit to be honest with you. It seems that we are all very rattled with everything that’s going on from biological viruses to this price war. In North America, there’s 35 billion dollars of capital that’s basically been pulled out of the 2020 budget plans.

On top of that, you layer on remote working and suddenly implementing a digital strategy becomes daunting. When I think about implementing a digital transformation, it’s really a management of change process that is quite culturally involved.

A successful digital transformation is a lot more than buying software. It’s a lot more than hiring a data scientist.  Making sure you assemble the right team and align with the right industry partner are all important components that need to be interlaced. Then building enough internal buy-in and getting people to culturally rally around that is very involved. 

If you’ve already started that journey, I think you can continue to reap the benefits and dig in on specific projects. But if you haven’t started yet, I don’t think this price shock alone gives you the push.

I feel very comfortable with what we are doing here at Birchcliff. We have detailed road maps and projects and inventory of things that we’re working on. We’ve got the people, the data engineering, and the data pipelines. I feel good about that. 

How much do you see your culture tying to your competitive advantage, being able to capture that next generation of knowledge?

It goes back to this passion for change, passion for continuous improvement. We’ve built an internal framework with some tangible tools. How can you continuously improve? We want to improve everything: from trialing different completion styles in the field, to new technology using physics-based and data-driven models, to spending time collaborating with peers, to competitor intelligence to learn from “best in class” competitors.

In addition, we have set up the business processes that support these efforts. How do you design a pilot or operational trial? How do you define success with appropriate KPIs? Who is responsible for scouting for new technology? All these items are important to maximize value. 

That interlacing of these various components is what is part of our Continuous Improvement framework. Data analytics and science is just one of the tools that fit within this framework. If our team decides to set up a field trial, we need the right sensors in the field so we can collect the right data to feed a data-driven model or calibrate our physics-based model. The culture of continuous improvement—and people rallying around that—allows us to get the buy-in for something like data analytics or an investment in physics-based modeling.

We just got approval to run downhole fiber in this environment. We’re making this investment because our people are bought in on how data, physics, and analytics interplay to drive continuous improvement. So, they all go hand-in-hand.

Some would argue that our Birchcliff sandbox is not the most competitive sandbox from a pressure and permeability perspective. If you look at the Montney, we’ve got a great position. It’s all contiguous land and we own our own plant and we’ve done a series of things which make our strategy highly successful and profitable compared to most of our peers.

Can you make a go of it without spending any money on modeling, field trials or diagnostics? If you have superior rock, you can probably still be very competitive. In the long run, I think the winners and survivors need to strive for this Continuous Improvement culture that is very much alive at Birchcliff. Our technical teams have been able to demonstrate that by year-over-year improvements on type curves and a variety of economic indicators.

Have you seen specific examples of success with data science and machine learning projects? And was there skepticism? How did you convince people to go along with it?

Analytics adds value in two places: 1) improving the efficiency of things that you already do and 2) by making better decisions because you’re able to interact with different data sets that you weren’t able to interface with before.

On the first point, we all want to be more efficient; that applies to everyone within Birchcliff. You can spend a lot of money moving data from external sources into your organization and then cleaning and staging it for an end product. The fact of the matter is, you don’t need to spend a lot of money to make some improvements on that. You just need some smart people and some software tools. 

If you can help make better decisions in terms of how you manage your production, that’s directly going to hit your revenue line. So, initially we focused a lot on production engineering with visualization dashboards. Those were some of the early use cases. By no means have we figured it all out, but we started small. Back in 2013, we organized data, then slowly but surely, we started to evaluate different tools. We started to build a network of people that we thought were like-minded and hired our first engineer with advanced analytics skills. 

That small group accelerated the adaptation of a lot of things. We slowly started to add people and showed more value in projects; that has led to where we are today. We have a dedicated data analytics team, which is rolled up under corporate development and competitor intelligence. We’ve since hired a data engineer and data scientist. 

That team didn’t make a lot of noise until we felt that it was worth bringing up with the broader organization. Many people, myself included, need “soak time” on these things. Having some tact around how you slowly but surely get your organization to adapt, there’s some strategy involved. For us, things have accelerated in a very positive direction.

Have you seen any competitors that are having success with data science and analytics? 

I think you see very different things depending on the size of the company. Larger players often employ highly technical and specialized people with great skill sets. But, individuals can feel isolated within the larger organizational chart. Some of those smart people are building some really cool analytics workflows, but they’re having a difficult time making it a broader initiative or socializing it to a larger group. 

On the other end, you see these large companies with a global mandate and initiative to implement a digital strategy, but when users actually need support on the gory details of data engineering and data pipelines in specific business units, support is lacking.  Instead, the initiative is rolled out at the corporate level but it is not well supported in the business unit where the focus should be on a specific problem in a specific basin. 

That brings me back to what I see here in Calgary. There are a lot of highly skilled technical people, but they appear to be quite siloed working on niche projects. It could be a function of the wrong expectations being set from the beginning. As per my previous comments, a successful digital strategy is really rooted in a management of change process that can be daunting and can make or break this kind of initiatives.

Do you see Birchcliff being on the cutting edge? And is that where you want to live or do you want to be somewhere in the early majority?

The leading edge is great, the bleeding edge is not so great. We’ve found that the analytics space is changing quickly; there’s a lot of smart suppliers and vendors developing and building tools and workflows. Looking at Birchcliff specifically, we try to balance this evolving space by always asking ourselves if we should build versus buy. Maybe you want to buy it because if you have to maintain and support it, the cost of maintenance can become prohibitive and is better suited for a 3rd party vendor to host it on the cloud. In addition, if the workflow or technology is evolving you can leverage of improvements that are driven by other operators as well where we may not use that specific application 24/7. 

So in general – I would say if it is truly novel we continue to develop it in house, simply because there is nothing like it on the market whereby we consider buying it if there are obvious advantages as previously highlighted.

Is there one specific project you could talk about that you were really proud of?

I guess there’s a lot of them. We’ve got workflows now for automated lookbacks on wells that are recently drilled. There’re field dashboards that we use to optimize production that have significantly impacted the bottom line. We’re using blending analytics tools to help our marketing group blend the right grades to maximize returns. There are ingestion tools that we’ve built for third-party pressure data that allows us to pipeline and database it. There are custom apps that we’ve built to house datasets that don’t typically have a home, where we have built processes that ingest that data and allow for easy access by our engineers. There are many examples.

Over the last few years, we’ve been spending more and more time building up competency at related to multivariate modeling and specifically machine learning projects. We are now a long way down that road where we have staged large ‘featurized’ data sets, allowing us to operationalize the type of workflows. We’re really excited about trying to blend a data-driven empirical data set with a physics-based model. That’s the stretch goal we’re working toward but expect to have a operationalized workflow by the end of this year.

Is Birchcliff using any cloud computing resources today?

We’re a bit of a hybrid. This is not my area of expertise. Some things are on the cloud. Some things are on-prem. Security, scalability, latency, control—all these things have pros and cons in both buckets, but we’re doing a little bit of both.

Control of the data: that was a really big thing. We own our plant. We don’t do contractors. We need to control everything, but now we’re starting to see the benefits of cloud, so we are trying to find a happy medium.

We haven’t seen any limitations with having on-prem servers during this time when everyone’s working remotely. Everyone’s on VPN and it was flawless for us. We didn’t skip a beat. 

Any books or blogs that you’d suggest reading?

I don’t seem to have too much time for reading between work, family and, of course, the mountains. I think Bill Gates always has a few interesting things to say in Gates Notes where he shares his perspective on the complicated challenges our world is facing today.

A second thought leader I enjoy learning from is Peter Tertzakian, a Geophysicist by training who is the Executive Director of the ARC Energy Research Institute. He’s basically a historian of energy transitions. He’s got a really cool website called Energyphile and has written several books as well. 

He also hosts the ARC Energy Ideas podcast with Jackie Forrest that explains the latest trends in Canadian energy and beyond. They’re not just focused on upstream E&P, they’re focused on energy.  

In your own words, how would you describe

I think there’s a lot of shared vision with in terms of the value of analytics and where it could play a really critical role in this continuous improvement journey. I think you have a great group and a good culture. I’m really happy for the success that your team has seen. 

Theo van der Werken is a highly resourceful and results-oriented manager with a deep understanding of all aspects in upstream oil and gas disciplines. Specialties include asset management of tight oil and gas reservoirs with proven leadership experience.
Data Science & Analytics Geology & Geoscience

SPE Permian Basin Data Analytics Kick-Off

Our VP of Analytics Kyle LaMotta is speaking at the SPE Data Analytics Kick-Off about large-scale properties of stress orientation, well orientation, and their combined effects on well productivity.

Originally scheduled to be in-person in Midland, the event will now be hosted by webinar on Wednesday, April 29th. Details and registration link here.

And check out this wonderful interview with Kyle and Ronnie Arispe, Data and Analytics Expert at Concho.

Transfer Updates Year in Review 2019

It’s been an incredible year! We launched into 2019 with determination to become even more focused on what we love doing — helping our clients accomplish more with their data. We care about the journey that knowledge takes form the source to the board room, and helping our clients deliver new insights.

In the process of aligning our company, product, and focus, we changed our name from to so that our new name reflects our mission — to build and deliver not only a new way of organizing information in oil and gas, but also a new way to run the E&P business, built solidly on data. We’ve worked in drilling, reserves, completions, production, portfolio management, subsurface, artificial lift, EOR and A&D. We’ve had the opportunity to visit our customers in oil and gas cities across North America, and we’ve hosted and attended some incredible events. Here are some highlights from 2019.

Building the team doubled in size. We welcomed so many amazing team members. Not just in the Houston office, but in our new Montreal office as well. We’re a group that likes to work hard and play hard, supporting each other and our customers every step of the way.

To make room for everybody to do their best work, our Houston headquarters moved a few blocks to its new location in the Houston’s downtown historic district at 114 Main, Suite 200. If you haven’t had a chance to come by and check out the new space, reach out and plan a visit! We love having guests come by and experience a part of our culture. 

Back at the ranch held its annual retreat near Brenham, Texas to bring the whole team together— the Houston and the Montreal offices— for in-depth brainstorming, teambuilding, and of course, some Texas barbecue.

The retreat served as an opportunity to discuss the latest trends in data science, oil and gas, and ways to keep pushing the edge cases of our industry.

Four Weeks, Seven Cities

When we launched our petroleum analytics meetups in Houston this past summer, we had a vision to take them on the road as well—one O&G town at a time. We picked cities where we had clients, conferences, and new connections we wanted to make. The 2019 Pub Crawl kicked off in Dallas, then went to Midland, Oklahoma City, Tulsa, Denver, Calgary, and ultimately back home for one last event in Houston.

Here’s our team at the Midland Beer Garden.

The Power of Purple

If you saw us at one of the conferences we attended this year, from the SPE OKC Oil and Gas Symposium in Oklahoma to the ATCE in Calgary—you may recognize our colorful purple booth. The power of purple is giving customers a seamless user experience and delivering even better tools for oil and gas workflows.

The Channel

We created almost 30 videos this year showcasing some of our workflows and new features. You can see them all and subscribe to our YouTube channel. Some highlights include our Q&A series with world geomechanics expert, Stanford professor, and technical advisor Dr. Mark Zoback. You can check out the full playlist here.

Standing Room Only

Not only have we been able to build some incredible tools this year, but we’ve also had the honor and opportunity to offer week-long geomechanics courses and many, many training sessions on client sites throughout the year. Our instructors work hard to make technical concepts not only accessible but engaging. One recent highlight: our Technical Director Lucas Wood captivated a crowd at the TIBCO Analytics Forum. When the crowd asked for more advanced material, he adapted examples on the fly to match the curiosity in the room. We are so grateful for all the engagement, not only at this event, but at all our training sessions throughout the year, from Houston to Midland to Denver.

Troy Talks Founder and CEO Troy Ruths made multiple appearances on expert panels and interviews throughout the year. Not only does he remain accessible to every single member of the team, but also to the whole community. He’s passionate and committed to sharing big ideas about the future of the industry, and that drive shapes the whole company. As an expert in oil and gas, data science, and tech culture, Troy can talk about it all. Check out his guest appearance on a podcast discussing “The Future of AI in O&G.”

Lastly, but most importantly…

We have incredible customers—both long-time clients and newcomers this year—and we feel so grateful to keep offering the very best products and services. We really take the trust placed in us very seriously and look forward to all the projects and milestones ahead in 2020.

Geology & Geoscience Transfer

Q&A with Mark Zoback: Merging Theory with Operations

Learn how merges theory with operational data to inform asset development strategies in this short video with geomechanics expert, Stanford professor, and Technical Advisor Dr. Mark Zoback.

Data Science & Analytics Geology & Geoscience Transfer

Q&A with Dr. Mark Zoback: The Approach to Unconventionals

Every well that gets drilled is an opportunity to gain more insight and understanding. The team behind believes in building tools for people to use the data they have to draw the right conclusions. How can machine learning aid geomechanics? Learn more about the approach in the above video.

Data Science & Analytics Geology & Geoscience Transfer

Q&A with Mark Zoback: Unconventional Asset Development

What are the opportunities and challenges with unconventionals? What can geoscience offer to unconventional development? What role can data science play? For a brief discussion, watch this video of Founder and CEO Dr. Troy Ruths with Stanford University Professor of Geophysics, and Technical Adviser, Dr. Mark Zoback.

Geology & Geoscience Transfer

Q&A with Mark Zoback: Geomechanics Principles

The next installment our Q&A series with geomechanics expert, Stanford professor, and Technical Advisor Dr. Mark Zoback touches on how a better understanding of geomechanics can lead to completion designs that improve well performance.

Data Science & Analytics Transfer

What Do Cancer Models Have to do with Oil and Gas?

Dr. Luay Nakhleh, J. S. Abercrombie Professor of Computer Science at Rice University, spoke at the second Petro Community event on the importance of accounting for uncertainty when modeling complex systems like cancer growth or well performance.

Insights gained from one domain, like computational biology, can often be leveraged in other domains, like petroleum analytics, because they uncover better ways of modeling complex data to discover patterns and make more informed observations.

“Computer scientists get training in modeling and formulating problems,” states Dr. Nakhleh. “A model is an abstract representation of a system to explain some of its features, and how detailed a model is depends on the questions being asked.”

Watch more of Dr. Nakhleh’s talk to discover how he deals with modeling uncertainty in his research. Do you think this resonates with your work in oil and gas? Why or why not?

Data Science & Analytics Transfer

Multivariate modelling using the diamond dataset

Today’s post was created by data analyst Omar Ali.

We’ll demonstrate how to create a multivariate model using the well-known diamond dataset from Kaggle. For this project, we’ll be utilizing the new Models feature in Our most recent release makes it extremely easy to run predictive algorithms on any type of dataset. This tool is constantly being upgraded with added functionality and features per our customers’ feedback. That being said, let’s predict prices!

Before we begin, please download the diamond dataset from the Kaggle page here.

Before building a model, let’s first explore the data. We want to find all the blank values or anything that has an odd distribution first to make sure we’re going to build a model on clean data.

Some quick preprocessing shows us that there are some zero values in x, y, and z. The depth and table distributions should be fenced at the distribution to keep outliers from skewing the dataset, and there are no NULL values for our cut and color samples.

The next portion of this exercise will familiarize you with the new Models feature in the

Click on the PetroPanel icon in Spotfire and select your respective database. Test your connection, and if it works, click on the “Models (beta)” tab.

Once you’re in the Models (beta) tab, you should see the following:

Click on “New Model,” name your model, and click on “Create.” This will bring you the main tab where you can edit your inputs, choose a machine learning algorithm, and save and train a model.

My Model Options tab looks like this:

To predict prices, we’ll use a random forest with 50 trees and a 20% test set hold out.

As you scroll down, you will find the Model Inputs tab. In this tab, you will point the PetroPanel to your table, select your predictors, and assign them to either “Categorical” or “Continuous.” For continuous variables, you can “fence” your dataset to keep outliers from skewing your model. For categorical variables, hot coding is the default, but a lookup table can be built as well. Here’s my Model Inputs tab:

As you can see, I fenced the depth parameter from 55 to 69 to keep the outliers from skewing the dataset.

Finally, the “Model Outputs” tab is where we can choose what we’re predicting, also with the opportunity to fence. Here’s mine:

Once everything is ready, just click “Save.” Once you save, you can go back to the “Loaded Models” tab and should see the model populated there. All you need to do now is train the model and see your results! The predicted values for your dataset will be added to the original dataset. Below you can see a visualization of the price predicted by the actual price of the diamond.

The model result metrics can be found on the Suite under the “Machine Learning” tab. The Suite is independent of the Petro Panel within Spotfire and is fully functional on its own. From here, you can view the model inputs and outputs, the variable importance, and many other metrics that are required for model evaluation. Here’s what it looks like for the model I just trained:

As you can see, we have a model with 84% accuracy on the test set. A key feature of the tool is that the model is already stored in our database, so we can create a job that runs predictions with this model at whatever frequency we decide on. This means that if we have a model that predicts diamond prices, the Suite can put it into production and the database will store the results. So, if we get diamond data daily, we can load in the data and predict the price through the Suite. If you’re looking for a one-time prediction, you can also manually enter in values for your predictions and generate a result to see how the model functions. An example of this can be found below:

I hope you enjoyed my demo of the new Multivariate Modeling feature in and the functionality of putting machine learning into production through the Suite. If you have any questions or concerns, please comment below!