Data Collaboration is a Rising Tide that Lifts All Ships

Reading Time: 3 minutes
Kelly Bennett

Kelly Bennett

Co-Founder & CEO

Water management is evolving quickly, no matter what industry you consider. Whether it’s conservation, mining, utility supply, or oilfield water management, the need for data – much more data – has come into focus. Data is widely understood as the backbone for our decision-making needs and as the role of artificial intelligence grows in decision support applications, the value of accurate data is growing fast. But so are the challenges associated with making data useful.

The last decade has seen a proliferation of tools and technologies to collect ever more data. The result is what many only dreamt of a handful of years ago: massive amounts of data flowing into operations centers from the field. I can’t imagine many people in related roles haven’t been pitched a project to build a ‘data lake’. Building these massive repositories made for big business among the major consulting firms, and many companies have pursued costly digitalization projects to become more ‘data driven’. Yet often, the reality is more data, more problems. Why? Because with so much data flowing in from so many different systems, measuring so many different things, it’s hard to manage and make sense of it all, let alone make it decision-useful.

Managing and utilizing data at scale is a tough equation to crack, no matter the industry in question. It gets harder to get one’s arms around as the pace of data generation grows and technology enables us to both capture more and do more with it. Like many others, my eyes get wide and my pulse quickens when I see huge amounts of data that we can get our hands on at B3. It’s easy to dream of the many possibilities to create valuable analytics. It’s much harder to ensure that data collection, quality controls, management, and translation into useable structures can meet needs, today and into the future. With so many competing priorities and fragmented approaches to solving big industry challenges, where do we start?

One of the more interesting factors in data that has influenced much of our work at B3 is the evolving role of collaboration. When I started working in the information business 17 years ago, the industry was deeply concerned with competitive advantage and companies were highly secretive about nearly everything. Antitrust concerns were another root source of secrecy. They typically saw information as providing a narrow, but critical, advantage and working together to create common data models or support broad understanding of major common challenges was rare. Fast forward to today and the cultural shift in industries like oil and gas is remarkable.

Collaborative efforts engaging private industry, academia, and regulators are producing tangible results that are helping solve big issues. The data nexus is transformative: while stakeholders may still have differing technical models and proprietary data, collaboration provides a means to identify and define problems and critically, delineate the data needed to address them. How does this lead to better data? For B3, this collaboration has led to prioritized new data collection, the creation of industry-accepted approaches to QA and data enhancement, even new data models. In less than a month, we will release the fourth version of our most widely used dataset, a unique collection of data to understand water and gas injection. Its evolution over the last year has largely been informed by the outcomes of unprecedented collaboration in the oil and gas industry to address seismicity and pressure concerns.

Are we on a path to an egalitarian data utopia where transparency means all stakeholders have access to all the same information? Of course not. Many companies deeply engaged in collaboration will continue to derive competitive advantage from having access to more proprietary data and more sophisticated tools to leverage them. But often that proprietary insight ends at the edge of their assets, leaving them without a similarly robust understanding of the big issues that affect regional landscapes.

Collaborative work to harmonize a broader understanding of which data are important and standards for quality makes for more informed stakeholders and provides a data backbone that can serve as a common basis for analysis. This common understanding is key to bridging the gap between deep knowledge of proprietary operations and what’s happening at a regional scale. It will be interesting to see where this all leads. Over the last few years, we have seen a groundswell of interest in finding ways to facilitate data sharing, integrate proprietary model outputs in derivative forms, and other collaborative means of improving the data used to make critical decisions. These developments are exciting and are already driving industry behavior and regulatory engagement. Debates about which data are important, how the data should be collected and used, and other governance concerns will remain ongoing, but that they are happening at all should be celebrated.

Mitigate risk, unlock opportunities

Scroll to Top