TECHNICAL PROGRAMME | Energy Technologies – Future Pathways
The Energy Transition: The Role of Digitalisation, AI, and Cybersecurity
Forum 23 | Technical Programme Hall 4
29
April
14:30
16:00
UTC+3
Digitalisation, AI, and cybersecurity are key enablers of this transition, providing the tools and frameworks needed to manage complex energy systems, optimise operations, and protect against cyber threats. AI offers a broad scope including the creation of virtual replicas of physical assets, processes and systems, using real-time data and simulations and can help to automate complex and repetitive tasks, such as drilling and production, which improves the efficiency, quality, and consistency of operations and reduces costs. This session will explore the latest advancements in these areas and discuss how they are transforming the energy industry to meet future challenges.
Objective
Despite the efforts directed toward adopting machine learning algorithms to address upstream challenges, the prospect of leveraging quantum machine learning (QML) to address these challenges has not been widely investigated. Based on the predictions, the advancement of the current computational powers will reach a plateau. As the hardware capacity determines the limit of what a machine can learn, this work highlights the potential of maximizing the exploitation of machine learning by adopting QML in various upstream cases.
Novelty
Quantum machine learning is an evolving field that utilizes quantum computing to perform machine learning algorithms. Employing quantum computing as a processing platform adds an advantage to machine learning due to the nature of quantum computing that employs subatomic particles for computation. This permits capitalizing on the properties of quantum mechanics phenomena such as superposition, entanglement, and quantum interference.
Methodology
This work sheds light on the feasibility of adapting QML technologies in various oil and gas cases that require computational complexity based on the current status of art. Through a comprehensive analysis of the theoretical concept of quantum computing, we provide an overview focusing on quantum machine learning and its various types and implementation. In addition, we highlight the key difference between the proposed approach to utilize quantum analogs with the current machine learning applications employed in the oil and gas industry, analyzing the gained advantage and the burden of adoption.
Results
Many of the proposed quantum algorithms remain a theoretical concept due to the unavailability of a stable large-scale quantum computer capable of accommodating the execution cost. Although the challenge of producing a stable quantum computer persists at the present time, researchers have discovered that several quantum algorithms, including quantum machine learning algorithms, are compatible with noisy intermediate-scale quantum computers (NISQ), which will be available in the near future. The potential for quantum machine learning is not limited to an extended speedup, but it also includes enhancing the generalization of machine learning algorithms, so that a machine is more likely to operate effectively with the new inputs of a given environment. These advanced capabilities can be harnessed to elevate the challenges and complexity of hydrocarbon plays; therefore, we encourage more exploration in this field tailored to the special requirements of the oil and gas industry since the development in the field of quantum machine learning can set forth a new generation of solutions. This will help with resolving long-standing issues beyond the limitation of existing processing power to advance upstream processes toward optimal operational excellence.
Despite the efforts directed toward adopting machine learning algorithms to address upstream challenges, the prospect of leveraging quantum machine learning (QML) to address these challenges has not been widely investigated. Based on the predictions, the advancement of the current computational powers will reach a plateau. As the hardware capacity determines the limit of what a machine can learn, this work highlights the potential of maximizing the exploitation of machine learning by adopting QML in various upstream cases.
Novelty
Quantum machine learning is an evolving field that utilizes quantum computing to perform machine learning algorithms. Employing quantum computing as a processing platform adds an advantage to machine learning due to the nature of quantum computing that employs subatomic particles for computation. This permits capitalizing on the properties of quantum mechanics phenomena such as superposition, entanglement, and quantum interference.
Methodology
This work sheds light on the feasibility of adapting QML technologies in various oil and gas cases that require computational complexity based on the current status of art. Through a comprehensive analysis of the theoretical concept of quantum computing, we provide an overview focusing on quantum machine learning and its various types and implementation. In addition, we highlight the key difference between the proposed approach to utilize quantum analogs with the current machine learning applications employed in the oil and gas industry, analyzing the gained advantage and the burden of adoption.
Results
Many of the proposed quantum algorithms remain a theoretical concept due to the unavailability of a stable large-scale quantum computer capable of accommodating the execution cost. Although the challenge of producing a stable quantum computer persists at the present time, researchers have discovered that several quantum algorithms, including quantum machine learning algorithms, are compatible with noisy intermediate-scale quantum computers (NISQ), which will be available in the near future. The potential for quantum machine learning is not limited to an extended speedup, but it also includes enhancing the generalization of machine learning algorithms, so that a machine is more likely to operate effectively with the new inputs of a given environment. These advanced capabilities can be harnessed to elevate the challenges and complexity of hydrocarbon plays; therefore, we encourage more exploration in this field tailored to the special requirements of the oil and gas industry since the development in the field of quantum machine learning can set forth a new generation of solutions. This will help with resolving long-standing issues beyond the limitation of existing processing power to advance upstream processes toward optimal operational excellence.
The energy transition is a complex and multifaceted challenge that requires innovative solutions to manage complex energy systems, optimize operations. Digitalisation and AI are key enablers of this transition, providing the tools and frameworks needed to transform the energy industry and meet future challenges. This presentation explores the latest advancements in these areas and discusses how they are transforming the energy industry.
One such advancement is the development of SOFFIA-AI (Satellite Observation For Forecasting, Intelligence and Analytics), an advanced platform that integrates satellite data, economic indicators, and artificial intelligence to provide actionable insights for energy market stakeholders. By utilizing H3 hexagonal grids—subcity-level spatial units—SOFFIA-AI bridges the gap between complex data streams and actionable intelligence, enabling precise, data-driven decision-making in downstream energy markets.
The presentation will demonstrate SOFFIA-AI's application in trading and market analytics, focusing on its ability to enhance real-time nowcasting and forecasting of regional energy demand. By leveraging data sources such as methane emissions, nightlight intensity, and Google Trends, the platform provides traders with granular insights into market dynamics, empowering them to anticipate shifts in demand, improve hedging strategies, and make informed trading decisions.
Furthermore, the presentation will explore SOFFIA's innovative use of large language models (LLMs) to extract, summarize, and analyze unstructured data, including market reports, news articles, and regulatory updates. This integration enables seamless access to relevant market information, significantly enhancing decision-making processes. By combining these AI-driven tools with high-resolution geospatial data, SOFFIA provides an unparalleled framework for informed decision-making.
Through case studies in the Middle East, the presentation highlights how SOFFIA's integrated approach reduces uncertainty, enhances decision accuracy, and drives sustainability. For instance, SOFFIA's real-time demand forecasts could enable traders in the Gulf region to optimize their portfolios, achieving an improvement in efficiency. Similarly, its ability to identify regional demand spikes has supported better logistical planning and inventory management for downstream players.
The discussion will also touch upon scalability, emphasizing how SOFFIA's modular architecture allows seamless adaptation to various downstream segments, including chemicals and retail. By focusing on actionable insights derived from sophisticated AI tools, SOFFIA empowers stakeholders across the downstream value chain to make data-driven decisions, leading to both economic and environmental benefits.
Overall, this presentation showcases the transformative potential of digitalisation, AI, and cybersecurity in the energy sector, highlighting the latest advancements and innovations that are driving a sustainable energy transition.
One such advancement is the development of SOFFIA-AI (Satellite Observation For Forecasting, Intelligence and Analytics), an advanced platform that integrates satellite data, economic indicators, and artificial intelligence to provide actionable insights for energy market stakeholders. By utilizing H3 hexagonal grids—subcity-level spatial units—SOFFIA-AI bridges the gap between complex data streams and actionable intelligence, enabling precise, data-driven decision-making in downstream energy markets.
The presentation will demonstrate SOFFIA-AI's application in trading and market analytics, focusing on its ability to enhance real-time nowcasting and forecasting of regional energy demand. By leveraging data sources such as methane emissions, nightlight intensity, and Google Trends, the platform provides traders with granular insights into market dynamics, empowering them to anticipate shifts in demand, improve hedging strategies, and make informed trading decisions.
Furthermore, the presentation will explore SOFFIA's innovative use of large language models (LLMs) to extract, summarize, and analyze unstructured data, including market reports, news articles, and regulatory updates. This integration enables seamless access to relevant market information, significantly enhancing decision-making processes. By combining these AI-driven tools with high-resolution geospatial data, SOFFIA provides an unparalleled framework for informed decision-making.
Through case studies in the Middle East, the presentation highlights how SOFFIA's integrated approach reduces uncertainty, enhances decision accuracy, and drives sustainability. For instance, SOFFIA's real-time demand forecasts could enable traders in the Gulf region to optimize their portfolios, achieving an improvement in efficiency. Similarly, its ability to identify regional demand spikes has supported better logistical planning and inventory management for downstream players.
The discussion will also touch upon scalability, emphasizing how SOFFIA's modular architecture allows seamless adaptation to various downstream segments, including chemicals and retail. By focusing on actionable insights derived from sophisticated AI tools, SOFFIA empowers stakeholders across the downstream value chain to make data-driven decisions, leading to both economic and environmental benefits.
Overall, this presentation showcases the transformative potential of digitalisation, AI, and cybersecurity in the energy sector, highlighting the latest advancements and innovations that are driving a sustainable energy transition.
Introduction:
Fractures appear as sinusoidal curves on borehole images (BHI’s) in horizontal wells and provide key insights into reservoir performance, they can indicate highly productive zones or highlight potential concerns for waterflood shortcuts. Traditionally interpretations are manual and is inherently time-consuming, laborious, and subject to interpreter bias and variability. Automating this process could result in significant cost savings allowing geological expertise to be focused elsewhere. However, computer vision algorithms face substantial challenges due to complexities in data quality, particularly in a Logging While Drilling (LWD) environment. LWD Images often experience distortions, noise, missing data patches, and artifacts, exacerbated by downhole shocks and vibrations that tend to increase with drilling depth. These imperfections can obscure true fractures or create patterns that mimic the sinusoidal shape of fractures, leading to high rates of false positives in automated detection. We aimed to develop an AI-assisted interpretation method to address many of these challenges.
Method & Applications:
Recognizing the impact of data quality on fracture interpretation reliability, we address the subjective and meticulous task of image quality labeling. We developed a deep learning-based workflow using an Efficient Net CNN, fine-tuned with borehole images classified as 'good', 'fair', and 'poor'. The resulting model achieves an F1 score of 80%, providing consistent and rapid identification of reliable data sections for interpretation whether manual or automated.
This work also presents a dual approach to improve the accuracy and efficiency of fracture analysis from borehole images. First, we define a set of image quality criteria and design corresponding scoring measures. Emphasis is placed on image quality features, as artifacts and missing data can significantly impact detection either by obscuring actual fractures or introducing false ones. These quality features are leveraged to minimize false positives while preserving true fracture detections, initially identified using a pre-existing sinusoid detection algorithm. This filtering process identifies key discriminative features that help distinguish genuine natural fractures from false detections. We managed to reduce false detections by 98% while keeping 94% of true fractures.
In fields developed with ERD wells Borehole Images are a critical element to understand reservoir performance. Human interpretations can be highly subjective, AI-Assisted methods significantly improves the repeatability and consistency required in field wide reservoir modeling and understanding reservoir performance. In addition, AI-Assisted methods significantly reduce interpretation times from days to less than an hour.
Conclusions:
In conclusion, AI-Assisted methods have been used to significantly improve the timelines and robustness of Borehole Image interpretation.
Fractures appear as sinusoidal curves on borehole images (BHI’s) in horizontal wells and provide key insights into reservoir performance, they can indicate highly productive zones or highlight potential concerns for waterflood shortcuts. Traditionally interpretations are manual and is inherently time-consuming, laborious, and subject to interpreter bias and variability. Automating this process could result in significant cost savings allowing geological expertise to be focused elsewhere. However, computer vision algorithms face substantial challenges due to complexities in data quality, particularly in a Logging While Drilling (LWD) environment. LWD Images often experience distortions, noise, missing data patches, and artifacts, exacerbated by downhole shocks and vibrations that tend to increase with drilling depth. These imperfections can obscure true fractures or create patterns that mimic the sinusoidal shape of fractures, leading to high rates of false positives in automated detection. We aimed to develop an AI-assisted interpretation method to address many of these challenges.
Method & Applications:
Recognizing the impact of data quality on fracture interpretation reliability, we address the subjective and meticulous task of image quality labeling. We developed a deep learning-based workflow using an Efficient Net CNN, fine-tuned with borehole images classified as 'good', 'fair', and 'poor'. The resulting model achieves an F1 score of 80%, providing consistent and rapid identification of reliable data sections for interpretation whether manual or automated.
This work also presents a dual approach to improve the accuracy and efficiency of fracture analysis from borehole images. First, we define a set of image quality criteria and design corresponding scoring measures. Emphasis is placed on image quality features, as artifacts and missing data can significantly impact detection either by obscuring actual fractures or introducing false ones. These quality features are leveraged to minimize false positives while preserving true fracture detections, initially identified using a pre-existing sinusoid detection algorithm. This filtering process identifies key discriminative features that help distinguish genuine natural fractures from false detections. We managed to reduce false detections by 98% while keeping 94% of true fractures.
In fields developed with ERD wells Borehole Images are a critical element to understand reservoir performance. Human interpretations can be highly subjective, AI-Assisted methods significantly improves the repeatability and consistency required in field wide reservoir modeling and understanding reservoir performance. In addition, AI-Assisted methods significantly reduce interpretation times from days to less than an hour.
Conclusions:
In conclusion, AI-Assisted methods have been used to significantly improve the timelines and robustness of Borehole Image interpretation.
As the world accelerates toward a net-zero future, the question is no longer if we’ll transition to sustainable energy, but how fast and which technologies will lead the way. This keynote explores the innovations transforming energy generation and storage. Built on GetFocus’s unique forecasting approach, this session delivers a data-driven view of what’s coming next and where the real opportunities lie.
The energy sector is undergoing a profound transformation, where the pace of innovation often outstrips decision-making cycles. In this environment, AI-enabled forecasting is rapidly becoming a strategic necessity for R&D and innovation leaders tasked with navigating complex supply chains, decarbonisation demands, and geopolitical uncertainty. This paper presents a unique approach developed by GetFocus, a Rotterdam-based AI company, in collaboration with MIT, to quantify and forecast technological change across energy-relevant domains.
Our proprietary model leverages global patent databases and natural language processing to compute Technology Improvement Rates (TIR). A predictive metric that reveals how fast technologies are advancing before they disrupt markets. With the U.S. Department of Defense as our launch customer, and now adopted by leading firms such as 3M, Hess, BASF, and Caterpillar, this framework is used to guide R&D strategy, reshape supply chain design, and accelerate sustainable product innovation.
We share case examples from chemical process innovation (e.g., Direct Lithium Extraction), materials R&D (e.g., composites in EV applications), and consumer product sustainability (e.g., plastics-free packaging) that illustrate how predictive technology intelligence can inform investment and resource allocation decisions with a forward-looking lens. The model’s predictive power has helped reduce costly misalignments between R&D priorities and market reality, while also surfacing early opportunities in emerging domains like green hydrogen catalysts, solid-state batteries, and carbon-to-value platforms.
This session will explore the broader implications of embedding AI into strategic energy decisions from identifying innovation white spaces to building early-mover advantage in volatile, high-tech segments of the energy economy. We also outline how cybersecurity, data governance, and system transparency are integrated into our platform to ensure trustworthy, audit-ready decision support for energy and industrial leaders.
The energy sector is undergoing a profound transformation, where the pace of innovation often outstrips decision-making cycles. In this environment, AI-enabled forecasting is rapidly becoming a strategic necessity for R&D and innovation leaders tasked with navigating complex supply chains, decarbonisation demands, and geopolitical uncertainty. This paper presents a unique approach developed by GetFocus, a Rotterdam-based AI company, in collaboration with MIT, to quantify and forecast technological change across energy-relevant domains.
Our proprietary model leverages global patent databases and natural language processing to compute Technology Improvement Rates (TIR). A predictive metric that reveals how fast technologies are advancing before they disrupt markets. With the U.S. Department of Defense as our launch customer, and now adopted by leading firms such as 3M, Hess, BASF, and Caterpillar, this framework is used to guide R&D strategy, reshape supply chain design, and accelerate sustainable product innovation.
We share case examples from chemical process innovation (e.g., Direct Lithium Extraction), materials R&D (e.g., composites in EV applications), and consumer product sustainability (e.g., plastics-free packaging) that illustrate how predictive technology intelligence can inform investment and resource allocation decisions with a forward-looking lens. The model’s predictive power has helped reduce costly misalignments between R&D priorities and market reality, while also surfacing early opportunities in emerging domains like green hydrogen catalysts, solid-state batteries, and carbon-to-value platforms.
This session will explore the broader implications of embedding AI into strategic energy decisions from identifying innovation white spaces to building early-mover advantage in volatile, high-tech segments of the energy economy. We also outline how cybersecurity, data governance, and system transparency are integrated into our platform to ensure trustworthy, audit-ready decision support for energy and industrial leaders.
David Smethurst
Chair
Oil and Gas Production Department Director
Board of Various Start ups
Eszter Varga
Vice Chair
Downstream Portfolio Evaluation & Strategy Expert, Downstream Evaluation & Long-term planning
MOL plc
Objective
Despite the efforts directed toward adopting machine learning algorithms to address upstream challenges, the prospect of leveraging quantum machine learning (QML) to address these challenges has not been widely investigated. Based on the predictions, the advancement of the current computational powers will reach a plateau. As the hardware capacity determines the limit of what a machine can learn, this work highlights the potential of maximizing the exploitation of machine learning by adopting QML in various upstream cases.
Novelty
Quantum machine learning is an evolving field that utilizes quantum computing to perform machine learning algorithms. Employing quantum computing as a processing platform adds an advantage to machine learning due to the nature of quantum computing that employs subatomic particles for computation. This permits capitalizing on the properties of quantum mechanics phenomena such as superposition, entanglement, and quantum interference.
Methodology
This work sheds light on the feasibility of adapting QML technologies in various oil and gas cases that require computational complexity based on the current status of art. Through a comprehensive analysis of the theoretical concept of quantum computing, we provide an overview focusing on quantum machine learning and its various types and implementation. In addition, we highlight the key difference between the proposed approach to utilize quantum analogs with the current machine learning applications employed in the oil and gas industry, analyzing the gained advantage and the burden of adoption.
Results
Many of the proposed quantum algorithms remain a theoretical concept due to the unavailability of a stable large-scale quantum computer capable of accommodating the execution cost. Although the challenge of producing a stable quantum computer persists at the present time, researchers have discovered that several quantum algorithms, including quantum machine learning algorithms, are compatible with noisy intermediate-scale quantum computers (NISQ), which will be available in the near future. The potential for quantum machine learning is not limited to an extended speedup, but it also includes enhancing the generalization of machine learning algorithms, so that a machine is more likely to operate effectively with the new inputs of a given environment. These advanced capabilities can be harnessed to elevate the challenges and complexity of hydrocarbon plays; therefore, we encourage more exploration in this field tailored to the special requirements of the oil and gas industry since the development in the field of quantum machine learning can set forth a new generation of solutions. This will help with resolving long-standing issues beyond the limitation of existing processing power to advance upstream processes toward optimal operational excellence.
Despite the efforts directed toward adopting machine learning algorithms to address upstream challenges, the prospect of leveraging quantum machine learning (QML) to address these challenges has not been widely investigated. Based on the predictions, the advancement of the current computational powers will reach a plateau. As the hardware capacity determines the limit of what a machine can learn, this work highlights the potential of maximizing the exploitation of machine learning by adopting QML in various upstream cases.
Novelty
Quantum machine learning is an evolving field that utilizes quantum computing to perform machine learning algorithms. Employing quantum computing as a processing platform adds an advantage to machine learning due to the nature of quantum computing that employs subatomic particles for computation. This permits capitalizing on the properties of quantum mechanics phenomena such as superposition, entanglement, and quantum interference.
Methodology
This work sheds light on the feasibility of adapting QML technologies in various oil and gas cases that require computational complexity based on the current status of art. Through a comprehensive analysis of the theoretical concept of quantum computing, we provide an overview focusing on quantum machine learning and its various types and implementation. In addition, we highlight the key difference between the proposed approach to utilize quantum analogs with the current machine learning applications employed in the oil and gas industry, analyzing the gained advantage and the burden of adoption.
Results
Many of the proposed quantum algorithms remain a theoretical concept due to the unavailability of a stable large-scale quantum computer capable of accommodating the execution cost. Although the challenge of producing a stable quantum computer persists at the present time, researchers have discovered that several quantum algorithms, including quantum machine learning algorithms, are compatible with noisy intermediate-scale quantum computers (NISQ), which will be available in the near future. The potential for quantum machine learning is not limited to an extended speedup, but it also includes enhancing the generalization of machine learning algorithms, so that a machine is more likely to operate effectively with the new inputs of a given environment. These advanced capabilities can be harnessed to elevate the challenges and complexity of hydrocarbon plays; therefore, we encourage more exploration in this field tailored to the special requirements of the oil and gas industry since the development in the field of quantum machine learning can set forth a new generation of solutions. This will help with resolving long-standing issues beyond the limitation of existing processing power to advance upstream processes toward optimal operational excellence.
Introduction:
Fractures appear as sinusoidal curves on borehole images (BHI’s) in horizontal wells and provide key insights into reservoir performance, they can indicate highly productive zones or highlight potential concerns for waterflood shortcuts. Traditionally interpretations are manual and is inherently time-consuming, laborious, and subject to interpreter bias and variability. Automating this process could result in significant cost savings allowing geological expertise to be focused elsewhere. However, computer vision algorithms face substantial challenges due to complexities in data quality, particularly in a Logging While Drilling (LWD) environment. LWD Images often experience distortions, noise, missing data patches, and artifacts, exacerbated by downhole shocks and vibrations that tend to increase with drilling depth. These imperfections can obscure true fractures or create patterns that mimic the sinusoidal shape of fractures, leading to high rates of false positives in automated detection. We aimed to develop an AI-assisted interpretation method to address many of these challenges.
Method & Applications:
Recognizing the impact of data quality on fracture interpretation reliability, we address the subjective and meticulous task of image quality labeling. We developed a deep learning-based workflow using an Efficient Net CNN, fine-tuned with borehole images classified as 'good', 'fair', and 'poor'. The resulting model achieves an F1 score of 80%, providing consistent and rapid identification of reliable data sections for interpretation whether manual or automated.
This work also presents a dual approach to improve the accuracy and efficiency of fracture analysis from borehole images. First, we define a set of image quality criteria and design corresponding scoring measures. Emphasis is placed on image quality features, as artifacts and missing data can significantly impact detection either by obscuring actual fractures or introducing false ones. These quality features are leveraged to minimize false positives while preserving true fracture detections, initially identified using a pre-existing sinusoid detection algorithm. This filtering process identifies key discriminative features that help distinguish genuine natural fractures from false detections. We managed to reduce false detections by 98% while keeping 94% of true fractures.
In fields developed with ERD wells Borehole Images are a critical element to understand reservoir performance. Human interpretations can be highly subjective, AI-Assisted methods significantly improves the repeatability and consistency required in field wide reservoir modeling and understanding reservoir performance. In addition, AI-Assisted methods significantly reduce interpretation times from days to less than an hour.
Conclusions:
In conclusion, AI-Assisted methods have been used to significantly improve the timelines and robustness of Borehole Image interpretation.
Fractures appear as sinusoidal curves on borehole images (BHI’s) in horizontal wells and provide key insights into reservoir performance, they can indicate highly productive zones or highlight potential concerns for waterflood shortcuts. Traditionally interpretations are manual and is inherently time-consuming, laborious, and subject to interpreter bias and variability. Automating this process could result in significant cost savings allowing geological expertise to be focused elsewhere. However, computer vision algorithms face substantial challenges due to complexities in data quality, particularly in a Logging While Drilling (LWD) environment. LWD Images often experience distortions, noise, missing data patches, and artifacts, exacerbated by downhole shocks and vibrations that tend to increase with drilling depth. These imperfections can obscure true fractures or create patterns that mimic the sinusoidal shape of fractures, leading to high rates of false positives in automated detection. We aimed to develop an AI-assisted interpretation method to address many of these challenges.
Method & Applications:
Recognizing the impact of data quality on fracture interpretation reliability, we address the subjective and meticulous task of image quality labeling. We developed a deep learning-based workflow using an Efficient Net CNN, fine-tuned with borehole images classified as 'good', 'fair', and 'poor'. The resulting model achieves an F1 score of 80%, providing consistent and rapid identification of reliable data sections for interpretation whether manual or automated.
This work also presents a dual approach to improve the accuracy and efficiency of fracture analysis from borehole images. First, we define a set of image quality criteria and design corresponding scoring measures. Emphasis is placed on image quality features, as artifacts and missing data can significantly impact detection either by obscuring actual fractures or introducing false ones. These quality features are leveraged to minimize false positives while preserving true fracture detections, initially identified using a pre-existing sinusoid detection algorithm. This filtering process identifies key discriminative features that help distinguish genuine natural fractures from false detections. We managed to reduce false detections by 98% while keeping 94% of true fractures.
In fields developed with ERD wells Borehole Images are a critical element to understand reservoir performance. Human interpretations can be highly subjective, AI-Assisted methods significantly improves the repeatability and consistency required in field wide reservoir modeling and understanding reservoir performance. In addition, AI-Assisted methods significantly reduce interpretation times from days to less than an hour.
Conclusions:
In conclusion, AI-Assisted methods have been used to significantly improve the timelines and robustness of Borehole Image interpretation.
The energy transition is a complex and multifaceted challenge that requires innovative solutions to manage complex energy systems, optimize operations. Digitalisation and AI are key enablers of this transition, providing the tools and frameworks needed to transform the energy industry and meet future challenges. This presentation explores the latest advancements in these areas and discusses how they are transforming the energy industry.
One such advancement is the development of SOFFIA-AI (Satellite Observation For Forecasting, Intelligence and Analytics), an advanced platform that integrates satellite data, economic indicators, and artificial intelligence to provide actionable insights for energy market stakeholders. By utilizing H3 hexagonal grids—subcity-level spatial units—SOFFIA-AI bridges the gap between complex data streams and actionable intelligence, enabling precise, data-driven decision-making in downstream energy markets.
The presentation will demonstrate SOFFIA-AI's application in trading and market analytics, focusing on its ability to enhance real-time nowcasting and forecasting of regional energy demand. By leveraging data sources such as methane emissions, nightlight intensity, and Google Trends, the platform provides traders with granular insights into market dynamics, empowering them to anticipate shifts in demand, improve hedging strategies, and make informed trading decisions.
Furthermore, the presentation will explore SOFFIA's innovative use of large language models (LLMs) to extract, summarize, and analyze unstructured data, including market reports, news articles, and regulatory updates. This integration enables seamless access to relevant market information, significantly enhancing decision-making processes. By combining these AI-driven tools with high-resolution geospatial data, SOFFIA provides an unparalleled framework for informed decision-making.
Through case studies in the Middle East, the presentation highlights how SOFFIA's integrated approach reduces uncertainty, enhances decision accuracy, and drives sustainability. For instance, SOFFIA's real-time demand forecasts could enable traders in the Gulf region to optimize their portfolios, achieving an improvement in efficiency. Similarly, its ability to identify regional demand spikes has supported better logistical planning and inventory management for downstream players.
The discussion will also touch upon scalability, emphasizing how SOFFIA's modular architecture allows seamless adaptation to various downstream segments, including chemicals and retail. By focusing on actionable insights derived from sophisticated AI tools, SOFFIA empowers stakeholders across the downstream value chain to make data-driven decisions, leading to both economic and environmental benefits.
Overall, this presentation showcases the transformative potential of digitalisation, AI, and cybersecurity in the energy sector, highlighting the latest advancements and innovations that are driving a sustainable energy transition.
One such advancement is the development of SOFFIA-AI (Satellite Observation For Forecasting, Intelligence and Analytics), an advanced platform that integrates satellite data, economic indicators, and artificial intelligence to provide actionable insights for energy market stakeholders. By utilizing H3 hexagonal grids—subcity-level spatial units—SOFFIA-AI bridges the gap between complex data streams and actionable intelligence, enabling precise, data-driven decision-making in downstream energy markets.
The presentation will demonstrate SOFFIA-AI's application in trading and market analytics, focusing on its ability to enhance real-time nowcasting and forecasting of regional energy demand. By leveraging data sources such as methane emissions, nightlight intensity, and Google Trends, the platform provides traders with granular insights into market dynamics, empowering them to anticipate shifts in demand, improve hedging strategies, and make informed trading decisions.
Furthermore, the presentation will explore SOFFIA's innovative use of large language models (LLMs) to extract, summarize, and analyze unstructured data, including market reports, news articles, and regulatory updates. This integration enables seamless access to relevant market information, significantly enhancing decision-making processes. By combining these AI-driven tools with high-resolution geospatial data, SOFFIA provides an unparalleled framework for informed decision-making.
Through case studies in the Middle East, the presentation highlights how SOFFIA's integrated approach reduces uncertainty, enhances decision accuracy, and drives sustainability. For instance, SOFFIA's real-time demand forecasts could enable traders in the Gulf region to optimize their portfolios, achieving an improvement in efficiency. Similarly, its ability to identify regional demand spikes has supported better logistical planning and inventory management for downstream players.
The discussion will also touch upon scalability, emphasizing how SOFFIA's modular architecture allows seamless adaptation to various downstream segments, including chemicals and retail. By focusing on actionable insights derived from sophisticated AI tools, SOFFIA empowers stakeholders across the downstream value chain to make data-driven decisions, leading to both economic and environmental benefits.
Overall, this presentation showcases the transformative potential of digitalisation, AI, and cybersecurity in the energy sector, highlighting the latest advancements and innovations that are driving a sustainable energy transition.
As the world accelerates toward a net-zero future, the question is no longer if we’ll transition to sustainable energy, but how fast and which technologies will lead the way. This keynote explores the innovations transforming energy generation and storage. Built on GetFocus’s unique forecasting approach, this session delivers a data-driven view of what’s coming next and where the real opportunities lie.
The energy sector is undergoing a profound transformation, where the pace of innovation often outstrips decision-making cycles. In this environment, AI-enabled forecasting is rapidly becoming a strategic necessity for R&D and innovation leaders tasked with navigating complex supply chains, decarbonisation demands, and geopolitical uncertainty. This paper presents a unique approach developed by GetFocus, a Rotterdam-based AI company, in collaboration with MIT, to quantify and forecast technological change across energy-relevant domains.
Our proprietary model leverages global patent databases and natural language processing to compute Technology Improvement Rates (TIR). A predictive metric that reveals how fast technologies are advancing before they disrupt markets. With the U.S. Department of Defense as our launch customer, and now adopted by leading firms such as 3M, Hess, BASF, and Caterpillar, this framework is used to guide R&D strategy, reshape supply chain design, and accelerate sustainable product innovation.
We share case examples from chemical process innovation (e.g., Direct Lithium Extraction), materials R&D (e.g., composites in EV applications), and consumer product sustainability (e.g., plastics-free packaging) that illustrate how predictive technology intelligence can inform investment and resource allocation decisions with a forward-looking lens. The model’s predictive power has helped reduce costly misalignments between R&D priorities and market reality, while also surfacing early opportunities in emerging domains like green hydrogen catalysts, solid-state batteries, and carbon-to-value platforms.
This session will explore the broader implications of embedding AI into strategic energy decisions from identifying innovation white spaces to building early-mover advantage in volatile, high-tech segments of the energy economy. We also outline how cybersecurity, data governance, and system transparency are integrated into our platform to ensure trustworthy, audit-ready decision support for energy and industrial leaders.
The energy sector is undergoing a profound transformation, where the pace of innovation often outstrips decision-making cycles. In this environment, AI-enabled forecasting is rapidly becoming a strategic necessity for R&D and innovation leaders tasked with navigating complex supply chains, decarbonisation demands, and geopolitical uncertainty. This paper presents a unique approach developed by GetFocus, a Rotterdam-based AI company, in collaboration with MIT, to quantify and forecast technological change across energy-relevant domains.
Our proprietary model leverages global patent databases and natural language processing to compute Technology Improvement Rates (TIR). A predictive metric that reveals how fast technologies are advancing before they disrupt markets. With the U.S. Department of Defense as our launch customer, and now adopted by leading firms such as 3M, Hess, BASF, and Caterpillar, this framework is used to guide R&D strategy, reshape supply chain design, and accelerate sustainable product innovation.
We share case examples from chemical process innovation (e.g., Direct Lithium Extraction), materials R&D (e.g., composites in EV applications), and consumer product sustainability (e.g., plastics-free packaging) that illustrate how predictive technology intelligence can inform investment and resource allocation decisions with a forward-looking lens. The model’s predictive power has helped reduce costly misalignments between R&D priorities and market reality, while also surfacing early opportunities in emerging domains like green hydrogen catalysts, solid-state batteries, and carbon-to-value platforms.
This session will explore the broader implications of embedding AI into strategic energy decisions from identifying innovation white spaces to building early-mover advantage in volatile, high-tech segments of the energy economy. We also outline how cybersecurity, data governance, and system transparency are integrated into our platform to ensure trustworthy, audit-ready decision support for energy and industrial leaders.


