TECHNICAL PROGRAMME | Energy Technologies – Future Pathways
Smart Infrastructure for the Future Energy Industry: Digitalisation & Innovation
Forum 18 | Digital Poster Plaza 4
27
April
15:30
17:30
UTC+3
As the energy industry evolves to meet the demands of a sustainable future, smart infrastructure is playing a crucial role in transforming the sector. This session will explore the cutting-edge technologies and strategies that are enabling smarter, more resilient, and adaptive energy systems. It will cover the latest developments in smart grids, intelligent energy management systems, IoT applications, AI-driven analytics, and the role of big data in optimising energy infrastructure. The session will bring together experts to discuss the challenges, opportunities, and future trends in smart energy infrastructure.
The Caspian Sea, positioned at the crossroads of Europe and Asia, stands as more than a hydrocarbon-rich basin; it serves as a strategic testing ground for advancing sustainable energy pathways. Amid global imperatives to guarantee energy security while accelerating the low-carbon transition, the Caspian Basin emerges as a pivotal hub where diversification, technological innovation, and environmental responsibility converge. This paper argues that offshore development in the region, enabled by advanced technologies and multilateral cooperation, offers actionable lessons for shaping resilient global energy systems.
Drawing on comparative case studies from leading offshore projects—including Kashagan, Azeri–Chirag–Gunashli, and new developments in Iranian waters—this study proposes a framework for sustainable offshore exploration. The framework highlights the integration of AI-enabled monitoring, full-scale digitalization, carbon capture readiness, and stringent environmental safeguards as essential pillars of next-generation offshore operations. Pilot assessments and simulation-based analyses suggest that such integrated approaches can potentially reduce methane emissions by up to 15% while lowering operating costs by nearly 10%, demonstrating that economic competitiveness and environmental stewardship can reinforce each other.
The Caspian’s distinctive conditions—its complex geology, semi-enclosed ecosystem, and multi-state governance—make it an indispensable arena for piloting energy transition strategies. Insights from the region emphasize the value of collaborative governance, technology-driven resilience, and regional capacity-building in sustaining market stability and ecological integrity. Moreover, the Caspian experience provides a replicable blueprint for other offshore provinces worldwide striving to align hydrocarbon development with sustainability and diversification.
The study concludes with policy-oriented recommendations that resonate with global net-zero ambitions and highlight synergies with Gulf Cooperation Council strategies, particularly in relation to hydrogen development, carbon management, and renewable energy integration under Vision 2030 frameworks.
Accurate flow rate estimation is critical for optimising gas production and enabling proactive reservoir management. At QatarEnergy LNG, Conventional metering technologies such as wet gas meters, multiphase flow meters (MPFMs), and test separators are commonly used for surveillance and allocating the rate of the separate phases. However, these systems present limitations in terms of cost, scalability, and operational reliability, specifically under dynamic flow conditions in wet gas wells with varying production characteristics. Test separators – though are widely considered as the reference method - are often constrained by infrastructure availability, limited test frequency and operational accessibility leading to a reduction in the effectiveness of capturing timely gas, water, and condensate rates. This paper presents the deployment and evaluation of Virtual Flow Metering (VFM) as a scalable, nonintrusive alternative to traditional metering systems. By leveraging real-time surface data with physics-based and data-driven models, VFMs offers continuous estimation of gas, condensate, and water rates at the wellheads. While the use of test separators remains essential for initial model calibration and periodic validation, VFMs can significantly reduce the need for frequent physical testing and improve surveillance insights across the asset. Benchmarking results from QatarEnergy LNG field applications show that VFMs can deliver accurate and reliable estimates across wells with variable and transient flow behaviour. The findings support VFMs as a key enabler of digital well surveillance, offering a cost-effective and operationally efficient tool for production optimisation. Looking ahead, integrating artificial intelligence (AI) into the VFMs systems presents a promising opportunity to further enhance adaptability, accuracy and predictive capabilities. This evolution positions VFMs as a foundation element in the future of autonomous production monitoring and reservoir management, contributing to QatarEnergy LNG broader objectives in digital transformation of the subsurface operations.
Digital twin technology is crucial for refining enterprises aiming for operational excellence, yet traditional mechanistic and data-driven models struggle to capture the complexities of refining processes. A dual-driven digital twin that integrates data and theoretical frameworks is essential for achieving a balance of interpretability, accuracy and adaptability, as it embeds physicochemical constraints into machine learning architectures to ensure predictions adhere to conservation laws while dynamically learning unmodeled phenomena. Despite its theoretical promise, implementing a dual-driven digital twin in refining systems faces three critical challenges. One significant issue is high-quality dataset engineering due to diverse data sources causing issues like missing values and temporal misalignment, necessitating robust preprocessing techniques. Another critical challenge is achieving real-time high-fidelity modeling, which requires multi-scale approaches that integrate molecular-level kinetics and fluid dynamics while utilizing reduced-order models and neural network pruning for efficient inference. Furthermore, online optimization and decision-making are complicated by high-dimensional, non-convex objectives, necessitating a closed-loop “sense-decide-act” framework that allows reinforcement learning agents to dynamically adjust operating parameters while maintaining safety margins.
This study addresses these challenges through a novel methodology combining domain knowledge embedding and holistic optimization. Initially, high-quality datasets are developed through expert and theory-guided protocols that include outlier detection and imputation using first-law, K-means clustering combined with domain expertise to define parameter correlations, and automated cloud simulation platforms that generate physics-compliant samples. Secondly, an unified computational architecture integrates multi-scale constraints, such as molecular-scale reaction kinetics and equipment-scale hydrodynamics, into loss functions using Lagrange multipliers. Continuous online learning allows for the adaptation of model parameters to real-time sensor data, achieving less than 2% prediction errors in product properties. Thirdly, a holistic optimization algorithm based on gradient descent synchronizes operational variables, including distillation cut points and pump frequencies, thereby reducing optimization cycles from hours to minutes. Implemented in a 10-million-ton-per-year atmospheric-vacuum distillation unit, this framework facilitated closed-loop operation, resulting in a 1.8% increase in light oil yield and generating annual economic benefits exceeding CNY 25 million. This work illustrates that the fusion of theory-constrained data and cross-scale modeling is essential for the advancement of next-generation intelligent refineries.
Key words: Digital twin, data and theory, high-quality datasets, refinery optimization.
Co-author/s:
JiaHua Zhang, Dalian West Pacific Petrochemical Co., Ltd.
This study addresses these challenges through a novel methodology combining domain knowledge embedding and holistic optimization. Initially, high-quality datasets are developed through expert and theory-guided protocols that include outlier detection and imputation using first-law, K-means clustering combined with domain expertise to define parameter correlations, and automated cloud simulation platforms that generate physics-compliant samples. Secondly, an unified computational architecture integrates multi-scale constraints, such as molecular-scale reaction kinetics and equipment-scale hydrodynamics, into loss functions using Lagrange multipliers. Continuous online learning allows for the adaptation of model parameters to real-time sensor data, achieving less than 2% prediction errors in product properties. Thirdly, a holistic optimization algorithm based on gradient descent synchronizes operational variables, including distillation cut points and pump frequencies, thereby reducing optimization cycles from hours to minutes. Implemented in a 10-million-ton-per-year atmospheric-vacuum distillation unit, this framework facilitated closed-loop operation, resulting in a 1.8% increase in light oil yield and generating annual economic benefits exceeding CNY 25 million. This work illustrates that the fusion of theory-constrained data and cross-scale modeling is essential for the advancement of next-generation intelligent refineries.
Key words: Digital twin, data and theory, high-quality datasets, refinery optimization.
Co-author/s:
JiaHua Zhang, Dalian West Pacific Petrochemical Co., Ltd.
There is more than 200 thousand pumping unit wells in CNPC, accurate and reliable downhole data is most important during oil production. Either Acoustic wave measurement of liquid level was not precise enough or the cable method was too expensive. The question was how to get the accurate downhole data and transmit it to the surface with low cost. This paper introduced the problems and solutions of the pilot test of the WWCTP in high water cut oil field .
WWCTP is a technology for monitoring the annulus pressure of pumping unit well and transmit to the ground wirelessly. In this technology, the down-hole pressure data was get by the pressure sensor, and the data can be transmitted to the surface through the rod, some typical wells were showed in this paper. For example, A well, with 1191 meters deep, a 57 mm diameter, 5.2 times/min stroke rate and 98.4% water cut, the WWCTP had been working stably for 4 months. But another well with almost the some condition, 1212 meters deep, a 57 mm diameter pump, 6 times per min stroke rate and 96.1% water cut, the signal is too weak and subtle to recognize. In total, more than 30 wells had been deployed, We found that there were many factors, such as pump size, well deliverability, could impact the data. Lots of optimization was carried out and finally realized stable wellbore Wireless Communication.
The pilot test shows that this technique had the advantages of a simple structure, reliable transmission, low cost and being free of periodic pump inspection. This technology was capable of wireless uploading in 1000 m deep well and capturing the accurate downhole pressure variation, also provides reference to optimize the production regime for low-production, low-efficiency and high-fluid-level wells.
This technology represents a new exploration in the acquisition of wellbore data. Furthermore, this technology's integrated cost is expected to decrease by more than 50% compared with wired data transmission, and has a broad application prospect.
Co-author/s:
Chenglong Liao, Engineer, Research Institute of Petroleum Exploration & Development, PetroChina.
WWCTP is a technology for monitoring the annulus pressure of pumping unit well and transmit to the ground wirelessly. In this technology, the down-hole pressure data was get by the pressure sensor, and the data can be transmitted to the surface through the rod, some typical wells were showed in this paper. For example, A well, with 1191 meters deep, a 57 mm diameter, 5.2 times/min stroke rate and 98.4% water cut, the WWCTP had been working stably for 4 months. But another well with almost the some condition, 1212 meters deep, a 57 mm diameter pump, 6 times per min stroke rate and 96.1% water cut, the signal is too weak and subtle to recognize. In total, more than 30 wells had been deployed, We found that there were many factors, such as pump size, well deliverability, could impact the data. Lots of optimization was carried out and finally realized stable wellbore Wireless Communication.
The pilot test shows that this technique had the advantages of a simple structure, reliable transmission, low cost and being free of periodic pump inspection. This technology was capable of wireless uploading in 1000 m deep well and capturing the accurate downhole pressure variation, also provides reference to optimize the production regime for low-production, low-efficiency and high-fluid-level wells.
This technology represents a new exploration in the acquisition of wellbore data. Furthermore, this technology's integrated cost is expected to decrease by more than 50% compared with wired data transmission, and has a broad application prospect.
Co-author/s:
Chenglong Liao, Engineer, Research Institute of Petroleum Exploration & Development, PetroChina.
Objective/Scope:
As the global energy transition accelerates, closed-loop geothermal systems are emerging as a promising source of sustainable baseload energy. This study introduces a robust geothermal screening model designed to evaluate the technical and economic viability of closed-loop systems at an early stage, enabling rapid assessment across diverse geological and operational conditions.
Methods, Procedures, Process:
The model integrates reservoir and wellbore thermal dynamics—such as mass flow rate, thermal conductivity, diffusivity, and temperature gradients—with cost components including drilling, operations, and maintenance. Leveraging AI-based simulation techniques, the screening framework estimates heat generation potential and associated lifecycle costs. The model is structured to rapidly process multiple scenarios, incorporating sensitivity to key subsurface and design parameters. This allows for the identification of high-potential configurations and regions suitable for further detailed analysis or field deployment.
Results, Observations, Conclusions:
The screening model effectively distinguishes between viable and non-viable system designs based on performance and economic indicators such as net present value (NPV), internal rate of return (IRR), and payback period. Results indicate that geofluid flow rates, temperature gradients, and drilling costs are the most influential parameters affecting system feasibility. The model provides a structured, data-driven approach to prioritize projects and guide decision-making before committing to more resource-intensive optimization or field development.
Novel/Additive Information:
This geothermal screening model offers a scalable and flexible tool for industry stakeholders to evaluate closed-loop geothermal systems in a wide range of settings. By integrating technical performance with economic considerations early in the assessment phase, the model supports faster, more informed decisions that can accelerate the adoption of geothermal energy within the global sustainable energy mix.
Co-author/s:
Ali Alshuwaikhat, Research Engineer, Saudi Aramco.
As the global energy transition accelerates, closed-loop geothermal systems are emerging as a promising source of sustainable baseload energy. This study introduces a robust geothermal screening model designed to evaluate the technical and economic viability of closed-loop systems at an early stage, enabling rapid assessment across diverse geological and operational conditions.
Methods, Procedures, Process:
The model integrates reservoir and wellbore thermal dynamics—such as mass flow rate, thermal conductivity, diffusivity, and temperature gradients—with cost components including drilling, operations, and maintenance. Leveraging AI-based simulation techniques, the screening framework estimates heat generation potential and associated lifecycle costs. The model is structured to rapidly process multiple scenarios, incorporating sensitivity to key subsurface and design parameters. This allows for the identification of high-potential configurations and regions suitable for further detailed analysis or field deployment.
Results, Observations, Conclusions:
The screening model effectively distinguishes between viable and non-viable system designs based on performance and economic indicators such as net present value (NPV), internal rate of return (IRR), and payback period. Results indicate that geofluid flow rates, temperature gradients, and drilling costs are the most influential parameters affecting system feasibility. The model provides a structured, data-driven approach to prioritize projects and guide decision-making before committing to more resource-intensive optimization or field development.
Novel/Additive Information:
This geothermal screening model offers a scalable and flexible tool for industry stakeholders to evaluate closed-loop geothermal systems in a wide range of settings. By integrating technical performance with economic considerations early in the assessment phase, the model supports faster, more informed decisions that can accelerate the adoption of geothermal energy within the global sustainable energy mix.
Co-author/s:
Ali Alshuwaikhat, Research Engineer, Saudi Aramco.
The goal of my project was to replace manual calculations for each new time interval of new well operations calculations with a machine learning model prediction that would provide recommendations on the best wells at each company site for each new date.
objectives:
The models were tested on data from multiple fields with significantly different geological properties. Data on more than 3,000 oil well operation events were collected and aggregated, including 1,739 were BZT (bottomhole zone treatment, in particular acid treatment), and the rest were measures to intensify oil production (IOP) by reducing bottomhole pressure. For each oil well operation, historical data on production and injection, parameters of the measures taken, and reservoir characteristics were collected and prepared. The target feature (additional production) was cleaned of outliers using the properties of quantile distribution. Unlike methods based on normal distribution, this approach is not sensitive to absolute values and the scale of outliers, which allows for more effective elimination of anomalies.
Another key task was to take into account the mutual influence of wells. Traditional correlation calculation methods required significant computing resources to process the entire history of wells. Therefore, an original algorithm based on fast Fourier transform (FFT) was developed to calculate the optimal radius of influence of production and injection wells and to estimate the correlation between them. All calculations were transferred to graphics processing units (GPUs), which accelerated the calculations by more than 10–20 times.
Various machine learning methods were tested to predict the success of GTM: gradient boosting (CatBoost) for regression and classification tasks, as well as ranking models (Learning to Rank).
It has been shown that direct prediction of the value of additional production (regression) results in high error (RMSE, MAE), which is due to the complexity of the physics of the process and the limited data available. The classification model predicting the probability of events success demonstrates acceptable accuracy (Accuracy ~0.7–0.8), but has a significant spread in predictions. The best results are achieved using a recommendation ranking model, which does not predict exact values, but ranks wells in descending order of expected relative efficiency (additional production at the field). This approach allows selecting a limited number of the most promising candidates for GTM. This recommendation model is distinguished by its ability to compare wells with each other (it calculates an individual loss function that takes into account the rank of the well - Learning to Rank).
objectives:
- formulate a specific task to be solved and collect the necessary data from databases.
- conduct data analysis and train machine learning models to predict well operations success.
- develop an algorithm for accounting for the influence of neighboring wells on the target well.
- automate data collection and model forecasting
The models were tested on data from multiple fields with significantly different geological properties. Data on more than 3,000 oil well operation events were collected and aggregated, including 1,739 were BZT (bottomhole zone treatment, in particular acid treatment), and the rest were measures to intensify oil production (IOP) by reducing bottomhole pressure. For each oil well operation, historical data on production and injection, parameters of the measures taken, and reservoir characteristics were collected and prepared. The target feature (additional production) was cleaned of outliers using the properties of quantile distribution. Unlike methods based on normal distribution, this approach is not sensitive to absolute values and the scale of outliers, which allows for more effective elimination of anomalies.
Another key task was to take into account the mutual influence of wells. Traditional correlation calculation methods required significant computing resources to process the entire history of wells. Therefore, an original algorithm based on fast Fourier transform (FFT) was developed to calculate the optimal radius of influence of production and injection wells and to estimate the correlation between them. All calculations were transferred to graphics processing units (GPUs), which accelerated the calculations by more than 10–20 times.
Various machine learning methods were tested to predict the success of GTM: gradient boosting (CatBoost) for regression and classification tasks, as well as ranking models (Learning to Rank).
It has been shown that direct prediction of the value of additional production (regression) results in high error (RMSE, MAE), which is due to the complexity of the physics of the process and the limited data available. The classification model predicting the probability of events success demonstrates acceptable accuracy (Accuracy ~0.7–0.8), but has a significant spread in predictions. The best results are achieved using a recommendation ranking model, which does not predict exact values, but ranks wells in descending order of expected relative efficiency (additional production at the field). This approach allows selecting a limited number of the most promising candidates for GTM. This recommendation model is distinguished by its ability to compare wells with each other (it calculates an individual loss function that takes into account the rank of the well - Learning to Rank).
As the global energy sector accelerates toward a low-carbon future, operators in carbon-intensive regions face the dual imperative of maximizing efficiency while minimizing environmental impact. This study presents a transformative framework that integrates digitalization with advanced nanomaterials to drive sustainable, high-performance oil and gas operations in the Middle East.
The methodology leverages IoT-enabled sensors, AI-driven predictive analytics, and graphene-based nanofluids with nanoceramic coatings, enabling real-time monitoring, predictive maintenance, and operational optimization of high-pressure/high-temperature (HPHT) offshore wells. Simulations under conditions representative of Arabian Gulf reservoirs validate the framework’s effectiveness, scalability, and operational relevance.
Results demonstrate substantial improvements: non-productive time decreased by 18% through predictive maintenance, fuel consumption dropped by 12% via operational optimization, and methane emissions fell by 15%, equating to approximately $2.1 million in annual savings per platform. Nanoparticle-enhanced drilling fluids increased drilling efficiency by 22% while reducing environmental impact. Digital twin technology provided real-time decision support, lowering operational risks by 30% and water consumption by 25% compared with conventional practices.
This integrated approach accelerates deployment by 40% in analogous formations and offers potential cost reductions of up to 50% when implemented across multiple assets. By merging cutting-edge digital tools with nanotechnology, the framework delivers measurable environmental and economic benefits while supporting operational resilience, safety, and regulatory compliance.
Beyond immediate operational gains, this study provides a replicable pathway for energy operators seeking to align hydrocarbon production with climate commitments. The combination of predictive analytics, digital twins, and nanomaterial innovations not only enhances efficiency but also enables significant reductions in carbon footprint and resource consumption. This dual benefit supports the broader global energy transition by advancing cleaner, more sustainable upstream operations.
These findings offer actionable insights for Middle Eastern operators and other carbon-intensive regions, demonstrating that strategic integration of digital technologies and nanomaterials can transform traditional oil and gas operations into sustainable, high-performing, and economically viable systems. The framework provides a model for achieving affordable, reliable, and clean energy production while meeting environmental stewardship goals and regional sustainability priorities.
The methodology leverages IoT-enabled sensors, AI-driven predictive analytics, and graphene-based nanofluids with nanoceramic coatings, enabling real-time monitoring, predictive maintenance, and operational optimization of high-pressure/high-temperature (HPHT) offshore wells. Simulations under conditions representative of Arabian Gulf reservoirs validate the framework’s effectiveness, scalability, and operational relevance.
Results demonstrate substantial improvements: non-productive time decreased by 18% through predictive maintenance, fuel consumption dropped by 12% via operational optimization, and methane emissions fell by 15%, equating to approximately $2.1 million in annual savings per platform. Nanoparticle-enhanced drilling fluids increased drilling efficiency by 22% while reducing environmental impact. Digital twin technology provided real-time decision support, lowering operational risks by 30% and water consumption by 25% compared with conventional practices.
This integrated approach accelerates deployment by 40% in analogous formations and offers potential cost reductions of up to 50% when implemented across multiple assets. By merging cutting-edge digital tools with nanotechnology, the framework delivers measurable environmental and economic benefits while supporting operational resilience, safety, and regulatory compliance.
Beyond immediate operational gains, this study provides a replicable pathway for energy operators seeking to align hydrocarbon production with climate commitments. The combination of predictive analytics, digital twins, and nanomaterial innovations not only enhances efficiency but also enables significant reductions in carbon footprint and resource consumption. This dual benefit supports the broader global energy transition by advancing cleaner, more sustainable upstream operations.
These findings offer actionable insights for Middle Eastern operators and other carbon-intensive regions, demonstrating that strategic integration of digital technologies and nanomaterials can transform traditional oil and gas operations into sustainable, high-performing, and economically viable systems. The framework provides a model for achieving affordable, reliable, and clean energy production while meeting environmental stewardship goals and regional sustainability priorities.
The Caspian Sea, positioned at the crossroads of Europe and Asia, stands as more than a hydrocarbon-rich basin; it serves as a strategic testing ground for advancing sustainable energy pathways. Amid global imperatives to guarantee energy security while accelerating the low-carbon transition, the Caspian Basin emerges as a pivotal hub where diversification, technological innovation, and environmental responsibility converge. This paper argues that offshore development in the region, enabled by advanced technologies and multilateral cooperation, offers actionable lessons for shaping resilient global energy systems.
Drawing on comparative case studies from leading offshore projects—including Kashagan, Azeri–Chirag–Gunashli, and new developments in Iranian waters—this study proposes a framework for sustainable offshore exploration. The framework highlights the integration of AI-enabled monitoring, full-scale digitalization, carbon capture readiness, and stringent environmental safeguards as essential pillars of next-generation offshore operations. Pilot assessments and simulation-based analyses suggest that such integrated approaches can potentially reduce methane emissions by up to 15% while lowering operating costs by nearly 10%, demonstrating that economic competitiveness and environmental stewardship can reinforce each other.
The Caspian’s distinctive conditions—its complex geology, semi-enclosed ecosystem, and multi-state governance—make it an indispensable arena for piloting energy transition strategies. Insights from the region emphasize the value of collaborative governance, technology-driven resilience, and regional capacity-building in sustaining market stability and ecological integrity. Moreover, the Caspian experience provides a replicable blueprint for other offshore provinces worldwide striving to align hydrocarbon development with sustainability and diversification.
The study concludes with policy-oriented recommendations that resonate with global net-zero ambitions and highlight synergies with Gulf Cooperation Council strategies, particularly in relation to hydrogen development, carbon management, and renewable energy integration under Vision 2030 frameworks.
In recent years, manufacturing industries in Japan have been facing a serious decline in the working population due to declining birthrate and aging population, as well as rising employee turnover rates. There are concerns about a shortage of experienced personnel in the oil refining and petrochemical industries, and an increase in problems caused by aging equipment in the industries. Prompt actions are required to cope with these concerns.
Digital technologies have been rapidly and continuously evolving in the world, which can offer promising solutions to the concerns.
Currently, plant operations rely on the round-the-clock monitoring and decision making by operators. To cope with the decline in experienced operators, our company has developed AI systems that enable advanced operation of a plant, simultaneously ensuring its stability and safety with prevention of the problems caused by aged equipment. Our goals also include improving production efficiency and reducing energy consumption in the plant beyond what experienced operators can achieve.
We have collaborated with “Preferred Networks”, a leading deep learning company in Japan since 2019. As a result of the collaboration and these endeavors, we have finally succeeded in introducing the AI systems to a Crude Distillation Unit (CDU) and a butadiene extraction unit in our plants. We have achieved continuous usage of them under various operating conditions of the units.
Our AI with deep neural networks learned the intricate relationships between the values of plant sensors and environmental variables, such as ambient temperature and precipitation, enables the prediction of the future values of the sensors and the selection of optimal operations as an alternative to human intervention. Unlike conventional advanced control systems, our AI excels at handling the non-linear dynamics between manipulated and controlled variables, allowing for flexible adaptation to changing operating conditions.
CDU especially requires a high level of skills and experience with operational factors to control and as many as sensors to monitor. The world’s first AI-based continuous autonomous operation of the CDU has been achieved. The AI system for the unit continuously monitors dozens of key operational factors and simultaneously adjusts dozens of valves to stabilize fluctuations resulting from crude oil switching as well as changes in crude oil throughput. The AI system has demonstrated higher stability and efficiency compared with manual operations. Moving forward, we will consider deploying the co-developed AI systems to our other refineries. Moreover, we aim to refine our AI models and investigate their applicability in other industries, expanding our reach beyond our company’s core operations.
Digital technologies have been rapidly and continuously evolving in the world, which can offer promising solutions to the concerns.
Currently, plant operations rely on the round-the-clock monitoring and decision making by operators. To cope with the decline in experienced operators, our company has developed AI systems that enable advanced operation of a plant, simultaneously ensuring its stability and safety with prevention of the problems caused by aged equipment. Our goals also include improving production efficiency and reducing energy consumption in the plant beyond what experienced operators can achieve.
We have collaborated with “Preferred Networks”, a leading deep learning company in Japan since 2019. As a result of the collaboration and these endeavors, we have finally succeeded in introducing the AI systems to a Crude Distillation Unit (CDU) and a butadiene extraction unit in our plants. We have achieved continuous usage of them under various operating conditions of the units.
Our AI with deep neural networks learned the intricate relationships between the values of plant sensors and environmental variables, such as ambient temperature and precipitation, enables the prediction of the future values of the sensors and the selection of optimal operations as an alternative to human intervention. Unlike conventional advanced control systems, our AI excels at handling the non-linear dynamics between manipulated and controlled variables, allowing for flexible adaptation to changing operating conditions.
CDU especially requires a high level of skills and experience with operational factors to control and as many as sensors to monitor. The world’s first AI-based continuous autonomous operation of the CDU has been achieved. The AI system for the unit continuously monitors dozens of key operational factors and simultaneously adjusts dozens of valves to stabilize fluctuations resulting from crude oil switching as well as changes in crude oil throughput. The AI system has demonstrated higher stability and efficiency compared with manual operations. Moving forward, we will consider deploying the co-developed AI systems to our other refineries. Moreover, we aim to refine our AI models and investigate their applicability in other industries, expanding our reach beyond our company’s core operations.
With the ever-growing demand for electricity, large-scale renewable energy integration, and the need to strengthen grid resiliency, utilities are placing significant emphasis on expanding transmission infrastructure. These efforts involve developing modern transmission corridors, high-voltage direct current (HVDC) lines, digital substations, advanced monitoring systems, and grid management technologies to enable efficient long-distance power transfer and better utilization of existing assets. However, the addition of new transmission infrastructure requires long-term planning, regulatory approvals, environmental assessments, complex engineering, substantial capital investment, and extensive coordination among multiple stakeholders. As a result, transmission planning and expansion are often hectic and time-consuming, even as renewable integration and demand growth are progressing at a much faster pace. This mismatch increases stress on transmission corridors and leads to congestion challenges. For example, in the United Kingdom, renewable energy curtailment costs have reached nearly £1 billion annually because 3.8 million MWh of renewable generation had to be curtailed due to transmission congestion, despite being available for production, transmission, and consumption.
According to literature, renewable energy additions will continue at high levels over the next two decades. In fact, in some countries such as the United States, many renewable energy projects remain stuck in grid interconnection queues, awaiting transmission access. Since transmission infrastructure additions take much longer to materialize compared to renewable deployment, innovative solutions are required to maximize the utilization of existing networks, reduce congestion, and minimize curtailment.
In this paper, we propose a novel technology called Dynamic System Rating (DSR), which extends beyond conventional Dynamic Line Rating (DLR). While DLR considers thermal limits as the primary constraint on current flows, these limits are not the only factor restricting capacity. Consequently, DLR data alone often leaves a significant portion of spare capacity underutilized. DSR takes a holistic approach by evaluating not only thermal ratings but also voltage and angular stability margins to determine the true available capacity of each transmission line. This comprehensive assessment ensures that increasing line capacity does not compromise other system stability factors.
Through a case study, we demonstrate how DSR technology enhances network capacity, reduces renewable energy curtailment, and improves overall electricity network utilization, thereby delivering tangible benefits to consumers. By autonomously optimizing power flows, DSR increases effective grid capacity and supports renewable integration, serving as a practical bridge while new transmission corridors are being developed.
According to literature, renewable energy additions will continue at high levels over the next two decades. In fact, in some countries such as the United States, many renewable energy projects remain stuck in grid interconnection queues, awaiting transmission access. Since transmission infrastructure additions take much longer to materialize compared to renewable deployment, innovative solutions are required to maximize the utilization of existing networks, reduce congestion, and minimize curtailment.
In this paper, we propose a novel technology called Dynamic System Rating (DSR), which extends beyond conventional Dynamic Line Rating (DLR). While DLR considers thermal limits as the primary constraint on current flows, these limits are not the only factor restricting capacity. Consequently, DLR data alone often leaves a significant portion of spare capacity underutilized. DSR takes a holistic approach by evaluating not only thermal ratings but also voltage and angular stability margins to determine the true available capacity of each transmission line. This comprehensive assessment ensures that increasing line capacity does not compromise other system stability factors.
Through a case study, we demonstrate how DSR technology enhances network capacity, reduces renewable energy curtailment, and improves overall electricity network utilization, thereby delivering tangible benefits to consumers. By autonomously optimizing power flows, DSR increases effective grid capacity and supports renewable integration, serving as a practical bridge while new transmission corridors are being developed.
The digitalization of energy systems requires resilient, efficient, and sustainable infrastructure. A persistent challenge in the petroleum sector is the monetization of stranded and associated petroleum gas (APG), which is frequently flared due to the high cost of conventional transport or limited market access. This paper explores an integrated solution: converting APG into electricity via modular gas engines and applying this electricity to hydro-cooled, containerized data centers designed for high-performance computing (HPC), artificial intelligence (AI), and blockchain workloads.
By colocating power generation and digital infrastructure at oil and gas fields, operators reduce flaring, avoid costly midstream investments, and create a new form of globally exportable value: computing capacity. Economic modelling indicates project payback periods of 18–36 months, depending on local energy costs and computing market demand. Hydro cooling reduces operating expenses and achieves power usage effectiveness (PUE) close to 1.05, significantly outperforming conventional cooling methods in hot climates.
An essential feature of this approach is the reuse of waste heat. In colder regions, thermal energy can be redirected to district heating, agriculture, greenhouses, or aquaculture. In hotter climates, such as the Middle East, hydro cooling enables heat-to-cold conversion through absorption chillers, producing chilled water for industrial processes, greenhouses, district cooling, or desalination. This dual-use capability aligns with circular economy principles and maximizes energy efficiency.
Global examples, such as the Green Data City in Oman, illustrate the viability of coupling energy assets with digital infrastructure. The paper argues that future leaders in the energy industry will be those who provide not only hydrocarbons but also sustainable power for digital infrastructure. For Saudi Arabia, this approach aligns directly with Vision 2030 objectives by reducing flaring under the Circular Carbon Economy framework, diversifying the economy into AI and digital services, and positioning the Kingdom as a regional hub for data-driven innovation.
By colocating power generation and digital infrastructure at oil and gas fields, operators reduce flaring, avoid costly midstream investments, and create a new form of globally exportable value: computing capacity. Economic modelling indicates project payback periods of 18–36 months, depending on local energy costs and computing market demand. Hydro cooling reduces operating expenses and achieves power usage effectiveness (PUE) close to 1.05, significantly outperforming conventional cooling methods in hot climates.
An essential feature of this approach is the reuse of waste heat. In colder regions, thermal energy can be redirected to district heating, agriculture, greenhouses, or aquaculture. In hotter climates, such as the Middle East, hydro cooling enables heat-to-cold conversion through absorption chillers, producing chilled water for industrial processes, greenhouses, district cooling, or desalination. This dual-use capability aligns with circular economy principles and maximizes energy efficiency.
Global examples, such as the Green Data City in Oman, illustrate the viability of coupling energy assets with digital infrastructure. The paper argues that future leaders in the energy industry will be those who provide not only hydrocarbons but also sustainable power for digital infrastructure. For Saudi Arabia, this approach aligns directly with Vision 2030 objectives by reducing flaring under the Circular Carbon Economy framework, diversifying the economy into AI and digital services, and positioning the Kingdom as a regional hub for data-driven innovation.
Arseniy Kirichenko
Chair
Consultant, M&A, Valuation, Business development, Energy, Oil and Gas projects
Energy, Oil and Gas projects
Accurate flow rate estimation is critical for optimising gas production and enabling proactive reservoir management. At QatarEnergy LNG, Conventional metering technologies such as wet gas meters, multiphase flow meters (MPFMs), and test separators are commonly used for surveillance and allocating the rate of the separate phases. However, these systems present limitations in terms of cost, scalability, and operational reliability, specifically under dynamic flow conditions in wet gas wells with varying production characteristics. Test separators – though are widely considered as the reference method - are often constrained by infrastructure availability, limited test frequency and operational accessibility leading to a reduction in the effectiveness of capturing timely gas, water, and condensate rates. This paper presents the deployment and evaluation of Virtual Flow Metering (VFM) as a scalable, nonintrusive alternative to traditional metering systems. By leveraging real-time surface data with physics-based and data-driven models, VFMs offers continuous estimation of gas, condensate, and water rates at the wellheads. While the use of test separators remains essential for initial model calibration and periodic validation, VFMs can significantly reduce the need for frequent physical testing and improve surveillance insights across the asset. Benchmarking results from QatarEnergy LNG field applications show that VFMs can deliver accurate and reliable estimates across wells with variable and transient flow behaviour. The findings support VFMs as a key enabler of digital well surveillance, offering a cost-effective and operationally efficient tool for production optimisation. Looking ahead, integrating artificial intelligence (AI) into the VFMs systems presents a promising opportunity to further enhance adaptability, accuracy and predictive capabilities. This evolution positions VFMs as a foundation element in the future of autonomous production monitoring and reservoir management, contributing to QatarEnergy LNG broader objectives in digital transformation of the subsurface operations.
Objective/Scope:
As the global energy transition accelerates, closed-loop geothermal systems are emerging as a promising source of sustainable baseload energy. This study introduces a robust geothermal screening model designed to evaluate the technical and economic viability of closed-loop systems at an early stage, enabling rapid assessment across diverse geological and operational conditions.
Methods, Procedures, Process:
The model integrates reservoir and wellbore thermal dynamics—such as mass flow rate, thermal conductivity, diffusivity, and temperature gradients—with cost components including drilling, operations, and maintenance. Leveraging AI-based simulation techniques, the screening framework estimates heat generation potential and associated lifecycle costs. The model is structured to rapidly process multiple scenarios, incorporating sensitivity to key subsurface and design parameters. This allows for the identification of high-potential configurations and regions suitable for further detailed analysis or field deployment.
Results, Observations, Conclusions:
The screening model effectively distinguishes between viable and non-viable system designs based on performance and economic indicators such as net present value (NPV), internal rate of return (IRR), and payback period. Results indicate that geofluid flow rates, temperature gradients, and drilling costs are the most influential parameters affecting system feasibility. The model provides a structured, data-driven approach to prioritize projects and guide decision-making before committing to more resource-intensive optimization or field development.
Novel/Additive Information:
This geothermal screening model offers a scalable and flexible tool for industry stakeholders to evaluate closed-loop geothermal systems in a wide range of settings. By integrating technical performance with economic considerations early in the assessment phase, the model supports faster, more informed decisions that can accelerate the adoption of geothermal energy within the global sustainable energy mix.
Co-author/s:
Ali Alshuwaikhat, Research Engineer, Saudi Aramco.
As the global energy transition accelerates, closed-loop geothermal systems are emerging as a promising source of sustainable baseload energy. This study introduces a robust geothermal screening model designed to evaluate the technical and economic viability of closed-loop systems at an early stage, enabling rapid assessment across diverse geological and operational conditions.
Methods, Procedures, Process:
The model integrates reservoir and wellbore thermal dynamics—such as mass flow rate, thermal conductivity, diffusivity, and temperature gradients—with cost components including drilling, operations, and maintenance. Leveraging AI-based simulation techniques, the screening framework estimates heat generation potential and associated lifecycle costs. The model is structured to rapidly process multiple scenarios, incorporating sensitivity to key subsurface and design parameters. This allows for the identification of high-potential configurations and regions suitable for further detailed analysis or field deployment.
Results, Observations, Conclusions:
The screening model effectively distinguishes between viable and non-viable system designs based on performance and economic indicators such as net present value (NPV), internal rate of return (IRR), and payback period. Results indicate that geofluid flow rates, temperature gradients, and drilling costs are the most influential parameters affecting system feasibility. The model provides a structured, data-driven approach to prioritize projects and guide decision-making before committing to more resource-intensive optimization or field development.
Novel/Additive Information:
This geothermal screening model offers a scalable and flexible tool for industry stakeholders to evaluate closed-loop geothermal systems in a wide range of settings. By integrating technical performance with economic considerations early in the assessment phase, the model supports faster, more informed decisions that can accelerate the adoption of geothermal energy within the global sustainable energy mix.
Co-author/s:
Ali Alshuwaikhat, Research Engineer, Saudi Aramco.
Zhongxian Hao
Speaker
Manager
Research Institute of Petroleum Exploration & Development, PetroChina
There is more than 200 thousand pumping unit wells in CNPC, accurate and reliable downhole data is most important during oil production. Either Acoustic wave measurement of liquid level was not precise enough or the cable method was too expensive. The question was how to get the accurate downhole data and transmit it to the surface with low cost. This paper introduced the problems and solutions of the pilot test of the WWCTP in high water cut oil field .
WWCTP is a technology for monitoring the annulus pressure of pumping unit well and transmit to the ground wirelessly. In this technology, the down-hole pressure data was get by the pressure sensor, and the data can be transmitted to the surface through the rod, some typical wells were showed in this paper. For example, A well, with 1191 meters deep, a 57 mm diameter, 5.2 times/min stroke rate and 98.4% water cut, the WWCTP had been working stably for 4 months. But another well with almost the some condition, 1212 meters deep, a 57 mm diameter pump, 6 times per min stroke rate and 96.1% water cut, the signal is too weak and subtle to recognize. In total, more than 30 wells had been deployed, We found that there were many factors, such as pump size, well deliverability, could impact the data. Lots of optimization was carried out and finally realized stable wellbore Wireless Communication.
The pilot test shows that this technique had the advantages of a simple structure, reliable transmission, low cost and being free of periodic pump inspection. This technology was capable of wireless uploading in 1000 m deep well and capturing the accurate downhole pressure variation, also provides reference to optimize the production regime for low-production, low-efficiency and high-fluid-level wells.
This technology represents a new exploration in the acquisition of wellbore data. Furthermore, this technology's integrated cost is expected to decrease by more than 50% compared with wired data transmission, and has a broad application prospect.
Co-author/s:
Chenglong Liao, Engineer, Research Institute of Petroleum Exploration & Development, PetroChina.
WWCTP is a technology for monitoring the annulus pressure of pumping unit well and transmit to the ground wirelessly. In this technology, the down-hole pressure data was get by the pressure sensor, and the data can be transmitted to the surface through the rod, some typical wells were showed in this paper. For example, A well, with 1191 meters deep, a 57 mm diameter, 5.2 times/min stroke rate and 98.4% water cut, the WWCTP had been working stably for 4 months. But another well with almost the some condition, 1212 meters deep, a 57 mm diameter pump, 6 times per min stroke rate and 96.1% water cut, the signal is too weak and subtle to recognize. In total, more than 30 wells had been deployed, We found that there were many factors, such as pump size, well deliverability, could impact the data. Lots of optimization was carried out and finally realized stable wellbore Wireless Communication.
The pilot test shows that this technique had the advantages of a simple structure, reliable transmission, low cost and being free of periodic pump inspection. This technology was capable of wireless uploading in 1000 m deep well and capturing the accurate downhole pressure variation, also provides reference to optimize the production regime for low-production, low-efficiency and high-fluid-level wells.
This technology represents a new exploration in the acquisition of wellbore data. Furthermore, this technology's integrated cost is expected to decrease by more than 50% compared with wired data transmission, and has a broad application prospect.
Co-author/s:
Chenglong Liao, Engineer, Research Institute of Petroleum Exploration & Development, PetroChina.
In recent years, manufacturing industries in Japan have been facing a serious decline in the working population due to declining birthrate and aging population, as well as rising employee turnover rates. There are concerns about a shortage of experienced personnel in the oil refining and petrochemical industries, and an increase in problems caused by aging equipment in the industries. Prompt actions are required to cope with these concerns.
Digital technologies have been rapidly and continuously evolving in the world, which can offer promising solutions to the concerns.
Currently, plant operations rely on the round-the-clock monitoring and decision making by operators. To cope with the decline in experienced operators, our company has developed AI systems that enable advanced operation of a plant, simultaneously ensuring its stability and safety with prevention of the problems caused by aged equipment. Our goals also include improving production efficiency and reducing energy consumption in the plant beyond what experienced operators can achieve.
We have collaborated with “Preferred Networks”, a leading deep learning company in Japan since 2019. As a result of the collaboration and these endeavors, we have finally succeeded in introducing the AI systems to a Crude Distillation Unit (CDU) and a butadiene extraction unit in our plants. We have achieved continuous usage of them under various operating conditions of the units.
Our AI with deep neural networks learned the intricate relationships between the values of plant sensors and environmental variables, such as ambient temperature and precipitation, enables the prediction of the future values of the sensors and the selection of optimal operations as an alternative to human intervention. Unlike conventional advanced control systems, our AI excels at handling the non-linear dynamics between manipulated and controlled variables, allowing for flexible adaptation to changing operating conditions.
CDU especially requires a high level of skills and experience with operational factors to control and as many as sensors to monitor. The world’s first AI-based continuous autonomous operation of the CDU has been achieved. The AI system for the unit continuously monitors dozens of key operational factors and simultaneously adjusts dozens of valves to stabilize fluctuations resulting from crude oil switching as well as changes in crude oil throughput. The AI system has demonstrated higher stability and efficiency compared with manual operations. Moving forward, we will consider deploying the co-developed AI systems to our other refineries. Moreover, we aim to refine our AI models and investigate their applicability in other industries, expanding our reach beyond our company’s core operations.
Digital technologies have been rapidly and continuously evolving in the world, which can offer promising solutions to the concerns.
Currently, plant operations rely on the round-the-clock monitoring and decision making by operators. To cope with the decline in experienced operators, our company has developed AI systems that enable advanced operation of a plant, simultaneously ensuring its stability and safety with prevention of the problems caused by aged equipment. Our goals also include improving production efficiency and reducing energy consumption in the plant beyond what experienced operators can achieve.
We have collaborated with “Preferred Networks”, a leading deep learning company in Japan since 2019. As a result of the collaboration and these endeavors, we have finally succeeded in introducing the AI systems to a Crude Distillation Unit (CDU) and a butadiene extraction unit in our plants. We have achieved continuous usage of them under various operating conditions of the units.
Our AI with deep neural networks learned the intricate relationships between the values of plant sensors and environmental variables, such as ambient temperature and precipitation, enables the prediction of the future values of the sensors and the selection of optimal operations as an alternative to human intervention. Unlike conventional advanced control systems, our AI excels at handling the non-linear dynamics between manipulated and controlled variables, allowing for flexible adaptation to changing operating conditions.
CDU especially requires a high level of skills and experience with operational factors to control and as many as sensors to monitor. The world’s first AI-based continuous autonomous operation of the CDU has been achieved. The AI system for the unit continuously monitors dozens of key operational factors and simultaneously adjusts dozens of valves to stabilize fluctuations resulting from crude oil switching as well as changes in crude oil throughput. The AI system has demonstrated higher stability and efficiency compared with manual operations. Moving forward, we will consider deploying the co-developed AI systems to our other refineries. Moreover, we aim to refine our AI models and investigate their applicability in other industries, expanding our reach beyond our company’s core operations.
The digitalization of energy systems requires resilient, efficient, and sustainable infrastructure. A persistent challenge in the petroleum sector is the monetization of stranded and associated petroleum gas (APG), which is frequently flared due to the high cost of conventional transport or limited market access. This paper explores an integrated solution: converting APG into electricity via modular gas engines and applying this electricity to hydro-cooled, containerized data centers designed for high-performance computing (HPC), artificial intelligence (AI), and blockchain workloads.
By colocating power generation and digital infrastructure at oil and gas fields, operators reduce flaring, avoid costly midstream investments, and create a new form of globally exportable value: computing capacity. Economic modelling indicates project payback periods of 18–36 months, depending on local energy costs and computing market demand. Hydro cooling reduces operating expenses and achieves power usage effectiveness (PUE) close to 1.05, significantly outperforming conventional cooling methods in hot climates.
An essential feature of this approach is the reuse of waste heat. In colder regions, thermal energy can be redirected to district heating, agriculture, greenhouses, or aquaculture. In hotter climates, such as the Middle East, hydro cooling enables heat-to-cold conversion through absorption chillers, producing chilled water for industrial processes, greenhouses, district cooling, or desalination. This dual-use capability aligns with circular economy principles and maximizes energy efficiency.
Global examples, such as the Green Data City in Oman, illustrate the viability of coupling energy assets with digital infrastructure. The paper argues that future leaders in the energy industry will be those who provide not only hydrocarbons but also sustainable power for digital infrastructure. For Saudi Arabia, this approach aligns directly with Vision 2030 objectives by reducing flaring under the Circular Carbon Economy framework, diversifying the economy into AI and digital services, and positioning the Kingdom as a regional hub for data-driven innovation.
By colocating power generation and digital infrastructure at oil and gas fields, operators reduce flaring, avoid costly midstream investments, and create a new form of globally exportable value: computing capacity. Economic modelling indicates project payback periods of 18–36 months, depending on local energy costs and computing market demand. Hydro cooling reduces operating expenses and achieves power usage effectiveness (PUE) close to 1.05, significantly outperforming conventional cooling methods in hot climates.
An essential feature of this approach is the reuse of waste heat. In colder regions, thermal energy can be redirected to district heating, agriculture, greenhouses, or aquaculture. In hotter climates, such as the Middle East, hydro cooling enables heat-to-cold conversion through absorption chillers, producing chilled water for industrial processes, greenhouses, district cooling, or desalination. This dual-use capability aligns with circular economy principles and maximizes energy efficiency.
Global examples, such as the Green Data City in Oman, illustrate the viability of coupling energy assets with digital infrastructure. The paper argues that future leaders in the energy industry will be those who provide not only hydrocarbons but also sustainable power for digital infrastructure. For Saudi Arabia, this approach aligns directly with Vision 2030 objectives by reducing flaring under the Circular Carbon Economy framework, diversifying the economy into AI and digital services, and positioning the Kingdom as a regional hub for data-driven innovation.
Tao Liu
Speaker
Technology Planning Coordinator
Dalian West Pacific Petrochemical Co., Ltd.
Digital twin technology is crucial for refining enterprises aiming for operational excellence, yet traditional mechanistic and data-driven models struggle to capture the complexities of refining processes. A dual-driven digital twin that integrates data and theoretical frameworks is essential for achieving a balance of interpretability, accuracy and adaptability, as it embeds physicochemical constraints into machine learning architectures to ensure predictions adhere to conservation laws while dynamically learning unmodeled phenomena. Despite its theoretical promise, implementing a dual-driven digital twin in refining systems faces three critical challenges. One significant issue is high-quality dataset engineering due to diverse data sources causing issues like missing values and temporal misalignment, necessitating robust preprocessing techniques. Another critical challenge is achieving real-time high-fidelity modeling, which requires multi-scale approaches that integrate molecular-level kinetics and fluid dynamics while utilizing reduced-order models and neural network pruning for efficient inference. Furthermore, online optimization and decision-making are complicated by high-dimensional, non-convex objectives, necessitating a closed-loop “sense-decide-act” framework that allows reinforcement learning agents to dynamically adjust operating parameters while maintaining safety margins.
This study addresses these challenges through a novel methodology combining domain knowledge embedding and holistic optimization. Initially, high-quality datasets are developed through expert and theory-guided protocols that include outlier detection and imputation using first-law, K-means clustering combined with domain expertise to define parameter correlations, and automated cloud simulation platforms that generate physics-compliant samples. Secondly, an unified computational architecture integrates multi-scale constraints, such as molecular-scale reaction kinetics and equipment-scale hydrodynamics, into loss functions using Lagrange multipliers. Continuous online learning allows for the adaptation of model parameters to real-time sensor data, achieving less than 2% prediction errors in product properties. Thirdly, a holistic optimization algorithm based on gradient descent synchronizes operational variables, including distillation cut points and pump frequencies, thereby reducing optimization cycles from hours to minutes. Implemented in a 10-million-ton-per-year atmospheric-vacuum distillation unit, this framework facilitated closed-loop operation, resulting in a 1.8% increase in light oil yield and generating annual economic benefits exceeding CNY 25 million. This work illustrates that the fusion of theory-constrained data and cross-scale modeling is essential for the advancement of next-generation intelligent refineries.
Key words: Digital twin, data and theory, high-quality datasets, refinery optimization.
Co-author/s:
JiaHua Zhang, Dalian West Pacific Petrochemical Co., Ltd.
This study addresses these challenges through a novel methodology combining domain knowledge embedding and holistic optimization. Initially, high-quality datasets are developed through expert and theory-guided protocols that include outlier detection and imputation using first-law, K-means clustering combined with domain expertise to define parameter correlations, and automated cloud simulation platforms that generate physics-compliant samples. Secondly, an unified computational architecture integrates multi-scale constraints, such as molecular-scale reaction kinetics and equipment-scale hydrodynamics, into loss functions using Lagrange multipliers. Continuous online learning allows for the adaptation of model parameters to real-time sensor data, achieving less than 2% prediction errors in product properties. Thirdly, a holistic optimization algorithm based on gradient descent synchronizes operational variables, including distillation cut points and pump frequencies, thereby reducing optimization cycles from hours to minutes. Implemented in a 10-million-ton-per-year atmospheric-vacuum distillation unit, this framework facilitated closed-loop operation, resulting in a 1.8% increase in light oil yield and generating annual economic benefits exceeding CNY 25 million. This work illustrates that the fusion of theory-constrained data and cross-scale modeling is essential for the advancement of next-generation intelligent refineries.
Key words: Digital twin, data and theory, high-quality datasets, refinery optimization.
Co-author/s:
JiaHua Zhang, Dalian West Pacific Petrochemical Co., Ltd.
With the ever-growing demand for electricity, large-scale renewable energy integration, and the need to strengthen grid resiliency, utilities are placing significant emphasis on expanding transmission infrastructure. These efforts involve developing modern transmission corridors, high-voltage direct current (HVDC) lines, digital substations, advanced monitoring systems, and grid management technologies to enable efficient long-distance power transfer and better utilization of existing assets. However, the addition of new transmission infrastructure requires long-term planning, regulatory approvals, environmental assessments, complex engineering, substantial capital investment, and extensive coordination among multiple stakeholders. As a result, transmission planning and expansion are often hectic and time-consuming, even as renewable integration and demand growth are progressing at a much faster pace. This mismatch increases stress on transmission corridors and leads to congestion challenges. For example, in the United Kingdom, renewable energy curtailment costs have reached nearly £1 billion annually because 3.8 million MWh of renewable generation had to be curtailed due to transmission congestion, despite being available for production, transmission, and consumption.
According to literature, renewable energy additions will continue at high levels over the next two decades. In fact, in some countries such as the United States, many renewable energy projects remain stuck in grid interconnection queues, awaiting transmission access. Since transmission infrastructure additions take much longer to materialize compared to renewable deployment, innovative solutions are required to maximize the utilization of existing networks, reduce congestion, and minimize curtailment.
In this paper, we propose a novel technology called Dynamic System Rating (DSR), which extends beyond conventional Dynamic Line Rating (DLR). While DLR considers thermal limits as the primary constraint on current flows, these limits are not the only factor restricting capacity. Consequently, DLR data alone often leaves a significant portion of spare capacity underutilized. DSR takes a holistic approach by evaluating not only thermal ratings but also voltage and angular stability margins to determine the true available capacity of each transmission line. This comprehensive assessment ensures that increasing line capacity does not compromise other system stability factors.
Through a case study, we demonstrate how DSR technology enhances network capacity, reduces renewable energy curtailment, and improves overall electricity network utilization, thereby delivering tangible benefits to consumers. By autonomously optimizing power flows, DSR increases effective grid capacity and supports renewable integration, serving as a practical bridge while new transmission corridors are being developed.
According to literature, renewable energy additions will continue at high levels over the next two decades. In fact, in some countries such as the United States, many renewable energy projects remain stuck in grid interconnection queues, awaiting transmission access. Since transmission infrastructure additions take much longer to materialize compared to renewable deployment, innovative solutions are required to maximize the utilization of existing networks, reduce congestion, and minimize curtailment.
In this paper, we propose a novel technology called Dynamic System Rating (DSR), which extends beyond conventional Dynamic Line Rating (DLR). While DLR considers thermal limits as the primary constraint on current flows, these limits are not the only factor restricting capacity. Consequently, DLR data alone often leaves a significant portion of spare capacity underutilized. DSR takes a holistic approach by evaluating not only thermal ratings but also voltage and angular stability margins to determine the true available capacity of each transmission line. This comprehensive assessment ensures that increasing line capacity does not compromise other system stability factors.
Through a case study, we demonstrate how DSR technology enhances network capacity, reduces renewable energy curtailment, and improves overall electricity network utilization, thereby delivering tangible benefits to consumers. By autonomously optimizing power flows, DSR increases effective grid capacity and supports renewable integration, serving as a practical bridge while new transmission corridors are being developed.
The goal of my project was to replace manual calculations for each new time interval of new well operations calculations with a machine learning model prediction that would provide recommendations on the best wells at each company site for each new date.
objectives:
The models were tested on data from multiple fields with significantly different geological properties. Data on more than 3,000 oil well operation events were collected and aggregated, including 1,739 were BZT (bottomhole zone treatment, in particular acid treatment), and the rest were measures to intensify oil production (IOP) by reducing bottomhole pressure. For each oil well operation, historical data on production and injection, parameters of the measures taken, and reservoir characteristics were collected and prepared. The target feature (additional production) was cleaned of outliers using the properties of quantile distribution. Unlike methods based on normal distribution, this approach is not sensitive to absolute values and the scale of outliers, which allows for more effective elimination of anomalies.
Another key task was to take into account the mutual influence of wells. Traditional correlation calculation methods required significant computing resources to process the entire history of wells. Therefore, an original algorithm based on fast Fourier transform (FFT) was developed to calculate the optimal radius of influence of production and injection wells and to estimate the correlation between them. All calculations were transferred to graphics processing units (GPUs), which accelerated the calculations by more than 10–20 times.
Various machine learning methods were tested to predict the success of GTM: gradient boosting (CatBoost) for regression and classification tasks, as well as ranking models (Learning to Rank).
It has been shown that direct prediction of the value of additional production (regression) results in high error (RMSE, MAE), which is due to the complexity of the physics of the process and the limited data available. The classification model predicting the probability of events success demonstrates acceptable accuracy (Accuracy ~0.7–0.8), but has a significant spread in predictions. The best results are achieved using a recommendation ranking model, which does not predict exact values, but ranks wells in descending order of expected relative efficiency (additional production at the field). This approach allows selecting a limited number of the most promising candidates for GTM. This recommendation model is distinguished by its ability to compare wells with each other (it calculates an individual loss function that takes into account the rank of the well - Learning to Rank).
objectives:
- formulate a specific task to be solved and collect the necessary data from databases.
- conduct data analysis and train machine learning models to predict well operations success.
- develop an algorithm for accounting for the influence of neighboring wells on the target well.
- automate data collection and model forecasting
The models were tested on data from multiple fields with significantly different geological properties. Data on more than 3,000 oil well operation events were collected and aggregated, including 1,739 were BZT (bottomhole zone treatment, in particular acid treatment), and the rest were measures to intensify oil production (IOP) by reducing bottomhole pressure. For each oil well operation, historical data on production and injection, parameters of the measures taken, and reservoir characteristics were collected and prepared. The target feature (additional production) was cleaned of outliers using the properties of quantile distribution. Unlike methods based on normal distribution, this approach is not sensitive to absolute values and the scale of outliers, which allows for more effective elimination of anomalies.
Another key task was to take into account the mutual influence of wells. Traditional correlation calculation methods required significant computing resources to process the entire history of wells. Therefore, an original algorithm based on fast Fourier transform (FFT) was developed to calculate the optimal radius of influence of production and injection wells and to estimate the correlation between them. All calculations were transferred to graphics processing units (GPUs), which accelerated the calculations by more than 10–20 times.
Various machine learning methods were tested to predict the success of GTM: gradient boosting (CatBoost) for regression and classification tasks, as well as ranking models (Learning to Rank).
It has been shown that direct prediction of the value of additional production (regression) results in high error (RMSE, MAE), which is due to the complexity of the physics of the process and the limited data available. The classification model predicting the probability of events success demonstrates acceptable accuracy (Accuracy ~0.7–0.8), but has a significant spread in predictions. The best results are achieved using a recommendation ranking model, which does not predict exact values, but ranks wells in descending order of expected relative efficiency (additional production at the field). This approach allows selecting a limited number of the most promising candidates for GTM. This recommendation model is distinguished by its ability to compare wells with each other (it calculates an individual loss function that takes into account the rank of the well - Learning to Rank).
As the global energy sector accelerates toward a low-carbon future, operators in carbon-intensive regions face the dual imperative of maximizing efficiency while minimizing environmental impact. This study presents a transformative framework that integrates digitalization with advanced nanomaterials to drive sustainable, high-performance oil and gas operations in the Middle East.
The methodology leverages IoT-enabled sensors, AI-driven predictive analytics, and graphene-based nanofluids with nanoceramic coatings, enabling real-time monitoring, predictive maintenance, and operational optimization of high-pressure/high-temperature (HPHT) offshore wells. Simulations under conditions representative of Arabian Gulf reservoirs validate the framework’s effectiveness, scalability, and operational relevance.
Results demonstrate substantial improvements: non-productive time decreased by 18% through predictive maintenance, fuel consumption dropped by 12% via operational optimization, and methane emissions fell by 15%, equating to approximately $2.1 million in annual savings per platform. Nanoparticle-enhanced drilling fluids increased drilling efficiency by 22% while reducing environmental impact. Digital twin technology provided real-time decision support, lowering operational risks by 30% and water consumption by 25% compared with conventional practices.
This integrated approach accelerates deployment by 40% in analogous formations and offers potential cost reductions of up to 50% when implemented across multiple assets. By merging cutting-edge digital tools with nanotechnology, the framework delivers measurable environmental and economic benefits while supporting operational resilience, safety, and regulatory compliance.
Beyond immediate operational gains, this study provides a replicable pathway for energy operators seeking to align hydrocarbon production with climate commitments. The combination of predictive analytics, digital twins, and nanomaterial innovations not only enhances efficiency but also enables significant reductions in carbon footprint and resource consumption. This dual benefit supports the broader global energy transition by advancing cleaner, more sustainable upstream operations.
These findings offer actionable insights for Middle Eastern operators and other carbon-intensive regions, demonstrating that strategic integration of digital technologies and nanomaterials can transform traditional oil and gas operations into sustainable, high-performing, and economically viable systems. The framework provides a model for achieving affordable, reliable, and clean energy production while meeting environmental stewardship goals and regional sustainability priorities.
The methodology leverages IoT-enabled sensors, AI-driven predictive analytics, and graphene-based nanofluids with nanoceramic coatings, enabling real-time monitoring, predictive maintenance, and operational optimization of high-pressure/high-temperature (HPHT) offshore wells. Simulations under conditions representative of Arabian Gulf reservoirs validate the framework’s effectiveness, scalability, and operational relevance.
Results demonstrate substantial improvements: non-productive time decreased by 18% through predictive maintenance, fuel consumption dropped by 12% via operational optimization, and methane emissions fell by 15%, equating to approximately $2.1 million in annual savings per platform. Nanoparticle-enhanced drilling fluids increased drilling efficiency by 22% while reducing environmental impact. Digital twin technology provided real-time decision support, lowering operational risks by 30% and water consumption by 25% compared with conventional practices.
This integrated approach accelerates deployment by 40% in analogous formations and offers potential cost reductions of up to 50% when implemented across multiple assets. By merging cutting-edge digital tools with nanotechnology, the framework delivers measurable environmental and economic benefits while supporting operational resilience, safety, and regulatory compliance.
Beyond immediate operational gains, this study provides a replicable pathway for energy operators seeking to align hydrocarbon production with climate commitments. The combination of predictive analytics, digital twins, and nanomaterial innovations not only enhances efficiency but also enables significant reductions in carbon footprint and resource consumption. This dual benefit supports the broader global energy transition by advancing cleaner, more sustainable upstream operations.
These findings offer actionable insights for Middle Eastern operators and other carbon-intensive regions, demonstrating that strategic integration of digital technologies and nanomaterials can transform traditional oil and gas operations into sustainable, high-performing, and economically viable systems. The framework provides a model for achieving affordable, reliable, and clean energy production while meeting environmental stewardship goals and regional sustainability priorities.
The Caspian Sea, positioned at the crossroads of Europe and Asia, stands as more than a hydrocarbon-rich basin; it serves as a strategic testing ground for advancing sustainable energy pathways. Amid global imperatives to guarantee energy security while accelerating the low-carbon transition, the Caspian Basin emerges as a pivotal hub where diversification, technological innovation, and environmental responsibility converge. This paper argues that offshore development in the region, enabled by advanced technologies and multilateral cooperation, offers actionable lessons for shaping resilient global energy systems.
Drawing on comparative case studies from leading offshore projects—including Kashagan, Azeri–Chirag–Gunashli, and new developments in Iranian waters—this study proposes a framework for sustainable offshore exploration. The framework highlights the integration of AI-enabled monitoring, full-scale digitalization, carbon capture readiness, and stringent environmental safeguards as essential pillars of next-generation offshore operations. Pilot assessments and simulation-based analyses suggest that such integrated approaches can potentially reduce methane emissions by up to 15% while lowering operating costs by nearly 10%, demonstrating that economic competitiveness and environmental stewardship can reinforce each other.
The Caspian’s distinctive conditions—its complex geology, semi-enclosed ecosystem, and multi-state governance—make it an indispensable arena for piloting energy transition strategies. Insights from the region emphasize the value of collaborative governance, technology-driven resilience, and regional capacity-building in sustaining market stability and ecological integrity. Moreover, the Caspian experience provides a replicable blueprint for other offshore provinces worldwide striving to align hydrocarbon development with sustainability and diversification.
The study concludes with policy-oriented recommendations that resonate with global net-zero ambitions and highlight synergies with Gulf Cooperation Council strategies, particularly in relation to hydrogen development, carbon management, and renewable energy integration under Vision 2030 frameworks.


