Image generated by AI using OpenAI tools. For illustrative purposes only.
From interconnection studies to long-term adequacy planning, existing methods remain incomplete for emerging AI data center loads
Connecting new large loads to power systems requires careful interconnection studies to ensure system reliability and security. While interconnection frameworks for conventional loads are well established [1], practices are still evolving on the details of study types (e.g., steady-state, dynamic, and transient) and associated models. These gaps are explicitly documented in NERC’s 2026 gap assessment [2], recent practitioner guidance [3], and an IEEE standards community review [4].
This is an important starting point. For conventional loads, load modeling frameworks are already mature. Static and dynamic approaches, including ZIP and CMLD (Composite Load Model), are well established [5], [6], [7], and there is an extensive literature on CMLD-based approaches and their parameterization and identification [8]-[13]. But data centers, as surveyed in the component-level, facility-level, and power-systems literature [14], [15], [16], differ fundamentally from conventional loads. Dense concentrations of power-electronic-interfaced servers and storage, thermally coupled motor-driven cooling loads, and workload-driven demand profiles produce voltage and frequency response characteristics that generic model parameterization does not capture [2], [17].
A similar gap appears in the modeling of AI workload-driven dynamics. The PERC1 model was proposed as an extension of EV charger models for constant-power electronic loads [18], but neither it nor existing CMLD variants represent the AI workload-driven dynamics documented in recent grid-impact studies [19], [20], [21], [22]. These dynamics include distributed training ramps, synchronized checkpointing, bursty inference fluctuations, and disturbance ride-through behavior [19], [20], [21], [22]. While EMT modeling of conventional data center power distribution has been demonstrated [23], and the PNNL Data Center Modeling Library provides generic EMT baselines for representative facility types [24], validated PSPD and EMT models for AI data centers that incorporate disturbance ride-through and workload-driven transients are explicitly documented as absent [2]. Third, no scientifically grounded interconnection framework specifying what data AI data center developers must provide at each interconnection stage exists in the peer-reviewed literature.
A broad literature exists, but it does not yet answer grid-operation needs
The literature on data center power modeling is already extensive. Established model taxonomies distinguish hardware-centric approaches, from circuit-level component aggregation to system-of-systems level, and software-centric approaches [14], [15]. At the server level, models span regression, utilization-based, stacked LSTM, multi-granularity deep learning, and domain-adaptation methods for IT power prediction [25-29]. Cooling systems are modeled through Power Usage Effectiveness (PUE)- and Coefficient of Performance (COP)-based frameworks, as well as thermodynamics-based digital twins [30], [31], [32]. Demand-response-oriented data center load models also address scheduling and curtailment optimization for grid signals [33], [34], [35].
Short-term facility-level load forecasting has also been addressed through ARIMA–deep learning hybrids [36], hidden Markov models [37], LightGBM-based segmented forecasting [38], cluster-level IT power temporal forecasting [14], and AI data center short-term demand forecasting [39]. In parallel, AI workload-specific hardware characterization and power draw measurements are documented in recent empirical and survey work [19], [40], [41], [42].
Despite this breadth, a major practical gap remains. No existing model produces load profiles for power systems operation simulations at the minute-to-hourly temporal resolution needed for operational interpretations required for power systems security, reliability, and flexibility assessments [2], [19], [43]. Closely related to this, the minimum data specification framework—the minimum set of AI data center attributes required to generate a realistic operational load profile—has not been defined in the peer-reviewed literature. In other words, the field has many modeling pieces, but it still lacks the operationally usable framework needed by the grid side.
Long-term forecasting faces an even deeper structural gap
The challenge becomes even sharper at longer horizons. Traditional long-term load forecasting methods are built on well-established probabilistic frameworks that employ meteorological, economic, and demographic drivers [44]. But these methods do not apply well to software-driven, project-driven loads where realization uncertainty and workload dynamics dominate over exogenous drivers [19], [45].
The broader data center forecasting literature addresses parts of this problem, but mostly at shorter horizons and at the facility layer. At the server and cluster level, operational modeling tasks can provide the demand-characterization foundation through calibrated operational load models and the minimum data specification framework. However, from the perspective of long-term adequacy planning, utilities currently rely on heuristic maturity-weighting applied to interconnection pipeline requests. Under this approach, proposed large-load projects in the interconnection pipeline are discounted or assigned higher forecast contributions based on perceived development readiness, such as contractual commitments, permitting status, construction progress, and service-agreement milestones. Yet [46] and [47] explicitly document the absence of empirical or methodologically transparent foundations for these practices.
This is where the research gap becomes a structural one. No published methodology translates a pipeline of proposed large-load developments, stochastically filtered through project realization dynamics, into the 8760-hour annual load profiles required for system adequacy analysis. The compound uncertainty spanning project realization variability, facility type mix, and AI workload evolution has no established methodological precedent across the literature. This is not a minor extension problem. It is a structural gap amplified by the extraordinary scale and pace of current AI data center deployment [48], [49], [50].
Why these gaps matter
Taken together, the literature shows an important contrast. On one hand, interconnection frameworks for conventional loads are well established [18], conventional load modeling methods are mature [5-13], and data center energy and facility behavior have already been studied across many layers [14], [15], [16], [19], [20], [25-42], [51]. On the other hand, the specific requirements of emerging AI data center loads are still not matched by fully established interconnection guidance, validated grid-facing operational models, minimum data-specification frameworks, or long-term adequacy methodologies [2], [3], [4], [43], [46-50].
The issue is therefore not the absence of related research. The issue is that the existing research does not yet close the gap between conventional practice and the grid-facing realities of AI data center integration. For interconnection studies, an important remaining need is for methods and models that are tailored, or can be systematically customized, to AI data centers and their evolving technical designs and workload variations, together with clearer guidance on the associated study scope and developer data requirements. For operational assessments, the missing pieces are realistic minute-to-hourly load profiles and the minimum information needed to generate them. For long-term planning, the missing piece is a transparent methodology that can move from uncertain project pipelines to adequacy-ready annual load profiles.
Acknowledgment: Dr. Xi Wang, Dr. Christine Cao and Dr. Amirhossein Ahmadi contributed to this review.
[1] IEEE, IEEE Standard for Interconnection and Interoperability of Inverter-Based Resources (IBRs) (IEEE Std 2800-2022), IEEE, 2022, doi: 10.1109/IEEESTD.2022.9762253.
[2] North American Electric Reliability Corporation, Large Loads Working Group, Assessment of Gaps in Existing Practices, Requirements, and Reliability Standards for Emerging Large Loads [White Paper], NERC, Mar. 2026.
[3] GridLab, Practical Guidance and Considerations for Large Load Interconnections — Interim Report, GridLab, Mar. 2025.
[4] Data Centers Standards Needs Analysis and Recommendations Activity, Review of Industry Efforts and Standards of Grid Readiness for Data Center Deployment [White Paper], IEEE Standards Association, Jan. 2026.
[5] J. V. Milanović, K. Yamashita, S. Martínez Villanueva, S. Ž. Djokić, and L. M. Korunović, “International industry practice on power system load modeling,” IEEE Transactions on Power Systems, vol. 28, no. 3, pp. 3038–3046, 2013, doi: 10.1109/TPWRS.2012.2231969.
[6] A. Arif, Z. Wang, J. Wang, B. Mather, H. Bashualdo, and D. Zhao, “Load modeling — A review,” IEEE Transactions on Smart Grid, vol. 9, no. 6, pp. 5986–5999, 2018, doi: 10.1109/TSG.2017.2700436.
[7] L. M. Korunović, J. V. Milanović, S. Z. Djokić, K. Yamashita, S. Martínez Villanueva, and S. Sterpu, “Recommended parameter values and ranges of most frequently used static load models,” IEEE Transactions on Power Systems, vol. 33, no. 6, pp. 5923–5934, 2018, doi: 10.1109/TPWRS.2018.2834725.
[8] X. Liang, “A New Composite Load Model Structure for Industrial Facilities,” IEEE Trans. Ind. Appl., vol. 52, no. 6, pp. 4601–4609, Nov. 2016, doi: 10.1109/TIA.2016.2600665.
[9] Electric Power Research Institute, Technical Reference on the Composite Load Model, EPRI, Palo Alto, CA, Tech. Rep. 3002019209, 2020.
[10] B. Tan, J. Zhao, and N. Duan, “Amortized Bayesian parameter estimation approach for WECC composite load model,” IEEE Transactions on Power Systems, vol. 39, no. 1, pp. 1517–1529, Jan. 2024, doi: 10.1109/TPWRS.2023.3250579.
[11] X. Wang, Y. Wang, D. Shi, J. Wang, and Z. Wang, “Two-stage WECC composite load modeling: A double deep Q-learning networks approach,” IEEE Transactions on Smart Grid, vol. 11, no. 5, pp. 4331–4344, 2020, doi: 10.1109/TSG.2020.2988171.
[12] S. Afrasiabi, M. Afrasiabi, M. Jarrahi, M. Mohammadi, J. Aghaei, M. Javadi, M. Shafie-Khah, and J. Catalao, “Wide-area composite load parameter identification based on multi-residual deep neural network,” IEEE Transactions on Neural Networks and Learning Systems, vol. 34, no. 9, pp. 6121–6131, 2023, doi: 10.1109/TNNLS.2021.3133350.
[13] K. Zhang, H. Zhu, and S. Guo, “Dependency analysis and improved parameter estimation for dynamic composite load modeling,” IEEE Transactions on Power Systems, vol. 32, no. 4, pp. 3287–3297, 2017, doi: 10.1109/TPWRS.2016.2623629.
[14] K. M. U. Ahmed, M. H. J. Bollen, and M. Alvarez, “A review of data centers energy consumption and reliability modeling,” IEEE Access, vol. 9, pp. 152536–152563, 2021, doi: 10.1109/ACCESS.2021.3125092.
[15] M. Dayarathna, Y. Wen, and R. Fan, “Data center energy consumption modeling: A survey,” IEEE Communications Surveys & Tutorials, vol. 18, no. 1, pp. 732–794, 2016, doi: 10.1109/COMST.2015.2481183.
[16] E. Ginzburg-Ganz, P. Lifshits, R. Machlev, J. Belikov, Z. Krieger, and Y. Levron, “Technical challenges of AI data center integration into power grids — A survey,” Energies, vol. 19, no. 1, p. 137, 2026, doi: 10.3390/en19010137.
[17] P. Mitra, J. Lu, and L. Sundaresh, “Emerging loads: Modeling for transmission reliability studies,” IEEE Power and Energy Magazine, vol. 23, no. 5, pp. 35–43, 2025, doi: 10.1109/MPE.2025.3543294.
[18] PowerWorld Corporation, “Load Characteristic Model: PERC1,” accessed: Feb. 19, 2026.
[19] X. Chen, X. Wang, A. Colacelli, M. Lee, and L. Xie, “Electricity demand and grid impacts of AI data centers: Challenges and prospects,” arXiv preprint arXiv:2509.07218, Nov. 2025, doi: 10.48550/arXiv.2509.07218. [PREPRINT — v4, November 2025.]
[20] Y. Li and Y. R. Li, “AI load dynamics — A power electronics perspective,” arXiv preprint arXiv:2502.01647, Feb. 2025, doi: 10.48550/arXiv.2502.01647. [PREPRINT — v2, Feb. 6, 2025.]
[21] M.-S. Ko and H. Zhu, “Wide-area power system oscillations from large-scale AI workloads,” arXiv preprint arXiv:2508.16457, Aug. 2025, doi: 10.48550/arXiv.2508.16457. [PREPRINT — v1, Aug. 22, 2025]
[22] K.-B. Kwon, S. Mukherjee, and V. Adetola, “Operational risks in grid integration of large data center loads: Characteristics, stability assessments, and sensitivity studies,” arXiv preprint arXiv:2510.05437, Oct. 2025, doi: 10.48550/arXiv.2510.05437. [PREPRINT — v3, Oct. 28, 2025.]
[23] J. Sun, S. Wang, J. Wang, and L. M. Tolbert, “Dynamic model and converter-based emulator of a data center power distribution system,” IEEE Transactions on Power Electronics, vol. 37, no. 7, pp. 8420–8432, Jul. 2022, doi: 10.1109/TPEL.2022.3146354.
[24] Pacific Northwest National Laboratory, Electromagnetic Transient Modeling of Large Data Centers for Grid-Level Studies, PNNL, Richland, WA, Tech. Rep. PNNL-38817, 2024.
[25] W. Lin, G. Wu, X. Wang, and K. Li, “An artificial neural network approach to power consumption model construction for servers in cloud data centers,” IEEE Transactions on Sustainable Computing, vol. 5, no. 3, pp. 329–340, 2019.
[26] Z. Shen, X. Zhang, B. Liu, B. Xia, Z. Liu, and Y. Li, “PCP-2LSTM: Two stacked LSTM-based prediction model for power consumption in data centers,” in Proc. International Conference on Advanced Cloud and Big Data (CBD), 2019.
[27] Z. Shen, X. Zhang, B. Xia, Z. Liu, and Y. Li, “Multi-granularity power prediction for data center operations via long short-term memory network,” in Proc. IEEE ISPA/BDCloud/SustainCom/SocialCom, 2019.
[28] R. Mo, W. Lin, S. Lin, S. Fong, and K. Li, “NoSPF: Non-stationary long-term power consumption forecasting for servers in cloud data centers,” IEEE Transactions on Computers, 2025.
[29] S. Pagani, P. S. Manoj, A. Jantsch, and J. Henkel, “Machine learning for power, energy, and thermal management on multicore processors: A survey,” IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, vol. 39, no. 1, pp. 101–116, 2018.
[30] Q. Zhang, Z. Meng, X. Hong, Y. Zhan, J. Liu, J. Dong, T. Bai, J. Niu, and M. J. Deen, “A survey on data center cooling systems: Technology, power consumption modeling and control strategy optimization,” Journal of Systems Architecture, vol. 119, p. 102253, 2021.
[31] Y. Zhou, F. Wei, S. Li, Z. Wang, J. Liu, and D. Yu, “Data center load modeling through optimal energy consumption characteristics,” Applied Energy, vol. 393, p. 126095, 2025, doi: 10.1016/j.apenergy.2025.126095.
[32] S. Jadhav and Z. Liu, “Digital twin-based cooling system optimization for data center,” arXiv preprint arXiv:2603.01198, Mar. 2026, doi: 10.48550/arXiv.2603.01198. [PREPRINT — v3, Mar. 7, 2026.]
[33] M. Chen, C. Gao, M. Shahidehpour, Z. Li, S. Chen, and D. Li, “Internet data center load modeling for demand response considering the coupling of multiple regulation methods,” IEEE Transactions on Smart Grid, vol. 12, no. 3, pp. 2060–2076, May 2021, doi: 10.1109/TSG.2020.3048032.
[34] S. Bahrami, V. W. S. Wong, and J. Huang, “Data center demand response in deregulated electricity markets,” IEEE Transactions on Smart Grid, vol. 10, no. 3, pp. 2820–2832, 2019, doi: 10.1109/TSG.2018.2810830.
[35] K. Kim, F. Yang, V. M. Zavala, and A. A. Chien, “Data centers as dispatchable loads to harness stranded power,” IEEE Transactions on Sustainable Energy, vol. 8, no. 1, pp. 208–218, Jan. 2017, doi: 10.1109/TSTE.2016.2593607.
[36] Z. Zhao, C. Liang, Y. Zhang, L. Zhu, X. Wang, and J. Tang, “Short-term data center load forecasting method based on data-physics hybrid driven framework,” Energy Reports, vol. 15, p. 108953, 2026.
[37] A. Bajracharya, M. R. A. Khan, S. Michael, and R. Tonkoski, “Forecasting data center load using hidden Markov model,” in Proc. 2018 North American Power Symposium (NAPS), Fargo, ND, Sep. 2018, pp. 1–5, doi: 10.1109/NAPS.2018.8600677.
[38] S. Lei, H. Dai, J. Ding, X. Liang, X. Ge, X. Xia, and F. Wang, “Net load segmented forecasting method for data center based on GS-LightGBM model,” in Proc. IEEE IAS Global Conference on Renewable Energy and Hydrogen Technologies (GlobConHT), 2023.
[39] M. Mughees, Y. Li, Y. Chen, and Y. R. Li, “Short-term load forecasting for AI-data center,” in Proc. 2025 IEEE Power & Energy Society General Meeting (PESGM), IEEE, 2025.
[40] I. Latif et al., “Empirical measurements of AI training power demand on a GPU-accelerated node,” IEEE Access, vol. 13, pp. 61740–61747, 2025, doi: 10.1109/ACCESS.2025.3554728.
[41] H. P. Sampatirao, “Characterization and classification of AI workloads in modern internet data centers,” SSRN preprint 5875422, Dec. 2025, doi: 10.2139/ssrn.5875422. [PREPRINT.]
[42] C. Aghadinuno, S. Ahmed, M. Alamaniotis, B. Wang, and N. Gatsis, “Investigation of AI data center load impact on power system frequency using real-world datasets,” in Proc. IEEE Green Technologies Conference (GreenTech), 2026.
[43] North American Electric Reliability Corporation, Aggregated Report on NERC Level 2 Industry Recommendation: Large Load Interconnection, Study, Commissioning, and Operations [Report, Sept. 9, 2025], NERC, 2025.
[44] T. Hong and S. Fan, “Probabilistic electric load forecasting: A tutorial review,” International Journal of Forecasting, vol. 32, no. 3, pp. 914–938, 2016.
[45] D. Al Kez and A. Foley, “Instability risks from programmable AI load ramping in low-inertia grids,” SSRN preprint 5370875, 2025, doi: 10.2139/ssrn.5370875. [PREPRINT.]
[46] Energy Systems Integration Group, Forecasting for Large Loads: Current Practices and Recommendations. A Report by the Large Loads Task Force, ESIG, 2025.
[47] Energy and Environmental Economics (E3), Forecasting Large Loads in the Age of AI and Data Centers, E3, San Francisco, CA, 2025.
[48] International Energy Agency, Energy and AI. Paris: IEA, 2025.
[49] Electric Power Research Institute, Powering Intelligence: Analyzing Artificial Intelligence and Data Center Energy Consumption, EPRI, Palo Alto, CA, Tech. Rep. 3002028905, 2024.
[50] Alberta Electric System Operator, Large Load Integration Phase I: Interim Connection Limit and Assignment [Presentation, Jun. 4, 2025], AESO, 2025.
[51] Y. Wang et al., “DCITNet: A temporal forecasting scheme for IT power demand of data center,” IEEE Transactions on Industry Applications, 2025, doi: 10.1109/TIA.2025.3619909.







