This paper introduces an optimal model that utilizes machine learning algorithms to predict the
active power losses during the allocation and sizing of distributed generation (DG) units in power distribution networks. The model incorporates the technique of Gradient Boosting Machine Regression (GBMR). This study estimates DG location, bus voltages, DG size, and active
losses without conventional power flow calculations.
The results demonstrate that the suggested estimations of power losses and DG sizing method are effective, practical, and adaptable. The accuracy of the proposed estimation methods has been validated using R-squared and mean absolute percentage error (MAPE) metrics. In the case of fixed load, the GBMR outperforms with a very low (MAPE) (0.9281%), a root mean square error (RMSE) of 1.748, and 0.999 accuracy in predicting
active power losses. When the normalized load variation (NLV), the R-squared 0.9991 value with low MAPE (1.9815%). This approach enables grid operators to effectively manage DG unit integration by providing precise estimates and forecasts of power loss. The effectiveness of the proposed strategy is validated in the IEEE 33 bus test system using MATLAB software.