Abstract:
This research aims to improve fault detection and classification in Gigabit Passive Optical Networks (GPON) by uti lizing machine learning, focusing on the K-Nearest Neighbors (K-NN) algorithm. The GPON network is extensively simulated using OptiSystem, where essential performance metrics—Optical Power (dBm), Bit Error Rate (BER), and Signal-to-Noise Ratio (SNR)—are analyzed under various fault and interference scenarios. The collected data undergoes preprocessing and normalization before classification with the K-NN algorithm implemented in MATLAB, using the Euclidean distance metric to measure similarity. Line Terminal (OLT) to the splitter. Classification results evaluated via confusion matrices show accuracy rates between 63.16% and 75.00% across different Optical Network Units (ONUs). ONU 2 and ONU 8 achieved the highest accuracies of 75.00% and 72.73%, respectively, while ONU 1 and ONU 7 recorded lower accuracies of 63.64% and 63.16%. Additionally, a detailed analysis of fiber attenuation effects on BER reveals significant signal degradation with increased attenuation. This effect is notably more severe in the segment between the splitter and ONUs compared to the path from the Optical These findings highlight the effectiveness of K-NN-based fault diagnosis systems in automating detection and en hancing GPON reliability, thus reducing downtime and operational costs. Future work may explore more advanced machine learning classifiers, improved feature selection, and real-time monitoring techniques to boost detection accuracy and network resilience.