Abstract:In this study, an improved method is proposed to solve the problems of fluctuating oscillation and easy divergence of the traditional back propagation (BP) neural network proportional integral differential (PID) algorithm. Firstly, the He initialization method is used to initialize the neural network, the learning rate is attenuated, and the algorithm gradient is clipped. Then, on this basis, the effects of different activation functions (Sigmoid, Tanh, ReLU and Leaky ReLU) and different smoothing techniques (exponential smoothing, moving average, Savitzky-Golay filter and Butterworth filter) on the performance of the algorithm are further compared. Finally, the robustness of the smoother is tested by extreme disturbance. The results show that compared with the traditional Sigmoid activation function, the ReLU, Leaky ReLU and Tanh activation functions have stronger stability, and the Leaky ReLU activation function has the best comprehensive performance. In terms of smoothing effect, exponential smoothing and Savitzky-Golay filters have more obvious advantages and are more suitable for applications that require fast response and precise smoothing. The smoothing techniques can make the algorithm recover faster and improve the stability of the algorithm.