煤炭工程 ›› 2025, Vol. 57 ›› Issue (5): 148-155.doi: 10. 11799/ ce202505020

• 研究探讨 • 上一篇    下一篇

基于改进Transformer的提升机制动系统故障诊断研究

王凯旋张宏伟   

  1. 河南理工大学南校区
  • 收稿日期:2024-06-29 修回日期:2024-08-17 出版日期:2025-05-13 发布日期:2025-07-03
  • 通讯作者: 王凯旋 E-mail:wkx221016@163.com

Fault diagnosis of hoist braking system based on improved Transformer

  • Received:2024-06-29 Revised:2024-08-17 Online:2025-05-13 Published:2025-07-03

摘要:

在对矿井提升系统进行故障诊断时,为了降低对专家经验的依赖性并充分挖掘数据间的复杂关系,提出了一种基于改进Transformer神经网络的提升机制动系统故障诊断方法。首先,分析制动系统的故障现象及原因,确定监测参数。其次,搭建改进的Transformer故障诊断模型,利用多层自注意力机制来捕捉矿井提升机监测数据之间的相关性和故障关系,并将池化层引入Transformer模型中,降低模型的参数量,缓解过拟合的风险。最后,以采集提升机实际运行数据为基础开展实验研究,利用Adam优化器更新模型参数。结果表明,改进后的Transformer故障分类预测的准确率可达到97.5%,相比于TransformerCNNLSTM神经网络准确率分别提升了6.110.014.8百分点,具有较高的准确性。

关键词:

矿井提升机 , 制动系统 , Transformer神经网络 , 故障诊断 , 自注意力机制

Abstract: As the ' throat ' connecting the upper and lower mines, the operation state of the mine hoist directly affects the production efficiency and safety of the mine, and the braking system is an important guarantee for the stable operation of the hoist. In order to reduce the dependence on expert experience and fully exploit the complex relationship between data, a fault diagnosis method of lifting braking system based on improved Transformer neural network is proposed. Firstly, the fault phenomena and causes of the braking system are analyzed, and the monitoring parameters are determined. Secondly, an improved Transformer fault diagnosis model is built, and a multi-layer self-attention mechanism is used to capture the correlation and fault relationship between the monitoring data of the mine hoist. The pooling layer is introduced into the Transformer model to reduce the parameters of the model and alleviate the risk of over-fitting. Finally, the experimental research is carried out based on the actual operation data of the hoist. The Adam optimizer is used to update the model parameters. The results show that the accuracy of the improved Transformer fault classification prediction can reach 97.5 %, which is 6.1 %, 10.0 % and 14.8 % higher than that of Transformer, CNN and LSTM neural network, respectively.

中图分类号: