Deakin University
Browse

A lightweight GAN-based fault diagnosis method based on knowledge distillation and deep transfer learning

Version 2 2024-06-03, 02:05
Version 1 2023-12-06, 04:43
journal contribution
posted on 2024-06-03, 02:05 authored by Hongyu ZhongHongyu Zhong, Samson YuSamson Yu, Hieu TrinhHieu Trinh, R Yuan, Y Lv, Yanan WangYanan Wang
Abstract Generative adversarial networks (GANs) have shown promise in the field of small sample fault diagnosis. However, it is worth noting that generating synthetic data using GANs is time-consuming, and synthetic data cannot fully replace real data. To expedite the GAN-based fault diagnostics process, this paper proposes a hybrid lightweight method for compressing GAN parameters. First, three modules are constructed: a teacher generator, a teacher discriminator, and a student generator, based on the knowledge distillation GAN (KD-GAN) approach. The distillation operation is applied to both teacher generator and student generator, while adversarial training is conducted for the teacher generator and the teacher discriminator. Furthermore, a joint loss function is proposed to update the parameters of the student generator by combining distillation loss and adversarial loss. Additionally, the proposed KD-GAN method is combined with deep transfer learning (DTL) and leverages real data to enhance the diagnostic model’s performance. Two numerical experiments are performed to demonstrate that the proposed KD-GAN-DTL method outperforms other GAN-based fault diagnosis methods in terms of computational time and diagnostic accuracy.

History

Journal

Measurement Science and Technology

Volume

35

Article number

ARTN 036103

Location

Bristol, Eng.

ISSN

0957-0233

eISSN

1361-6501

Language

eng

Publication classification

C1 Refereed article in a scholarly journal

Issue

3

Publisher

IOP Publishing