Predicting the B-H Loops of Power Magnetics with Transformer-based Encoder-Projector-Decoder Neural Network Architecture

Haoran Li, Diego Serrano, Shukai Wang, Thomas Guillod, Min Luo, Minjie Chen

Research output: Chapter in Book/Report/Conference proceedingConference contribution

7 Scopus citations

Abstract

This paper presents a transformer-based encoder-projector-decoder neural network architecture for modeling power magnetics B-H hysteresis loops. The transformer-based encoder-decoder network architecture maps a flux density excitation waveform (B) into the corresponding magnetic field strength (H) waveform. The predicted B-H loop can be used to estimate the core loss and support magnetics-in-circuit simulations. A projector is added between the transformer encoder and decoder to capture the impact of other inputs such as frequency, temperature, and dc bias. An example transformer neural network is designed, trained, and tested to prove the effectiveness of the proposed architecture.

Original languageEnglish (US)
Title of host publicationAPEC 2023 - 38th Annual IEEE Applied Power Electronics Conference and Exposition
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1543-1550
Number of pages8
ISBN (Electronic)9781665475396
DOIs
StatePublished - 2023
Event38th Annual IEEE Applied Power Electronics Conference and Exposition, APEC 2023 - Orlando, United States
Duration: Mar 19 2023Mar 23 2023

Publication series

NameConference Proceedings - IEEE Applied Power Electronics Conference and Exposition - APEC
Volume2023-March

Conference

Conference38th Annual IEEE Applied Power Electronics Conference and Exposition, APEC 2023
Country/TerritoryUnited States
CityOrlando
Period3/19/233/23/23

All Science Journal Classification (ASJC) codes

  • Electrical and Electronic Engineering

Keywords

  • data-driven method
  • hysteresis loop
  • machine learning
  • neural network
  • power magnetics
  • transformer

Fingerprint

Dive into the research topics of 'Predicting the B-H Loops of Power Magnetics with Transformer-based Encoder-Projector-Decoder Neural Network Architecture'. Together they form a unique fingerprint.

Cite this