dc.contributor |
Other |
|
dc.contributor |
Thompson, John |
|
dc.creator |
Luan, Dianxin |
|
dc.creator |
Thompson, John |
|
dc.date |
2023-01-31T14:05:17Z |
|
dc.date |
2023-01-31T14:05:17Z |
|
dc.date.accessioned |
2023-02-17T20:25:43Z |
|
dc.date.available |
2023-02-17T20:25:43Z |
|
dc.identifier |
Luan, Dianxin; Thompson, John. (2023). Channelformer Neural Network Software, [software]. University of Edinburgh. School of Engineering. Institute for Digital Communications. https://doi.org/10.7488/ds/3801. |
|
dc.identifier |
https://hdl.handle.net/10283/4790 |
|
dc.identifier |
https://doi.org/10.7488/ds/3801 |
|
dc.identifier.uri |
http://localhost:8080/xmlui/handle/CUHPOERS/242569 |
|
dc.description |
This code was prepared for the IEEE Transactions on Wireless Communications Paper "Channelformer: Attention based Neural Solution for Wireless Channel Estimation and Effective Online Training" (https://hdl.handle.net/20.500.11820/244a98cb-c237-497c-bbf2-2d8f3ad0068b). The paper abstract is as follows:
In this paper, we propose an encoder-decoder neural architecture (called Channelformer) to achieve
improved channel estimation for orthogonal frequency-division multiplexing (OFDM) waveforms in
downlink scenarios. The self-attention mechanism is employed to achieve input precoding for the input
features before processing them in the decoder. In particular, we implement multi-head attention in the
encoder and a residual convolutional neural architecture as the decoder, respectively. We also employ a
customized weight-level pruning to slim the trained neural network with a fine-tuning process, which
reduces the computational complexity significantly to realize a low complexity and low latency solution.
This enables reductions of up to 70% in the parameters, while maintaining an almost identical perfor-
mance compared with the complete Channelformer. We also propose an effective online training method
based on the fifth generation (5G) new radio (NR) configuration for the modern communication systems,
which only needs the available information at the receiver for online training. Using industrial standard
channel models, the simulations of attention-based solutions show superior estimation performance
compared with other candidate neural network methods for channel estimation.
The software was prepared in MATLAB 2021B and a Readme file is provided with the code to give a short description of how it works. |
|
dc.format |
application/zip |
|
dc.format |
text/plain |
|
dc.language |
eng |
|
dc.publisher |
University of Edinburgh. School of Engineering. Institute for Digital Communications |
|
dc.relation |
https://hdl.handle.net/20.500.11820/244a98cb-c237-497c-bbf2-2d8f3ad0068b |
|
dc.rights |
Creative Commons Attribution 4.0 International Public License |
|
dc.subject |
wireless communications |
|
dc.subject |
channel estimation |
|
dc.subject |
neural networks |
|
dc.subject |
orthogonal frequency division multiplexing |
|
dc.subject |
self-attention mechanism |
|
dc.subject |
Engineering |
|
dc.title |
Channelformer Neural Network Software |
|
dc.type |
software |
|