This project examines the fault tolerance of transformer-based error correction models, focusing on noise introduced during and due to syndrome processing. It builds on the transformer architecture proposed in Accelerating Error Correction Code Transformers.
This is part of the final project for ECE 537: Coding and Information Theory, Fall 2024, taught by Dr. Bane Vasic and Dr. Asit Kumar Pradhan Course Link.
- Overleaf Document: Contains the project writeup and current analysis. Access here.
-
Clone the repository:
git clone https://github.com/yourusername/project-repo.git cd project-repo
-
Install dependencies:
If you haven’t already, install the required libraries:pip install -r requirements.txt
Use the following command to train a 6 layers AECCT of dimension 128 on the LDPC(49,24) code:
python main.py --code LDPC_N49_K24 --N_dec 6 --d_model 128