This repository contains code and resources for the paper: "The Analog Activation Function (TAAF) of Emergent Linear Systems: Evaluating the Performance of The Analog Activation Function (TAAF) on MNIST and CIFAR-10 Datasets."
This paper introduces the Analog Activation Function (TAAF), a novel activation function designed for neural networks, inspired by the principles of emergent linearity. We evaluate TAAF's performance on image classification tasks using the MNIST and CIFAR-10 datasets, comparing it against standard activation functions.
Key Findings:
- MNIST Dataset: TAAF achieves a test accuracy of 99.39% on MNIST.
- CIFAR-10 Dataset: TAAF achieves a test accuracy of 79.37% on CIFAR-10.
This repository provides the code to reproduce our experiments and explore the TAAF activation function.
This repository includes:
src/Directory:mnist/: Code for training and evaluating models on MNIST.cifar10/: Code for training and evaluating models on CIFAR-10.activation_functions.py: Python file defining theTAAFandELUactivation function classes.utils.py: Utility functions and potentially common model components.
tests/Directory:mnist/: Saved model checkpoints for MNIST experiments.cifar10/: Saved model checkpoints for CIFAR-10 experiments.
LICENSE: License file for the repository.README.md: This README file.requirements.txt: List of Python package dependencies.
- Python 3.x
- PyTorch (>= version mentioned in your code, if any)
- Torchvision
- tqdm
- Other packages as listed in
requirements.txt.
-
Clone the repository:
git clone [repository-url] cd [repository-directory]Replace
[repository-url]and[repository-directory]. -
Install Dependencies:
pip install -r requirements.txt
MNIST Experiments:
-
Navigate to the MNIST directory:
cd src/mnist -
Run the training script:
python train.py
CIFAR-10 Experiments:
-
Navigate to the CIFAR-10 directory:
cd src/cifar10 -
Run the training script:
python train.py
Model Checkpoints:
Trained model checkpoints are saved in the tests/mnist/ and tests/cifar10/ directories.
- MNIST Dataset: Automatically downloaded by torchvision.
- CIFAR-10 Dataset: Automatically downloaded by torchvision.
The CNN architecture and training parameters are detailed in the paper.
Key Training Parameters:
- Optimizer: Adam
- Learning Rate: 0.001
- Loss Function: Cross-Entropy Loss
- Batch Size: 64
- Epochs: 10 for MNIST, 20 for CIFAR-10
Detailed results are in the paper.
- MNIST: ~99.39%
- CIFAR-10: ~79.37%
B. T. Chatfield. (2025). The Analog Activation Function (TAAF) of Emergent Linear Systems: Evaluating the Performance of The Analog Activation Function (TAAF) on MNIST and CIFAR-10 Datasets. OpenASCI & Genova Laboratories Research & Development Div.
This project is licensed under the A customised version of the Hippocratic License v3 - see the LICENSE file.
Personal - LinkedIn Profile OpenASCI GenoLabs
- Inspired by the Project-README-Template by YousefIbrahimismail.
- Make a Readme
- Shields
- SVG README
Discussion & Contributions are encouraged! Please feel free to open a Pull Request.
| About | Key Features | Getting Started | Datasets | Model Details | Results | File Structure | License | Citation | Contact |