Skip to content

bryn-gnolbs/TAAF-for-Image-Classification

Repository files navigation

License Papers With Code Papers With Code License License

About

This repository contains code and resources for the paper: "The Analog Activation Function (TAAF) of Emergent Linear Systems: Evaluating the Performance of The Analog Activation Function (TAAF) on MNIST and CIFAR-10 Datasets."

This paper introduces the Analog Activation Function (TAAF), a novel activation function designed for neural networks, inspired by the principles of emergent linearity. We evaluate TAAF's performance on image classification tasks using the MNIST and CIFAR-10 datasets, comparing it against standard activation functions.

Key Findings:

  • MNIST Dataset: TAAF achieves a test accuracy of 99.39% on MNIST.
  • CIFAR-10 Dataset: TAAF achieves a test accuracy of 79.37% on CIFAR-10.

This repository provides the code to reproduce our experiments and explore the TAAF activation function.

Key Features

This repository includes:

  • src/ Directory:
    • mnist/: Code for training and evaluating models on MNIST.
    • cifar10/: Code for training and evaluating models on CIFAR-10.
    • activation_functions.py: Python file defining the TAAF and ELU activation function classes.
    • utils.py: Utility functions and potentially common model components.
  • tests/ Directory:
    • mnist/: Saved model checkpoints for MNIST experiments.
    • cifar10/: Saved model checkpoints for CIFAR-10 experiments.
  • LICENSE: License file for the repository.
  • README.md: This README file.
  • requirements.txt: List of Python package dependencies.

Getting Started

Prerequisites

  • Python 3.x
  • PyTorch (>= version mentioned in your code, if any)
  • Torchvision
  • tqdm
  • Other packages as listed in requirements.txt.

Installation

  1. Clone the repository:

    git clone [repository-url]
    cd [repository-directory]

    Replace [repository-url] and [repository-directory].

  2. Install Dependencies:

    pip install -r requirements.txt

Running the Code

MNIST Experiments:

  1. Navigate to the MNIST directory:

    cd src/mnist
  2. Run the training script:

    python train.py

CIFAR-10 Experiments:

  1. Navigate to the CIFAR-10 directory:

    cd src/cifar10
  2. Run the training script:

    python train.py

Model Checkpoints:

Trained model checkpoints are saved in the tests/mnist/ and tests/cifar10/ directories.

Datasets

  • MNIST Dataset: Automatically downloaded by torchvision.
  • CIFAR-10 Dataset: Automatically downloaded by torchvision.

Model and Training Details

The CNN architecture and training parameters are detailed in the paper.

Key Training Parameters:

  • Optimizer: Adam
  • Learning Rate: 0.001
  • Loss Function: Cross-Entropy Loss
  • Batch Size: 64
  • Epochs: 10 for MNIST, 20 for CIFAR-10

Results

Detailed results are in the paper.

  • MNIST: ~99.39%
  • CIFAR-10: ~79.37%

Citation

B. T. Chatfield. (2025). The Analog Activation Function (TAAF) of Emergent Linear Systems: Evaluating the Performance of The Analog Activation Function (TAAF) on MNIST and CIFAR-10 Datasets. OpenASCI & Genova Laboratories Research & Development Div.

License

This project is licensed under the A customised version of the Hippocratic License v3 - see the LICENSE file.

Contact

Personal - LinkedIn Profile OpenASCI GenoLabs

Acknowledgments

back to top ⬆️

Discussion & Contributions are encouraged! Please feel free to open a Pull Request.


About Key Features Getting Started Datasets Model Details Results File Structure License Citation Contact

Languages