HumanActivityClassifier is a SwiftUI-based iOS application for real-time human activity recognition using CoreMotion sensor data and a CoreML model. The app collects motion data, predicts activities with a sliding window mechanism, and visualizes sensor readings using animated line charts in a modern and elegant UI. I first collect the data using HumanActivityRecoderApp HumanActivityRecoderApp then train the model uisng HumanActivityRecognition repo after that I developed the app and deploy the model on IOS
- π‘ Real-time accelerometer and gyroscope data collection
- π§ Local, offline human activity prediction using a CoreML model
- π Live sensor data visualization as smooth line graphs
- πͺ Sliding window prediction (64-sample window with 50% overlap)
- π§± Clean MVVM architecture with SwiftUI modular components
- π¨ Beautiful UI with gradient background and card-style result display
The project uses a custom CoreML model trained to recognize human activities using 6-axis data (accelerometer and gyroscope).
- Input:
MLMultiArrayof shape[1, 64, 6]- Each sample includes:
acc_x, acc_y, acc_z, gyro_x, gyro_y, gyro_z
- Each sample includes:
- Output:
- Predicted label (e.g.,
"Walking","Running","Sitting"....)
- Predicted label (e.g.,
- Tap Start Prediction to begin collecting sensor data.
- The app records sensor readings every ~20ms.
- Every time 64 samples are collected, the data is fed into the CoreML model.
- The prediction result is displayed and updated in real-time.
- The most recent accelerometer data is plotted as a live curve.
- Tap Stop Prediction to halt motion tracking and prediction.
- Swift / SwiftUI
- CoreMotion (sensor access)
- CoreML (Deploy DL model on IOS)
- Xcode 15 or later
- iOS 16+ target
- Swift 5.9+
- Real iPhone device (CoreMotion does not work on Simulator)
git clone https://github.com//HumantwwindeActivityClassifier.git
cd HumanActivityClassifier
open HumanActivityClassifier.xcodeproj
- Select a physical iPhone device.
- Run the app from Xcode.
The app uses a custom LineGraphView to visualize accelerometer data:
- X (Red), Y (Green), Z (Blue) lines
- Smooth transitions with Canvas rendering
- Auto-scales based on incoming values
Simple and intuitive UI with clear buttons to start and stop recording activities.
| Prediction Selection | Prediction Selection | Stop Predicition |
|---|---|---|
![]() |
![]() |
![]() |
If you find this project helpful, feel free to give it a βοΈ star!


