An iOS-based application for real-time human activity data recording, supporting sensor data collection, manual labeling, and data export. Perfect for building Human Activity Recognition (HAR) datasets and training machine learning models.
The app collects motion data from iPhone's built-in sensors (accelerometer and gyroscope) in real-time and displays them as live charts.
Collected data includes:
acc_x,acc_y,acc_z(Acceleration)gyro_x,gyro_y,gyro_z(Rotation rate)timestamplabel
Users can manually assign labels during recording (e.g., "walking", "running", "climbing stairs"). The label is automatically attached to the data for easier training and analysis.
Simple and intuitive UI with clear buttons to start and stop recording activities.
While recording, the app displays a dynamic chart of sensor data for quick visual feedback.
Export your recording in JSON or CSV format and share via AirDrop, Mail, WeChat, etc.
Example file name: motion_data_2025-04-15T20:42:12.json
- Swift / SwiftUI
- CoreMotion (sensor access)
- Combine (data binding)
- Codable (JSON encoding)
- FileManager (data saving)
- Support multiple labels and switching during recording
- Export format selection (JSON / CSV)
- More detailed charts for extra data dimensions
- Real-time activity prediction using CoreML
| Label Selection | Sensor Data & Export |
|---|---|
![]() |
![]() |
If you find this project helpful, feel free to give it a ⭐️ star!


