AI-powered education monitoring system with face recognition, gesture detection, and emotion analysis
Project description
๐ธ Camera Master - AI-Powered Education Monitoring
Camera Master is a comprehensive Python package that provides AI-powered features for education monitoring, including face recognition attendance, gesture detection, emotion analysis, attention tracking, and gamification.
๐ Features
Phase 1 - Core Features
- ๐ค Face Recognition Attendance: Automated attendance tracking using DeepFace and OpenCV
- โ Gesture Recognition: Hand gesture detection (numbers 0-5) using MediaPipe
- ๐ Emotion Analysis: Real-time emotion detection (7 emotions) using DeepFace
- ๐ Visualization: Beautiful charts and graphs with Matplotlib
Phase 2 - Extensions
- ๐ท Mask Detection: Face mask compliance monitoring
- ๐ฅ Age/Gender Estimation: Demographic analysis using DeepFace
- ๐๏ธ Attention Tracking: Eye aspect ratio and head pose monitoring
- ๐ Audio Feedback: Text-to-speech notifications (pyttsx3)
- ๐ Automated Reports: CSV/JSON/HTML report generation
Phase 3 - Advanced Features
- โ๏ธ Gesture-to-Text: Hand sign to text conversion
- ๐ด Fatigue Detection: Drowsiness and yawning detection
- ๐ก๏ธ Spoof Detection: Liveness check via blink detection
- ๐ Mood Tracker: Long-term emotion trend analysis
Phase 4 - Enterprise Ready
- ๐ Access Control: Face-based authentication with authorization levels
- ๐ฎ Gamification: Points, badges, levels, and leaderboards
- ๐ฑ Dashboards: Interactive Streamlit/Gradio interfaces
- โ ๏ธ Anomaly Detection: Engagement alerts and notifications
๐ Installation
Prerequisites
- Python 3.8 or higher
- Webcam or camera device
- Windows/Linux/macOS
Install from source
# Clone the repository
git clone https://github.com/RNSsanjay/camera-master.git
cd camera-master
# Install dependencies
pip install -r requirements.txt
# Install the package
pip install -e .
Install from PyPI (once published)
pip install camera-master
๐ Quick Start
1. Attendance System
from camera_master import Attendance
# Initialize
attendance = Attendance()
# Register a face
attendance.register_face("John Doe")
# Start monitoring
df = attendance.start_monitoring(camera_index=0)
# Save report
attendance.save_report()
2. Emotion Analysis
from camera_master import EmotionAnalyzer
# Initialize
analyzer = EmotionAnalyzer()
# Start analysis
analyzer.start_analysis(camera_index=0)
# Get statistics
stats = analyzer.get_emotion_statistics()
print(f"Dominant emotion: {stats['dominant_emotion']}")
3. Gesture Recognition
from camera_master import GestureRecognizer
# Initialize
recognizer = GestureRecognizer()
# Start recognition
recognizer.start_recognition(camera_index=0)
4. Comprehensive Monitoring
from camera_master import (
Attendance, EmotionAnalyzer, AttentionTracker,
Visualizer, ReportGenerator
)
# Initialize all components
attendance = Attendance()
emotion = EmotionAnalyzer()
attention = AttentionTracker()
visualizer = Visualizer()
reports = ReportGenerator()
# Run monitoring session
# ... (see examples/demo_comprehensive.py)
๐ฅ๏ธ Command-Line Interface
Camera Master provides convenient CLI commands:
Attendance
# Register a new face
camera-master attendance --register "John Doe"
# Start monitoring
camera-master attendance --start --camera 0
Emotion Analysis
camera-master emotion --start --camera 0
Gesture Recognition
camera-master gesture --start --camera 0
Attention Tracking
camera-master attention --start --camera 0
Mask Detection
camera-master mask --start --camera 0
Fatigue Detection
camera-master fatigue --start --camera 0
Spoof Detection
camera-master spoof --start --camera 0
Gamification Dashboard
camera-master gamification --user "John Doe" --leaderboard
๐ Examples
Explore the examples/ directory for complete working examples:
demo_attendance.py- Basic attendance systemdemo_emotion.py- Emotion analysis with visualizationdemo_comprehensive.py- Full monitoring system with gamification
Run examples:
python examples/demo_attendance.py
python examples/demo_emotion.py
python examples/demo_comprehensive.py
๐๏ธ Package Structure
camera-master/
โโโ camera_master/
โ โโโ __init__.py # Package initialization
โ โโโ attendance.py # Face recognition attendance
โ โโโ gesture.py # Gesture recognition
โ โโโ emotion.py # Emotion analysis
โ โโโ visualization.py # Data visualization
โ โโโ reports.py # Report generation
โ โโโ utils.py # Utility functions
โ โโโ attention.py # Attention tracking
โ โโโ mask_detection.py # Mask detection
โ โโโ age_gender.py # Age/gender estimation
โ โโโ fatigue.py # Fatigue detection
โ โโโ spoof.py # Spoof detection
โ โโโ mood_tracker.py # Mood tracking
โ โโโ access_control.py # Access control
โ โโโ gamification.py # Gamification engine
โ โโโ cli.py # Command-line interface
โโโ examples/
โ โโโ demo_attendance.py
โ โโโ demo_emotion.py
โ โโโ demo_comprehensive.py
โโโ setup.py
โโโ pyproject.toml
โโโ requirements.txt
โโโ README.md
๐ Features Breakdown
Attendance System
- Face Detection: Multiple backend support (OpenCV, SSD, MTCNN, RetinaFace)
- Face Recognition: Multiple models (VGG-Face, Facenet, OpenFace, ArcFace)
- Database: Local face database storage
- Reports: CSV/JSON export with timestamps
Emotion Analysis
- 7 Emotions: Happy, Sad, Angry, Fear, Surprise, Disgust, Neutral
- Real-time Detection: Frame-by-frame analysis
- Confidence Scores: Percentage for each emotion
- Trend Analysis: Mood tracking over time
Gesture Recognition
- Hand Detection: MediaPipe Hands
- Number Recognition: 0-5 finger counting
- Special Gestures: OK, Thumbs up/down, Peace sign
- Custom Training: Train your own gestures
Attention Tracking
- Eye Tracking: Eye Aspect Ratio (EAR) calculation
- Head Pose: Pitch, yaw, roll estimation
- Attention Score: Combined metric (0-1)
- Drowsiness Alert: Real-time warnings
Fatigue Detection
- Eye Closure: Prolonged blink detection
- Yawning: Mouth aspect ratio analysis
- Fatigue Levels: Normal, Mild, Warning, Critical
- Alerts: Visual and audio warnings
Gamification
- Points System: Earn points for engagement
- Badges: 8+ achievement badges
- Levels: Progressive leveling system
- Leaderboards: Compete with peers
- Streaks: Attendance streak tracking
๐ง Configuration
Camera Settings
# Use different camera
attendance = Attendance()
attendance.start_monitoring(camera_index=1) # Use second camera
Recognition Thresholds
# Adjust recognition sensitivity
attendance = Attendance(
model_name="Facenet",
threshold=0.5, # Lower = more strict
detector_backend="retinaface"
)
Attention Parameters
# Customize attention tracking
tracker = AttentionTracker(
ear_threshold=0.25, # Eye closure threshold
attention_threshold=0.5 # Minimum attention score
)
๐ Reports and Analytics
Generate Reports
from camera_master import ReportGenerator
report_gen = ReportGenerator()
# Attendance report
report_gen.generate_attendance_report(attendance_data, output_format='html')
# Emotion report
report_gen.generate_emotion_report(emotion_data, output_format='csv')
# Comprehensive report
report_gen.generate_comprehensive_report(
attendance_data=attendance_df,
emotion_data=emotion_df,
attention_data=attention_df,
output_format='html'
)
Visualizations
from camera_master import Visualizer
visualizer = Visualizer()
# Emotion distribution pie chart
visualizer.plot_emotion_distribution(emotion_df)
# Emotion timeline
visualizer.plot_emotion_timeline(emotion_df)
# Attention metrics
visualizer.plot_attention_metrics(attention_df)
# Complete dashboard
visualizer.create_dashboard(attendance_df, emotion_df, attention_df)
๐ฏ Use Cases
Education
- Classroom Monitoring: Track student engagement and attention
- Online Learning: Monitor remote student participation
- Attendance Management: Automated attendance tracking
- Behavior Analysis: Understand student emotional patterns
Corporate
- Meeting Analytics: Analyze meeting engagement
- Training Assessment: Monitor trainee attention
- Security: Face-based access control
- HR Analytics: Employee engagement metrics
Healthcare
- Patient Monitoring: Track patient emotional state
- Therapy Sessions: Analyze emotional responses
- Elderly Care: Fatigue and attention monitoring
๐ ๏ธ Dependencies
- opencv-python: Computer vision
- mediapipe: Hand and face mesh detection
- deepface: Face recognition and analysis
- numpy: Numerical computing
- pandas: Data manipulation
- matplotlib: Visualization
- pyttsx3: Text-to-speech
- streamlit: Web dashboards
- gradio: ML interfaces
- tensorflow: Deep learning backend
๐ค Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature) - Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
๐ License
This project is licensed under the MIT License - see the LICENSE file for details.
๐จโ๐ป Author
RNS Sanjay
- GitHub: @RNSsanjay
๐ Acknowledgments
- DeepFace for face recognition
- MediaPipe for hand and face detection
- OpenCV for computer vision
- TensorFlow for deep learning
๐ Support
For support, please open an issue on GitHub or contact the maintainers.
๐บ๏ธ Roadmap
Version 0.2.0
- Cloud sync (Firebase/Supabase)
- Real-time dashboards
- Mobile app integration
- Multi-camera support
Version 0.3.0
- Advanced ML models
- Custom model training
- API endpoints
- Docker deployment
Version 1.0.0
- Production-ready features
- Comprehensive documentation
- Performance optimization
- Enterprise features
โ ๏ธ Disclaimer
This software is provided for educational and monitoring purposes. Ensure compliance with local privacy laws and regulations when using face recognition and monitoring technologies. Always obtain proper consent from individuals being monitored.
๐ Performance
- Face Recognition: ~100ms per frame (VGG-Face)
- Emotion Detection: ~150ms per frame
- Gesture Recognition: ~30ms per frame
- Attention Tracking: ~50ms per frame
Performance may vary based on hardware and model selection.
๐ Privacy & Security
- Local Processing: All processing happens locally
- No Cloud Required: Works offline
- Data Control: You control all data
- Encrypted Storage: Option for encrypted face database
๐ฑ Platform Support
- โ Windows 10/11
- โ Linux (Ubuntu 20.04+)
- โ macOS (10.15+)
- โ ๏ธ Raspberry Pi (limited performance)
Made with โค๏ธ by RNS Sanjay
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file camera_master-0.1.0.tar.gz.
File metadata
- Download URL: camera_master-0.1.0.tar.gz
- Upload date:
- Size: 54.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
36adba9ad8f74fe59761fee847437b7839fcae93d702228a07b1edd8306ed2f0
|
|
| MD5 |
a2d98ac2421c5f638f42e40b5a8c7651
|
|
| BLAKE2b-256 |
8252863718966870f2bc013c6174c12fb43dd97929005d46e3114b07b51c853a
|
File details
Details for the file camera_master-0.1.0-py3-none-any.whl.
File metadata
- Download URL: camera_master-0.1.0-py3-none-any.whl
- Upload date:
- Size: 54.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3d2a6a0bb137211995ec40d4df338f1f67a7da66aad9a12024f84340f7410310
|
|
| MD5 |
9e228f032f8de000c0eed0970f58f65e
|
|
| BLAKE2b-256 |
a37f50b3ab5ec02862e0ee0e69d96c561284c87b0d77aa6d6ee9507d3e07de49
|