A eye gaze tracking library based on computer vision and neural network for NeuralAction.
This repository is part of project NeuralAction.
Current gaze tracking model's mean error is 3.2 cm in 50 cm far without any calibration.
With calibration, mean error is ~1.8 cm.
- Gaze tracking with calibrations
- Research
- More accurate eye-gaze tracker
- More accurate eye-blink classification
- Being more robust to a person's appearance difference
- Less calibration tries
- Develop
- Support ARM64 (WoA)
- Support WinML / ONNX models
- Optimize for low-powered devices (low battery, slow CPU or GPU)
- Windows eye-tracking accessibility API integration
- Channel merged input
- Neural-net based calibrator
- MobileNet-v3 training
- Support IR camera from Windows Hello
- Auto detect and load calibration data
- OpenCvSharp native recompile
- SharpFace native recompile
- WinML backend
- ONNX model runner
- ONNX formatted gaze model
- ONNX formatted eye blink model
- GPU support
- FP16 computation
- Grayscale (for IR camera)
- StyleAug
- CycleGAN for race appearance transfer
- Apply transfer-learning
- Intergration
- Chrome UI accessibility expose
- Gaze tracking calibration codes
- Put more various data into gaze tracking model.
- Single camera gaze tracking.
- Gaze tracking service.
- Abstractions around OpenCV
- Abstractions around Tensorflow
- Platform abstraction layer (Files, Audio, Video, etc...)
- Face tracking (Tadas/OpenFace)
- Cascade object detection
- Some examples of openCV
- Cross platform webcam I/O
- Data sharing between OpenCV
- Input image normalization
- GPU acceleration supports
- Model imports