2018Fall-Smart-Guidance-Helmet

2018 fall eslab final project

This project is maintained by NTUEE-ESLab

Smart Guidance Helmet

Eslab final project (2018 Fall) @ National Taiwan University
Inspired by YutaItoh/3D-Eye-Tracker
Author: B04901003 許傑盛, B04901059 蔡承佑

Description

This project aim to use gaze-detection to do something funny and fancy.

To reproduce this project, you should make a wearable headset which consist of a RPi to connect to internet and under the same subnet of the PC and a IR LED & Pi NoIR camera V2 to capture the eye-pupil.

Demonstration video: Video on Youtube

Why using IR

Since Aisan eye compose iris with dark color, it’s hard to identify pupil from visible light spectrum. However, in infrared wavelength (~850nm) pupil would absorb the light while iris reflect it.

Project Structure

How to build

Our source code would seperate into two parts. One is for gaze-capture, and another part is for UI display using QT, all source would written in C++.
All source would tested on Ubuntu 16.04, with Raspberry Pi3 model B.

Prerequisite

PC side

Raspberry Pi side

Build the project

1. Run gaze tracking

(PC) $ git clone https://github.com/NTUEE-ESLab/2018Fall_Smart-Guidance-Helmet.git
(PC) $ cd 2018Fall_Smart-Guidance-Helmet/gaze-tracking
(PC) $ mkdir build; cd build
(PC) $ cmake ..; make
(PC) $ cmake ..; make  # this might happen since it's my first time to write such big project's cmake ...
(PC) $ cd main
(Rpi)$ raspivid -cd MJPEG -w 640 -h 480 -b 9000000 -fps 20 -vf -t 0 -o - | gst-launch-1.0 fdsrc ! jpegparse ! rtpjpegpay ! udpsink host=<PC IP> post=5000
(PC) $ ./main
Some debug keys are pre-assigned for a better control of the software:

2. Run eye direction UI

(PC) $ cd 2018Fall_Smart-Guidance-Helmet/eye-direction-UI
(PC) $ qmake; make
(PC) $ ./eyedirection2

Acknowledgements