Eye Blink Detection: Integrated - Master of Computer Applications
Eye Blink Detection: Integrated - Master of Computer Applications
by
January 2021
1
AMRITA VISHWA VIDYAPEETHAM
MYSURU CAMPUS
BONAFIDE CERTIFICATE
This is to certify that the project entitled "EYE BLINK DETECTION" submitted by,
for the award of the degree of Integrated - Master of Computer Applications at Amrita
School of Arts and Sciences is a bonafide record of the work carried out by them under my
guidance and supervision at Amrita Vishwa Vidyapeetham, Mysuru.
CHAIRPERSON
(i)
2
AMRITA VISHWA VIDYAPEETHAM
MYSURU CAMPUS
DECLARATION
We,
ANUDEEP DASARI [MY.SC.I5MCA18019]
ABHIJNA P S [MY.SC.I5MCA18024]
NAVNEET KUMAR SINGH [MY.SC.I5MCA18033]
hereby declare that this project report, entitled “EYE BLINK DETECTION” is a record of
the original work done by us under the guidance of Mr. AKSHAY S, Assistant Professor,
Department of Computer Science, Amrita School of Arts and Sciences, Mysuru and that to
the best of our knowledge, this work has not formed the basis for the award of any
degree/diploma/associate-ship/fellowship or a similar award, to any candidate in any
University.
1. Abhijna PS
3. Anudeep Dasari
(ii)
3
ACKNOWLEDGEMENT
We would like to express our sincere thanks to Amma, our beloved chancellor
“Mata Amritanandamayi Devi”.
We would like to express our sincere thanks to Br. Sunil Dharmapal, Director,
who supported us with continuous encouragement, motivation and lending us with right
direction throughout our tenure of studies.
We would like to express our sincere thanks to Br. Venugopal, Correspondent, for
providing us with well supported peaceful studying environment with excellent
infrastructure and support.
Further, we extend our thanks to all the faculty members and technical staff of our
department and ICTS for their suggestions, support and providing resources at needed
times.
Finally, we express our sincere gratitude to our parents and friends who have been
the embodiment of love and affection which helped us to carry out the project in a smooth
and successful way.
(iii)
4
ABSTRACT
With a growing number of computer devices around, the more time we spend for interacting
with such devices there is an adverse effect on our health as well. The aim of the project is to
assist the user to be reminded to take care of his own eye, by avoiding the computer vision
syndrome. Nowadays all the computers come with camera hence we make use of camera and
build eye blink detector application. Increase in technology leads to more workaholic people
around and therefore we make use of technology itself to satisfy human healthcare. This is not
the first application ever built to take care of human eye blink count but our application
provides a user friendly as well as an efficient way of resolving things. Since the application
runs in background it makes the way for user to use other applications and also reminding the
user to take care of the eye by giving them a constant notification.
(iv)
5
Table of Contents
CERTIFICATES……………………….. i
ACKNOWDLGEMENT……………….. ii
DECLERATION………………………..iii
ABSTRACT……………………………. iv
LIST OF FIGURES…………………….. v
LIST OF TABLES……………………… vi
6
4.2 Explanation of key function……………22
4.3 Result analysis………………………….25
4.4 Method of implementation……………..26
4.5 Conclusion……………………………...28
5. Testing and Validation 29
5.1 Introduction……………………………..30
5.2 Design of test case and scenarios……......30
5.3 Validation……………………………….30
5.4 Conclusion………………………………31
6. Conclusion 32
6.1 Future Enhancement……………………..33
7. References 34
7
LIST OF FIGURES
LIST OF TABLES
(v)
8
CHAPTER 1
INTRODUCTION
9
1.1 Introduction
Viewing a computer or digital screen often makes the eyes work harder. As a result, the unique
characteristics and high visual demands of computer and digital screen viewing make many
individuals susceptible to the development of vision-related symptoms. The project focuses on
helping the once own eye by avoiding the vision problem. Due to hours of usage in computer
we tend to forget the stress we put on our eye. The project is focused on avoiding such thing
and a small effort to keep vision better.
1.2 Motivation
Where due to the watching screen for long time will lead to many eye vision syndromes.
Keeping that in the mind we are intended to do this project to help those individuals for not
falling for such eye syndrome in coming future and to take care of their eye.
10
CHAPTER 2
ANALYSIS
11
2.1 Introduction
So here in this chapter, the software which we used to build the project and hardware
requirements will be described. And the how our project works and the flowchart of the eye
blink detection module is explained.
12
Figure 2.4.1- OpenCV
PyCharm
PyCharm is an integrated development environment used in computer programming,
specifically for the Python language. It is developed by the Czech company
JetBrains. PyCharm is a dedicated Python Integrated Development Environment (IDE)
providing a wide range of essential tools for Python developers, tightly integrated to create a
convenient environment for productive Python, web, and data science development.
Dlib
Dlib is a modern C++ toolkit containing machine learning algorithms and tools for creating
complex software in C++ to solve real world problems. It is used in both industry and
academia in a wide range of domains including robotics, embedded devices, mobile phones,
and large high-performance computing environments. Dlib's open source licensing allows
13
you to use it in any application, free of charge. We build python binaries for Dlib using C
make tool.
PyQt5
PyQt5 is a comprehensive set of Python bindings for Qt v5. It is implemented as more than
35 extension modules and enables Python to be used as an alternative application
development language to C++ on all supported platforms including iOS and Android. PyQt5
is a Python binding of the cross-platform GUI toolkit Qt, implemented as a Python plug-in.
PyQt5 is free software developed by the British firm Riverbank Computing.
14
2.5 Content diagram of Project
15
2.6 Flowchart of blink detection module.
16
CHAPTER 3
DESIGN
17
3.1 ARCHITECTURE DIAGRAM
18
3.3 USE CASE DIAGRAM
19
3. 4 Module design
Figure 3.4
20
Chapter 4
IMPLEMENTATION & RESULTS
21
4.1 Introduction
Unlike traditional image processing methods for computing blinks which typically involve
some combination of:
1. Eye localization.
2. Thresholding to find the whites of the eyes.
3. Determining if the ―white‖ region of the eyes disappears for a period of time (indicating a
blink).
The eye aspect ratio is instead a much more elegant solution that involves a very simple
calculation based on the ratio of distances between facial landmarks of the eyes.This method
for eye blink detection is fast, efficient, and easy to implement.
In the first part we’ll discuss the eye aspect ratio and how it can be used to determine if a
person is blinking or not in a given video frame.
The importing of packages except Dlib is simple and can be done easily. All the other
packages can be installed by using conda commands or by using PyCharm itself. And PyQt5
is used for the layout designing and the .ui file is used in the program to run the application.
Now that all the necessary packages are imported we need to define a function that
implements the EAR calculation.
class startb(QDialog):
def __init__(self):
print("Application initialization...")
super(startb,self).__init__()
loadUi("untitled.ui",self)
self.startbutton.clicked.connect(self.blinkdetect)
So starting with class, where the __init__ is the function where the .ui file is accessed to run
the application layout and the object value (startbutton is the object value of button START).
22
def eye_aspect_ratio(self,eye):
# compute the euclidean distances between the two sets of
# vertical eye landmarks (x, y)-coordinates
A = dist.euclidean(eye[1], eye[5])
B = dist.euclidean(eye[2], eye[4])
def blinkdetect(self):
print("START Button.....")
EYE_AR_THRESH = 0.2
EYE_AR_CONSEC_FRAMES = 2
COUNTER = 0
TOTAL = 0
vs = VideoStream(src=0).start()
fileStream = False
time.sleep(1.0)
while True:
if fileStream and not vs.more():
break
frame = vs.read()
frame = imutils.resize(frame, width=800)
gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)
rects = detector(gray, 0)
j=False
for rect in rects:
shape = predictor(gray, rect)
shape = face_utils.shape_to_np(shape)
leftEye = shape[lStart:lEnd]
rightEye = shape[rStart:rEnd]
leftEAR = self.eye_aspect_ratio(leftEye)
23
rightEAR = self.eye_aspect_ratio(rightEye)
ear = (leftEAR + rightEAR) / 2.0
leftEyeHull = cv2.convexHull(leftEye)
rightEyeHull = cv2.convexHull(rightEye)
cv2.drawContours(frame, [leftEyeHull], -1, (0, 255, 0), 1)
cv2.drawContours(frame, [rightEyeHull], -1, (0, 255, 0), 1)
if ear < EYE_AR_THRESH:
COUNTER += 1
else:
if COUNTER >= EYE_AR_CONSEC_FRAMES:
TOTAL += 1
COUNTER = 0
cv2.putText(frame, "Blinks: {}".format(TOTAL), (10, 30),
cv2.FONT_HERSHEY_SIMPLEX, 0.7, (0, 0, 255), 2)
self.blinkcount.setText("Blinks: {}".format(TOTAL))
cv2.putText(frame, "EAR: {:.2f}".format(ear), (300, 30),
cv2.FONT_HERSHEY_SIMPLEX, 0.7, (0, 0, 255), 2)
self.earvalue.setText("EAR: {:.2f}".format(ear))
cv2.imshow("Frame", frame)
key = cv2.waitKey(1) & 0xFF
if key == ord("q"):
j=True
cv2.destroyAllWindows()
vs.stop()
break
if(j==True):
break
So when the Button is clicked the above function will be accessed and will be processed.
If the eye aspect ratio falls below a certain threshold and then rises above the threshold, then
we’ll register a “blink” — the EYE_AR_THRESH is this threshold value. We default it to a
value of 0.2 as this is what has worked best for my applications.
We then have an important constant, EYE_AR_CONSEC_FRAME — this value is set to 2
to indicate that two successive frames with an eye aspect ratio less than EYE_AR_THRESH
must happen in order for a blink to be registered.
So in the for loop using an array slicing techniques, we can extract the (x, y)-coordinates for
both the left and right eye, respectively. From there, we compute the eye aspect ratio for each
eye in the line leftEAR = self.eye_aspect_ratio(leftEye)
rightEAR = self.eye_aspect_ratio(rightEye).
app=QApplication(sys.argv)
mainwindow = startb()
widget=QtWidgets.QStackedWidget()
widget.addWidget(mainwindow)
widget.setFixedWidth(971)
widget.setFixedHeight(533)
widget.show()
app.exec_()
24
So this comes at the end of the code but out of the loop and class so it will access the
initialized from the __init__ function i.e. user interface and it is executed before all the blink
detection module starts.
In the above figure when the program is started it initialize the user interface and displayes
the user interface as show in the figure _______ and when the START button is pressed the
shape_predictor_68_face_landmarks.dat pre-trained face detector file is used and imported to
the program and then the camera is accessed and the camera interface will pop up to the user.
25
Figure 4.3.2 application and blink detection
26
Figure 4.4.1- Figure Describing the coordinates used to calculate the EAR
Based on this image, we should take away on key point:
There is a relation between the width and the height of these coordinates
Based on the work by Soukup ova and Cech in their 2016 paper, Real-Time Eye Blink
Detection using Facial Landmarks, we can then derive an equation that reflects this relation
called the eye aspect ratio (EAR):
27
Figure 4.4.3- Top-left: A visualization of eye landmarks when then the eye is open.
Figure 4.4.4- Top-right: Eye landmarks when the eye is closed.
Figure 4.4.5- Bottom: Plotting the eye aspect ratio over time. The dip in the eye aspect ratio
indicates a blink.
On the top-left we have an eye that is fully open — the eye aspect ratio here would be
large(r) and relatively constant over time.
However, once the person blinks (top-right) the eye aspect ratio decreases
dramatically, approaching zero.
The bottom figure plots a graph of the eye aspect ratio over time for a video clip. As
we can see, the eye aspect ratio is constant, then rapidly drops close to zero, then
increases again, indicating a single blink has taken place.
To implement eye blink module, we need to import necessary modules in
detect_blinks.py packages. The following code snippet shows the importing of
necessary packages:
4.5 Conclusion
So by this chapter we described about the facial landmarks for both eyes, we compute the
eye aspect ratio for each eye, which gives us a singular value, relating the distances between
the vertical eye landmark points to the distances between the horizontal landmark points.
Once we have the eye aspect ratio, we can threshold it to determine if a person is blinking —
the eye aspect ratio will remain approximately constant when the eyes are open and then will
rapidly approach zero during a blink, then increase again as the eye opens.
28
Chapter 5
TESTING & VALIDATION
29
5.1 Introduction
We introduce Eye blink detection Testing where number of milliseconds the eyes was closed
is taken as the test case which generates the blink and also the type of eye colour is
considered in the below shown table.
5.3 Validation
The output of the blink detection algorithm was evaluated by two separate individuals. Both individuals
watched the recording of each trial and noted each time the participant blinked. If a participant blinked
and the algorithm did not detect the blink, a miss would be counted. If the algorithm detected a blink
when there was not one observed, it was classified as a ―false positive. Only 6.5 percent of blinks were
missed, whereas 2.0 percent of blinks were falsely detected. Overall, the blink detection algorithm had
an accuracy rating of 93.7 percent.
30
5.4 Conclusion
In the process of detection there were no error in detecting the eye but there was some false
blink count due to persons head movement. In an over a 90 blinks there was at least 13 to 15
false blinks. Due to the webcam positions where in laptops it’s usually at the top so some of
the blinks were taken when the person looks downward which is not actually an eye blink but
the algorithm and position of cam defect. Whereas the accuracy of true blinks was 93 percent
and still for more accuracy it can be tuned by changing threshold and frame count values.
31
Chapter 6
Conclusion
32
The human computer interaction is important for every person of this
century. In today's era most of the work is done through computers. Technology have its have
its own advantage in our life but also by over use of tech and gadgets we face problems.
Especially the gadgets with screen displays which affects our valuable human organ eye which
helps us in visualize. This was the approach to help the individuals to realize the threat their
bring to themselves and an approach to avoid the syndrome faced by modern techs.
To make our blink detector more robust to these challenges, we can recommend:
1. Computing the eye aspect ratio for the N- th frame, along with the eye aspect ratios for N –
6 and N + 6 frames, then concatenating these eye aspect ratios to form a 13-dimensional feature
vector.
2. Training a Support Vector Machine (SVM) on these feature vectors.
We can use the combination of the temporal-based feature vector and SVM classifier helps
reduce false-positive blink detection and improves the overall accuracy of the blink detector.
Our project also does not have a good UI and cannot be used in all type’s devices. We can
improve it by implementing project in Android and IOS which is used by majority of users.
Our project also uses server-side camera rather than client side that aids for faster image
processing. We can also implement this from client-side camera but it has a performance
bottleneck.
33
REFERENCES
1. Adrian Rosebrock, PhD. Eye blink detection with OpenCV, Python, and dlib,
ago,https://en.wikipedia.org/wiki/PyCharm.
34