Depression Detection Using EEG
Depression Detection Using EEG
Detection using
EEG
Abstract:
Depression is a widespread mental health disorder with a significant global impact on individuals and
society. Early and accurate diagnosis of depression is essential for effective treatment and
intervention. This abstract presents an overview of research on the utilization of
Electroencephalography (EEG) in the detection of depression.
Recent studies have demonstrated that individuals with depression exhibit distinctive EEG patterns
compared to healthy controls. These patterns often involve alterations in brainwave frequencies,
connectivity, and asymmetry, which may serve as biomarkers for depression. By employing
advanced signal processing techniques and machine learning algorithms, researchers have made
significant strides in classifying depressed individuals with high accuracy using EEG data.
Furthermore, EEG-based depression detection has the potential to offer several advantages over
traditional assessment methods. It provides an objective and quantifiable measure of brain activity,
reducing the subjectivity associated with self-reporting and clinical assessments. Additionally, EEG is
cost-effective, non-invasive, and readily accessible, making it a viable option for widespread
screening and monitoring.
This abstract concludes by highlighting the potential of EEG as a valuable tool for depression
detection. While challenges such as data variability and interpretational complexity persist, ongoing
research in this field offers a promising avenue for developing reliable and accessible diagnostic tools
for depression. The integration of EEG with other neuroimaging techniques and clinical data may
further enhance the accuracy and comprehensiveness of depression diagnosis, ultimately
contributing to improved mental health outcomes.
Introduction:
Depression, a pervasive and debilitating mental health disorder, affects millions of people
worldwide, exerting a profound impact on their quality of life and well-being. Early and accurate
diagnosis of depression is essential for timely intervention and effective treatment. However,
diagnosing depression remains a complex challenge, often relying on subjective self-reporting and
clinical assessments. In this context, emerging technologies and neurophysiological approaches have
the potential to revolutionize depression detection.
One such innovative approach is the utilization of Electroencephalography (EEG) for the detection
and characterization of depression. EEG, a non-invasive neuroimaging technique, records the
electrical activity of the brain by measuring the voltage fluctuations resulting from the collective
neural activity of millions of neurons. The EEG method has gained prominence in recent years as a
promising tool for understanding the neurobiological basis of depression and as a means of
developing more objective and reliable diagnostic criteria.
This introduction provides an overview of the current landscape of depression diagnosis, highlighting
the challenges faced by traditional methods, and introduces EEG as an exciting avenue for improving
the accuracy and objectivity of depression detection. We will explore the unique features of EEG, the
neurophysiological markers associated with depression, and the potential benefits that EEG offers in
enhancing our understanding of this complex mental health condition.
As we delve into the evolving field of depression detection using EEG, it becomes evident that this
innovative approach has the potential to transform the way we diagnose and manage depression.
EEG's capacity to capture real-time, neural electrical activity offers a unique window into the
depressed brain, providing insights that extend beyond self-reporting and clinical evaluation. The
objective, quantifiable nature of EEG data, along with its non-invasive and cost-effective
characteristics, makes it an enticing prospect for both researchers and clinicians seeking more
accurate, accessible, and early detection methods for depression.
1. Literature Survey on EEG-Based Depression Detection:
Abstract: This literature review focuses on machine learning techniques applied to depression
detection. The survey presents an overview of various algorithms, data sources, and features used in
machine learning-based depression diagnosis. It discusses the challenges and opportunities
associated with these methods and highlights their potential for more accurate and early
identification of depressive symptoms.
3. The Role of Neuroimaging in Understanding Depression: A Comprehensive Review:
Abstract: This comprehensive review delves into the role of neuroimaging techniques, including
functional Magnetic Resonance Imaging (fMRI) and Positron Emission Tomography (PET), in
understanding depression. It summarizes key findings from neuroimaging studies, highlighting the
neural circuitry and structural abnormalities associated with depression. The review also discusses
the potential for neuroimaging to refine diagnostic criteria and inform treatment strategies.
4. Biomarkers in Depression: An Overview of Current Research and Challenges:
Abstract: This systematic review focuses on digital mental health interventions for depression. It
summarizes research on the effectiveness and accessibility of mobile apps, online platforms, and
telehealth services in supporting individuals with depression. The review highlights the potential of
digital interventions as a scalable approach for early detection, management, and treatment of
depression.
Existing System: Conventional Clinical Diagnosis for Depression
Disadvantages:
2. Limited Objectivity: Clinical assessments may lack objective biological markers for
depression. This can result in a lack of precision in distinguishing depression from other
mood disorders.
4. Stigma: Seeking clinical help for depression can be associated with social stigma, hindering
some individuals from seeking diagnosis and treatment.
Proposed System: Depression Detection Using EEG
Advantages:
2. Early Detection: EEG can detect subtle changes in brain activity associated with depression,
allowing for early detection and intervention, potentially preventing the progression of the
condition.
3. Non-Invasive: The use of EEG is non-invasive and does not involve exposure to radiation,
making it a safe and comfortable diagnostic tool for patients.
4. Data-Driven: Advanced signal processing techniques and machine learning algorithms can
analyze EEG data to identify depression with high accuracy, improving the efficiency of
diagnosis.
5. Remote Monitoring: EEG data can be collected remotely, enabling continuous monitoring of
patients' mental health without the need for frequent in-person visits.
6. Reduced Stigma: EEG-based diagnosis may reduce the social stigma associated with
traditional clinical assessments, as it focuses on neurophysiological markers rather than
subjective self-reporting.
7. Personalized Treatment: With more accurate diagnosis, healthcare providers can tailor
treatment plans to individual patients, improving the effectiveness of interventions.
HARD REQUIRMENTS :
System : i3 or above
Ram : 4GB Ram. \
Hard disk : 40GB
SOFTWARE REQUIRMENTS :
ECONOMICAL FEASIBILITY
TECHNICAL FEASIBILITY
SOCIAL FEASIBILITY
ECONOMICAL FEASIBILITY
This study is carried out to check the economic impact that the system will have on the
organization. The amount of fund that the company can pour into the research and
development of the system is limited. The expenditures must be justified. Thus the developed
system as well within the budget and this was achieved because most of the technologies
used are freely available. Only the customized products had to be purchased.
TECHNICAL FEASIBILI
This study is carried out to check the technical feasibility, that is, the technical requirements
of the system. Any system developed must not have a high demand on the available technical
resources. This will lead to high demands on the available technical resources. This will lead
to high demands being placed on the client. The developed system must have a modest
requirement, as only minimal or null changes are required for implementing this system.
SOCIAL FEASIBILITY
The aspect of study is to check the level of acceptance of the system by the user. This
includes the process of training the user to use the system efficiently. The user must not feel
threatened by the system, instead must accept it as a necessity. The level of acceptance by the
users solely depends on the methods that are employed to educate the user about the system
and to make him familiar with it. His level of confidence must be raised so that he is also able
to make some constructive criticism, which is welcomed, as he is the final user of the system.
4.SYSTEM DESIGN :
4.1 .UML DIAGRAMS :
GOALS:
MODULES:
upload MRI images dataset : use this button to get upload images.
Generate images train & test model : use this button to get generate images train & test
model.
Generate deep learning CNN model : use this button to get deep learning CNN model.
Get drive HQ images: using this button to get open drive HQ
Predict tumor :use this button to get predict tumor.
SOFTWARE ENVIRONMENT :
What is Python :
Python language is being used by almost all tech-giant companies like – Google,
Amazon, Facebook, Instagram, Dropbox, Uber… etc.
The biggest strength of Python is huge collection of standard library which can be
used for the following –
Machine Learning
GUI Applications (like Kivy, Tkinter, PyQt etc. )
Web frameworks like Django (used by YouTube, Instagram, Dropbox)
Image processing (like Opencv, Pillow)
Web scraping (like Scrapy, BeautifulSoup, Selenium)
Test frameworks
Multimedia
Advantages of Python :-
Let’s see how Python dominates over other languages.
1. Extensive Libraries
Python downloads with an extensive library and it contain code for various purposes like
regular expressions, documentation-generation, unit-testing, web browsers, threading,
databases, CGI, email, image manipulation, and more. So, we don’t have to write the
complete code for that manually.
2. Extensible
As we have seen earlier, Python can be extended to other languages. You can write
some of your code in languages like C++ or C. This comes in handy, especially in
projects.
3. Embeddable
Complimentary to extensibility, Python is embeddable as well. You can put your Python
code in your source code of a different language, like C++. This lets us add scripting
capabilities to our code in the other language.
4. Improved Productivity
The language’s simplicity and extensive libraries render programmers more
productive than languages like Java and C++ do. Also, the fact that you need to write
less and get more things done.
5. IOT Opportunities
Since Python forms the basis of new platforms like Raspberry Pi, it finds the future bright
for the Internet Of Things. This is a way to connect the language with the real world.
When working with Java, you may have to create a class to print ‘Hello World’. But in
Python, just a print statement will do. It is also quite easy to learn, understand,
and code. This is why when people pick up Python, they have a hard time adjusting to
other more verbose languages like Java.
7. Readable
Because it is not such a verbose language, reading Python is much like reading English.
This is the reason why it is so easy to learn, understand, and code. It also does not need
curly braces to define blocks, and indentation is mandatory. This further aids the
readability of the code.
8. Object-Oriented
This language supports both the procedural and object-oriented programming
paradigms. While functions help us with code reusability, classes and objects let us
model the real world. A class allows the encapsulation of data and functions into one.
10. Portable
When you code your project in a language like C++, you may need to make some
changes to it if you want to run it on another platform. But it isn’t the same with Python.
Here, you need to code only once, and you can run it anywhere. This is called Write
Once Run Anywhere (WORA). However, you need to be careful enough not to include
any system-dependent features.
11. Interpreted
Lastly, we will say that it is an interpreted language. Since statements are executed one
by one, debugging is easier than in compiled languages.
Any doubts till now in the advantages of Python? Mention in the comment section.
2. Affordable
Python is free therefore individuals, small companies or big organizations can leverage
the free available resources to build applications. Python is popular and widely used so it
gives you better community support.
The 2019 Github annual survey showed us that Python has overtaken Java in the
most popular programming language category.
Disadvantages of Python
So far, we’ve seen why Python is a great choice for your project. But if you choose it,
you should be aware of its consequences as well. Let’s now see the downsides of
choosing Python over another language.
1. Speed Limitations
We have seen that Python code is executed line by line. But since Python is interpreted, it
often results in slow execution. This, however, isn’t a problem unless speed is a focal
point for the project. In other words, unless high speed is a requirement, the benefits
offered by Python are enough to distract us from its speed limitations.
As you know, Python is dynamically-typed. This means that you don’t need to declare
the type of variable while writing the code. It uses duck-typing. But wait, what’s that?
Well, it just means that if it looks like a duck, it must be a duck. While this is easy on the
programmers during coding, it can raise run-time errors.
5. Simple
No, we’re not kidding. Python’s simplicity can indeed be a problem. Take my example. I
don’t do Java, I’m more of a Python person. To me, its syntax is so simple that the
verbosity of Java code seems unnecessary.
This was all about the Advantages and Disadvantages of Python Programming Language.
History of Python : -
What do the alphabet and the programming language Python have in common? Right,
both start with ABC. If we are talking about ABC in the Python context, it's clear that the
programming language ABC is meant. ABC is a general-purpose programming language
and programming environment, which had been developed in the Netherlands,
Amsterdam, at the CWI (Centrum Wiskunde &Informatica). The greatest achievement of
ABC was to influence the design of Python.Python was conceptualized in the late 1980s.
Guido van Rossum worked that time in a project at the CWI, called Amoeba, a
distributed operating system. In an interview with Bill Venners 1, Guido van Rossum said:
"In the early 1980s, I worked as an implementer on a team building a language called
ABC at Centrum voor Wiskunde en Informatica (CWI). I don't know how well people
know ABC's influence on Python. I try to mention ABC's influence because I'm indebted
to everything I learned during that project and to the people who worked on it."Later on
in the same Interview, Guido van Rossum continued: "I remembered all my experience
and some of my frustration with ABC. I decided to try to design a simple scripting
language that possessed some of ABC's better properties, but without its problems. So I
started typing. I created a simple virtual machine, a simple parser, and a simple runtime. I
made my own version of the various ABC parts that I liked. I created a basic syntax, used
indentation for statement grouping instead of curly braces or begin-end blocks, and
developed a small number of powerful data types: a hash table (or dictionary, as we call
it), a list, strings, and numbers."
Before we take a look at the details of various machine learning methods, let's start by
looking at what machine learning is, and what it isn't. Machine learning is often
categorized as a subfield of artificial intelligence, but I find that categorization can often
be misleading at first brush. The study of machine learning certainly arose from research
in this context, but in the data science application of machine learning methods, it's more
helpful to think of machine learning as a means of building models of data.
At the most fundamental level, machine learning can be categorized into two main types:
supervised learning and unsupervised learning.
Quality of data − Having good-quality data for ML algorithms is one of the biggest
challenges. Use of low-quality data leads to the problems related to data preprocessing
and feature extraction.
No clear objective for formulating business problems − Having no clear objective and
well-defined goal for business problems is another key challenge for ML because this
technology is not that mature yet.
Machine Learning is the most rapidly growing technology and according to researchers
we are in the golden year of AI and ML. It is used to solve many real-world complex
problems which cannot be solved with traditional approach. Following are some real-
world applications of ML −
Emotion analysis
Sentiment analysis
Speech synthesis
Speech recognition
Customer segmentation
Object recognition
Fraud detection
Fraud prevention
Arthur Samuel coined the term “Machine Learning” in 1959 and defined it as a “Field of
study that gives computers the capability to learn without being explicitly
programmed”.
And that was the beginning of Machine Learning! In modern times, Machine Learning is
one of the most popular (if not the most!) career choices. According to Indeed, Machine
Learning Engineer Is The Best Job of 2019 with a 344% growth and an average base
salary of $146,085 per year.
But there is still a lot of doubt about what exactly is Machine Learning and how to start
learning it? So this article deals with the Basics of Machine Learning and also the path
you can follow to eventually become a full-fledged Machine Learning Engineer. Now let’s
get started!!!
This is a rough roadmap you can follow on your way to becoming an insanely talented
Machine Learning Engineer. Of course, you can always modify the steps according to
your needs to reach your desired end-goal!
Step 1 – Understand the Prerequisites
In case you are a genius, you could start ML directly but normally, there are some
prerequisites that you need to know which include Linear Algebra, Multivariate Calculus,
Statistics, and Python. And if you don’t know these, never fear! You don’t need a Ph.D.
degree in these topics to get started but you do need a basic understanding.
Both Linear Algebra and Multivariate Calculus are important in Machine Learning.
However, the extent to which you need them depends on your role as a data scientist. If
you are more focused on application heavy machine learning, then you will not be that
heavily focused on maths as there are many common libraries available. But if you want
to focus on R&D in Machine Learning, then mastery of Linear Algebra and Multivariate
Calculus is very important as you will have to implement many ML algorithms from
scratch.
Data plays a huge role in Machine Learning. In fact, around 80% of your time as an ML
expert will be spent collecting and cleaning data. And statistics is a field that handles the
collection, analysis, and presentation of data. So it is no surprise that you need to learn
it!!!
Some of the key concepts in statistics that are important are Statistical Significance,
Probability Distributions, Hypothesis Testing, Regression, etc. Also, Bayesian Thinking is
also a very important part of ML which deals with various concepts like Conditional
Probability, Priors, and Posteriors, Maximum Likelihood, etc.
Some people prefer to skip Linear Algebra, Multivariate Calculus and Statistics and learn
them as they go along with trial and error. But the one thing that you absolutely cannot
skip is Python! While there are other languages you can use for Machine Learning like R,
Scala, etc. Python is currently the most popular language for ML. In fact, there are many
Python libraries that are specifically useful for Artificial Intelligence and Machine
Learning such as Keras, TensorFlow, Scikit-learn, etc.
So if you want to learn ML, it’s best if you learn Python! You can do that using various
online resources and courses such as Fork Python available Free on GeeksforGeeks.
Now that you are done with the prerequisites, you can move on to actually learning ML
(Which is the fun part!!!) It’s best to start with the basics and then move on to the more
complicated stuff. Some of the basic concepts in ML are:
Model – A model is a specific representation learned from data by applying some machine
learning algorithm. A model is also called a hypothesis.
Feature – A feature is an individual measurable property of the data. A set of numeric
features can be conveniently described by a feature vector. Feature vectors are fed as input
to the model. For example, in order to predict a fruit, there may be features like color,
smell, taste, etc.
Target (Label) – A target variable or label is the value to be predicted by our model. For the
fruit example discussed in the feature section, the label with each set of input would be the
name of the fruit like apple, orange, banana, etc.
Training – The idea is to give a set of inputs(features) and it’s expected outputs(labels), so
after training, we will have a model (hypothesis) that will then map new data to one of the
categories trained on.
Prediction – Once our model is ready, it can be fed a set of inputs to which it will provide a
predicted output(label).
Supervised Learning – This involves learning from a training dataset with labeled data
using classification and regression models. This learning process continues until the
required level of performance is achieved.
Unsupervised Learning – This involves using unlabelled data and then finding the
underlying structure in the data in order to learn more and more about the data itself using
factor and cluster analysis models.
Semi-supervised Learning – This involves using unlabelled data like Unsupervised
Learning with a small amount of labeled data. Using labeled data vastly increases the
learning accuracy and is also more cost-effective than Supervised Learning.
Reinforcement Learning – This involves learning optimal actions through trial and error.
So the next action is decided by learning behaviors that are based on the current state and
that will maximize the reward in the future.
Advantages of Machine learning :-
Machine Learning can review large volumes of data and discover specific trends and
patterns that would not be apparent to humans. For instance, for an e-commerce website like
Amazon, it serves to understand the browsing behaviors and purchase histories of its users
to help cater to the right products, deals, and reminders relevant to them. It uses the results
to reveal relevant advertisements to them.
With ML, you don’t need to babysit your project every step of the way. Since it means
giving machines the ability to learn, it lets them make predictions and also improve the
algorithms on their own. A common example of this is anti-virus softwares; they learn to
filter new threats as they are recognized. ML is also good at recognizing spam.
3. Continuous Improvement
As ML algorithms gain experience, they keep improving in accuracy and efficiency. This
lets them make better decisions. Say you need to make a weather forecast model. As the
amount of data you have keeps growing, your algorithms learn to make more accurate
predictions faster.
Machine Learning algorithms are good at handling data that are multi-dimensional and
multi-variety, and they can do this in dynamic or uncertain environments.
5. Wide Applications
You could be an e-tailer or a healthcare provider and make ML work for you. Where it does
apply, it holds the capability to help deliver a much more personal experience to customers
while also targeting the right customers.
1. Data Acquisition
Machine Learning requires massive data sets to train on, and these should be
inclusive/unbiased, and of good quality. There can also be times where they must wait for
new data to be generated.
ML needs enough time to let the algorithms learn and develop enough to fulfill their
purpose with a considerable amount of accuracy and relevancy. It also needs massive
resources to function. This can mean additional requirements of computer power for you.
3. Interpretation of Results
Another major challenge is the ability to accurately interpret results generated by the
algorithms. You must also carefully choose the algorithms for your purpose.
4. High error-susceptibility
Machine Learning is autonomous but highly susceptible to errors. Suppose you train an
algorithm with data sets small enough to not be inclusive. You end up with biased
predictions coming from a biased training set. This leads to irrelevant advertisements being
displayed to customers. In the case of ML, such blunders can set off a chain of errors that
can go undetected for long periods of time. And when they do get noticed, it takes quite
some time to recognize the source of the issue, and even longer to correct it.
Guido Van Rossum published the first version of Python code (version 0.9.0) at alt.sources
in February 1991. This release included already exception handling, functions, and the core
data types of list, dict, str and others. It was also object oriented and had a module system.
Python version 1.0 was released in January 1994. The major new features included in this
release were the functional programming tools lambda, map, filter and reduce, which
Guido Van Rossum never liked.Six and a half years later in October 2000, Python 2.0 was
introduced. This release included list comprehensions, a full garbage collector and it was
supporting unicode.Python flourished for another 8 years in the versions 2.x before the next
major release as Python 3.0 (also known as "Python 3000" and "Py3K") was released.
Python 3 is not backwards compatible with Python 2.x. The emphasis in Python 3 had been
on the removal of duplicate programming constructs and modules, thus fulfilling or coming
close to fulfilling the 13th law of the Zen of Python: "There should be one -- and preferably
only one -- obvious way to do it."Some changes in Python 7.3:
Purpose :-
Python
Python features a dynamic type system and automatic memory management. It supports
multiple programming paradigms, including object-oriented, imperative, functional and
procedural, and has a large and comprehensive standard library.
Python is Interpreted − Python is processed at runtime by the interpreter. You do not need
to compile your program before executing it. This is similar to PERL and PHP.
Python is Interactive − you can actually sit at a Python prompt and interact with the
interpreter directly to write your programs.
Python also acknowledges that speed of development is important. Readable and terse
code is part of this, and so is access to powerful constructs that avoid tedious repetition of
code. Maintainability also ties into this may be an all but useless metric, but it does say
something about how much code you have to scan, read and/or understand to
troubleshoot problems or tweak behaviors. This speed of development, the ease with
which a programmer of other languages can pick up basic Python skills and the huge
standard library is key to another area where Python excels. All its tools have been quick
to implement, saved a lot of time, and several of them have later been patched and
updated by people with no Python background - without breaking.
Tensorflow
TensorFlow is a free and open-source software library for dataflow and differentiable
programming across a range of tasks. It is a symbolic math library, and is also used
for machine learning applications such as neural networks. It is used for both research
and production at Google.
TensorFlow was developed by the Google Brain team for internal Google use. It was
released under the Apache 2.0 open-source license on November 9, 2015.
Numpy
It is the fundamental package for scientific computing with Python. It contains various
features including these important ones:
Pandas
Matplotlib
For simple plotting the pyplot module provides a MATLAB-like interface, particularly
when combined with IPython. For the power user, you have full control of line styles,
font properties, axes properties, etc, via an object oriented interface or via a set of
functions familiar to MATLAB users.
Scikit – learn
Python features a dynamic type system and automatic memory management. It supports
multiple programming paradigms, including object-oriented, imperative, functional and
procedural, and has a large and comprehensive standard library.
Python is Interpreted − Python is processed at runtime by the interpreter. You do not need
to compile your program before executing it. This is similar to PERL and PHP.
Python is Interactive − you can actually sit at a Python prompt and interact with the
interpreter directly to write your programs.
Python also acknowledges that speed of development is important. Readable and terse
code is part of this, and so is access to powerful constructs that avoid tedious repetition of
code. Maintainability also ties into this may be an all but useless metric, but it does say
something about how much code you have to scan, read and/or understand to
troubleshoot problems or tweak behaviors. This speed of development, the ease with
which a programmer of other languages can pick up basic Python skills and the huge
standard library is key to another area where Python excels. All its tools have been quick
to implement, saved a lot of time, and several of them have later been patched and
updated by people with no Python background - without breaking.
There have been several updates in the Python version over the years. The question is how
to install Python? It might be confusing for the beginner who is willing to start learning
Python but this tutorial will solve your query. The latest or the newest version of Python is
version 3.7.4 or in other words, it is Python 3.
Note: The python version 3.7.4 cannot be used on Windows XP or earlier devices.
Before you start with the installation process of Python. First, you need to know about
your System Requirements. Based on your system type i.e. operating system and based
processor, you must download the python version. My system type is a Windows 64-bit
operating system. So the steps below are to install python version 3.7.4 on Windows 7
device or to install Python 3. Download the Python Cheatsheet here.The steps on how to
install Python on Windows 10, 8 and 7 are divided into 4 parts to help understand better.
Step 1: Go to the official site to download and install python using Google Chrome or any
other web browser. OR Click on the following link: https://www.python.org
Now, check for the latest and the correct version for your operating system.
Step 3: You can either select the Download Python for windows 3.7.4 button in Yellow
Color or you can scroll further down and click on download with respective to their version.
Here, we are downloading the most recent python version for windows 3.7.4
Step 4: Scroll down the page until you find the Files option.
Step 5: Here you see a different version of python along with the operating system.
• To download Windows 32-bit python, you can select any one from the three options:
Windows x86 embeddable zip file, Windows x86 executable installer or Windows x86
web-based installer.
•To download Windows 64-bit python, you can select any one from the three options:
Windows x86-64 embeddable zip file, Windows x86-64 executable installer or Windows
x86-64 web-based installer.
Here we will install Windows x86-64 web-based installer. Here your first part regarding
which version of python is to be downloaded is completed. Now we move ahead with the
second part in installing python i.e. Installation
Note: To know the changes or updates that are made in the version you can click on the
Release Note Option.
Installation of Python
Step 1: Go to Download and Open the downloaded python version to carry out the
installation process.
Step 2: Before you click on Install Now, Make sure to put a tick on Add Python 3.7 to
PATH.
Step 3: Click on Install NOW After the installation is successful. Click on Close.
With these above three steps on python installation, you have successfully and correctly
installed Python. Now is the time to verify the installation.
Note: The installation process might take a couple of minutes.
Step 5: Name the file and save as type should be Python files. Click on SAVE. Here I have
named the files as Hey World.
Step 6: Now for e.g. enter print
6.SYSTEM TEST
The purpose of testing is to discover errors. Testing is the process of trying to discover every
conceivable fault or weakness in a work product. It provides a way to check the functionality
of components, sub assemblies, assemblies and/or a finished product It is the process of
exercising software with the intent of ensuring that the Software system meets its
requirements and user expectations and does not fail in an unacceptable manner. There are
various types of test. Each test type addresses a specific testing requirement.
TYPES OF TESTS
Unit testing
Unit testing involves the design of test cases that validate that the internal
program logic is functioning properly, and that program inputs produce valid outputs. All
decision branches and internal code flow should be validated. It is the testing of individual
software units of the application .it is done after the completion of an individual unit before
integration. This is a structural testing, that relies on knowledge of its construction and is
invasive. Unit tests perform basic tests at component level and test a specific business
process, application, and/or system configuration. Unit tests ensure that each unique path of a
business process performs accurately to the documented specifications and contains clearly
defined inputs and expected results.
Integration testing
Integration tests are designed to test integrated software components to
determine if they actually run as one program. Testing is event driven and is more concerned
with the basic outcome of screens or fields. Integration tests demonstrate that although the
components were individually satisfaction, as shown by successfully unit testing, the
combination of components is correct and consistent. Integration testing is specifically aimed
at exposing the problems that arise from the combination of components.
Functional test
Functional tests provide systematic demonstrations that functions tested are
available as specified by the business and technical requirements, system documentation, and
user manuals.
Functional testing is centered on the following items:
System Test
System testing ensures that the entire integrated software system meets
requirements. It tests a configuration to ensure known and predictable results. An example of
system testing is the configuration oriented system integration test. System testing is based on
process descriptions and flows, emphasizing pre-driven process links and integration points.
Unit Testing
Unit testing is usually conducted as part of a combined code and unit test
phase of the software lifecycle, although it is not uncommon for coding and unit testing to be
conducted as two distinct phases.
Field testing will be performed manually and functional tests will be written
in detail.
Test objectives
All field entries must work properly.
Pages must be activated from the identified link.
The entry screen, messages and responses must not be delayed.
Features to be tested
Verify that the entries are of the correct format
No duplicate entries should be allowed
All links should take the user to the correct page.
Integration Testing
Software integration testing is the incremental integration testing of two or
more integrated software components on a single platform to produce failures caused by
interface defects.
The task of the integration test is to check that components or software applications, e.g.
components in a software system or – one step up – software applications at the company
level – interact without error.
Test Results: All the test cases mentioned above passed successfully. No defects
encountered.
Acceptance Testing
User Acceptance Testing is a critical phase of any project and requires significant
participation by the end user. It also ensures that the system meets the functional
requirements.
Test Results: All the test cases mentioned above passed successfully. No defects
encountered.
Depression Detection using EEG
In this project we are using deep learning CNN algorithm to predict depression
from EEG signals dataset. Now-a-days humans are more prone to depression
due to competitive environment in all fields and timely detection of depression
can help humans in recovering faster. All existing techniques were dependent
on manual counselling or traditional machine learning algorithms such as SVM
are not accurate. To predict depression accurately we are applying CNN
algorithm which will filtered trained data at multiple layers to get optimized
features which result into increase prediction accuracy.
To train CNN algorithm we have downloaded Depression EEG signals dataset
from KAGGLE and below screen showing some dataset details
In above dataset screen first row represents dataset column names and
remaining rows represents dataset values and this values are extracted from
EEG signals and each rows contains 989 columns.
To implement this project we have designed following modules
1) Upload EEG-Signal Dataset: using this module we will upload dataset to
application and then extract normal and depressed records from dataset
2) Features Extraction: using this module we will replace all missing values
and then extract features from dataset and then split dataset into train and
test where application used 80% dataset for training and 20% for testing
3) Run Existing SVM Algorithm: using this module we will train SVM
algorithm on 80% training dataset and then evaluate its performance on
20% dataset in terms of accuracy and precision
4) Run Propose CNN Algorithm: using this module we will train propose
CNN algorithm on 80% training dataset and then evaluate its
performance on 20% dataset in terms of accuracy and precision
5) Predict Depression from Test Signals: using this module we will upload
test dataset and then CNN will predict weather test data is normal or
DEPRESSED
6) Comparison Graph: using this module we will plot comparison graph
between SVM and CNN
SCREENSHOTS
To run project double click on ‘run.bat’ file to get below screen
In above screen with SVM we got 63% accuracy and in confusion matrix graph
x-axis represents Predicted classes and y-axis contains TRUE classes and we
can see SVM predicted all records as Depressed and its performance is not good
and now close above graph and then click on ‘Run Propose CNN Algorithm’
button to train CNN and get below output
In above screen with CNN we got 93% accuracy and in confusion matrix graph
different colour boxes represents CORRECT prediction count and same colour
boxes represents INCORRECT prediction count and CNN predicted only 23
and 11 as wrong prediction and 173 and 289 as correct prediction. Now close
above graph and then click on ‘Predict Depression from Test Signals’ button to
upload test data and get prediction output
In above screen selecting and uploading ‘test_signal.csv’ file and then press
Open button to load test data and get below prediction output
In above screen in square bracket we can see TEST data values and after =
symbol we can see prediction as Depressed or Normal and you can scroll above
output screen to view all prediction output like below screen