Abiri 2019 - A Comprehensive Review of EEG-based Brain-Computer Interface Paradigms
Abiri 2019 - A Comprehensive Review of EEG-based Brain-Computer Interface Paradigms
net/publication/328974192
CITATIONS READS
247 5,199
5 authors, including:
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Soheil Borhani on 15 March 2019.
*Corresponding author: 313 Perkins Hall, University of Tennessee, Knoxville, TN 37996 USA,
E-mail: [email protected]
Abstract
Advances in brain science and computer technology in the past decade have led to exciting
developments in Brain-Computer Interface (BCI), thereby making BCI a top research area in
applied science. The renaissance of BCI opens new methods of neurorehabilitation for physically
disabled people (e.g., paralyzed patients and amputees) and patients with brain injuries (e.g.,
stroke patients). Recent technological advances such as wireless recording, machine learning
analysis, and real-time temporal resolution have increased interest in electroencephalographic
(EEG) based BCI approaches. Many BCI studies have focused on decoding EEG signals
associated with whole-body kinematics/kinetics, motor imagery, and various senses. Thus, there
is a need to understand the various experimental paradigms used in EEG-based BCI systems.
Moreover, given that there are many available options, it is essential to choose the most
appropriate BCI application to properly manipulate a neuroprosthetic or neurorehabilitation
device. The current review evaluates EEG-based BCI paradigms regarding their advantages and
disadvantages from a variety of perspectives. For each paradigm, various EEG decoding
algorithms and classification methods are evaluated. The applications of these paradigms with
targeted patients are summarized. Finally, potential problems with EEG-based BCI systems are
discussed, and possible solutions are proposed.
1
1 Introduction
The concept of using brain signals to control prosthetic arms was developed in 1971 [1]. Since
that time, researchers have been attempting to interpret brain waveforms to establish a more
accurate and convenient control over external devices. Later, this research area was termed
brain-computer interface (BCI), and its applications spread rapidly [2].
BCI systems utilize recorded brain activity to communicate between the brain and computers to
control the environment in a manner that is compatible with the intentions of humans [3]. There
are two primary directions in which BCI systems have been applied. The first is studying brain
activity to investigate a feedforward pathway used to control the external devices without the aim
of rehabilitation [4]. The other dominant direction is closed-loop BCI systems during
neurorehabilitation with the feedback loop playing a vital role in recovering the neural plasticity
training or regulating brain activities [4].
Brain activity can be recorded through various neuroimaging methods [3, 5]. The methods can be
categorized into two groups: invasive and noninvasive. Electrocorticography (ECoG) and
electroencephalography (EEG) have become the most common invasive and noninvasive
technologies, respectively [3]. ECoG, also known as intracranial EEG, is recorded from the
cortical surface. Other invasive technologies record signals from within the brain using single-
neuron action potentials (single units), multi-unit activity (MUA), local field potentials (LFPs)
[6, 7]. The high quality spatial and temporal characteristics of these signals lead to successful
decoding of biomechanic parameters [8-12]. These decoding achievements for upper limb
kinematics using invasive electrodes in monkeys and humans have resulted in accurate control of
prosthetic devices in 3D space [13-17]. However, the invasive electrodes have significant
drawbacks due to the risk of performing surgery and the gradual degradation of the recorded
signals. Therefore, noninvasive approaches such as functional magnetic resonance imaging
(fMRI), magnetoencephalography (MEG), near-infrared spectroscopy (NIRS), and EEG have
become more widespread in human participants.
Although some noninvasive technologies provide a higher spatial resolution (e.g., fMRI), the
EEG has proved to be the most popular method due to direct measures of neural activity,
inexpensiveness, and portability for clinical use [3]. EEG measures electrical brain activity
caused by the flow of electric currents during synaptic excitations of neuronal dendrites,
especially in the cortex, but also in the deep brain structures. The electric signals are recorded by
placing electrodes on the scalp [3]. EEG signals have been used to control devices such as
wheelchairs [18] and communication aid systems [19]. During the past decade, EEG methods
have also become a promising approach in controlling assistive and rehabilitation devices [20].
EEG signals could provide a pathway from the brain to various external devices resulting in
brain-controlled assistive devices for disabled people and brain-controlled rehabilitation devices
for patients with strokes and other neurological deficits [21-25]. One of the most challenging
topics in BCI is finding and analyzing the relationships between recorded brain activity and
underlying models of the human body, biomechanics, and cognitive processing. As a result,
2
investigation of relationships between EEG signals and upper limb movement, real and
imaginary, has become a fascinating area of research in recent years [26, 27].
To implement an EEG-based BCI system for a particular application, a specific protocol and
paradigm has to be chosen for all phases of the experiment. First, the subject performs a
particular task (e.g., imagery task, visual task) in order to learn how to modulate their brain
activity while EEG signals are recorded from the scalp. Using the recorded EEG as training
data, a neural decoder for the paradigm is generated. Afterward, the subject performs the task
again and the neural decoder is used for BCI control.
Many EEG-based BCI review papers have been published [18, 23, 24, 28-32]; however, there is
a lack of review or guidance in comparing EEG-based BCI paradigms. Here we aim to review
the most commonly employed EEG-based BCI paradigms. A guideline on deployed algorithms
and classification methods in generating control signals from these paradigms are summarized.
Each of these paradigms has their advantages and disadvantages depending on a patient’s
physical condition and user-friendliness. The current and future potential applications of these
paradigms in the manipulation of an external object, rehabilitation, restoration, enhancement, and
entertainment are investigated. Finally, present issues and limitations in EEG-based BCI systems
are examined, and future possibilities for developing new paradigms are discussed.
2 Motor Imagery Paradigms
Motor imagery is described as imagining a movement rather than executing a real movement (for
more detail on motor imagery see [27]). Previous studies have confirmed that imagination
activates areas of the brain that are responsible for generating actual movement [33]. The most
common motor imagery paradigms reported in literature are sensorimotor rhythms (SMR) and
imagined body kinematics (IBK). In the following sections, the paradigms are described in
detail.
2.1 Sensorimotor Rhythms (SMR) Paradigms
2.1.1 Overview
The sensorimotor rhythms paradigm is one of the most popular motor imagery paradigms (e.g.,
[34], [35]). In this paradigm, the imagined movement is defined as the imagination of kinesthetic
movements of large body parts such as hands, feet, and tongue, which could result in
modulations of brain activity [36].
Imagined movement in sensorimotor rhythm paradigms causes event-related desynchronization
(ERD) in mu (8-12 Hz) and beta rhythms (18-26 Hz). In contrast, relaxation results in event-
related synchronization (ERS), (for an in-depth review see [37]). The ERD and ERS modulations
are most prominent in EEG signals acquired from electrode locations C3 and C4 (10/20
international system); these electrode locations are above the sensorimotor cortex. These
modulated EEG signals in the aforementioned frequency domains (mu/beta) can be employed to
control prosthetic devices. Wolpaw et al. [38] controlled a one-dimensional cursor using mu
3
rhythms. Figure 1 shows examples of change in frequency spectra of SMR during imagination of
hands.
The main drawback of the SMR paradigm is that the training time for 2D and 3D cursor control
can take weeks or months. The training for this system requires subjects to learn how to
modulate specific frequency bands of neural activity to move a cursor in different directions to
select desired targets.
Figure 1. An example of a change in frequency spectra for EEG recorded from C3 and C4. The top row
(a, b) shows spectral power changes in C3 and C4 electrodes while performing imagined movement of
right hand versus left hand. The middle row (c, d) shows spectral power change in C3 and C4 electrodes
while performing imagined movement of both hands versus rest. The bottom row (e, f) shows spectral
power change in C3 and C4 electrodes for imagined movement of right hand versus rest and left hand
versus rest, respectively (Figure copied from [39] with permission of a Creative Commons Attribution
(CC BY) license).
4
controlled by a linear equation in which the independent variable was a weighted combination of
the amplitudes in a mu or beta rhythm frequency band recorded from the right and left
sensorimotor cortices. These changes were generated as the result of right and left-hand
imaginary movements.
Bhattacharyya et al. [41] compared the performance of different classification methods for left/
right hand imagery in EEG features. They found that the accuracy of kernelized SVM
outperforms the other classifiers. Murguialday et al. [42] designed a hierarchical linear
classification scheme using the peak mu power band to differentiate between relaxation, left-
hand movement, and right-hand movement. For movement prediction of the right hand, left
hand, tongue, and right foot, Morash et al. [36] showed that time-frequency features could better
depict the non-stationary nature of EEG SMR. Using a parametric modeling approach, they
divided time into bins of 256ms and frequency into bins of 3.9 Hz and applied Naïve Bayesian
classification. However, parametric classification methods require a priori knowledge of
subjects’ EEG pattern that is not always applicable for BCI control. Nonetheless, Chen at al. [43]
used a three-layer neural network non-parametric approach, and they investigated an adaptive
classifier for controlling an orthotic hand by motor imagery. A summary of previous SMR work
is shown in Table 1.
Table 1. Previous SMR paradigms. DWT: Discrete wavelet transform, LMS: Least mean square, STFT:
short-time Fourier transform, CSP: Common spatial pattern, N/A: Not Applicable
Reference Task Feature Classification method
[38] Cursor control in 1D Mu rhythm (8-12Hz) N/A
amplitude from
[44] Cursor control in 2D FFT + Mu rhythm amplitude Linear regression
(7.5-16Hz)
[45] Grasping and object DWT over 12-14Hz and 18- LDA
manipulation 22Hz
[40] Cursor control in 2D Mu (8-12Hz) and Beta (18- Linear regression + LMS to
26Hz) rhythm amplitude optimize weights
[42] Control of a prosthetic hand Peak Mu (8-12Hz) band A logistic regression
power (relaxation and motor
imagery) + a logistic
regression (right hand and
left hand)
[43] Control of a hand orthosis STFT over Mu band (8- 3-layer feedforward NN
14Hz) classified three classes (right
hand, left hand, no
imagination)
[46] Control of a rehabilitation Using CSP algorithm to N/A
robot select features
5
[47] Control of a robotic am Time-frequency power of N/A
EEG over the recorded
locations on [10.5,13.5] Hz
frequency range
[48] Control of a rehabilitation Time-Frequency power in LDA
robot EEG alpha [8,13] Hz, sigma
[14,18] Hz and beta [18-30]
Hz bands over C3, C4, and
Cz
7
different hand clenching speeds using spatial-temporal characteristics of alpha (8-12 Hz) and
beta (18-26 Hz) bands. To translate multiple discrete speeds of hand imagery they developed
multiple linear regression models and smoothed the output with a low-pass 1 Hz filter. Although
they found a correlation between higher frequency bands and the speed of imagery, they did not
successfully find movement trajectory information. Bradberry et al. [63] conducted a prominent
study on IBK; they were able to extract two-dimensional hand imagery [63] and actual three-
dimensional hand movement trajectory [72] using low-frequency EEG signals (<1 Hz). A linear
decoding model with first-order temporal differences of EEG data was developed, and they
successfully modeled continuous cursor velocity, which was correlated with the defined
trajectory. They also showed that EEG data from 100ms before movement imagination onset is
correlated with the movement. The linear model was as follows:
𝑁 𝐿
The same equation was used for horizontal and vertical velocities. In this equation 𝑥[𝑡] −
𝑥[𝑡 − 1] is cursor velocity along one axis, 𝑁 is the number of EEG channels, 𝐿 is the number of
time lags, 𝑆𝑛 [𝑡 − 𝑘] is the temporal lagged version of EEG at EEG channel n at time lag 𝑘, and 𝑎
and 𝑏 are the weights that result from the linear regression.
Using partial least squares (PLS) regression, Ofner and Müller-Putz [73] were able to reduce
EEG artifacts And also eliminate highly correlated variables. They were also able to identify
relationships between latent predictors and desired response variables. By using different
electrode locations and different time lags as latent variables, the algorithm captured the user’s
source space contribution to the latent variables. Finally, Kim et al. [65] explored a nonlinear
decoding model called kernel ridge regression (KRR). They showed that KRR algorithm
significantly reduced eye movement contamination, which is common in linear models. Andres
et al. [66] and Kim et al. [65] examined the role of eye movement in the linear decoding of IBK.
By comparing the decoding performance with and without EOG contaminated brain signals, they
found that eye movement plays a significant role in IBK tasks. Additionally, in contrast to a
report published by Korik et al. [75], Kim et al. [65] confirmed that the SMR bands do not
contain kinematic parameter information.
8
operated with zero-training is also a promising future development. Abiri et al. [80, 81] used the
IBK in a zero-training BCI paradigm to control a quadcopter in 2D space.
9
Figure 2. ERP components after the onset of a visual stimulus (figure copied from [92] with
permission) .
10
significantly better than other classification methods. Moreover, their analysis indicated that the
P300 was stable across sessions and subjects.
Citi et al. [97] introduced a 2D cursor control P300-based BCI. They were able to extract an
analog control signal with a single-trial approach using a genetic algorithm. Also, there are other
single-trial classification approaches using P300 signals [98-101]. Most of the early P300 Speller
research had focused on EEG locations along the midline (e.g., Fz, Cz, and Pz). In [102]
information from posterior locations such as PO7, PO8, and Oz were added to an SWLDA
classifier. They showed that adding additional electrode locations significantly improved the
discriminability of data samples. Bell et al. [103] increased the information transfer rate (ITR) to
24 bits/min for a four-choice classification problem relying on the fact that P300 has a robust
response to multiple trials. They elicited P300-based control analyzing only five trials of P300
responses with 95% accuracy using SVMs. Edlinger et al. [104] and Chen et al. [105] applied the
paradigm in a virtual environment (VE) as an actuator for a smart building scenario and to
control a virtual hand, respectively. By dividing the screen into seven different regions Fazel-
Rezai and Abhari [106] were able to reduce distraction caused by adjacent items and, at the same
time, were able to lower the stimulus probability. These changes resulted in larger P300
amplitudes, which resulted in higher detection accuracy and higher ITR [92].
An innovative checkerboard paradigm (CBP) was introduced in [107]. The CBP showed
significantly higher mean accuracy than the row-column paradigm (RCP) (i.e., 92% compared to
77%) and mean ITR was increased to 23 bits/min from 17 bits/min. The CBP is able to avoid
stimulus adjacency-distraction error addressed in [106] and also increases P300 detection
accuracy by lowering the probability of target-occurrence. In [108], a language model to enhance
typing speed was utilized. They examined P300 BCI paradigms including single-character
presentation (SCP), RCP, and they also tested a rapid serial visual presentation (RSVP)
paradigm. They applied PCA over a band-pass [1.5-42] Hz filtered EEG to extract a one-
dimensional feature vector from multiple locations over frontal, central, and parietal regions.
11
[103] Control of a humanoid robot Band-pass filter [0.5, 30] Hz SVM
and downsampling to 100 Hz
[104] Single Character (SC) speller Band-pass filter [0.5, 30] Hz LDA
and downsampling to 60 Hz
[106] Region-based (RB) speller C1, C2, Cz, Pz, and Fz Averaged Mexican-hat
channels were used wavelet coefficients used as
feature set
[107] 8×9 Checkerboard (CB) Cz, Pz, PO7, and PO8 channels SWLDA
speller were used
[2, 109] Single Character (SC) speller Scaling data samples into [-1, Bayesian Linear
1] and downsampling to Discriminant Analysis
32Hz (BLDA) and Fisher’s Linear
Discriminant Analysis
(FLDA)
[110] Target selection in 3 D space Channel selection and SWLDA
downsampling to16 Hz
12
a light-emitting diode (LED) or a cathode ray tube (CRT). Multiple flickering targets with
distinct flickering frequencies are typically presented to the user. There is a strong correlation
between flicker frequency and the observed frequency of the EEG. The user’s intended target is
determined by matching the pattern of EEG activity to the command associated with the
particular frequency.
There are advantages associated with the SSVEP paradigm. Because the stimuli are exogenous,
it is a no-training paradigm that can be used by many subjects. The stimuli flash at many
different frequencies, thereby resulting in many commands and more degrees of freedom to
control prosthetic devices. In addition, the SSVEP frequencies can be more reliably classified
than event-related potentials. However, the use of flickering stimuli could lead to fatigue for the
subject, mainly when using low flickering frequency [119-122]. This paradigm is also not well
suited for people with visual impairments due to the required gaze shifts during use. However,
Min et al. [123] have recently proposed a new SSVEP paradigm that uses a grid-shaped line
array. They suggested that this novel presentation is gaze-independent. There are also steady-
state somatosensory evoked potentials (SSSEP) [124] and hybrid SSSEP and P300 applications
[125].
13
subjects in order to investigate the scope of applicability of SSVEP-based BCIs. In addition to
performance, they examined a number of covariates including age, gender, and level of tiredness.
Chen et al. [131] examined the correlation coefficients between stimulus frequency and subject’s
EEG frequency using canonical correlation analysis (CCA). Considering accuracy and ITR
simultaneously, they determined a user-specific optimal stimulation duration and phase interval.
In a text input application, Chen et al. [132] attempted to enhance ITR by employing entropy
coding algorithms such as Huffman coding. An advantage of the SSVEP paradigm is that it is
less susceptible to motion artifacts. Thus, it is a suitable choice for a mobile subject. Pfurtscheller
at al. [133], showed that a gait-assistance exoskeleton could be accurately controlled. They
evaluated online and offline performance of CCA and k nearest neighbors (kNN) classifiers.
Most studies conducted with the SSVEP paradigm are based on decoding bottom-up visual
information. Thus, these systems are gaze-shift dependent. Min et al. [123] examined a top-down
visual condition within the paradigm. The results in the top-down condition showed a different
pattern over the occipital lobe than the pattern produced by the bottom-up condition. Moreover, a
randomly-shuffled LDA (rLDA) classifier performed more accurately in the top-down condition
than the more commonly used CCA classifier. An overview of previous SSVEP studies with
accuracy and ITR is shown in Table 3.
Bio-inspired intelligent information processing techniques can also help to understand the human
perceptual systems and to incorporate the biological models and features of human perceptual
systems into the bioinspired information processing techniques to process the physiological
signals for BCI. For instance, entropy can be used to measure the dynamic complexity of EEG
signals. Cao et al. [134] proposed using inherent fuzzy entropy for the improvement of EEG
complexity evaluation, which can apply to SSVEP.
14
[126] Control of an electrical (6-13) Hz with four Maximum Likelihood
prosthesis electrodes over occipital lobe
[129] A brain-to-brain motion (6-13) Hz with four LDA
control system electrodes over occipital lobe
[130] Spelling (7-10) Hz with eight MEC
electrodes over occipital lobe
4 Error-Related Potential
4.1 Overview
The error-related potential (ErrP) recently been used as an ERP component that can be used to
correct BCI errors [138]. The ErrP occurs when there is a mismatch between a subject’s intention
to perform a given task and the response provided by the BCI. For example, assume a user
wishes to move a cursor from the middle of a monitor to the left side of the monitor. If the cursor
erroneously moves to the right, an error-related potential will be generated. The ErrP is mostly
pronounced at frontal and central lobes and has a latency of 200-700ms. Figure 3 shows a
schematic of how an ErrP is generated and how it can be used to teach an intelligent agent to
control a BCI. The paradigm no longer relies on an average number of trials like in P300, but it
uses a short window in a single trial basis. Ferrez and Millan [139] decoded errors followed the
occurrence of miss recognition of user intent by the BCI system. Subsequently, Chavarriaga and
Millan [140] utilized the ErrP to control an external autonomous device within the concept of
shared autonomy. The shared autonomy describes the situation where the user has only a
supervisory control over the action of a system upon which he/she has no control. Consistent
with the previous reports, they reported an ERP response located over the medial-frontal cortex
with a negative amplitude around 260ms after an error was detected by the subject. Moreover,
the amplitude of the ERP is inversely [140] modulated by the frequency of the autonomous
system error.
15
Figure 3. A schematic of how an ErrP paradigm can be used in a BCI system (figure copied from [138]
with permission). Left: Detecting the existence of error and correct the last movement. Right: Using ErrP
in a learning process to update a BCI classifier.
A real-time and closed-loop BCI system can be regarded as a control problem. The ErrP can be
used to adjust the input control signals to the device. While in a traditional control system, the
adjustment is performed by the using linear or nonlinear controllers, in a BCI system where the
brain plays the role of controller, the adjustment can be automatically performed by the power of
brain signals (for more information see review [141]). Finding a suitable controller in a
traditional control system has become a solvable problem; however, understanding brain-
controlled signals and translating them into logical and stable commands for usage in an external
device remains challenging. This investigation is further discussed in a study by Artusi et al.
[142].
The process of using ErrP in a closed-loop BCI system could be considered as analogous to
“learn from your mistake.” In contrast to a traditional control system, in which error signal can
be sensed in milliseconds, the brain does not produce an ErrP until 200ms-700ms after the
subject receives feedback [139, 142]. The feedback is the relevant event whose onset engages the
brain circuits to process error-related information. The delay and non-stationarity of the signal
slows the system and makes real-time implementation difficult. Additionally, since the ErrP does
not contain any information about direction or magnitude, there is still the challenge of how to
16
adjust command signals based on detected ErrP in a multi-degree-of-freedom control system.
Thus, most BCI systems are designed using pre-learned algorithms to perform a task in a closed-
loop BCI [140, 143]. Recently, Iturrate et al. [144] developed a BCI system using the ErrP to
autonomously complete a task after a training time of several minutes. In their task, a brain-
controlled robotic arm learned how to reach a specific target based on a pre-learned algorithm
using ErrP paradigm.
4.2 ErrP Analysis & Classification Methods
One approach to extracting the ErrP is to detect the discrepancy of the observed action and the
translated action in the BCI platform. Ferrez and Millan [139] found an interaction between the
subject and the BCI system. They observed positive peaks at 200ms and 450ms after feedback
and negative peaks 250ms and 450ms after feedback. They also observed that ErrP amplitude is
higher as the error rate decreases. Chavarriaga and Millan [140] investigated the consequences of
the subject monitoring an external agent that the subject does not have control over. They used a
cursor movement paradigm and estimated the posterior probability of moving the cursor in the
wrong direction as 𝑃𝑒𝑟𝑟 by classifying the EEG signal using a Gaussian classifier. They found
that electrode locations FCz and Cz were most closely correlated to the ErrP response.
Itturate et al. [145] designed a study where a subject observed a virtual robot performing a
reaching task. The subject was instructed to judge the robot motion based on prior information of
the correct path. The averaged EEG waveforms at each electrode location were calculated, and
the results showed a significant difference between the correct and incorrect operation of the
robot. On error trials, a sharp positive peak at approximately 300ms was observed and was
followed by a negative peak at approximately 400ms. The averaged EEG waveforms were
derived in two steps: First, bipolar channels in the medial and posterior regions within the range
[150-700ms] were selected, offset components were removed, a bandpass filter of 0.5-10Hz was
applied, and the result was down-sampled to 64Hz; Second, they applied a Functional Decision
Tree in their AdaBoost classification algorithm to the resulting feature vector. Ten-Fold cross-
validation suggested that the resulting averaged EEG waveforms distinguished between correct
and incorrect motion of a robot.
17
arms, [149, 153], and it has been used to teach a robotic BMI system how to reach a particular
target in a 2D plane [144].
The ErrP can provide additional information to improve closed-loop BCI systems. It is likely
that, in the future, the ErrP will allow a user to observe and spontaneously make the desired
change in a BCI system without the need for directly performing a control task [154, 155].
5 Hybrid Paradigms
5.1 Overview
A hybrid paradigm refers to a combination of two or more physiological measures in which at
least one is EEG (for review see [156-158]). The other physiological measures could be other
bio-signals such as heart rate (ECG), eye movements (EOG) or hemodynamic signal recorded by
fNIRS [159]. In hybrid paradigms, sequential or simultaneous processing structures can be used
to output control commands to the BCI system [157]. Figure 4 shows a schematic of each
system. In the simultaneous processing configuration, bio-signals concurrently enter two (or
more) parallel decoding systems while in a sequential setting one decoding paradigm acts as a
gating function for another decoding system. Visual P300, SSVEP, and SMR paradigms are the
most prevalent paradigms in the development of hybrid BCI systems [82, 116].
User Application
Paradigm Type 1 Paradigm Type 2
Interface Interface
(a)
Paradigm Type 1
User Application
Interface Interface
Paradigm Type 2
(b)
Figure 4. A schematic of two employed structures in hybrid BCI systems; a) Sequential form, b)
Simultaneous form.
In recent BCI studies, combining various mentioned paradigms or combining a BCI paradigm
with another interface has shown to enhanced BCI performance. For example, Luth et al. [160]
paired P300 and SSVEP in controlling an assistive robotic arm. In a 2D cursor task, Li et al.
[161] used Mu and Beta rhythms for controlling horizontal movement and P300 for vertical
movement. Bi et al. [162] also used a combination of SSVEP and P300. The SSVEP paradigm
18
was used to extract directional information (clockwise/counterclockwise), and the P300 was used
to decode the speed of the cursor. To minimize false positive rates of the user’s resting state,
Pfurtscheller et al. [137] introduced a hybrid BCI that combined of event-related synchronization
(ERS) and SSVEP collected from an EEG channel located above motor cortex and another
electrode located above visual cortex. Allison et al. [163] developed a 2D cursor control BCI
incorporating SSVEP for decoding horizontal and event-related desynchronization (ERD) for
vertical movements.
19
[162] 2D cursor control Simultaneous SSVEP and RBF SVM, FLDA
P300
[164] Robotic grasp control Sequential SSVEP and Mu CCA, STFT
rhythm
[171] Robotic control Sequential EOG and ERP LDA
6 Other Paradigms
In addition to the most common BCI paradigms detailed above, other paradigms have been
examined in a limited number of studies. Table 5 shows a number of previously generated EEG-
based BCI paradigms and a brief description of each system. Among the paradigms shown in
Table 5, the “covert and overt attention,” “discrete movement intention” and “auditory
paradigm” paradigms have shown promise as BCI devices.
20
Covert and Overt Attention Paradigm: Hunt and Kingstone [174] were among the first to use a
covert attention BCI paradigm. They discovered the existence of a dissociation between
voluntary shifts in overt and covert attention. In a covert attention paradigm, the subject is
instructed to look at a centrally located fixation point. The subject’s task is to follow another
point (e.g., cursor) without overt eye movement. In contrast to covert attention, an overt attention
task the subject is instructed to use overt eye movements while they attend to a moving object.
Both of these approaches depend on visual attention, and the EEG signals are typically recorded
from the posterior cortex. Additional studies using this paradigm were performed by Kelly et al.
[175, 176]. In [176], they investigated Parieto-occipital alpha band (8-14Hz) EEG activity in a
covert attention paradigm to classify the spatial attention to the right and left. Later, they
confirmed the existence of distinct patterns in overt and covert attention during preparatory
processes [175]. Tonin and colleagues [177, 178] used a covert attention paradigm in a 2-class
classification problem (i.e., attention to right corner target of a monitor vs. attention to left corner
target of a monitor) to control a BCI system in online mode and provide feedback to the subject
by showing the result of classification. Additionally, Treder et al. [179] employed a covert
attention paradigm for a two-dimensional BCI control to covertly choose a target among six
targets which are equally distributed around a circle on a screen.
Discrete Movement Intention Paradigm: In the movement intention paradigm, EEG signals
collected before movement onset are used to detect the intended movement of a BCI user and
manipulate the environment accordingly. In these studies, the subject may or may not be able to
physically execute an actual movement. However, their EEG signals can confirm the intention of
movement before movement occurs [180]. In some studies, the terminology “attempted” [181] or
“planned” [182] movement is used to describe the intention of movement. This paradigm can be
primarily and fruitfully used in motor rehabilitation. By using the movement intention paradigm
in robotic rehabilitation, a patient’s intentions can initiate the movement of a robot. Frisoli et al.
[183] used a gaze-dependent variation of this paradigm for upper limb rehabilitation. EEG
signals were used to adjust jerk, acceleration, and speed of the exoskeleton. As a means of
therapy for post-stroke patients, Muralidharan et al. [181] successfully extracted intention from
EEG signals to open or close a paretic left/right hand. A similar study by Lew et al. [184] was
performed using two able-bodied subjects and two post-stroke patients with an overall success
rate of 80% in detection of movement. Investigation of EEG signals for the intention of the right-
hand and left-hand movements was performed by Bai et al. [185]. Bai et al. [180] predicted wrist
extension movements in seven healthy subjects. Zhou et al. [186] classified the information from
EEG signals during the moment in which the subjects (four healthy, two stroke) intended to
perform shoulder abduction or elbow flexion movements. Also, EEG data were analyzed for a
chronic stroke patient before the onset of hand movement toward a target [187].
Auditory Paradigm: Auditory paradigms have also been investigated by a number of BCI
researchers [83]. Brain signals can be modulated either by using an intention-driven
(endogenous) BCI or stimulus-driven (exogenous) BCI depending on the paradigm. For
example, auditory P300 [188] considered as an exogenous stimulation is used to evoke auditory
21
steady-state responses (ASSR) [189]. ASSR is an auditory evoked potential in response to rapid
auditory stimuli; Picton et al. [189] showed that the ASSR maximum amplitude is recorded from
the vertex of the scalp. Sellers and Donchin [188] compared P300 auditory and visual paradigms
in patients with ALS. Although they showed proof of principle with the auditory P300 BCI,
performance was significantly better in the visual condition. Nijboer et al. [84] also validated the
feasibility of an auditory-based BCI by comparing with visual-based BCI. Ferracuti et al. [190]
used a novel paradigm where five classes of auditory stimuli were presented in five different
locations of space.
Somatosensory (Tactile) paradigm: In recent years, the usage of a somatosensory paradigms for
patients with visual impairment has become popular. In this paradigm, vibrotactile sensors are
located in pre-determined parts of body while stimulations happen at different frequencies [191].
The stimulations of these sensors will be reflected on EEG signals recorded from the scalp.
Muller-Putz et al. [124] investigated the usability of the steady-state somatosensory evoked
potential paradigm. Other researchers employed tactile P300 paradigms in their BCI systems
[192]. Imagined tactile paradigms were also investigated by Yao et al. [85]. The somatosensory
paradigm was utilized in assisting patients with locked-in syndrome [193, 194].
Reflexive Semantic Conditioning Paradigm: BCIs for communication purposes have been
developed since the late eighties; however, it remains a great challenge to provide reliable results
for people with severe motor disabilities, such as completely locked-in syndrome (CLIS). A
paradigm named "reflexive semantic conditioning" (based on Pavlov theory) was developed and
tested in healthy participants as well as in people with diagnosis of ALS. The main goal of the
paradigm is to deal with communication problems in CLIS and ALS patients [195-199].
22
[85], [191], [212], [124], Somatosensory (Tactile) paradigm: tactile sensors are used to
[192], [193], [194], [213] stimulate parts of body (in different frequency) while the EEG
signals are recorded for classification and generating control
commands.
[154], [155] Passive paradigm: passive EEG signals without the purpose of
voluntary control, such as the user’s intentions, situational
interpretations, and emotional states, are utilized as a complementary
BCI.
[214] Non-motor mental imagery paradigm: EEG signals origin from non-
motor imaginary tasks such as math calculation.
[215], [19], [216], [217], Slow cortical potentials paradigm: low frequency EEG signals
[218] recorded from prefrontal cortex are modulated through a long
training time of a cognitive task while receiving neurofeedback, as
well.
[219], [220], [221] Observation-based paradigm: EEG signals are collected while the
subject observes different actions performed by an external device
(such as prosthetic hand).
[222], [223], [224] Eye-movement paradigm: EEG signals are recorded while the
subject is instructed to have eye movement to different directions.
Discrete classes are extracted from EEG signals for controlling
external objects.
[195], [196], [197], Reflexive Semantic Conditioning Paradigm: The EEG signals is
[198], [199] modulated by presenting various statements. The paradigm is
primarily used for communication in ALS and CLIS populations.
23
Training Time and Fatigue: One of the most significant challenges in BCI is the training
required for a subject to become proficient with the system. Most paradigms have lengthy
training times, which can cause fatigue in subjects. Although there are examples of long-term use
of stimulus-based BCI such as [112, 226], overall external stimulus paradigms such as P300-
based systems may cause fatigue over extended periods of use. Moreover, subject-dependency
and even inter-session variability can make it necessary for BCI researchers to collect calibration
data at the beginning of each session. To mitigate this problem, some recent studies have used
methods such as transfer learning to develop a zero training/generic BCI model that generalizes
to most subjects [81, 227-230].
Signal Processing and Novel Decoders: Many different decoding methods, signal processing
algorithms [231], and classification algorithms [30] have been recently investigated.
Nevertheless, the information extracted from EEG signals does not have a high enough signal-to-
noise ratio to control a system such as a neuroprosthetic arm with multiple degrees of freedom.
More robust, accurate, and fast online algorithms are required to be able to control a multi-DOF
system. In recent years, some researchers have suggested that source localization of EEG [232]
and active data selection [233] can improve classification performance. Other researchers have
suggested the use of advanced machine learning and deep learning methods [234-237], which
have potential to extract additional features that can improve classification. Furthermore, other
researchers have proposed adaptive classifiers and decoders in order to compensate for the non-
stationary nature of EEG signals [238]. Meanwhile, a particular standardization system is
essential to evaluate the performance of decoding algorithms in specific applications and BCI
systems [239].
From Shared Control to Supervisory Control in Closed-Loop BCIs: A closed-loop BCI is
considered to be a co-adaptive and mutual learning system where the human and computer learn
from each other, while adaptation in mathematical algorithms and neural circuits also occurs.
Millan [240] described the closed-loop BCI system as a “two-learner system.” The terms “shared
control” and “hybrid control” were also used to describe the contributions of both human and
machine in performing the control task [20, 55, 143, 241-243]. The shared BCI system includes
both high-level and low-level control systems. High-level control commands are generated by
the brain and traditional control systems are responsible for low-level control tasks. Interestingly,
in high-level control, there is always a tradeoff between the natural way of control and subject
fatigue. The ideal BCI system with mutual interaction can be described as a supervisory control
system in which the subject is the leader with minimum involvement (in high-level control), and
the BCI system serves as an intelligent system (in low-level control) [140, 244]. By cognitive
monitoring, the user can act as a supervisor of the external autonomous system instead of
continuously interacting with control commands.
The definition of a closed-loop control system is currently a controversial issue [141, 245]. In
reality, in an EEG-based BCI, some types of artificial sensory feedback, except visual feedback
[246], should be considered to provide the subject with the highest feeling of control in a closed-
24
loop form. In contrast, invasively controlled prosthetic arms include the sense of touch, which
increases the perception of a closed-loop control system [247]. In EEG-based BCI platforms,
various feedback mechanisms have been investigated, including brain stimulation [35], reaction
force [248], and somatosensory stimulation [42].
Development of New EEG Technologies: Since scalp EEG is categorized as low-cost and
affordable technology among brain monitoring technologies, it has the potential to be
commercialized for general public [3]. There are studies to determine alertness/drowsiness from
brain dynamics while evaluating behavioral changes with applications to drowsy driving. Having
a portable EEG headset helps understand the brain dynamics underlying integration of perceptual
functions of the brain in different scenarios. Some studies evaluate behavioral changes in
response to auditory signals in a driving environment and find correlations between brainwaves
and other sensory inputs such as haptic feedback. As part of development for this technology
many researchers have investigated the development of wearable and wireless EEG headsets
[173, 249]. Dry EEG sensors have also developed [250-253]. These sensors do not require skin
preparation or gel applications that are required of conventional wet sensors. The development of
these new EEG headsets could facilitate the application of BCIs beyond current levels. For
example, a forehead EEG-based BCI can be used as a sleep management system that can assess
sleep quality. The device could also be used as a depression treatment screening system that
could evaluate and predict the efficacy of rapid antidepressant agents. Nevertheless, there are
still limitations to dry electrode technology. For example, the sensors are uncomfortable to the
scalp and they are very sensitive to muscle and movement artifacts. In addition, current dry
headsets recording quality typically degrades after approximately one hour.
Neurofeedback and the Future Paradigms: One future direction of BCI is its application in
neurofeedback [254]. Neurofeedback, a type of biofeedback, is the process of self-regulating
brainwaves to improve various aspects of cognitive control. In some cases, neurofeedback-based
BCIs could potentially replace medications, thereby reducing the negative side effects of
medication. For example, this technology could help to alleviate cognitive and pathological
neural diseases, such as migraine headaches. A headache detection and management system can
notify migraine patients’ imminent migraine headaches days in advance while offering a
treatment in neurofeedback form. Neurofeedback-based BCIs could also be developed to assist
the treatment of people with addiction, obesity, autism, and asthma [255]. New EEG paradigms
can also be developed to facilitate cognitive control [256] and interaction with the environment
[154, 155]. For instance, ErrP can be used as a useful mechanism to enhance neurofeedback
since it allows a user to observe and spontaneously make the desired change in a BCI system
without the need to directly perform a control task. Moreover, new cognitive models of
neurofeedback can be developed for neuro-rehabilitation of cognitive deficits, such as ADHD,
anxiety, epilepsy, Alzheimer’s disease, traumatic brain injury, and post-traumatic stress disorder
[257-263].
25
8 Conclusions
Currently, there is a high level of interest in non-invasive BCI technology. Many variables have
facilitated the popularity of these systems. Because of wireless recording, low-cost amplifiers,
higher temporal resolution, and advanced signal analysis methodology, the systems are more
accessible to researchers in many scientific domains. As described in this review, a critical aspect
of employing a BCI system is to match the appropriate control signal with the desired
application. It is essential to choose the most reliable, accurate, and convenient paradigm to
manipulate a neuroprosthetic device or implement a specific neurorehabilitation program. The
current review has evaluated several EEG-based BCI paradigms in terms of their advantages and
disadvantages from a variety of perspectives. Each paradigm was described and presented in
terms of the control signals, various EEG decoding algorithms, and classification methods, and
target populations of each paradigm were summarized. Finally, potential problems with EEG-
based BCI systems were discussed, and possible solutions were proposed.
Acknowledgments
The authors are grateful to Dr. Jose Millan for his insightful comments to an early draft of this
manuscript. The assistance of Megan Pitz to the manuscript is also appreciated. This work was
partially supported by NeuorNET at UTK.
Author Contributions
R.A. and S.B. contributed equally and have shared first authorship. E.S. and Y.J. revised the
paper and contributed with insightful comments. X.Z. was involved in all aspects of the study.
References
[1] L. M. Nirenberg, J. Hanley, and E. B. Stear, "A New Approach to Prosthetic Control:
EEG Motor Signal Tracking With an Adaptively Designed Phase-Locked Loop,"
Biomedical Engineering, IEEE Transactions on, vol. BME-18, no. 6, pp. 389-398, 1971.
[2] J. R. Wolpaw, N. Birbaumer, D. J. McFarland, G. Pfurtscheller, and T. M. Vaughan,
"Brain-computer interfaces for communication and control," (in eng), Clin Neurophysiol,
vol. 113, no. 6, pp. 767-91, Jun 2002.
[3] L. F. Nicolas-Alonso and J. Gomez-Gil, "Brain computer interfaces, a review," Sensors,
vol. 12, no. 2, pp. 1211-1279, 2012.
[4] M. A. Lebedev and M. A. Nicolelis, "Brain-Machine Interfaces: From Basic Science to
Neuroprostheses and Neurorehabilitation," Physiological Reviews, vol. 97, no. 2, pp. 767-
837, 2017.
[5] R. A. Ramadan and A. V. Vasilakos, "Brain computer interface: control signals review,"
Neurocomputing, vol. 223, pp. 26-44, 2017.
26
[6] S. Waldert, T. Pistohl, C. Braun, T. Ball, A. Aertsen, and C. Mehring, "A review on
directional information in neural signals for brain-machine interfaces," Journal of
Physiology-Paris, vol. 103, no. 3, pp. 244-254, 2009.
[7] M. W. Slutzky and R. D. Flint, "Physiological properties of brain-machine interface input
signals," Journal of neurophysiology, vol. 118, no. 2, pp. 1329-1343, 2017.
[8] D. W. Moran and A. B. Schwartz, "Motor cortical representation of speed and direction
during reaching," Journal of Neurophysiology, vol. 82, no. 5, pp. 2676-2692, 1999.
[9] L. R. Hochberg et al., "Neuronal ensemble control of prosthetic devices by a human with
tetraplegia," Nature, vol. 442, no. 7099, pp. 164-171, 2006.
[10] S.-P. Kim, J. D. Simeral, L. R. Hochberg, J. P. Donoghue, and M. J. Black, "Neural
control of computer cursor velocity by decoding motor cortical spiking activity in
humans with tetraplegia," Journal of neural engineering, vol. 5, no. 4, p. 455, 2008.
[11] G. H. Mulliken, S. Musallam, and R. A. Andersen, "Decoding trajectories from posterior
parietal cortex ensembles," the Journal of Neuroscience, vol. 28, no. 48, pp. 12913-
12926, 2008.
[12] M. Hauschild, G. H. Mulliken, I. Fineman, G. E. Loeb, and R. A. Andersen, "Cognitive
signals for brain–machine interfaces in posterior parietal cortex include continuous 3D
trajectory commands," Proceedings of the National Academy of Sciences, vol. 109, no.
42, pp. 17075-17080, 2012.
[13] L. R. Hochberg et al., "Reach and grasp by people with tetraplegia using a neurally
controlled robotic arm," Nature, vol. 485, no. 7398, pp. 372-375, 2012.
[14] S.-P. Kim, J. D. Simeral, L. R. Hochberg, J. P. Donoghue, G. M. Friehs, and M. J. Black,
"Point-and-click cursor control with an intracortical neural interface system by humans
with tetraplegia," Neural Systems and Rehabilitation Engineering, IEEE Transactions on,
vol. 19, no. 2, pp. 193-203, 2011.
[15] J. Vogel et al., "An assistive decision-and-control architecture for force-sensitive hand–
arm systems driven by human–machine interfaces," The International Journal of
Robotics Research, vol. 34, no. 6, pp. 763-780, 2015.
[16] D. M. Taylor, S. I. H. Tillery, and A. B. Schwartz, "Direct cortical control of 3D
neuroprosthetic devices," Science, vol. 296, no. 5574, pp. 1829-1832, 2002.
[17] M. Velliste, S. Perel, M. C. Spalding, A. S. Whitford, and A. B. Schwartz, "Cortical
control of a prosthetic arm for self-feeding," Nature, vol. 453, no. 7198, pp. 1098-1101,
2008.
[18] L. Bi, X.-A. Fan, and Y. Liu, "EEG-based brain-controlled mobile robots: a survey,"
Human-Machine Systems, IEEE Transactions on, vol. 43, no. 2, pp. 161-176, 2013.
[19] N. Birbaumer et al., "A spelling device for the paralysed," Nature, vol. 398, no. 6725, pp.
297-298, 1999.
[20] J. Meng, S. Zhang, A. Bekyo, J. Olsoe, B. Baxter, and B. He, "Noninvasive
Electroencephalogram Based Control of a Robotic Arm for Reach and Grasp Tasks,"
Scientific Reports, vol. 6, p. 38565, 2016.
[21] J. J. Daly and J. R. Wolpaw, "Brain-computer interfaces in neurological rehabilitation,"
(in eng), Lancet Neurol, vol. 7, no. 11, pp. 1032-43, Nov 2008.
[22] N. Birbaumer and L. G. Cohen, "Brain–computer interfaces: communication and
restoration of movement in paralysis," The Journal of Physiology, vol. 579, no. 3, pp.
621-636, March 15, 2007 2007.
27
[23] S. Machado, L. F. Almada, and R. N. Annavarapu, "Progress and Prospects in EEG-
Based Brain-Computer Interface: Clinical Applications in Neurorehabilitation," Journal
of Rehabilitation Robotics, vol. 1, no. 1, pp. 28-41, 2013.
[24] S. Moghimi, A. Kushki, A. Marie Guerguerian, and T. Chau, "A Review of EEG-Based
Brain-Computer Interfaces as Access Pathways for Individuals with Severe Disabilities,"
Assistive Technology, vol. 25, no. 2, pp. 99-110, 2013/04/03 2012.
[25] N. Birbaumer, "Breaking the silence: brain-computer interfaces (BCI) for communication
and motor control," (in eng), Psychophysiology, vol. 43, no. 6, pp. 517-32, Nov 2006.
[26] J. L. Contreras-Vidal, A. Presacco, H. Agashe, and A. Paek, "Restoration of Whole Body
Movement: Toward a Noninvasive Brain-Machine Interface System," Pulse, IEEE, vol.
3, no. 1, pp. 34-37, 2012.
[27] T. Mulder, "Motor imagery and action observation: cognitive tools for rehabilitation,"
Journal of neural transmission, vol. 114, no. 10, pp. 1265-1278, 2007.
[28] T. M. Vaughan, J. R. Wolpaw, and E. Donchin, "EEG-based communication: prospects
and problems," IEEE transactions on rehabilitation engineering, vol. 4, no. 4, pp. 425-
430, 1996.
[29] H.-J. Hwang, S. Kim, S. Choi, and C.-H. Im, "EEG-Based Brain-Computer Interfaces: A
Thorough Literature Survey," International Journal of Human-Computer Interaction,
vol. 29, no. 12, pp. 814-826, 2013/12/02 2013.
[30] F. Lotte, M. Congedo, A. Lécuyer, F. Lamarche, and B. Arnaldi, "A review of
classification algorithms for EEG-based brain–computer interfaces," Journal of neural
engineering, vol. 4, no. 2, p. R1, 2007.
[31] G. Pfurtscheller, B. Graimann, and C. Neuper, "EEG‐Based Brain‐Computer Interface
System," Wiley Encyclopedia of Biomedical Engineering, 2006.
[32] S. Machado et al., "EEG-based brain-computer interfaces: an overview of basic concepts
and clinical applications in neurorehabilitation," Reviews in the neurosciences, vol. 21,
no. 6, pp. 451-468, 2010.
[33] G. Pfurtscheller and C. Neuper, "Motor imagery activates primary sensorimotor area in
humans," Neuroscience letters, vol. 239, no. 2, pp. 65-68, 1997.
[34] H. Yuan and B. He, "Brain-Computer Interfaces Using Sensorimotor Rhythms: Current
State and Future Perspectives," 2014.
[35] B. He, B. Baxter, B. J. Edelman, C. C. Cline, and W. Y. Wenjing, "Noninvasive brain-
computer interfaces based on sensorimotor rhythms," Proceedings of the IEEE, vol. 103,
no. 6, pp. 907-925, 2015.
[36] V. Morash, O. Bai, S. Furlani, P. Lin, and M. Hallett, "Classifying EEG signals preceding
right hand, left hand, tongue, and right foot movements and motor imageries," Clinical
neurophysiology, vol. 119, no. 11, pp. 2570-2578, 2008.
[37] G. Pfurtscheller and F. L. Da Silva, "Event-related EEG/MEG synchronization and
desynchronization: basic principles," Clinical neurophysiology, vol. 110, no. 11, pp.
1842-1857, 1999.
[38] J. R. Wolpaw, D. J. McFarland, G. W. Neat, and C. A. Forneris, "An EEG-based brain-
computer interface for cursor control," Electroencephalography and Clinical
Neurophysiology, vol. 78, no. 3, pp. 252-259, 3// 1991.
[39] K. LaFleur, K. Cassady, A. Doud, K. Shades, E. Rogin, and B. He, "Quadcopter control
in three-dimensional space using a noninvasive motor imagery-based brain–computer
interface," Journal of neural engineering, vol. 10, no. 4, p. 046003, 2013.
28
[40] J. R. Wolpaw and D. J. McFarland, "Control of a two-dimensional movement signal by a
noninvasive brain-computer interface in humans," Proceedings of the National Academy
of Sciences of the United States of America, vol. 101, no. 51, pp. 17849-17854, 2004.
[41] S. Bhattacharyya, A. Khasnobish, A. Konar, D. N. Tibarewala, and A. K. Nagar,
"Performance analysis of left/right hand movement classification from EEG signal by
intelligent algorithms," in Computational Intelligence, Cognitive Algorithms, Mind, and
Brain (CCMB), 2011 IEEE Symposium on, 2011, pp. 1-8.
[42] A. R. Murguialday et al., "Brain-Computer Interface for a Prosthetic Hand Using Local
Machine Control and Haptic Feedback," in Rehabilitation Robotics, 2007. ICORR 2007.
IEEE 10th International Conference on, 2007, pp. 609-613.
[43] C.-W. Chen, C.-C. K. Lin, and M.-S. Ju, "Hand orthosis controlled using brain-computer
interface," Journal of Medical and Biological Engineering, vol. 29, no. 5, pp. 234-241,
2009.
[44] J. R. Wolpaw and D. J. McFarland, "Multichannel EEG-based brain-computer
communication," Electroencephalography and Clinical Neurophysiology, vol. 90, no. 6,
pp. 444-449, 6// 1994.
[45] G. R. Muller-Putz, R. Scherer, G. Pfurtscheller, and R. Rupp, "EEG-based
neuroprosthesis control: a step towards clinical practice," (in eng), Neurosci Lett, vol.
382, no. 1-2, pp. 169-74, Jul 1-8 2005.
[46] K. K. Ang et al., "A clinical study of motor imagery-based brain-computer interface for
upper limb robotic rehabilitation," in Engineering in Medicine and Biology Society, 2009.
EMBC 2009. Annual International Conference of the IEEE, 2009, pp. 5981-5984: IEEE.
[47] B. S. Baxter, A. Decker, and B. He, "Noninvasive control of a robotic arm in multiple
dimensions using scalp electroencephalogram," in Neural Engineering (NER), 2013 6th
International IEEE/EMBS Conference on, 2013, pp. 45-47: IEEE.
[48] M. Sarac, E. Koyas, A. Erdogan, M. Cetin, and V. Patoglu, "Brain Computer Interface
based robotic rehabilitation with online modification of task speed," (in eng), IEEE Int
Conf Rehabil Robot, vol. 2013, p. 6650423, Jun 2013.
[49] D. J. McFarland, W. A. Sarnacki, and J. R. Wolpaw, "Electroencephalographic (EEG)
control of three-dimensional movement," (in eng), J Neural Eng, vol. 7, no. 3, p. 036007,
Jun 2010.
[50] C. Guger, W. Harkam, C. Hertnaes, and G. Pfurtscheller, "Prosthetic control by an EEG-
based brain-computer interface (BCI)," in Proc. aaate 5th european conference for the
advancement of assistive technology, 1999, pp. 3-6.
[51] G. Pfurtscheller, G. R. Müller, J. Pfurtscheller, H. J. Gerner, and R. Rupp, "‘Thought’ –
control of functional electrical stimulation to restore hand grasp in a patient with
tetraplegia," Neuroscience Letters, vol. 351, no. 1, pp. 33-36, 11/6/ 2003.
[52] S. Aiqin, F. Binghui, and J. Chaochuan, "Motor imagery EEG-based online control
system for upper artificial limb," in Transportation, Mechanical, and Electrical
Engineering (TMEE), 2011 International Conference on, 2011, pp. 1646-1649.
[53] R. Roy, A. Konar, D. N. Tibarewala, and R. Janarthanan, "EEG driven model predictive
position control of an artificial limb using neural net," in Computing Communication &
Networking Technologies (ICCCNT), 2012 Third International Conference on, 2012, pp.
1-9.
29
[54] A. J. Doud, J. P. Lucas, M. T. Pisansky, and B. He, "Continuous three-dimensional
control of a virtual helicopter using a motor imagery based brain-computer interface,"
PloS one, vol. 6, no. 10, p. e26322, 2011.
[55] T. Li, J. Hong, J. Zhang, and F. Guo, "Brain–machine interface control of a manipulator
using small-world neural network and shared control strategy," Journal of neuroscience
methods, vol. 224, pp. 26-38, 2014.
[56] S. Fok et al., "An EEG-based brain computer interface for rehabilitation and restoration
of hand control following stroke using ipsilateral cortical physiology," (in eng), Conf
Proc IEEE Eng Med Biol Soc, vol. 2011, pp. 6277-80, 2011.
[57] C. Wang et al., "A feasibility study of non-invasive motor-imagery BCI-based robotic
rehabilitation for Stroke patients," in Neural Engineering, 2009. NER'09. 4th
International IEEE/EMBS Conference on, 2009, pp. 271-274: IEEE.
[58] E. Buch et al., "Think to move: a neuromagnetic brain-computer interface (BCI) system
for chronic stroke," stroke, vol. 39, no. 3, pp. 910-917, 2008.
[59] A. Ramos-Murguialday et al., "Proprioceptive feedback and brain computer interface
(BCI) based neuroprostheses," PloS one, vol. 7, no. 10, p. e47048, 2012.
[60] A. Ramos‐Murguialday et al., "Brain–machine interface in chronic stroke rehabilitation:
a controlled study," Annals of neurology, vol. 74, no. 1, pp. 100-108, 2013.
[61] V. Kaiser, I. Daly, F. Pichiorri, D. Mattia, G. R. Müller-Putz, and C. Neuper,
"Relationship between electrical brain responses to motor imagery and motor impairment
in stroke," Stroke, vol. 43, no. 10, pp. 2735-2740, 2012.
[62] J. Pereira, P. Ofner, A. Schwarz, A. I. Sburlea, and G. R. Müller-Putz, "EEG neural
correlates of goal-directed movement intention," NeuroImage, vol. 149, pp. 129-140,
2017.
[63] T. J. Bradberry, R. J. Gentili, and J. L. Contreras-Vidal, "Fast attainment of computer
cursor control with noninvasively acquired brain signals," Journal of neural engineering,
vol. 8, no. 3, p. 036010, 2011.
[64] P. Ofner and G. R. Müller-Putz, "EEG-Based Classification of Imagined Arm
Trajectories," in Replace, Repair, Restore, Relieve–Bridging Clinical and Engineering
Solutions in Neurorehabilitation: Springer, 2014, pp. 611-620.
[65] J.-H. Kim, F. Biessmann, and S.-W. Lee, "Decoding Three-Dimensional Trajectory of
Executed and Imagined Arm Movements from Electroencephalogram Signals," 2014.
[66] A. Úbeda, J. M. Azorín, R. Chavarriaga, and J. d. R. Millán, "Evaluating decoding
performance of upper limb imagined trajectories during center-out reaching tasks," in
Systems, Man, and Cybernetics (SMC), 2016 IEEE International Conference on, 2016,
pp. 000252-000257: IEEE.
[67] Y. Gu, K. Dremstrup, and D. Farina, "Single-trial discrimination of type and speed of
wrist movements from EEG recordings," Clinical Neurophysiology, vol. 120, no. 8, pp.
1596-1600, 8// 2009.
[68] Y. Gu, D. Farina, A. R. Murguialday, K. Dremstrup, P. Montoya, and N. Birbaumer,
"Offline identification of imagined speed of wrist movements in paralyzed ALS patients
from single-trial EEG," (in English), Frontiers in Neuroscience, Original Research vol. 3,
2009-August-10 2009.
[69] A. Vuckovic and F. Sepulveda, "Delta band contribution in cue based single trial
classification of real and imaginary wrist movements," (in eng), Med Biol Eng Comput,
vol. 46, no. 6, pp. 529-39, Jun 2008.
30
[70] T. Chakraborti et al., "Implementation of EEG based control of remote robotic systems,"
in Recent Trends in Information Systems (ReTIS), 2011 International Conference on,
2011, pp. 203-208: IEEE.
[71] A. K. Mohamed, T. Marwala, and L. R. John, "Single-trial EEG discrimination between
wrist and finger movement imagery and execution in a sensorimotor BCI," (in eng), Conf
Proc IEEE Eng Med Biol Soc, vol. 2011, pp. 6289-93, 2011.
[72] T. J. Bradberry, R. J. Gentili, and J. L. Contreras-Vidal, "Reconstructing Three-
Dimensional Hand Movements from Noninvasive Electroencephalographic Signals," The
Journal of Neuroscience, vol. 30, no. 9, pp. 3432-3437, March 3, 2010 2010.
[73] P. Ofner and G. R. Müller-Putz, "Using a noninvasive decoding method to classify
rhythmic movement imaginations of the arm in two planes," IEEE Transactions on
Biomedical Engineering, vol. 62, no. 3, pp. 972-981, 2015.
[74] H. Yuan, C. Perdoni, and B. He, "Relationship between speed and EEG activity during
imagined and executed hand movements," (in eng), J Neural Eng, vol. 7, no. 2, p. 26001,
Apr 2010.
[75] A. Korik, R. Sosnik, N. Siddique, and D. Coyle, "Imagined 3D hand movement trajectory
decoding from sensorimotor EEG rhythms," in Systems, Man, and Cybernetics (SMC),
2016 IEEE International Conference on, 2016, pp. 004591-004596: IEEE.
[76] R. Abiri, G. Heise, F. Schwartz, and X. Zhao, "EEG-based control of a unidimensional
computer cursor using imagined body kinematics," in Biomedical Engineering Society
Annual Meeting (BMES 2015), 2015.
[77] R. Abiri, X. Zhao, G. Heise, Y. Jiang, and F. Abiri, "Brain computer interface for gesture
control of a social robot: An offline study," in Electrical Engineering (ICEE), 2017
Iranian Conference on, 2017, pp. 113-117: IEEE.
[78] R. Abiri, S. Borhani, X. Zhao, and Y. Jiang, "Real-time brain machine interaction via
social robot gesture control," in ASME 2017 Dynamic Systems and Control Conference,
2017, pp. V001T37A002-V001T37A002: American Society of Mechanical Engineers.
[79] R. Abiri, j. Kilmarx, S. Borhani, X. Zhao, and Y. Jiang, "A Brain-Machine Interface for a
Sequence Movement Control of a Robotic Arm," in Society for Neuroscience (SfN 2017),
2017.
[80] R. Abiri, J. Kilmarx, M. Raji, and X. Zhao, "Planar Control of a Quadcopter Using a
Zero-Training Brain Machine Interface Platform," in Biomedical Engineering Society
Annual Meeting (BMES 2016), 2016.
[81] S. Borhani, R. Abiri, X. Zhao, and Y. Jiang, "A Transfer Learning Approach towards
Zero-training BCI for EEG-Based Two Dimensional Cursor Control," in Society for
Neuroscience (SfN 2017), 2017.
[82] D. Kapgate and D. Kalbande, "A Review on Visual Brain Computer Interface," in
Advancements of Medical Electronics: Springer, 2015, pp. 193-206.
[83] S. Gao, Y. Wang, X. Gao, and B. Hong, "Visual and auditory brain–computer interfaces,"
IEEE Transactions on Biomedical Engineering, vol. 61, no. 5, pp. 1436-1447, 2014.
[84] F. Nijboer et al., "An auditory brain–computer interface (BCI)," Journal of neuroscience
methods, vol. 167, no. 1, pp. 43-50, 2008.
[85] L. Yao, X. Sheng, D. Zhang, N. Jiang, D. Farina, and X. Zhu, "A BCI System Based on
Somatosensory Attentional Orientation," IEEE Transactions on Neural Systems and
Rehabilitation Engineering, vol. 25, no. 1, pp. 81-90, 2017.
31
[86] M. Fabiani, G. Gratton, D. Karis, and E. Donchin, "Definition, identification, and
reliability of measurement of the P300 component of the event-related brain potential,"
Advances in psychophysiology, vol. 2, no. S 1, p. 78, 1987.
[87] J. Polich, "Updating P300: an integrative theory of P3a and P3b," Clinical
neurophysiology, vol. 118, no. 10, pp. 2128-2148, 2007.
[88] L. A. Farwell and E. Donchin, "Talking off the top of your head: toward a mental
prosthesis utilizing event-related brain potentials," Electroencephalography and clinical
Neurophysiology, vol. 70, no. 6, pp. 510-523, 1988.
[89] S. Halder et al., "Prediction of P300 BCI aptitude in severe motor impairment," PloS one,
vol. 8, no. 10, p. e76148, 2013.
[90] R. Fazel-Rezai, B. Z. Allison, C. Guger, E. W. Sellers, S. C. Kleih, and A. Kübler, "P300
brain computer interface: current challenges and emerging trends," Frontiers in
neuroengineering, vol. 5, 2012.
[91] L. M. McCane et al., "Brain-computer interface (BCI) evaluation in people with
amyotrophic lateral sclerosis," Amyotrophic Lateral Sclerosis and Frontotemporal
Degeneration, vol. 15, no. 3-4, pp. 207-215, 2014.
[92] P. Cipresso et al., "The use of P300‐based BCIs in amyotrophic lateral sclerosis: from
augmentative and alternative communication to cognitive assessment," Brain and
behavior, vol. 2, no. 4, pp. 479-498, 2012.
[93] S. Sutton, P. Tueting, J. Zubin, and E. R. John, "Information delivery and the sensory
evoked potential," Science, vol. 155, no. 3768, pp. 1436-1439, 1967.
[94] E. Donchin, K. M. Spencer, and R. Wijesinghe, "The mental prosthesis: assessing the
speed of a P300-based brain-computer interface," Rehabilitation Engineering, IEEE
Transactions on, vol. 8, no. 2, pp. 174-179, 2000.
[95] F. Piccione et al., "P300-based brain computer interface: reliability and performance in
healthy and paralysed participants," Clinical neurophysiology, vol. 117, no. 3, pp. 531-
537, 2006.
[96] D. J. Krusienski et al., "A comparison of classification techniques for the P300 Speller,"
Journal of neural engineering, vol. 3, no. 4, p. 299, 2006.
[97] L. Citi, R. Poli, C. Cinel, and F. Sepulveda, "P300-based BCI mouse with genetically-
optimized analogue control," IEEE transactions on neural systems and rehabilitation
engineering, vol. 16, no. 1, pp. 51-61, 2008.
[98] S. Silvoni et al., "P300-based brain-computer interface communication: evaluation and
follow-up in amyotrophic lateral sclerosis," Frontiers in neuroscience, vol. 3, p. 1, 2009.
[99] M. Marchetti, F. Piccione, S. Silvoni, and K. Priftis, "Exogenous and endogenous
orienting of visuospatial attention in P300-guided brain computer interfaces: A pilot
study on healthy participants," Clinical Neurophysiology, vol. 123, no. 4, pp. 774-779,
2012.
[100] M. Marchetti, F. Piccione, S. Silvoni, L. Gamberini, and K. Priftis, "Covert visuospatial
attention orienting in a brain-computer interface for amyotrophic lateral sclerosis
patients," Neurorehabilitation and neural repair, vol. 27, no. 5, pp. 430-438, 2013.
[101] S. Silvoni, M. Cavinato, C. Volpato, C. A. Ruf, N. Birbaumer, and F. Piccione,
"Amyotrophic lateral sclerosis progression and stability of brain-computer interface
communication," Amyotrophic lateral sclerosis and frontotemporal degeneration, vol.
14, no. 5-6, pp. 390-396, 2013.
32
[102] D. J. Krusienski, E. W. Sellers, D. J. McFarland, T. M. Vaughan, and J. R. Wolpaw,
"Toward enhanced P300 speller performance," Journal of neuroscience methods, vol.
167, no. 1, pp. 15-21, 2008.
[103] C. J. Bell, P. Shenoy, R. Chalodhorn, and R. P. Rao, "Control of a humanoid robot by a
noninvasive brain–computer interface in humans," Journal of neural engineering, vol. 5,
no. 2, p. 214, 2008.
[104] G. Edlinger, C. Holzner, C. Groenegress, C. Guger, and M. Slater, "Goal-oriented control
with brain-computer interface," in International Conference on Foundations of
Augmented Cognition, 2009, pp. 732-740: Springer.
[105] W.-d. Chen et al., "A P300 based online brain-computer interface system for virtual hand
control," (in English), Journal of Zhejiang University SCIENCE C, vol. 11, no. 8, pp.
587-597, 2010/08/01 2010.
[106] R. Fazel-Rezai and K. Abhari, "A region-based P300 speller for brain-computer
interface," Canadian Journal of Electrical and Computer Engineering, vol. 34, no. 3, pp.
81-85, 2009.
[107] G. Townsend et al., "A novel P300-based brain–computer interface stimulus presentation
paradigm: moving beyond rows and columns," Clinical Neurophysiology, vol. 121, no. 7,
pp. 1109-1120, 2010.
[108] M. Moghadamfalahi, U. Orhan, M. Akcakaya, H. Nezamfar, M. Fried-Oken, and D.
Erdogmus, "Language-model assisted brain computer interface for typing: a comparison
of matrix and rapid serial visual presentation," IEEE Transactions on Neural Systems and
Rehabilitation Engineering, vol. 23, no. 5, pp. 910-920, 2015.
[109] U. Hoffmann, J.-M. Vesin, T. Ebrahimi, and K. Diserens, "An efficient P300-based
brain–computer interface for disabled subjects," Journal of Neuroscience methods, vol.
167, no. 1, pp. 115-125, 2008.
[110] I. Iturrate, J. M. Antelis, A. Kubler, and J. Minguez, "A noninvasive brain-actuated
wheelchair based on a P300 neurophysiological protocol and automated navigation,"
IEEE Transactions on Robotics, vol. 25, no. 3, pp. 614-627, 2009.
[111] F. Nijboer et al., "A P300-based brain–computer interface for people with amyotrophic
lateral sclerosis," Clinical neurophysiology, vol. 119, no. 8, pp. 1909-1916, 2008.
[112] E. W. Sellers, T. M. Vaughan, and J. R. Wolpaw, "A brain-computer interface for long-
term independent home use," Amyotrophic lateral sclerosis, vol. 11, no. 5, pp. 449-455,
2010.
[113] E. W. Sellers, D. B. Ryan, and C. K. Hauser, "Noninvasive brain-computer interface
enables communication after brainstem stroke," Science translational medicine, vol. 6,
no. 257, pp. 257re7-257re7, 2014.
[114] E. A. Aydın, Ö. F. Bay, and İ. Güler, "Implementation of an Embedded Web Server
Application for Wireless Control of Brain Computer Interface Based Home
Environments," Journal of medical systems, vol. 40, no. 1, pp. 1-10, 2016.
[115] S. He et al., "A P300-Based Threshold-Free Brain Switch and Its Application in
Wheelchair Control," IEEE Transactions on Neural Systems and Rehabilitation
Engineering, vol. 25, no. 6, pp. 715-725, 2017.
[116] S. Amiri, A. Rabbi, L. Azinfar, and R. Fazel-Rezai, "A review of P300, SSVEP, and
hybrid P300/SSVEP brain-computer interface systems," Brain-Computer Interface
Systems—Recent Progress and Future Prospects, 2013.
33
[117] F.-B. Vialatte, M. Maurice, J. Dauwels, and A. Cichocki, "Steady-state visually evoked
potentials: focus on essential paradigms and future perspectives," Progress in
neurobiology, vol. 90, no. 4, pp. 418-438, 2010.
[118] C. S. Herrmann, "Human EEG responses to 1–100 Hz flicker: resonance phenomena in
visual cortex and their potential correlation to cognitive phenomena," Experimental brain
research, vol. 137, no. 3-4, pp. 346-353, 2001.
[119] M. H. Chang, H. J. Baek, S. M. Lee, and K. S. Park, "An amplitude-modulated visual
stimulation for reducing eye fatigue in SSVEP-based brain–computer interfaces,"
Clinical Neurophysiology, vol. 125, no. 7, pp. 1380-1391, 2014.
[120] G. G. Molina and V. Mihajlovic, "Spatial filters to detect steady-state visual evoked
potentials elicited by high frequency stimulation: BCI application," Biomedizinische
Technik/Biomedical Engineering, vol. 55, no. 3, pp. 173-182, 2010.
[121] S. M. T. Müller, P. F. Diez, T. F. Bastos-Filho, M. Sarcinelli-Filho, V. Mut, and E.
Laciar, "SSVEP-BCI implementation for 37–40 Hz frequency range," in Engineering in
Medicine and Biology Society, EMBC, 2011 Annual International Conference of the
IEEE, 2011, pp. 6352-6355: IEEE.
[122] I. Volosyak, D. Valbuena, T. Luth, T. Malechka, and A. Graser, "BCI demographics II:
How many (and what kinds of) people can use a high-frequency SSVEP BCI?," IEEE
Transactions on Neural Systems and Rehabilitation Engineering, vol. 19, no. 3, pp. 232-
239, 2011.
[123] B.-K. Min, S. Dähne, M.-H. Ahn, Y.-K. Noh, and K.-R. Müller, "Decoding of top-down
cognitive processing for SSVEP-controlled BMI," Scientific Reports, vol. 6, 2016.
[124] G. R. Muller-Putz, R. Scherer, C. Neuper, and G. Pfurtscheller, "Steady-state
somatosensory evoked potentials: suitable brain signals for brain-computer interfaces?,"
IEEE transactions on neural systems and rehabilitation engineering, vol. 14, no. 1, pp.
30-37, 2006.
[125] C. Pokorny, C. Breitwieser, and G. R. Müller-Putz, "The role of transient target stimuli in
a steady-state somatosensory evoked potential-based brain–computer interface setup,"
Frontiers in neuroscience, vol. 10, p. 152, 2016.
[126] G. R. Muller-Putz and G. Pfurtscheller, "Control of an electrical prosthesis with an
SSVEP-based BCI," (in eng), IEEE Trans Biomed Eng, vol. 55, no. 1, pp. 361-4, Jan
2008.
[127] Q. Liu, K. Chen, Q. Ai, and S. Q. Xie, "Review: recent development of signal processing
algorithms for SSVEP-based brain computer interfaces," Journal of Medical and
Biological Engineering, vol. 34, no. 4, pp. 299-309, 2013.
[128] M. Bryan et al., "An adaptive brain-computer interface for humanoid robot control," in
Humanoid Robots (Humanoids), 2011 11th IEEE-RAS International Conference on,
2011, pp. 199-204: IEEE.
[129] G. Li and D. Zhang, "Brain-Computer Interface Controlled Cyborg: Establishing a
Functional Information Transfer Pathway from Human Brain to Cockroach Brain," PloS
one, vol. 11, no. 3, p. e0150667, 2016.
[130] F. Gembler, P. Stawicki, and I. Volosyak, "Autonomous Parameter Adjustment for
SSVEP-Based BCIs with a Novel BCI Wizard," Frontiers in neuroscience, vol. 9, 2015.
[131] X. Chen, Y. Wang, M. Nakanishi, X. Gao, T.-P. Jung, and S. Gao, "High-speed spelling
with a noninvasive brain–computer interface," Proceedings of the National Academy of
Sciences, vol. 112, no. 44, pp. E6058-E6067, 2015.
34
[132] Y.-J. Chen, S.-C. Chen, I. A. Zaeni, C.-M. Wu, A. J. Tickle, and P.-J. Chen, "The
SSVEP-Based BCI Text Input System Using Entropy Encoding Algorithm,"
Mathematical Problems in Engineering, vol. 2015, 2015.
[133] N.-S. Kwak, K.-R. Müller, and S.-W. Lee, "A lower limb exoskeleton control system
based on steady state visual evoked potentials," Journal of neural engineering, vol. 12,
no. 5, p. 056009, 2015.
[134] Z. Cao and C.-T. Lin, "Inherent fuzzy entropy for the improvement of EEG complexity
evaluation," IEEE Transactions on Fuzzy Systems, vol. 26, no. 2, pp. 1032-1035, 2018.
[135] H. J. Hwang et al., "Clinical feasibility of brain‐computer interface based on steady‐state
visual evoked potential in patients with locked‐in syndrome: Case studies,"
Psychophysiology, vol. 54, no. 3, pp. 444-451, 2017.
[136] J. Chen, D. Zhang, A. K. Engel, Q. Gong, and A. Maye, "Application of a single-flicker
online SSVEP BCI for spatial navigation," PloS one, vol. 12, no. 5, p. e0178385, 2017.
[137] G. Pfurtscheller, T. Solis-Escalante, R. Ortner, P. Linortner, and G. R. Muller-Putz, "Self-
paced operation of an SSVEP-Based orthosis with and without an imagery-based “brain
switch:” a feasibility study towards a hybrid BCI," Neural Systems and Rehabilitation
Engineering, IEEE Transactions on, vol. 18, no. 4, pp. 409-414, 2010.
[138] R. Chavarriaga, A. Sobolewski, and J. d. R. Millán, "Errare machinale est: the use of
error-related potentials in brain-machine interfaces," Using Neurophysiological Signals
that Reflect Cognitive or Affective State, p. 53, 2015.
[139] P. W. Ferrez and J. d. R. Millán, "Error-related EEG potentials generated during
simulated brain–computer interaction," IEEE transactions on biomedical engineering,
vol. 55, no. 3, pp. 923-929, 2008.
[140] R. Chavarriaga and J. d. R. Millán, "Learning from EEG error-related potentials in
noninvasive brain-computer interfaces," IEEE transactions on neural systems and
rehabilitation engineering, vol. 18, no. 4, pp. 381-388, 2010.
[141] J. Wright, V. G. Macefield, A. van Schaik, and J. C. Tapson, "A Review of Control
Strategies in Closed-Loop Neuroprosthetic Systems," Frontiers in Neuroscience, vol. 10,
2016.
[142] X. Artusi, I. K. Niazi, M.-F. Lucas, and D. Farina, "Performance of a simulated adaptive
BCI based on experimental classification of movement-related and error potentials,"
IEEE Journal on Emerging and Selected Topics in Circuits and Systems, vol. 1, no. 4, pp.
480-488, 2011.
[143] I. Iturrate, L. Montesano, and J. Minguez, "Shared-control brain-computer interface for a
two dimensional reaching task using EEG error-related potentials," in 2013 35th Annual
International Conference of the IEEE Engineering in Medicine and Biology Society
(EMBC), 2013, pp. 5258-5262: IEEE.
[144] I. Iturrate, R. Chavarriaga, L. Montesano, J. Minguez, and J. d. R. Millán, "Teaching
brain-machine interfaces as an alternative paradigm to neuroprosthetics control,"
Scientific reports, vol. 5, 2015.
[145] I. Iturrate, L. Montesano, and J. Minguez, "Robot Reinforcement Learning using EEG-
based reward signals," in Robotics and Automation (ICRA), 2010 IEEE International
Conference on, 2010, pp. 4822-4829: IEEE.
[146] T. Tsoneva, J. Bieger, and G. Garcia-Molina, "Towards error-free interaction," in 2010
Annual International Conference of the IEEE Engineering in Medicine and Biology,
2010, pp. 5799-5802: IEEE.
35
[147] M. K. Goel, R. Chavarriaga, and J. d. R. Millán, "Cortical current density vs. surface
EEG for event-related potential-based Brain-Computer interface," in Neural Engineering
(NER), 2011 5th International IEEE/EMBS Conference on, 2011, pp. 430-433: IEEE.
[148] H. Zhang, R. Chavarriaga, M. K. Goel, L. Gheorghe, and J. d. R. Millán, "Improved
recognition of error related potentials through the use of brain connectivity features," in
2012 Annual International Conference of the IEEE Engineering in Medicine and Biology
Society, 2012, pp. 6740-6743: Ieee.
[149] I. Iturrate, R. Chavarriaga, L. Montesano, J. Minguez, and J. d. R. Millán, "Latency
correction of error potentials between different experiments reduces calibration time for
single-trial classification," in 2012 Annual International Conference of the IEEE
Engineering in Medicine and Biology Society, 2012, pp. 3288-3291: IEEE.
[150] I. Iturrate, L. Montesano, and J. Minguez, "Task-dependent signal variations in EEG
error-related potentials for brain–computer interfaces," Journal of neural engineering,
vol. 10, no. 2, p. 026024, 2013.
[151] J. Omedes, I. Iturrate, R. Chavarriaga, and L. Montesano, "Asynchronous Decoding of
Error Potentials During the Monitoring of a Reaching Task," in Systems, Man, and
Cybernetics (SMC), 2015 IEEE International Conference on, 2015, pp. 3116-3121:
IEEE.
[152] A. Kreilinger, C. Neuper, and G. R. Müller-Putz, "Error potential detection during
continuous movement of an artificial arm controlled by brain–computer interface,"
Medical & biological engineering & computing, vol. 50, no. 3, pp. 223-230, 2012.
[153] J. Omedes, I. Iturrate, L. Montesano, and J. Minguez, "Using frequency-domain features
for the generalization of EEG error-related potentials among different tasks," in 2013
35th Annual International Conference of the IEEE Engineering in Medicine and Biology
Society (EMBC), 2013, pp. 5263-5266: IEEE.
[154] L. George and A. Lécuyer, "An overview of research on" passive" brain-computer
interfaces for implicit human-computer interaction," in International Conference on
Applied Bionics and Biomechanics ICABB 2010-Workshop W1" Brain-Computer
Interfacing and Virtual Reality", 2010.
[155] T. O. Zander and C. Kothe, "Towards passive brain–computer interfaces: applying brain–
computer interface technology to human–machine systems in general," Journal of neural
engineering, vol. 8, no. 2, p. 025005, 2011.
[156] S. Amiri, R. Fazel-Rezai, and V. Asadpour, "A review of hybrid brain-computer interface
systems," Advances in Human-Computer Interaction, vol. 2013, p. 1, 2013.
[157] G. Pfurtscheller et al., "The hybrid BCI," Frontiers in neuroscience, vol. 4, p. 3, 2010.
[158] H. Banville and T. Falk, "Recent advances and open challenges in hybrid brain-computer
interfacing: a technological review of non-invasive human research," Brain-Computer
Interfaces, vol. 3, no. 1, pp. 9-46, 2016.
[159] U. Chaudhary, B. Xia, S. Silvoni, L. G. Cohen, and N. Birbaumer, "Brain–computer
interface–based communication in the completely locked-in state," PLoS biology, vol. 15,
no. 1, p. e1002593, 2017.
[160] T. Luth, D. Ojdanic, O. Friman, O. Prenzel, and A. Graser, "Low level control in a semi-
autonomous rehabilitation robotic system via a brain-computer interface," in
Rehabilitation Robotics, 2007. ICORR 2007. IEEE 10th International Conference on,
2007, pp. 721-728: IEEE.
36
[161] Y. Li et al., "An EEG-based BCI system for 2-D cursor control by combining Mu/Beta
rhythm and P300 potential," Biomedical Engineering, IEEE Transactions on, vol. 57, no.
10, pp. 2495-2505, 2010.
[162] L. Bi, J. Lian, K. Jie, R. Lai, and Y. Liu, "A speed and direction-based cursor control
system with P300 and SSVEP," Biomedical Signal Processing and Control, vol. 14, pp.
126-133, 2014.
[163] B. Z. Allison, C. Brunner, C. Altstätter, I. C. Wagner, S. Grissmann, and C. Neuper, "A
hybrid ERD/SSVEP BCI for continuous simultaneous two dimensional cursor control,"
Journal of neuroscience methods, vol. 209, no. 2, pp. 299-307, 2012.
[164] F. Duan, D. Lin, W. Li, and Z. Zhang, "Design of a Multimodal EEG-based Hybrid BCI
System with Visual Servo Module," IEEE Transactions on Autonomous Mental
Development, vol. 7, no. 4, pp. 332-341, 2015.
[165] T. Yu et al., "Enhanced motor imagery training using a hybrid BCI with feedback," IEEE
Transactions on Biomedical Engineering, vol. 62, no. 7, pp. 1706-1717, 2015.
[166] B. Hyung Kim, M. Kim, and S. Joevaluated, "Quadcopter flight control using a low-cost
hybrid interface with EEG-based classification and eye tracking," Computers in biology
and medicine, 2014.
[167] M. Kim, B. H. Kim, and S. Jo, "Quantitative evaluation of a low-cost noninvasive hybrid
interface based on EEG and eye movement," IEEE Transactions on Neural Systems and
Rehabilitation Engineering, vol. 23, no. 2, pp. 159-168, 2015.
[168] K.-S. Hong and M. J. Khan, "Hybrid Brain–Computer interface Techniques for improved
Classification Accuracy and increased Number of Commands: A Review," Frontiers in
neurorobotics, vol. 11, 2017.
[169] I. Choi, I. Rhiu, Y. Lee, M. H. Yun, and C. S. Nam, "A systematic review of hybrid
brain-computer interfaces: Taxonomy and usability perspectives," PloS one, vol. 12, no.
4, p. e0176674, 2017.
[170] P. Horki, T. Solis-Escalante, C. Neuper, and G. Müller-Putz, "Combined motor imagery
and SSVEP based BCI control of a 2 DoF artificial upper limb," Medical & biological
engineering & computing, vol. 49, no. 5, pp. 567-577, 2011.
[171] J. Ma, Y. Zhang, A. Cichocki, and F. Matsuno, "A Novel EOG/EEG Hybrid Human–
Machine Interface Adopting Eye Movements and ERPs: Application to Robot Control,"
IEEE Transactions on Biomedical Engineering, vol. 62, no. 3, pp. 876-889, 2015.
[172] M. J. Khan, K.-S. Hong, N. Naseer, and M. R. Bhutta, "Hybrid EEG-NIRS based BCI for
quadcopter control," in Society of Instrument and Control Engineers of Japan (SICE),
2015 54th Annual Conference of the, 2015, pp. 1177-1182: IEEE.
[173] T. Malechka, T. Tetzel, U. Krebs, D. Feuser, and A. Graeser, "sBCI-Headset—Wearable
and Modular Device for Hybrid Brain-Computer Interface," Micromachines, vol. 6, no. 3,
pp. 291-311, 2015.
[174] A. R. Hunt and A. Kingstone, "Covert and overt voluntary attention: linked or
independent?," Cognitive Brain Research, vol. 18, no. 1, pp. 102-105, 2003.
[175] S. P. Kelly, J. J. Foxe, G. Newman, and J. A. Edelman, "Prepare for conflict: EEG
correlates of the anticipation of target competition during overt and covert shifts of visual
attention," European Journal of Neuroscience, vol. 31, no. 9, pp. 1690-1700, 2010.
[176] S. Kelly, E. Lalor, R. Reilly, and J. Foxe, "Independent brain computer interface control
using visual spatial attention-dependent modulations of parieto-occipital alpha," in
37
Neural Engineering, 2005. Conference Proceedings. 2nd International IEEE EMBS
Conference on, 2005, pp. 667-670: IEEE.
[177] L. Tonin, R. Leeb, and J. del R Millán, "Time-dependent approach for single trial
classification of covert visuospatial attention," Journal of neural engineering, vol. 9, no.
4, p. 045011, 2012.
[178] L. Tonin, R. Leeb, A. Sobolewski, and J. del R Millán, "An online EEG BCI based on
covert visuospatial attention in absence of exogenous stimulation," Journal of neural
engineering, vol. 10, no. 5, p. 056007, 2013.
[179] M. S. Treder, A. Bahramisharif, N. M. Schmidt, M. A. van Gerven, and B. Blankertz,
"Brain-computer interfacing using modulations of alpha activity induced by covert shifts
of attention," Journal of neuroengineering and rehabilitation, vol. 8, no. 1, p. 24, 2011.
[180] O. Bai et al., "Prediction of human voluntary movement before it occurs," Clinical
Neurophysiology, vol. 122, no. 2, pp. 364-372, 2011.
[181] A. Muralidharan, J. Chae, and D. Taylor, "Extracting attempted hand movements from
EEGs in people with complete hand paralysis following stroke," (in English), Frontiers
in Neuroscience, Original Research vol. 5, 2011-March-25 2011.
[182] L. Yang, H. Leung, M. Plank, J. Snider, and H. Poizner, "EEG activity during movement
planning encodes upcoming peak speed and acceleration and improves the accuracy in
predicting hand kinematics," IEEE journal of biomedical and health informatics, vol. 19,
no. 1, pp. 22-28, 2015.
[183] A. Frisoli et al., "A New Gaze-BCI-Driven Control of an Upper Limb Exoskeleton for
Rehabilitation in Real-World Tasks," Systems, Man, and Cybernetics, Part C:
Applications and Reviews, IEEE Transactions on, vol. 42, no. 6, pp. 1169-1179, 2012.
[184] E. Lew, R. Chavarriaga, S. Silvoni, and J. d. R. Millán, "Detection of Self-Paced
Reaching Movement Intention from EEG Signals," (in English), Frontiers in
Neuroengineering, Original Research vol. 5, 2012-July-12 2012.
[185] O. Bai, P. Lin, S. Vorbach, J. Li, S. Furlani, and M. Hallett, "Exploration of
computational methods for classification of movement intention during human voluntary
movement from single trial EEG," Clinical Neurophysiology, vol. 118, no. 12, pp. 2637-
2655, 2007.
[186] J. Zhou, J. Yao, J. Deng, and J. P. Dewald, "EEG-based classification for elbow versus
shoulder torque intentions involving stroke subjects," (in eng), Comput Biol Med, vol. 39,
no. 5, pp. 443-52, May 2009.
[187] J. Ibáñez et al., "Detection of the onset of upper-limb movements based on the combined
analysis of changes in the sensorimotor rhythms and slow cortical potentials," Journal of
neural engineering, vol. 11, no. 5, p. 056009, 2014.
[188] E. W. Sellers and E. Donchin, "A P300-based brain–computer interface: initial tests by
ALS patients," Clinical neurophysiology, vol. 117, no. 3, pp. 538-548, 2006.
[189] T. W. Picton, M. S. John, A. Dimitrijevic, and D. Purcell, "Human auditory steady-state
responses: Respuestas auditivas de estado estable en humanos," International journal of
audiology, vol. 42, no. 4, pp. 177-219, 2003.
[190] F. Ferracuti, A. Freddi, S. Iarlori, S. Longhi, and P. Peretti, "Auditory Paradigm for a
P300 BCI system using Spatial Hearing," in 2013 IEEE/RSJ International Conference on
Intelligent Robots and Systems, 2013, pp. 871-876: IEEE.
[191] K. Hamada, H. Mori, H. Shinoda, and T. M. Rutkowski, "Airborne ultrasonic tactile
display brain-computer interface paradigm," arXiv preprint arXiv:1404.4184, 2014.
38
[192] A.-M. Brouwer and J. B. Van Erp, "A tactile P300 brain-computer interface," Frontiers
in neuroscience, vol. 4, p. 19, 2010.
[193] C. Guger et al., "Complete locked-in and locked-in patients: command following
assessment and communication with vibro-tactile P300 and motor imagery brain-
computer interface tools," Frontiers in neuroscience, vol. 11, p. 251, 2017.
[194] Z. R. Lugo et al., "A vibrotactile p300-based brain–computer interface for consciousness
detection and communication," Clinical EEG and neuroscience, vol. 45, no. 1, pp. 14-21,
2014.
[195] A. Furdea et al., "A new (semantic) reflexive brain–computer interface: In search for a
suitable classifier," Journal of neuroscience methods, vol. 203, no. 1, pp. 233-240, 2012.
[196] C. A. Ruf et al., "Semantic classical conditioning and brain-computer interface control:
encoding of affirmative and negative thinking," Frontiers in neuroscience, vol. 7, p. 23,
2013.
[197] N. Birbaumer, F. Piccione, S. Silvoni, and M. Wildgruber, "Ideomotor silence: the case
of complete paralysis and brain–computer interfaces (BCI)," Psychological research, vol.
76, no. 2, pp. 183-191, 2012.
[198] G. Gallegos-Ayala, A. Furdea, K. Takano, C. A. Ruf, H. Flor, and N. Birbaumer, "Brain
communication in a completely locked-in patient using bedside near-infrared
spectroscopy," Neurology, vol. 82, no. 21, pp. 1930-1932, 2014.
[199] U. Chaudhary, N. Birbaumer, and A. Ramos-Murguialday, "Brain–computer interfaces
for communication and rehabilitation," Nature Reviews Neurology, vol. 12, no. 9, p. 513,
2016.
[200] Y. Wang and S. Makeig, "Predicting intended movement direction using EEG from
human posterior parietal cortex," in Foundations of Augmented Cognition.
Neuroergonomics and Operational Neuroscience: Springer, 2009, pp. 437-446.
[201] P. S. Hammon, S. Makeig, H. Poizner, E. Todorov, and V. R. de Sa, "Predicting
Reaching Targets from Human EEG," Signal Processing Magazine, IEEE, vol. 25, no. 1,
pp. 69-77, 2008.
[202] A. Muralidharan, J. Chae, and D. M. Taylor, "Early detection of hand movements from
electroencephalograms for stroke therapy applications," (in eng), J Neural Eng, vol. 8,
no. 4, p. 046003, Aug 2011.
[203] N. A. Bhagat et al., "Detecting movement intent from scalp EEG in a novel upper limb
robotic rehabilitation system for stroke," in Engineering in Medicine and Biology Society
(EMBC), 2014 36th Annual International Conference of the IEEE, 2014, pp. 4127-4130:
IEEE.
[204] R. Xu, N. Jiang, C. Lin, N. Mrachacz-Kersting, K. Dremstrup, and D. Farina, "Enhanced
low-latency detection of motor intention from EEG for closed-loop brain-computer
interface applications," IEEE Transactions on Biomedical Engineering, vol. 61, no. 2, pp.
288-296, 2014.
[205] S. Halder et al., "Prediction of auditory and visual p300 brain-computer interface
aptitude," PloS one, vol. 8, no. 2, p. e53513, 2013.
[206] I. Käthner, C. A. Ruf, E. Pasqualotto, C. Braun, N. Birbaumer, and S. Halder, "A portable
auditory P300 brain–computer interface with directional cues," Clinical neurophysiology,
vol. 124, no. 2, pp. 327-338, 2013.
[207] D. S. Klobassa et al., "Toward a high-throughput auditory P300-based brain–computer
interface," Clinical Neurophysiology, vol. 120, no. 7, pp. 1252-1261, 2009.
39
[208] G. Placidi, D. Avola, A. Petracca, F. Sgallari, and M. Spezialetti, "Basis for the
implementation of an EEG-based single-trial binary brain computer interface through the
disgust produced by remembering unpleasant odors," Neurocomputing, vol. 160, pp. 308-
318, 2015.
[209] Y. J. Kim et al., "A study on a robot arm driven by three-dimensional trajectories
predicted from non-invasive neural signals," Biomedical engineering online, vol. 14, no.
1, p. 1, 2015.
[210] A. Úbeda, E. Hortal, J. Alarcón, R. Salazar-Varas, A. Sánchez, and J. M. Azorín, "Online
detection of horizontal hand movements from low frequency EEG components," in 2015
7th International IEEE/EMBS Conference on Neural Engineering (NER), 2015, pp. 214-
217: IEEE.
[211] K. Kiguchi, T. D. Lalitharatne, and Y. Hayashi, "Estimation of Forearm
Supination/Pronation Motion Based on EEG Signals to Control an Artificial Arm,"
Journal of Advanced Mechanical Design, Systems, and Manufacturing, vol. 7, no. 1, pp.
74-81, 2013.
[212] C. Breitwieser, C. Pokorny, C. Neuper, and G. R. Muller-Putz, "Somatosensory evoked
potentials elicited by stimulating two fingers from one hand--usable for BCI?," (in eng),
Conf Proc IEEE Eng Med Biol Soc, vol. 2011, pp. 6373-6, 2011.
[213] M. van der Waal, M. Severens, J. Geuze, and P. Desain, "Introducing the tactile speller:
an ERP-based brain–computer interface for communication," Journal of Neural
Engineering, vol. 9, no. 4, p. 045002, 2012.
[214] E. Hortal et al., "SVM-based Brain–Machine Interface for controlling a robot arm
through four mental tasks," Neurocomputing, vol. 151, pp. 116-121, 2015.
[215] N. Birbaumer, "Slow cortical potentials: their origin, meaning, and clinical use," ed:
Tilburg, The Netherlands: Tilburg Univ. Press, 1997, pp. 25-39.
[216] N. Birbaumer et al., "The thought translation device (TTD) for completely paralyzed
patients," IEEE Transactions on rehabilitation Engineering, vol. 8, no. 2, pp. 190-193,
2000.
[217] A. Kübler, B. Kotchoubey, J. Kaiser, J. R. Wolpaw, and N. Birbaumer, "Brain–computer
communication: Unlocking the locked in," Psychological bulletin, vol. 127, no. 3, p. 358,
2001.
[218] T. Hinterberger, J. M. Houtkooper, and B. Kotchoubey, "Effects of feedback control on
slow cortical potentials and random events," in Parapsychological Association
Convention, 2004, pp. 39-50.
[219] D. Tkach, J. Reimer, and N. G. Hatsopoulos, "Observation-based learning for brain–
machine interfaces," Current Opinion in Neurobiology, vol. 18, no. 6, pp. 589-594, 12//
2008.
[220] H. Agashe and J. L. Contreras-Vidal, "Observation-based training for neuroprosthetic
control of grasping by amputees," in Engineering in Medicine and Biology Society
(EMBC), 2014 36th Annual International Conference of the IEEE, 2014, pp. 3989-3992:
IEEE.
[221] H. A. Agashe and J. L. Contreras-Vidal, "Observation-based calibration of brain-machine
interfaces for grasping," in Neural Engineering (NER), 2013 6th International
IEEE/EMBS Conference on, 2013, pp. 1-4: IEEE.
40
[222] A. N. Belkacem, H. Hirose, N. Yoshimura, D. Shin, and Y. Koike, "Classification of four
eye directions from EEG signals for eye-movement-based communication systems," J.
Med. Biol. Eng., 2013.
[223] R. Ramli, H. Arof, F. Ibrahim, M. Y. I. Idris, and A. Khairuddin, "Classification of eyelid
position and eyeball movement using EEG signals," Malaysian Journal of Computer
Science, vol. 28, no. 1, pp. 28-45, 2015.
[224] A. N. Belkacem et al., "Real-time control of a video game using eye movements and two
temporal EEG sensors," Computational intelligence and neuroscience, vol. 2015, p. 1,
2015.
[225] C. Brunner et al., "BNCI Horizon 2020: towards a roadmap for the BCI community,"
Brain-computer interfaces, vol. 2, no. 1, pp. 1-10, 2015.
[226] E. M. Holz, L. Botrel, T. Kaufmann, and A. Kübler, "Long-term independent brain-
computer interface home use improves quality of life of a patient in the locked-in state: a
case study," Archives of physical medicine and rehabilitation, vol. 96, no. 3, pp. S16-S26,
2015.
[227] P. Wang, J. Lu, B. Zhang, and Z. Tang, "A review on transfer learning for brain-
computer interface classification," in 2015 5th International Conference on Information
Science and Technology (ICIST), 2015, pp. 315-322: IEEE.
[228] V. Jayaram, M. Alamgir, Y. Altun, B. Scholkopf, and M. Grosse-Wentrup, "Transfer
learning in brain-computer interfaces," IEEE Computational Intelligence Magazine, vol.
11, no. 1, pp. 20-31, 2016.
[229] F. Lotte, "Signal processing approaches to minimize or suppress calibration time in
oscillatory activity-based brain–computer interfaces," Proceedings of the IEEE, vol. 103,
no. 6, pp. 871-890, 2015.
[230] N. R. Waytowich, J. Faller, J. O. Garcia, J. M. Vettel, and P. Sajda, "Unsupervised
adaptive transfer learning for Steady-State Visual Evoked Potential brain-computer
interfaces," in Systems, Man, and Cybernetics (SMC), 2016 IEEE International
Conference on, 2016, pp. 004135-004140: IEEE.
[231] A. Bashashati, M. Fatourechi, R. K. Ward, and G. E. Birch, "A survey of signal
processing algorithms in brain-computer interfaces based on electrical brain signals," (in
eng), J Neural Eng, vol. 4, no. 2, pp. R32-57, Jun 2007.
[232] B. J. Edelman, B. Baxter, and B. He, "EEG Source Imaging Enhances the Decoding of
Complex Right-Hand Motor Imagery Tasks," IEEE Transactions on Biomedical
Engineering, vol. 63, no. 1, pp. 4-14, 2016.
[233] N. Tomida, T. Tanaka, S. Ono, M. Yamagishi, and H. Higashi, "Active data selection for
motor imagery EEG classification," IEEE Transactions on Biomedical Engineering, vol.
62, no. 2, pp. 458-467, 2015.
[234] M. Längkvist, L. Karlsson, and A. Loutfi, "A review of unsupervised feature learning and
deep learning for time-series modeling," Pattern Recognition Letters, vol. 42, pp. 11-24,
2014.
[235] J. Schmidhuber, "Deep learning in neural networks: An overview," Neural Networks, vol.
61, pp. 85-117, 2015.
[236] I. Sturm, S. Lapuschkin, W. Samek, and K.-R. Müller, "Interpretable deep neural
networks for single-trial eeg classification," Journal of Neuroscience Methods, vol. 274,
pp. 141-145, 2016.
41
[237] A. H. Marblestone, G. Wayne, and K. P. Kording, "Toward an integration of deep
learning and neuroscience," Frontiers in computational neuroscience, vol. 10, 2016.
[238] S. Perdikis, R. Leeb, and J. d R Millán, "Context-aware adaptive spelling in motor
imagery BCI," Journal of neural engineering, vol. 13, no. 3, p. 036018, 2016.
[239] B. Dal Seno, M. Matteucci, and L. T. Mainardi, "The utility metric: a novel method to
assess the overall performance of discrete brain–computer interfaces," IEEE Transactions
on Neural Systems and Rehabilitation Engineering, vol. 18, no. 1, pp. 20-28, 2010.
[240] J. d. R. Millán, "Brain-Machine Interfaces: The Perception-Action Closed Loop: A Two-
Learner System," IEEE Systems, Man, and Cybernetics Magazine, vol. 1, no. 1, pp. 6-8,
2015.
[241] M. S. Fifer et al., "Simultaneous Neural Control of Simple Reaching and Grasping with
the Modular Prosthetic Limb using Intracranial EEG," 2014.
[242] D. P. McMullen et al., "Demonstration of a Semi-Autonomous Hybrid Brain–Machine
Interface Using Human Intracranial EEG, Eye Tracking, and Computer Vision to Control
a Robotic Upper Limb Prosthetic," Neural Systems and Rehabilitation Engineering, IEEE
Transactions on, vol. 22, no. 4, pp. 784-796, 2014.
[243] R. Leeb, R. Chavarriaga, S. Perdikis, I. Iturrate, and J. d. R. Millán, "Moving Brain-
Controlled Devices Outside the Lab: Principles and Applications," in Recent Progress in
Brain and Cognitive Engineering: Springer, 2015, pp. 73-94.
[244] C. Vidaurre, C. Klauer, T. Schauer, A. Ramos-Murguialday, and K.-R. Müller, "EEG-
based BCI for the linear control of an upper-limb neuroprosthesis," Medical Engineering
& Physics, vol. 38, no. 11, pp. 1195-1204, 2016.
[245] J. P. Cunningham, P. Nuyujukian, V. Gilja, C. A. Chestek, S. I. Ryu, and K. V. Shenoy,
"A closed-loop human simulator for investigating the role of feedback control in brain-
machine interfaces," Journal of neurophysiology, vol. 105, no. 4, pp. 1932-1949, 2011.
[246] M. C. Dadarlat, J. E. O'doherty, and P. N. Sabes, "A learning-based approach to artificial
sensory feedback leads to optimal integration," Nature neuroscience, vol. 18, no. 1, pp.
138-144, 2015.
[247] S. N. Flesher et al., "Intracortical microstimulation of human somatosensory cortex,"
Science translational medicine, vol. 8, no. 361, pp. 361ra141-361ra141, 2016.
[248] F. D. Broccard et al., "Closed-Loop Brain–Machine–Body Interfaces for Noninvasive
Rehabilitation of Movement Disorders," Annals of biomedical engineering, pp. 1-21,
2014.
[249] S. Suresh, Y. Liu, and R. C.-H. Yeow, "Development of a Wearable
Electroencephalographic Device for Anxiety Monitoring," Journal of Medical Devices,
vol. 9, no. 3, p. 030917, 2015.
[250] J. Saab, B. Battes, and M. Grosse-Wentrup, Simultaneous EEG recordings with dry and
wet electrodes in motor-imagery. na, 2011.
[251] T. O. Zander et al., "A dry EEG-system for scientific research and brain–computer
interfaces," Frontiers in neuroscience, vol. 5, p. 53, 2011.
[252] T. R. Mullen et al., "Real-time neuroimaging and cognitive monitoring using wearable
dry EEG," IEEE Transactions on Biomedical Engineering, vol. 62, no. 11, pp. 2553-
2567, 2015.
[253] Y. Chen et al., "A high-security EEG-based login system with RSVP stimuli and dry
electrodes," IEEE Transactions on Information Forensics and Security, vol. 11, no. 12,
pp. 2635-2647, 2016.
42
[254] M. Ordikhani-Seyedlar, M. A. Lebedev, H. B. Sorensen, and S. Puthusserypady,
"Neurofeedback therapy for enhancing visual attention: state-of-the-art and challenges,"
Frontiers in Neuroscience, vol. 10, 2016.
[255] S. Wyckoff and N. Birbaumer, "Neurofeedback and Brain–Computer Interfaces," The
Handbook of Behavioral Medicine, pp. 275-312, 2014.
[256] R. Abiri, J. McBride, X. Zhao, and Y. Jiang, "A real-time brainwave based neuro-
feedback system for cognitive enhancement," in ASME 2015 Dynamic Systems and
Control Conference (Columbus, OH), 2015.
[257] N. J. Steiner, E. C. Frenette, K. M. Rene, R. T. Brennan, and E. C. Perrin, "In-school
neurofeedback training for ADHD: sustained improvements from a randomized control
trial," Pediatrics, pp. peds. 2013-2059, 2014.
[258] N. J. Steiner, E. C. Frenette, K. M. Rene, R. T. Brennan, and E. C. Perrin,
"Neurofeedback and cognitive attention training for children with attention-deficit
hyperactivity disorder in schools," Journal of Developmental & Behavioral Pediatrics,
vol. 35, no. 1, pp. 18-27, 2014.
[259] R. Abiri, X. Zhao, and Y. Jiang, "A Real Time EEG-Based Neurofeedback platform for
Attention Training," in Biomedical Engineering Society Annual Meeting (BMES 2016),
2016.
[260] Y. Jiang, R. Abiri, and X. Zhao, "Tuning Up the Old Brain with New Tricks: Attention
Training via Neurofeedback," Frontiers in aging neuroscience, vol. 9, 2017.
[261] D. S. Bassett and A. N. Khambhati, "A network engineering perspective on probing and
perturbing cognition with neurofeedback," Annals of the New York Academy of Sciences,
2017.
[262] R. Abiri, X. Zhao, and Y. Jiang, "Controlling gestures of a social robot in a brain
machine interface platform," in 6th International Brain-Computer Interface Meeting
(2016 BCI), 2016, p. 122.
[263] R. Abiri, S. Borhani, X. Zhao, and Y. Jiang, "Real-Time Neurofeedback for Attention
Training: Brainwave-Based Brain Computer Interface," in Organization for Human
Brain Mapping (OHBM 2017), 2017.
43