Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Brain Computer Interfaces (BCI) is a system that allows you to control machines using your brain activity directly, rather than through intermediary interfaces like a mouse, keyboard, touchscreen or voice. EMOTIV technology converts brain waves to digital signals that can be used to control an endless number of digital outputs such as games, IoT devices, communication devices and audio/visual content. EmotivBCI is a desktop application for Mac and Windows that allows you to view and train the EMOTIV data streams used for BCI. These data streams are the following:
Mental Commands allow active control with specific thoughts that you have trained the system to recognize. You can train up to 4 commands per training profile with an unlimited number of profiles.
Performance Metrics allow passive and continuous control based on your real time cognitive state, including measurements of focus, excitement, interest, engagement, stress and relaxation. No training is required.
We also offer two additional data streams that are not from the brain directly, but can be applied to enhance your BCI system.
Facial Expressions trigger events with movements of your facial and eye muscles. No training is required, but you can train some expressions to make them more accurate and personalized.
Motion Sensors capture your head movements which you can apply to move a mouse or trigger other commands.
The personalized training profiles that you create in EmotivBCI can be used with our Node-RED toolbox to control a wide array of outputs including social media, IoT devices and robotics. Your BCI profile can also be used in other third party applications connected to EMOTIV via our Cortex service.
This document is intended to help you to get started with EmotivBCI and to understand how to use the various features that it offers.
If you have any queries beyond the scope of this document, please contact us through our online support.
Upon opening EmotivBCI, you will be asked to log in via the EMOTIV Launcher. You can log in using your EmotivID or your Facebook account.
Note: You need to be online the first time you login to EMOTIV Launcher so that the application can authenticate your credentials.
If you cannot remember your password, click on “Forgot Password” under the login fields in the EMOTIV Launcher to reset your password and follow the instructions provided.
Learn more about the system requirements for EmotivBCI.
To view your account information or change your password, visit your Account page on our website. To access this via the application:
Click on the setting icon in the top right hand corner of the application
Click on “Account” in the dropdown menu
To logout:
Open the EMOTIV Launcher application
Click on the Settings icon at top right of the EMOTIV Launcher
Click on “Log out”
Click on "Yes" button
Note: Logging out of the EMOTIV Launcher will log you out from all EMOTIV Applications
EmotivBCI is compatible with EMOTIV EPOC, EPOC+, EPOC X, INSIGHT, and EPOC Flex (partial, see Connecting EPOC Flex for more detailed) headsets. Before you connect your headset to EmotivBCI, make sure your device’s battery is charged.
if your headset pairs with more than 1 connection type at the same time, connection types priority will follow 2 cases below:
Case 1: In case of no prior connections, the priority for different types is as follows: USB cable connection -> USB dongle connection -> BTLE connection
Case 2: In the case of prior connections, a new incoming connection will be queued for about 30s until the first connection can disconnect
Also in this section:
If you do not have an EmotivID, click on “Create Account” in the EMOTIV Launcher and follow the instructions on our website.
EMOTIV Launcher
EMOTIV Launcher is our single, centralized installer to download all EMOTIV applications. You can choose to download our entire application suite or select the application you wish to install. Please note that Cortex service is essential for both EmotivPRO and EmotivBCI to operate successfully. You can create a new account or manage your passwords using the EMOTIV Launcher.
To install EmotivBCI on Windows:
Use the computer that you want to install EmotivBCI on
Login at www.emotiv.com
Go to the “MyAccount” page at www.emotiv.com/my-account/
Click on “Downloads” in the menu
Find Emotiv-Installer-Win in the list of available downloads and click on “Download” next to the listing
Open downloaded Emotiv-Installer-Win-x64.exe
Install to the default C:\Program Files (x86) or specify the path where you would like to install Emotiv applications. Click Continue.
Select the Applications you want to install. Please note that Cortex Service is essential for successful operation of EmotivBCI
Accept the User License Agreement and click Continue
Wait for the installer to download required files and install them
Click close to exit the installer.
To install EmotivBCI on Mac:
Use the computer that you want to install EmotivBCI on
Login at www.emotiv.com
Go to the “MyAccount” page at www.emotiv.com/my-account/
Click on “Downloads” in the menu
Find Emotiv-Installer-Mac in the list of available downloads and click on “Download” next to the listing
Open the downloaded Emotiv-Installer-Mac.dmg (Note: You may need to allow installing apps from unidentified developers in order to run the installer. Please right click on the installer and select open form the right-click menu. You will then be prompted with a pop up asking for your permission to open the app, click Open.)
Install to the default User/Applications/EmotivApps or specify the path where you would like to install the Emotiv applications. Click Continue.
Select the Applications you want to install. Please note that Cortex Service is essential for successful operation of EmotivBCI
Accept the User License Agreement and click Continue
Wait for the installer to download required files and install them
Click close to exit the installer.
Updating EmotivBCI:
We highly recommend keeping your app up to date with the latest version so that you can benefit from new features and bug fixes and so that you can maximize compatibility between various EMOTIV apps that you may have.
Our auto-updater tool (which is a part of EMOTIV Launcher) will let you know when a new update is available. Please follow the instructions in the updater tool when these updates are available.
If you’re in an environment with high levels of Bluetooth interference, we recommend using a USB receiver dongle to pair with your headset. To pair your headset with a USB receiver dongle:
Plug in the USB receiver to your computer’s USB port.
Plug in the USB receiver dongle to your computer’s USB port. The LED above the power symbol should start blinking (about once a second) to show that it is broadcasting.
Turn your headset on. When you turn your headset on, a second LED will light up, and the LED above the power symbol will become fainter and flicker more rapidly. When your USB receiver dongle does this, it is paired with your headset.
You can now connect your headset to EmotivBCI.
If you experience any issues when pairing your EEG headset with a USB receiver dongle, please refer to your headset's user manual.
You can EmotivBCI offline. However, if you are offline, your training profiles will not be synced to other devices and your training profiles will not be updated in the cloud to be accessed by other applications.(e.g. BCI Node-RED toolkit) To keep your profile up to date for use with other applications or devices, we recommend using EmotivBCI while connected online.
To connect your EEG headset to EmotivBCI via Bluetooth (BLE 4.0) on Mac OS:
Go to Bluetooth settings and turn Bluetooth on.
Open EMOTIV Launcher, click on the Bluetooth icon and follow the steps.
Turn your headset on.
Open EmotivBCI.
Click on the Connect Headset button at the top of the screen.
Choose your headset and click on Connect.
Follow the fitting instructions for your headset.
Once you’ve fitted your headset, click on Close.
Your headset is now connected to EmotivBCI.
There are two states for connecting your EMOTIV headset; "unpaired" or "paired". If you are having trouble connecting in the "unpaired" state, then you should try pairing the device to your computer.
Unpaired mode
Go to Settings and turn Bluetooth On.
Turn your headset on.
Open EmotivPRO.
Click on the Connect Headset button at the top of the screen.
Choose your headset and click on Connect.
Follow the fitting instructions for your headset.
Once you've fitted your headset, click on Close.
Your headset is now connected to EmotivPRO.
Paired mode
If you're having trouble connecting your headset, you should try pairing it to your computer:
Go to Settings and then Bluetooth and other devices.
2. Turn your headset on.
3. Turn Bluetooth On and click Add Bluetooth or other device.
4. Your headset should now appear on the list of available devices.
To disconnect your headset from EmotivBCI:
Unplug the dongle or turn off the headset using power button on the device.
Disconnecting the device may take up to 30 seconds.
EmotivBCI is now possible to work with an EPOC Flex headset but this requires a specific EEG sensor mapping same as the EPOC+ or EPOC X. You can achieve this by the following steps:
Connect your EPOC Flex with your machine (via Bluetooth or dongle)
Open EmotivBCI and click on the "Connect" button:
A popup will appear and you can click on the "Open EMOTIV Launcher" button to open the EMOTIV Launcher
Connect your EPOC Flex on the EMOTIV Launcher
Create a new EEG sensor mapping with a template (EPOC X Base Configuration)
Apply the new EEG sensor mapping
Back to the EmotivBCI, and connect the EPOC Flex headset as usual.
Accurate EEG data collection depends on good sensor contact with the scalp and good EEG signal quality.
The contact quality map shows you a visual representation of the current contact quality at each individual sensor on your headset. Each sensor’s contact quality status can be seen in real time, so you can adjust them (if needed) to achieve 100% contact quality.
The contact quality statuses are:
Green - good
Orange - moderate
Red - poor
Black - very poor
Please note: The contact quality map for EPOC Flex will only show the sensors that have been configured.
EEG Quality (EQ) has been built to give you an easy way to assess whether you are recording high quality brain signals or not. Basically, it is a scoring system for the quality of the signal in each sensor. Each sensor is given a black, red, orange, light green or dark green colour to indicate the quality of the EEG signal for that specific sensor.
Overall EEG Quality as displayed in your EMOTIV application is presented as a score of 0-100. This number makes it easy to understand the quality of your signal at a glance. How do we calculate it? See below.
Contact Quality is a simple measurement of impedence of the channel. While this measurement is used widely in neuroscience and medical settings, it is really only a useful reading to a trained EEG professional and does not give the full picture on the quality of the signal. Signals that have low impedance are not necessarily brain signals - and you might have found that sitting your headset on the table can produce a false positive - leading the user to falsely believe they are recording high quality brain data.
EEG Quality is a machine-learning trained algorithm that automatically determines the quality of the signal based on multiple metrics. Each of these metrics are important in assessing whether the recording data accurately captures the underlying brain signal. These metrics include
Contact Quality (CQ) (as above) - an impedance measurement that indicates the quality of the electrical signal passing through the sensors and the reference. It can be one of 4 values: 0 - very bad (black), 1 - bad (red), 2 - ok (orange) and 4 - good (green).
ML Signal Quality (SQ): A machine learning algorithm trained on high quality EEG recordings that were assessed and collected by the EMOTIV Research team. The Signal Quality algorithm assess data over 2 seconds and is scored 0 - very bad (black), 1 - bad (red), 3 (light green) and 4 -good (dark green)
Signal Magnitude Quality (SMQ): A measure of the signal amplitude. Sometimes, the above metrics i.e. CQ, SQ are good but the signal amplitude is very small and therefore small power fluctuations in the FFT would be undetectable. Often, this is due to poor sensor hydration or poor scalp contact. SMQ gives scores for the average signal magnitude within a 2 second epoch. The scores can be 0 - very low (black), 1 - low (red), 4 - ok (green).
Each of these metrics are important in determining the quality of the signal, therefore the sensor colour will not be dark green unless all of these metrics have a “good” score of 4. The “EQ Map” UI in your EMOTIV software displays the minimum score of any of these metrics. The resulting EQ Score for each channel can be 0,1,2,3,4 corresponding to the colours black, red, orange, light green and dark screen, respectively.
Overall EQ is calculated and displayed as a number out of 100 so you can see your EEG Quality for all channels at a glance. Overall EQ is calculated as the sum of the worst 3 channels divided by the sum of the maximum score of these 3 channels, as a percentage. The resulting mathematical formula is (min1 + min2 + min3 / 12) * 100.
When connecting your EEG headset to EmotivBCI, the Device Fitting screen will show you how to fit your headset.
If you’re connecting an EPOC Flex headset to EmotivBCI, please refer to the Assembly and Fitting guide which can be accessed via the link on the Device Fitting screen.
You can find the EEG quality indicator at the top of the EmotivPRO home screen, next to your headset’s name. The EEG quality indicator shows you, in real time, the overall EEG quality averaged across all the sensors on your headset. Red indicates poor EEG quality, orange is average EEG quality, and green indicates good EEG quality.
You can find the battery level indicator at the top of the EmotivPRO home screen, next to your headset’s name. The battery level indicator shows you the percentage of battery life remaining in your EEG headset.
You can navigate through main features of EmotivBCI using the left side menu. This menu is always visible on high resolutions screens. For low resolution screens, you can access this menu when needed by clicking on the menu icon in the top left corner of the application.
With this menu you can access the following features which will display in the main application display area:
Mental Commands: train and test Mental Commands (discrete thoughts) that can later be used to control various things around you
Facial Expressions: train and test Facial Expression recognition which can later be used as triggers to control devices around you
Performance metrics: view real time data streams for EMOTIV Performance Metrics
Motion Sensors: view a real time data stream from your headset’s 9-axis motion sensors
Your ability to speak multiple languages, play a musical instrument, level of education and occupation can dramatically change how your brain behaves. When you first login to the app you will see a form asking you to provide some demographic details about yourself. Filling in this form will help us improve our detections and provide you with the best results.
You can configure your EPOC+ and EPOC X's EEG sample rate, EEG resolution, motion data sample rate, and motion data resolution through EmotivBCI.
There are no configuration settings for Emotiv’s EPOC, EPOC Flex, and Insight headsets.
To configure your EPOC+ or EPOC X:
Connect your EPOC+ or EPOC X headset to your computer using the USB cable that came with it.
Open the EPOC+ or EPOC X configuration settings by clicking on the headset name at the top of the EmotivBCI home screen.
Select your configuration. Your headset will update as you make your selections.
Unplug your EPOC+ or EPOC X device from your computer
Please note:
If your EPOC+ or EPOC X headset is connected to your computer via Bluetooth, then EEG can only run at 128 Hz and motion data can only run at 64 Hz.
If you view the configuration menu during real-time streaming, your headset settings will be shown even when your headset is not plugged in to your computer.
If you want to showcase EmotivBCI to a friend or colleague it is extremely important that you do not let them use one of your existing training profiles OR create a new profile under your user account as this will result in data pollution and reduce the accuracy of the detections for your account. Specifically for these cases we created Guest mode.
To use Guest mode click “Use guest mode” on the “Select training profile screen”.
Guest mode will create a disposable profile that will remain active until the next time you access the app or choose a different profile.
Guest profile data is not stored in your EmotivID account. If you would like to save training profiles for another person, you can create and sign into the application with a different EmotivID.
Training data sets for mental commands and facial expressions are stored in training profiles. From EmotivBCI v3.6.5, each headset type with different EEG channels will have different training data sets, then different training profiles.
Training profiles are attached to your EmotivID and your EmotivID account can store multiple training profiles. You may want multiple training profiles to experiment with different training strategies or to customize for different applications.
To get the best results, please ensure that every time you try to use different training strategy you create a new profile.
When you train mental commands or facial expressions in EmotivBCI application, you will be requested to select or create new training profile if you haven't created one before.
Note: When you run BCI applications or integrations, such as those created by Node-RED, you will be asked to select which training profile you want to use.
If you upgrade from EmotivBCI before v3.6.5, your existing training profiles will become read only. See Readonly training profile.
Before create a new training profile, you must connect an Emotiv headset you are going to train with.
After your headset is connected, click “Create new profile” on the “Select training profile” screen and enter a profile name in the pop up screen before clicking “Create”.
To edit the name of a training profile hover over the profile you would like to rename and click on the "pencil" icon and change the name of the profile in the pop. Click “Save” to save your changes.
To change the training profile you are using, access “Select training profile” screen from the user dropdown menu on the top right of the application header and click on “Training profiles”.
To Delete a training profile hover over the profile you would like to delete and click on the "trash" icon and click “Yes, delete” in the confirmation pop up.
Since EmotivBCI 3.6.8, you can convert to training mode so that you can train it, but only with Epoc or Flex headset.
Please note that after converting to training mode, these profiles are actually written in a new data structure and cannot work with EmotivBCI before 3.6.5.
Our algorithm for Mental Commands works using pattern recognition and requires that you train at least one Neutral state and one Command state. The system learns to recognize the patterns of brain activity that are related to your Command state compared to your Neutral state. Though you can start using a profile after training a Command only one time, we highly recommend you do repeat trainings for each Command and for your Neutral state. The more training that you do, the better the system will be able to detect the pattern of brain activity associated with your Command thought AND the better you will get at learning how to recreate that thought in your mind.
We have associated the the Mental Commands inside EmotivBCI with the movements of a cube including Push, Pull, Left, Right, Up, Down, Rotate, Disappear. This way you can practice using your command inside the app in and get feedback that it is working or not. Outside of the application, however, you can associate these commands to any digital output you choose, even if they are not related to the movement of an object. For example, your “Up” command can be used to turn on the smart lights in your home using Node-RED.
Before you can train any commands you need to train a Neutral state. Your Neutral brain activity will be used as a contrast to your brain activity during your command training. What is most important is that you do not think about any of your Command thoughts during your Neutral training. Other than that, you can hang out relaxed and let your mind wander. For advice on training, visit our .
To train Neutral, click on the Train button that appears to the right of the Neutral box. You will then be asked to hold a Neutral state for 8 seconds. During this time the cube will not move, so do not think about any of your Command thoughts during these 8 seconds.
If you upgrade from EmotivBCI before v3.6.5, you should notice that we change the format of training profiles so that the existing training profile (i.e. created before v3.6.5) will not be fully supported. We call them readonly profiles which can only run at a live mode and cannot run on training mode. If you still want to use this profile in training mode, see how to .
These training profiles only support headsets with 14 EEG channels (i.e. EPOC X and Flex with EPOC X config mapping).
In this section:
Once you have trained Neutral you are ready to train your first Command. Commands are thoughts that you will recreate in your mind when you want to trigger the action associated with it. Your goal is to perform thoughts that are:
reproducible, so that you can create the same “brain state” as required, and
separable (if your profile contains more than one Command), so that the system’s machine learning algorithms can easily recognize each action and distinguish them from each other.
For advice on training, visit our . Once you have decided on the thought you will use for this action, you are ready to begin.
To train your first Command, click on the unlocked command box and select the type of command you would like to train. Once you have selected your preferred Command, click “Train”. You will then be asked to hold a hold the Command thought in your mind for 8 seconds. During this time the cube will move according to the action you selected.
If you get distracted during any training, you can click on "Cancel" under the timer to stop the training session. No data will be added to your profile in this case.
You have the choice to Accept or Reject each training that you do. Accepted trainings will be added into your current Training Profile and the profile will be immediately updated. Rejected trainings will be discarded and not added to your profile.
The first two trainings of each profile will seed the foundation of your profile. If you feel you are adequately able to hold your Command thought in your mind during each of these two trainings, select Accept. If not, you can select Reject and try again.
To add a new Command to your Training Profile, click on the unlocked command box and select the type of command you would like to train. Once you have selected your preferred Command select it and click “Train”. You can add up to four Commands for every Training Profile.
WARNING: Adding more Commands to your profile makes controlling the Commands much more difficult. We do not recommend trying out more Commands until you are comfortable the the number you currently are working with. Learn more about working with multiple Commands in our Tips and Tricks.
Every time you Accept a command and it gets added into your profile, this command will increase its level by one. Command levels appear beside the command box on the Mental Commands screen and are equal to the amount of Accepted trainings.
The Brain Space Diagram gives you feedback on the quality of your Training Profile and can guide your training process. In the Brain Space Diagram, colored dots represent your Neutral state (pink) and each of your trained Commands for the current Training Profile. The further apart the dots are on the diagram, the better separated out the Commands are in your Training Profile and the easier you are likely to find it to trigger them independently. The closer they are to each other, the more similar they are and the harder you will find to trigger them in isolation.
The Brain Space Diagram gives you feedback on the quality of your Training Profile and can guide your training process. In the Brain Space Diagram, colored dots represent your Neutral state (pink) and each of your trained Commands for the current Training Profile. The further apart the dots are on the diagram, the better separated out the Commands are in your Training Profile and the easier you are likely to find it to trigger them independently. The closer they are to each other, the more similar they are and the harder you will find to trigger them in isolation.
If there are two dots quite close to each other in the diagram, we recommend doing more training sessions for these Commands (including Neutral). If you are struggling to separate out the dots on the diagram after repeated trainings, we recommend deleting troublesome Commands or starting a new profile to try a new strategy. See our Tips and Tricks for help.
The Brain Space Diagram will be updated once you click "Accept" after completing a Command training.
If you would like to delete all of your training data for a single Command you can do so by selecting a command you want to delete and clicking on the icon.
When asked, confirm your action by clicking “Yes, Delete Command” in the pop up.
After two command trainings, the app provides feedback on each training to help you identify a good training from a bad training and inform your decision as to Accept or Reject it. This feedback is displayed on a 1-100 scale and is based on how consistent the training you just did is with previous trainings in your profile.
The GOAL marker (at 75) indicates where a good training would land. Any training above the GOAL is very likely to improve your profile. Any training significantly below the GOAL is likely to reduce the quality of your profile. If you repeatedly get low scores, you may want to start again with another strategy. A different thought may be easier for you to reproduce the result more consistently in your mind. See our Tips and Tricks for more help.
Choosing your thought: The thought that you train on and use for your Mental Commands can be anything. They can be literal (i.e. you can try and focus on pushing the virtual box) or they can be as abstract as you like (i.e. where push is associated with visualizing a scene or counting backwards from 500 in steps of 7). The possibilities are endless. Different strategies work best for different people, so try a few out.
If you are training a profile with one command, you want to make that one command as strong and distinct as possible. One way to achieve this is to use something that is multi-modal - i.e. something that contains different sensory and kinematic (related to movements of your muscles) components all together. If you have a strong disposition toward any of these modalities (e.g. you are a musician and so can easily imagine auditory sounds), you may find focusing on this single modality works best for you.
If you are training a profile with multiple commands, you may find you get best results if each of your commands uses a single and different sensory or kinematic modality (e.g. one that is visual, one that is auditory and one that is kinematic). What is most important is that they are distinct from each other and you are able to recreate them accurately in your mind repeatedly.
You may also find that associating different hand gestures or postures with a Command can help to better reproduce them.
Multiple commands: The more Commands that are in a profile the harder it is to trigger them independently. For best results, we recommend becoming confident with one command before trying a profile with two, and for becoming confident with two commands before trying a profile with three, etc.
Training sequence: If you are training a profile with one command, we recommend that you alternate training of the Command with training of the Neutral state. This will provide the best contrast for your profile.
If you plan to develop a profile with multiple commands, we recommend adding in all of the Commands at an early stage in the training process, rather than perfecting one Command before adding in the others. This way you can, from the beginning, ensure that the Commands are well separated from each other and work well together. You may find that cycling through them in sequence for training gives you the best result.
Words of encouragement: Controlling machines with your mind is hard. Do not be discouraged if you are not able to master mind control right away. Being able to recreate a thought in your mind at will is something that take practice for most of us to learn. It is like learning how to generate certain patterns of brain activity to learn how to walk or talk. Practice certainly does help and you will likely find that with repeated trainings, your ability to trigger a command at will becomes much easier.
The ability to clear your mind of distracting thoughts while focusing on a particular one can be a great way to train your brain to have better focus in other situations. Indeed, some people use Mental Commands training to challenge and train their attentional control generally.
If you would like to temporarily turn off one or more of the commands within your Training Profile, you can do so by toggling the switch to the left of those commands.
Turning a command OFF will adjust your Training Profile to only include the remaining commands. This will usually result in increased performance for the active commands. Use this feature if you want to use a subset of commands within a profile.
Toggle the switch again to turn the command back ON.
Note 1: You must have one command active at all times.
Note 2: Deactivating a command in EmotivBCI will deactivate it for other applications running on your computer (e.g. Node-RED).
If you would like to try out the Commands you have trained already and see how those perform you can do so by accessing Live Mode. To access Live Mode simply click on the Live Mode button on the Mental Commands screen. Once inside Live Mode, the cube will respond to your brain activity in real time, based on your current Training Profile.
In Live Mode you will also see a power meter showing the real time detection of commands in your Profile. Since this meter does not include the animations and physics that the cube does, it is a more quantitative representation of your performance.
You can choose to do your Mental Commands training in two different modes
Animated mode animates the cube the same way every time according to the action being trained ("Live training feedback" toggle OFF).
Live mode moves the cube during the training session according to how well you are activating that command. ("Live training feedback" toggle ON). This mode can be accessed once you have trained a command.
Use the "Live training feedback" toggle on the main training screen to switch between these two modes.
As well as detecting electrical activity from your brain, EMOTIV headsets also detect activity from your facial and eye muscles since these emit electrical signals as well. Rather than isolating and discarding the signals from these muscles, we put them to use to identify facial expressions and eye movements that you can use as control inputs as well.
Every time you train a Facial Expression, this expression will increase its level by one. Expression levels appear beside the expression box on the Facial Expressions screen and are equal to the amount of trainings completed.
If you would like to adjust the sensitivity of individual Mental Commands in Live Mode you can do so by clicking on 'settings' icon in the Live Mode and adjusting the sliders against individual commands.
If you would like to adjust the sensitivity of individual Mental Commands in Live Mode you can do so by clicking on 'settings' icon in the Live Mode and adjusting the sliders against individual commands.
Moving the slider to the right (10) will increase the sensitivity of the command making it easier to trigger. Moving the slider to the left (1) will make the command harder to trigger.
On top of the default detections for Facial Expressions, you can train a subset of these expressions - Frown, Clench, Smile, Surprise - to personalize them for you. In most cases, training your own expressions rather then using default detections will give you more precise control.
Before you can train any Facial Expressions you need to train a Neutral expression. Your Neutral training will be used as a baseline to contrast with your expression training. During this Neutral training, the most important thing is to relax the muscles of your face. You can let your mind wander in any way.
To train a Neutral expression select Neutral and click on Train button that appears to the right of the command box. You will then be asked to hold a Neutral state for 8 seconds. During this time you will see a face with no expression.
If you would like to try out the expressions you have trained already and see how those perform you can do so by accessing Live Mode. To access Live Mode simply click on the Live Mode button on the Facial Expressions screen. Once inside Live Mode, the face will respond to your brain activity in real time, based on your current Training Profile.
If you would like to adjust the sensitivity of individual facial expressions you can do so by clicking on 'settings' icon in the Live Mode and adjusting the sliders against individual expressions.
Moving the slider to the right (10) will increase the sensitivity of the command making it easier to trigger. Moving the slider to the left (1) will make the command harder to trigger.
Once you have trained Neutral you are ready to train your first Facial Expression. For best results, you want to hold that expression for the entire duration of the training.
To train a Facial Expression, click on Train next to the expression you would like to train. You will then be asked to hold that expression for 8 seconds. During this time you will see a face with that expression.
EmotivBCI comes with a set of default detections for Facial Expressions that can be used “out of the box”. You can try those immediately by accessing Live Mode on the Facial Expressions screen.
If you would like to use the default detections supplied with EmotivBCI instead of your trained expressions you can do so by un-checking the “Use training detections” box on the sensitivity pop up in Live Mode.
If you would like to delete your training data for a single expression you can do so by selecting an expression you want to delete and clicking on the “trash can” icon. When asked, confirm your action by clicking “Yes, delete expression” in the pop up.
Performance Metrics are shown at 0.1Hz, which is the frequency rate the Performance Metric API will output at with a BASIC license for application with Node-RED or another third party application. X-axis is number of samples.
The Performance Metrics view displays the results of our EMOTIV Performance Metrics algorithms for cognitive states. Select “PERFORMANCE METRICS” in the left side menu to access this view. This view allows you to test out and see how EMOTIV’s Performance Metrics work, so that you can more effectively apply them in any BCI integration.
Performance Metrics data is displayed in the application on a scaled axis from 0 to 100. The graph shows historical data while the number on the left side shows the current value of each metric.
You can turn individual Performance Metrics on and off in the view to focus on the data you are most interested in.
To do this:
Click on the eye icon in the top right corner of the view
Deselect any data stream that you do not want to see
Select any data stream that you want to see
Click outside of the popup to close and return to the main view
Motion sensor graphs include 10-axis motion sensors for Insight and EPOC+ headsets:
Quaternion 0, 1, 2, 3 - quaternion data in X, Y and Z directions and absolute quaternion values
AccX, AccY and AccZ - accelerometer in X, Y and Z directions
MagX, MagY and MagZ - magnetometer in X, Y and Z directions
Motion sensor graphs for older Insight and Epoc+ headsets (firmware < 0x633) include 9-axis motion data:
GyroX, GyroY and GyroZ - gyroscope in X, Y and Z directions
AccX, AccY and AccZ - accelerometer in X, Y and Z directions
MagX, MagY and MagZ - magnetometer in X, Y and Z directions
Motion sensors display data concerning your headset’s position and orientation using a combination of absolute orientation (magnetometer), acceleration (accelerometer) and rotation vectors (quaternion) data in a ten channel time series graph. Select “MOTION SENSORS” in the left side menu to access this view.
Alternatively, for your EPOC+ devices, you can choose to view a 3D visualisation of the motion sensor represented by a cube. The 'cube' represents accurate rotation of the device along X, Y and Z axes. The user can 'Calibrate' the cube by turning the headset on and clicking the 'Calibrate' button which sets the cube to Position 0. The 'mirrored' display is checked by default so that the pink marker points towards the user when headset is worn correctly. If the mirrored option is unchecked, the pink pointer faces away from the user and the blue marker points towards the user.
The 'Rotation' gauge on the left represents cube rotation along Y axis and can also be called pointing direction. With position 0 set arrow in vertical position. That 'Roll' gauge in the middle represents rotation of the cube along Z axis, and can also be described as roll left or right and is seen as a from the top view where pink and blue dots on the gauge stay in the fixed position. Position 0 is when line set in horizontal position on the gauge. The 'Pitch' gauge on the right represents cubes rotation along X axis, can also be described as tilt up or down. Position 0 is when the arrow is set in a horizontal position on the gauge.
For older EPOC+ headsets, (Firmware <0x633) motion sensors display data concerning your headset's position and orientation using a (gyroscope), acceleration (accelerometer) and absolute orientation (magnetometer) measurements.
*Note - Epoc Flex does not support Motion data currently
You can toggle all Performance Metrics on and off in the Performance Metrics view. To do this:
Click on the eye icon in the top right corner of the view
Click on Toggle All button in the pop up
Click outside of the popup to close and return to the main view
The resolution of Motion Sensor data for EPOC is 128 Hz and 12-bits, and for INSIGHT headsets is 64 Hz and 14-bits. The resolution for EPOC+and EPOC X is 16-bits and can be 32 or 64 Hz (or OFF) depending on the headset’s configuration and whether or not it is connected via Bluetooth. When EPOC+ or EPOC X is connected to Bluetooth, the rate is limited to 64 Hz. For EPOC Flex, the resolution is 8 bits and the sampling rate is 16 Hz. EPOC headsets include GyroX and GyroY sensors only. X-axis is samples.
You can turn individual Motion Sensor streams on and off in the view to focus on the data you are most interested in. To do this:
Click on the eye icon in the top right corner of the view
Deselect any data stream that you do not want to see
Select any data stream that you want to see
Click outside of the popup to close and return to the main view
You can toggle all Motion Sensor on and off in the Motion Sensor view. To do this:
Click on the eye icon in the top right corner of the view
Click on Toggle All button in the pop up
Click outside of the popup to close and return to the main view
Open Sound Control (OSC) is a universal communications protocol optimized for modern networking to enable connections between computers and other multimedia devices. EMOTIV’s BCI-OSC enables EMOTIV Brainwear® and Virtual Brainwear® to interface with a wider variety of OSC compatible hardware and software applications. OSC allows you to send mental commands, facial expressions, or performance metrics to an external, network-connected device.
Though we highly recommend using our online installer that keeps you up to date on the latest versions of our applications, you can contact Support for offline version-locked downloads of particular releases.
Release notes history:
EmotivBCI 2.0 is now integrated with our latest Cortex 2.0 increasing reliability, performance and overall stability of the product
BCI is now available as part of our single, centralized installer EMOTIV App which allows you to download all our EMOTIV Application suite and manage licenses more effectively
The Contact Quality warning pop up lets users know if their headset's CQ is below acceptable thresholds. Higher contact quality ensures better data and more robust training
Performance Metrics include 6 metrics:
Focus is a measure of fixed attention to one specific task. Focus measures the depth of attention as well as the frequency that attention switches between tasks. A high level of task switching is an indication of poor focus and distraction.
Engagement is experienced as alertness and the conscious direction of attention towards task-relevant stimuli. It measures the level of immersion in the moment and is a mixture of attention and concentration and contrasts with boredom. Engagement is characterized by increased physiological arousal and beta waves along with attenuated alpha waves. The greater the attention, focus and workload, the greater the output score reported by the detection.
Interest is the degree of attraction or aversion to the current stimuli, environment or activity and is commonly referred to as Valence. Low interest scores indicate a strong aversion to the task, high interest indicates a strong affinity with the task while mid-range scores indicate you neither like nor dislike the activity.
Excitement is an awareness or feeling of physiological arousal with a positive value. It is characterized by activation in the sympathetic nervous system which results in a range of physiological responses including pupil dilation, eye widening, sweat gland stimulation, heart rate and muscle tension increases, blood diversion, and digestive inhibition. In general, the greater the increase in physiological arousal the greater the output score for the detection. The Excitement detection is tuned to provide output scores that reflect short-term changes in excitement over time periods as short as several seconds.
Stress is a measure of comfort with the current challenge. High stress can result from an inability to complete a difficult task, feeling overwhelmed and fearing negative consequences for failing to satisfy the task requirements. Generally a low to moderate level of stress can improve productivity, whereas a higher level tends to be destructive and can have long term consequences for health and wellbeing.
Relaxation is a measure of an ability to switch off and and recover from intense concentration. Trained meditators can score extremely high relaxation scores.
Emotiv BCI now supports OSC, Open Sound Control module
EMOTIV’s BCI-OSC enables you to connect your EMOTIV Brainwear® and convert multiple data streams to a universal OSC format — giving you real-time control of multimedia processors.
Performance enhancement and minor bug fixes.
You can train Facial Expressions at any time by selecting “Train” next to that expression.
This release contains a very big improvement to training profiles: a training profile will be only for specific headset types. As a result, all existing profiles (i.e. created before version 3.6.5) will become readonly and only for EPOC X and Flex. See more detail here.
Allow users to convert readonly training profile to training mode
Once you subscribe to the OSC module, the tab is enabled in BCI. The following example demonstrates a connection of EMOTIV BCI with MaxMSP, which is a visual programming language for music and multimedia.
Connect a simulated device (create one on EMOTIV Launcher if you don't have. See Creating a virtual brainwear) or an OSC compatible EMOTIV Brainwear®
Choose a training profile to connect to the external device.
Select Sending mode: Unicast to Self
Set the IP: 127.0.0.1
Set the Port: 8000
Choose the Data stream you want to connect: Facial expressions, Mental Commands, or Performance Metrics
Click Start
Open Max MSP, go to File > Package Manager and install CNMAT Externals
Go to https://github.com/Emotiv/opensoundcontrol/tree/master and check the table with OSC Address Patterns
Create (replicate) the nodes below and change OSC-route according to whichever OSC Pattern you wish to address (in the example image, Facial expressions/Smile) - check table in the previous step for the addresses.
Open Processing and go to Sketch > Import Library… > Add Library , search and install oscP5
Open a new File.
Import oscP5 to the code and initialize an instance listening to port 12000. Example code (copy and paste in Processing):
import oscP5.*;
//OSC receive
OscP5 oscP5;
// This value is set by the OSC event handler
float importedValue = 0;
float radius;
void setup() {
size(1200,1000);
// Initialize an instance listening to port 12000
oscP5 = new OscP5(this,8500);
}
void draw() {
background (0);
// Scale up imported value
radius = importedValue * 1000;
// Display circle at location vector
stroke(255);
strokeWeight(2);
fill(255);
ellipse(500,500, radius, radius);
println(radius);
}
void oscEvent(OscMessage theOscMessage) {
float value = theOscMessage.get(0).floatValue();
importedValue = value;
}
14. Click the Play button and watch the graphics change according to Smile. importedValue is associated with the circle radius.
15. Open any example code in File > Examples.. 16. Associate importedValue with any float variable from any Library to play around. Be sure to:
Import oscP5;
import oscP5.*;
//OSC receive
OscP5 oscP5;
// This value is set by the OSC event handler
Initialize importedValue (before void setup);
float importedValue = 0;
Initialize oscP5 (place it inside void setup);
// Initialize an instance listening to port 12000
oscP5 = new OscP5(this,8500);
Associate the event with the variable importedValue (place it after void draw);
void oscEvent(OscMessage theOscMessage) {
float value = theOscMessage.get(0).floatValue();
importedValue = value;
}