g.BCIsys: Specs & Features

Complete research and development system for data acquisition, analysis, classification and neurofeedback.

g.BCIsys - g.tec's Brain-Computer Interface research environment

g.tec provides complete MATLAB-based research and development systems, including all hard- and software components needed for data acquisition, real-time and off-line data analysis, data classification and providing neurofeedback. A BCI system can be built with g.MOBIlab+, g.USBamp-RESEARCH, g.HIamp or g.Nautilus. g.MOBIlab+ is available with up to 8 EEG channels with wireless signal transmission and is portable. g.USBamp-RESEARCH is available for 16-64 EEG channels and transmits the data over USB to the PC or notebook. g.HIamp acquires 64 - 256 channels over USB. g.Nautilus wireless EEG is available with 8 - 64 channels.

With the software package High-Speed Online Processing under SIMULINK, you can read the biosignal data directly into SIMULINK. SIMULINK blocks are used to visualize and store the data. The parameter extraction and classification can be performed with standard SIMULINK blocks, the g.RTanalyze library or self-written S-functions. After the EEG data acquisition, the data can be analyzed with g.BSanalyze, the EEG and classification toolbox.

With ready-to-use BCI sample applications, you can develop state-of-the-art BCI experiments within a few hours. g.tec started to develop BCI systems more than 15 years ago. Therefore, all important BCI functions are included in the package and can easily be used and modified. See some BCI-related videos here!

Product Highlights

  • Complete BCI research system for EEG and ECoG
  • Ready to go paradigms for spelling, robot and cursor control
  • Seamless integration of real-time experiments and off-line analysis
  • Runs with g.MOBIlab+, g.USBamp-RESEARCH, g.HIamp or g.Nautilus technology
  • Open source paradigms let you make adaptations and develop applications easily
  • MATLAB/Simulink Rapid Prototyping environment speeds up development times from months to days
  • BCI technology proven by hundreds of subjects and labs
  • Zero class enabled for SSVEP, P300 and motor imagery
  • The only environment that supports all BCI approaches (P300, SSVEP/SSSEP, Motor Imagery, slow waves)
  • Recommended setup for a fully equipped BCI lab plan available
  • Integrates invasive and non-invasive stimulation for closed-loop experiments
  • Multi-device feature to record from multiple subjects or different g.tec biosignal amplifiers

More information

Customer adapted solutions

g.tec provides complete BCI solutions that are highly flexible, like the needs of different users. There are systems varying from 8 to 256 channels. Customers could choose active or passive electrode systems, and there are many additional accessories, sensors, consumables, and other items available to help you conduct top quality experiments across a wide range of users, environments, task demands, and other factors. Tell our sales people what you need, and they will provide you with a tailored solution.

Brain-Computer Interface (BCI)

A BCI provides a new communication channel between the human brain and a computer. Mental activity involves electrical activity, and these electrophysiological signals can be detected with techniques like the Electroencephalogram (EEG) or Electrocorticogram (ECoG). The BCI system detects such changes and transforms them into control signals, which can be used for moving objects, writing letters, opening doors, changing TV channels, control devices and other everyday household activities. This helps people with limited mobility increase their independence and enables completely paralyzed patients who suffer from disorders of consciousness (DOC), severe brain injuries or locked-in syndrome, to communicate with their environment. Furthermore, it can be used as an assessment tool of consciousness and perception of these patients. BCI's are also used for motor rehabilitation after stroke and brain mapping procedures.

BCI Publications

To support your start into the fascinating world of Brain-Computer Interface research, see some literature here:  Publications

The international BCI Award

Endowed with 6,000 USD, the prize is an accolade to recognize outstanding and innovative research done in the field of Brain-Computer Interfaces. More information at www.bci-award.com.


"The g.tec brain-computer interface environment allows my lab to rapidly realize new applications." - Prof. Nima Mesgarani, Columbia University, USA

"It is an inspiring tool for researchers who want to go several steps further in BCI studies." - Asst. Prof. N. Firat Özkan, Eskisehir Osmangazi University, TR

"Thanks to the easy integration of g.tec amplifieres into Simulink we are able to analyse users physiological signals without additional implementation efforts." - Augusto Garcia MSc, Technical University of Darmstadt, DE

Motor Imagery

The subject imagines performing an action, like squeezing a ball. The EEG data are classified online, and the result is graphically presented to the subject as a horizontal bar on the screen that moves right if right hand motor imagery is detected or moves left if left hand motor imagery is detected. The continuous feedback helps the subject learn to produce motor imagery activity that leads to correct classification. To improve performance, the classifier should be updated after some successful sessions. Offline analysis of the recorded data supports feature optimization.

Motor Rehabilitation System

One of the most common types of Brain-Computer Interface (BCI) systems relies on motor imagery (MI). The user is
asked to imagine moving either the right or left hand. This produces specific patterns of brain activity in the EEG signal, which an artificial classifier can interpret to detect which hand the user imagined moving. This approach has been used for a wide variety of communication and control purposes, such as spelling, navigation through a virtual environment, or controlling a cursor, wheelchair, orthosis, or prosthesis.
In the last few years, however, a totally novel and promising application for MI-based BCIs has gained great attention. Several recent articles have shown that MI-based BCIs can induce neural plasticity and thus serve as an important tool to enhance motor rehabilitation for stroke patients. In other words, the overall goal of the BCI system is not communication, but improved stroke recovery. Furthermore, other work has shown that this rehabilitation can be even more effective when combined with immersive graphical environments that can help users interact effectively and naturally with the BCI system. Immersive BCI stroke rehabilitation is an ongoing research effort in numerous American and European research projects, many of which involve g.tec.

g.REHAbci - Motor Rehabilitation with Virtual Limbs

Neurofeedback is critical in a MI-based BCI. Rehabilitation is most effective when users get immersive feedback that relates to the activities they imagine or perform. For example, if people imagine grasping an object with their left hand, then an image of a grasping hand can help users visualize this activity. If a stroke patient keeps trying to imagine or perform the same movement, while receiving feedback that helps to guide this movement, then users might regain the ability to grasp, or at least recover partial grasp function.

Recently, g.tec developed a full research package for stroke rehabilitation. The system consists of a 64 channel cap with active EEG electrodes that are connected to biosignal amplifier g.HIamp. To train the BCI system, the user imagines left and right hand movements. Common Spatial Patterns (CSPs) are then calculated from the 64 channels that weight each electrode according to its importance. This electrode selection is done fully automatically and includes algorithms to improve the signal-to-noise ratio. Furthermore, a linear discriminant analysis is trained to distinguish left vs. right hand movements. When this training is finished, which typically takes less than an hour, the patient can control virtual hands that are projected in a highly immersive 3D environment using g.VRsys. Smaller setups can be realized with computer screens or headmounted devices.The g.HIsys development environment contains also a block that allows to trigger and tune an FES stimulator in real-time to optimize the neurorehabiliation procedure. The stimulators supports the rehabilitation of lower and upper limbs with biphasic current stimulation.

As with all g.tec BCI systems, the BCI stroke rehabilitation system relies on well-known software platforms such as Matlab Simulink, which can easily be interfaced with other components from other sources. For more information, including a list of references or technical details, please contact g.tec.

Motor Rehabilitation with Robotic Devices

Exercising motor imagery (MI) is known to be an effective therapy in stroke rehabilitation, even if no feedback about the performance is given to the user. Providing additional real-time feedback can elicit Hebbian plasticity, which increases cortical plasticity, and could improve functional recovery. The MI based Brain-Computer Interface (BCI) is linked to a rehabilitation robot (Amadeo, tyromotion GmbH, Austria), giving motoric and haptic feedback to the user. If a correct pattern of right-hand MI was detected, the robot performed a complete movement (flexion and extension) of the hand, thus giving online feedback.

BCI Award and Stroke Rehabilitation

Much of this work is summarized in a recently published roadmap, which was developed over two years by a consortium of different groups. This roadmap lists BCI for rehabilitation as one of the two emerging disruptive technologies that could dramatically change BCI research. This roadmap is available at future-bnci.org. Around 10% of the BCI Award submissions work on stroke rehabilitation in the past few years.

Ping-Pong Game

Everybody knows the famous Ping-Pong game that was played in the seventies on TV sets. In this example, two persons are connected to the BCI system and can control the paddle with motor imagery. The paddle moves upwards via left hand movement imagination and downwards via right hand movement imagination. The algorithm extracts EEG bandpower features in the alpha and beta ranges of two EEG channels per person. Therefore, in total, 4 EEG channels are analyzed and classified.

This picture (courtesy of Dr. Kai Miller) shows high frequency band activity recorded by ECoG, which cannot be detected with EEG

High Gamma Activity

While most BCIs rely on the EEG, some newer work has drawn attention to BCIs based on ECoG. ECoG based systems have numerous advantages over EEG systems, including (i) higher spatial resolution, (ii) higher frequency range, (iii) fewer artifacts, and (iv) no need to prepare users for each session of BCI use, which usually requires scraping the skin and applying electrode gel. Recent research has demonstrated, over and over, that ECoG can outperform comparable EEG methods because of these advantages. Scientific work showed that ECoG methods can not only improve BCIs but also help us address fundamental questions in neuroscience. A few efforts have sought to map “eloquent cortex” with ECoG. That is, scientists have studied language areas of the brain while people say different words or phonemes. Results revealed far more information than EEG based methods, and have inspired new ECoG BCIs that are impossible with EEG BCIs. Other work explored the brain activity associated with movement. This has been very well studied with the EEG, leading to the well-known dominant paradigm that real and imagined movement affects activity in the 8-12 Hz range. ECoG research showed that this is only part of the picture. Movement also affects a higher frequency band, around 70-200 Hz, which cannot be detected with scalp EEG. This higher frequency band is more focal and could lead to more precise and accurate BCIs than EEG methods could ever deliver. 

This picture (courtesy of Dr. Kai Miller) shows high frequency band activity recorded by ECoG, which cannot be detected with EEG.


The P300 is another type of brain activity that can be detected with the EEG. The P300 is a brainwave component that occurs after a stimulus that is both important and relatively rare. In the EEG signal, the P300 appears as a positive wave 300 ms after stimulus onset. The electrodes are placed over the posterior scalp.

P300 Spelling

The P300 paradigm presents e.g. 36 letters in a 6 x 6 matrix on the computer monitor. Each letter (or row or column of letters) flashes in a random order, and the subject has to silently count each flash that includes the letter that he or she wants to communicate. As soon as the corresponding letter flashes, a P300 component is produced inside the brain. The algorithms analyze the EEG data and select the letter with the highest P300 component. Then, this letter is written onto the screen. Normally, between 2-15 flashes per letter are required for high accuracy. The number depends on many factors, including the electrodes and their scalp positions, the data processing parameters, and the individual height of the subject's P300 brainwave.

P300 Smart Home Control

he BCI was connected to a Virtual Reality (VR) system. The virtual 3D representation of the smart home had different control elements (TV, music, windows, heating system, phone), and allowed the subjects to move through the apartment. Users could perform tasks like playing music, watching TV, open doors, or moving around. Therefore, seven control masks were created: a light mask, a music mask, a phone mask, a temperature mask, a TV mask, a move mask and a "go to" mask. The controlling mask for the TV is shown.

P300 Second Life Control

g.tec implemented a BCI system based on the P300 brainwave. Different symbols are arranged on a computer screen and are highlighted in a random order. If the subject silently counts one specific symbol that is flashing, the P300 should be elicited, and the BCI system can recognize this P300 and therefore the symbol. To control Second-Life, different masks (GUI with icons) were created for moving around, chatting, or other tasks specialized to each user's wishes.

Hyperscanning - Connecting Minds

Many futurists believe that people in the distant future will use advanced technology to work together more directly, something like a “hive mind”. People could use technology to help them not just work together but also think together, accomplishing goals more quickly and effectively. That future may not be so distant. Recently, the P300 speller was used for a demonstration called “Hyperscanning” that represents an important step toward direct cooperation through thought alone. Today, several different groups have EEG-based P300 spellers that can identify targets reliably with about 3 flashes per letter. But, despite very extensive effort from groups around the world, faster communication has not been possible without neurosurgery, since brainwave activity from one flash is usually too noisy for accurate classification. Recently, eight people worked together to spell “Merry Christmas” through the P300 speller with only one flash per letter. They spelled all 14 characters without a single mistake. Hence, by combining the brainwave signals across eight people, the system managed to substantially improve communication speed and accuracy. This approach could be used for cooperative control for many different applications. People might work together to play games or draw paintings, or could work together for other tasks like making music, voting or otherwise making decisions, or solving problems. Someday, users might put their heads together for the most direct “meeting of the minds” ever. Watch the video of the hyperscanning BCI experiment at g.tec: Hyperscanning - the g.tec way of wishing "Merry Christmas

Vibro-tactile Stimulation

However, P300 BCIs based on visual stimuli do not work with patients who lost their vision. Auditory paradigms can also be implemented using a frequent stimulus with a certain frequency and an infrequent stimulus with another frequency. The user is asked to count how many times the infrequent stimulus occurs. Like with the visual P300 speller, the infrequent stimuli also produce a P300 response in the EEG. The same principle can be used for vibrotactile stimulation if e.g. the right hand is frequently stimulated and the left hand is infrequently stimulated. The EEG will also exhibit a P300 if the user is paying attention to the infrequent stimuli. This auditory and vibrotactile setup can assess whether the patient is able to follow instructions and experimental procedures. To answer yes and no questions, it is necessary to extend the vibrotactile setup to 3 stimulators. One of the stimulators applies the frequent stimuli, and 2 stimulators apply the infrequent stimuli. The user can concentrate on one of the infrequent stimulators to say (in this case) yes or no. Typically, an evoked potential is calculated by averaging the frequent and infrequent stimuli. A statistical analysis helps to visualize statistically significant differences, which is especially important for patient data collected in field settings, which is frequently noisy.

Avatar control

Avatar control has been developed through the research project VERE (Virtual Embodiment and Robotic Re-Embodiment). The VERE project is concerned with embodiment of people in surrogate bodies so that they have the illusion that the surrogate body is their own body – and that they can move and control it as if it were their own. There are two types of embodiment: (i) robotic embodiment and (ii) virtual embodiment. In the first type, the person is embodied in a remote physical robotic device, which they control through a BCI. For example, a patient confined to a wheelchair or bed, who is unable to physically move, may nevertheless re-enter the world actively and physically through such remote embodiment. In the second type, the VERE project uses the intendiX ACTOR protocol to access the BCI output from within Unity to control both the virtual and robotic avatars. The BCI is part of the intention recognition and inference component of the embodiment station. The intention recognition and inference unit takes inputs from fMRI, EEG and other physiological sensors to create a control signal together with access to a knowledge base, taking into account body movements and facial movements. This output is used to control the virtual representation of the avatar in Unity and to control the robotic avatar. The user gets feedback showing the scene and the BCI control via the HMD or a display. The BCI overlay, for example, allows users to embed the BCI stimuli and feedback within video streams recorded by the robot and the virtual environment of the user’s avatar. The user is situated inside the embodiment station, which also provides different stimuli such as visual, auditory and tactile. The setup can also be used for invasive recordings with the electrocorticogram (ECoG). The avatar control is promising from a market perspective because it could be used in rehabilitation systems, such as for motor imagery with stroke patients. To run these experiments g.HIsys and g.VRsys are required.

Code-based BCI

BCI systems can also use pseudo-random stimulation sequences on a screen (code-based BCI). Such a system can be used to control a robotic device. In this case, the BCI controls were overlaid on the video that showed a robot performing certain tasks.

The user was seated in front of a computer monitor and was connected with active EEG electrodes to a biosignal amplifier. The amplifier sent the EEG data to the BCI system that allowed the subject to control a robotic device (e-puck) in real-time. The robotic device was located beside the subject on the floor and the movement was observed with a tracking camera that recorded x- and y-positions on the tracking system computer (EthoVision, Noldus, The Netherlands). Additionally the robotic movements were also captured with a feedback camera that passed the video image to the computer monitor in front of the subject (Technical University of Munich, Germany) and showed the experimental paradigm together with the BCI controls that the subject used to control the robotic device. The code-based BCI system reached a very high on-line accuracy, which is very promising for real-time control applications where a continuous control signal is needed.

The code-based BCI principle is available in g.HIsys as an add-on toolbox g.BCI_CVEP. The toolbox allows to analyze EEG data in real-time and to provide code-flickering icons to remote screens via the SOCI module. This allows you to integrate control icons in external applications that are e.g. programmed in unity. All parameter estimation and classification algorithms are integrated in this toolbox to quickly develop your own application.

Steady State Visual Evoked Potential (SSVEP)

Steady state visual evoked potentials (SSVEP)-based BCIs use several stationary oscillating light sources (e.g. flickering LEDs, or phase-reversing checkerboards), each of which oscillates at unique frequency. When a person gazes at one of these lights, or even focuses attention on it, then the EEG activity over the occipital lobe will show an increase in power at the corresponding frequency.

With four choices, anyone could easily move a robot forwards, backwards, to the left and to the right. Hence, in our SSVEP BCI, we have four lights. (Of course, SSVEP BCIs have been developed with more or less than four lights, depending mainly on how many commands are required.) All the user has to do now is to look at one specific flickering light (for example, the light that is assigned to the "move forward" command). Our algorithms determine which EEG frequency component(s) are higher than normal, which reveals which light the user was observing and thus which movement command the user wanted to send. This system also uses a "no-control" state. When the user does not look at any oscillating light, the robot doesn't move.

intendiX BCI System

The intendiX BCI system was designed to be operated by caregivers or the patient’s family at home. It consists of active EEG electrodes, a biosignal amplifier and a computer running the software. This allows the operator to communicate with her or his environment by processing P300 evoked potentials, from the EEG data in realtime. The software shows an alphabet, numbers or icons on the computer screen. Then the characters are highlighted in a random order while the user concentrates on the specific character he or she wants to select. By using the extendiX package, the system can be configured with respect to the user’s needs. Furthermore, the extendiX allows interacting with connected devices. This setup provides a powerful communication channel that allows the user to easily interact with the environment or nearby devices such as TV, switches, central heating or computer programs.

ACTOR protocol

The ACTOR protocol allows users to interface external applications easily with a BCI system. If the applications understands the protocol, it can just be plugged to the BCI system. The protocol allows users to implement smart home control, spelling, painting, exoskeleton control, VR control, robotic control, game control, and new applications in development.

For complex control tasks, the BCI Application ConTrol and Online Reconfiguration (ACTOR) Protocol is provided. The ACTOR protocol uses eXtensible Markup Language (XML) formatted message strings to exchange information between the BCI and the attached system. Whenever the BCI system is started, it broadcasts a dedicated hello message to identify the available and active applications. As soon as the BCI has detected external applications, it will request from the application the list of applications and services available from this client. The BCI will acknowledge the received list of commands, services and actions and report whether it was able to process them successfully.

This allows you to easily configure your BCI system according to your applications via UDP or from definition files, either at start-up or during operation, which makes the system very flexible. The BCI system also sends standard XML commands to the external applications for e.g. switching on the light in a smart home environment. If external applications are able to understand the ACTOR protocol, it can just be plugged into the BCI system. The ACTOR protocol is already used in many EC research projects, including such as Brainable, Backhome, Vere, and ALIAS.

By combining the ACTOR protocol with the SOCI system, the BCI can be fully embedded within and controlled by a large variety of user applications, and configured, dynamically by each of them. The ACTOR protocol is designed to empower users to communicate and interact with their environment and control various applications, services and devices therein using one single BCI device.

SOCI system

The intendiX SOCI system (Screen Overlay Control Interface module) can be used especially for virtual reality (VR) applications and similar applications where merging BCI controls with the applications native interface is essential for an improved and optimal user experience. Using SOCI the intendiX platform can be configured to remotely display its stimuli and feedback on various different devices and systems. The intendiX SOCI can be embedded in host applications to directly interact with BCI controls inside the displayed scene. It generates CVEP, SSVEP stimuli and supports single symbol, row column and random patterns for P300 stimulation.

Through dedicated interfaces it is possible to define and replay custom patterns such as scanning cursors as used by the g.EOGEMGcontrol application. Besides the basic highlighting and colour inversion stimulus types the SOCI system to use a predefined set of coulour images, for example images of famous faces, as stimuli, which is for example used by intendiX and g.BCI_SOCI to implement the face speller.

Hybrid BCIs

Hybrid BCIs combine different input signals to provide more flexible and effective control. g.HIsys supports (i) mouse control, (ii) EMG 1D and 2D control, (iii) EOG 1D control and (iv) eye-tracker control, as well as the standard BCI signals.

EMG and EOG are recorded via the biosignal amplifier and are analyzed with g.RTanalyze to generate the control signals, while the mouse and the eye-tracker use external devices that are interfaced with g.HIsys. The combination of these input signals makes it possible to use a BCI system for a larger patient group and to make the system faster and more reliable.


The g.EOGEMGcontrol provides a set of BCI type models that uses eye motion (EOG) signals or muscular contraction (EMG) signals to select individual symbols initiate commands and control external devices.

Cortico-Cortical Evoked Potentials (CCEP)

To record CCEPs subdural ECoG grids are implanted directly on the cortex on the dominant hemisphere. Then conventional electrical cortical stimulation mapping is used to identify e.g. Broca’s area. Then a bipolar stimulation is performed into Broca’s area and this leads to CCEPs over the motor cortex and over the auditory cortex. Electrode channels showing an EP over the motor cortex indicate the mouth region required to say words and sentences. EPs over the auditory cortex indicate electrode positions representing the regions responsible for hearing and for understanding e.g. questions (Wernicke’s area, receptive language area). Summarizing, the CCEP procedures allows rapidly to find a whole functional cortical network and this provides important information for neuro-surgical and BCI applications.

FUSIFORM face area

High-gamma activity allows to map the temporal base of the cortex and this is useful to find the fusiform face area, an area responsible to identifying faces, and nearby regions are coding colors, shapes, characters, etc. This region can also be used for real-time decoding. In this example Dr. Ogawa (Asahikawa Medical University Japan) is showing different faces, Kanji characters and Arabic characters to the patient with the ECoG implant (high-resolution ECoG on the temporal base on the left and right hemispheres). The patient is just observing the images and the BCI system decodes in real-time high-gamma activity of the ECoG electrodes and is able to identify in real-time which image the patient is seeing. This also works if the patient sees real faces or his own face with the mirror.

g.UDPinterface - Data Exchange between two running Simulink

The exchange of data between different computer systems is important for many applications. The g.UDPinterface for MATLAB/Simulink provides ready-to-use Simulink blocks and MATLAB functions to transmit data from a biosignal recording device to other applications like a Virtual Reality system or another MATLAB instance on another PC. The g.UDPinterface can be used to exchange data between 2 Simulink applications running on two different PCs or notebooks.

Product Highlights:

  • Exchange data between MATLAB/Simulink on two PCs over a standard network connection
  • PCs are just connected with a normal network cable for data exchange
  • The Simulink blocks can be used per drag-and-drop
  • Fast data exchange with response time < 1ms
  • Allows you to interface MATLAB with other software packages


On PC #1 this model reads biosignal data from the gMOBIlab device into Simulink and passes the data to the UDP block. The UDP block sends the data over the network connection to PC #2.
The block also receives data from PC #2 and visualizes the data in the Scope block on channel 1. The second channel of the Scope shows the data acquired with g.USBamp .
On PC #2 this Simulink model reads biosignal data into Simulink, converts the data in double precision and sends the data with the UDP block to PC #1. The UDP block also receives data from PC #1 and visualizes the data with the Scope block. The second Scope channel displays the signal amplified with g.MOBIlab+ on channel 1.

Invasive and Non-Invasive Stimulation

For many BCI experiments it is essential to control an electrical stimulator in real-time depending on the analysis results of the EEG, ECoG or spike data read in by g.HIamp, g.USBamp or g.Nautilus. g.tec provides an intracranial stimulator with 80V compliance voltage for electrical stimulation of the cortex with ECoG grids or strips or for depth electrodes. This can be used e.g. to stimulate the sensory cortex to add a touch feeling if robotic devices are controlled with the BCI or for deep brain-stimulation experiments e.g. with Parkinson patients. Stimulation parameters such as on-set time or stimulation current can be set and triggered in real-time from Simulink. Non-invasive stimulation is used e.g. to stimulate hand movements when the person images a hand movement. In this case a Functional electrical stimulator is used with surface stimulation electrodes. The g.Estim FES Simulink interface allows to set all required parameters of the FES stimulator and to trigger the stimulator from Simulink. Important is that the stimulator measures also the stimulation current to make sure that the actual current can be delivered (depending on the electrode impedance).

Connecting your clinical system to the g.tec Brain-Computer Interface

To facilitate Brain-Computer Interface research in the clinical environment; many of our customers want to use their clinical EEG/ECoG system in parallel with the BCI system. g.tec offers various solutions to connect clinical EEG machines and the g.tec BCI system. Below there are two examples. g.tec's employees are prepared to find the best solution for existing systems together with our customers. Please contact  for an individual setup.

Depending on the clinical EEG/ECoG machine, two different connection options are possible:

i) On the jackbox of the clinical system, the EEG/ECoG analog signals are available via an additional D-sub type connector. Here, the 1.5mm safety plugs coming from the EEG cap/ECoG grid are connected to the jackbox on the clinical machine. Then, the g.tec connection cable connects via the parallel D-SUB type connector to the multi pole input sockets of up to 4 g.USBamp systems or one g.HIamp.

ii) The jackbox of the clinical system has no option for an additional connection. In this case, the g.tec 64 channel breakout box can be connected to the clinical system instead of the original jackbox. Here, the 1.5mm safety plugs coming from the EEG cap/ECoG grid are connected to the g.tec breakout box, which is connected on one side to the clinical system and on the other side to the multi pole input sockets of up to 4 g.USBamps or one g.HIamp.

Example 1: On the clinical system, the jackbox on the patient side has a parallel output option for EEG/ECoG signals via e.g. D-SUB type like connectors:

Solution:  Connect the parallel output on the jackbox via the g.tec D-SUB type connection cable to the BCI system.

Supported systems come from: BIOLOGIC, STELLATE, XLTEK

Example 2: On the clinical system, there is no parallel output option on the jackbox available:

Detailed picture of the splitter boxes and connection cables

Solution: Connect the 1.5mm safety plugs coming from the EEG cap/ECoG grid to the g.tec splitter box and connect the box to the clinical system as well as to the BCI system using the g.tec connection cables.

Supported systems come from: NIHON KOHDEN

Recommended setup for an ECoG research lab plan available.

Unity toolbox

The Unity toolbox allows to control Virtual Reality 3D content with g.BCIsys in real-time. g.tec provides games (Mastermind, Spacetraveller, Avatar) that can be controlled with the P300, SSVEP or motor imagery based BCIs. The BCI Simulink models are communicating in real-time with Unity. Beside the games, g.tec provides also a full body 3D g.Avatar which limbs can be controlled with the BCI system. The toolbox can also interact with the SOCI module in order to overlay graphical content with P300, SSVEP, cVEP or motor imagery controls. This allows the user to use the BCI controls in the highly immersive VR environment. The toolbox is available for the Occuls Rift HMD. Mastermind uses the SOCI toolbox and allows P300 control, Spacetraveller uses the cVEP toolbox and allows direct control and g.Avatar uses the CSP toolbox and allows control with the motor imagery BCI.

Available configurations

Complete Solutions

product no.: 6029 read more g.BCIsys16USB ERD, SSVEP, P300 — complete BCI research system, g.USBamp-RESEARCH with 16 channels, includes all hardware and software for motor imagery/ERD, SSVEP-BCI and P300 spelling examples, PC with ready-to-go installation
product no.: 6023 read more g.BCIsys16USB: complete BCI-research system, PC included — with g.USBamp-RESEARCH, 16 channels, highspeed online processing, data analysis & classification, ERD/motor imagery examples, PC with ready-to-go installation
product no.: 6031 read more g.BCIsys32USB ERD, SSVEP, P300 — complete BCI research system, g.USBamp-RESEARCH with 32 channels, includes all hardware and software for motor imagery/ERD, SSVEP-BCI and P300 spelling examples, PC with ready-to-go installation
product no.: 6024 read more g.BCIsys32USB: complete BCI-research system, PC version — with g.USBamp-RESEARCH, 32 channels, highspeed online processing, data analysis & classification, ERD/motor imagery examples, PC with ready-to-go installation
product no.: 6021 read more g.BCIsys64USB: complete BCI-research system, PC version — with g.USBamp-RESEARCH, 64 channels, highspeed online processing, data analysis & classification, ERD/motor imagery examples, PC with ready-to-go installation
product no.: 6006 read more g.BCIsys8MOBIlab+: BCI research system, 8 EEG, NB included — with g.MOBIlab+ 8 channel EEG vesion, highspeed online processing, data analysis & classification, ERD/motor imagery examples, notebook with ready-to-go installation
product no.: 6011 read more g.BCIsys8MOBIlab+: P300, 8 EEG, NB included — complete BCI research system, g.MOBIlab+ 8 channel EEG version, highspeed online processing, data analysis & classification, includes all hardware and software for P300 spelling, notebook with ready-to-go installation
product no.: 6014 read more g.BCIsys8MOBIlab+: SSVEP, P300, NB included — complete BCI research system, g.MOBIlab+ 8 channel EEG version, highspeed online processing, data analysis & classification, includes all hardware and software for P300 spelling and SSVEP robot control, notebook with ready-to-go installation
product no.: 6007 read more g.BCIsysMOBIlab+: BCI research system, multi-purpose, NB included — with g.MOBIlab+ multi-purpose version, highspeed online processing, data analysis & classification, examples for ERD/motor imagery BCI, notebook with ready-to-go installation
product no.: 6028 read more g.tec BCI2000 bundle offer with g.USBamp-RESEARCH, NB included — consisting of: g.USBamp-RESEARCH (16 channel biosignal amplifier, with power supply); water-proof heavy duty case; USB cable; fully equipped business notebook; C API and BCI2000 driver, BCI2000 driver package; bundle offer (0216R+3003+0263a)
product no.: 6012 read more g.tec BCI2000 bundle offer with g.MOBIlab+, NB included — complete BCI system with g.MOBIlab+ 8 channel EEG version, with drivers and BCI2000 software package, notebook with ready-to-go installation
product no.: 6032 read more RehaBCI, PC included — 32 channels; consisting of: g.BCIsys32USB; g.VRsys; g.GAMMAbundle for g.USBamp CSP; g.UDPinterface; g.USBamp common spatial patterns model; 3D human avatar (g.AVATAR)
product no.: 8030 read more RehaBCI for g.HIamp — g.HIamp-RESEARCH 80; g.SCARABEO 64 bundle; g.HEADbox - active; g.HIamp SIMULINK HIGH-SPEED ONLINE Processing; g.RTanalyze [education price]; g.BSanalyze [education price]: Base Version, EEG-toolbox, Classify-toolbox; g.HIamp common spatial patterns; Unity toolbox; g.UDP interface; business PC; bundle offer (7001R, 1098, 7005, 0260D, 0111, 0101, 0102, 0105, 0142b, 0303, 0264, 3001A)
product no.: 8006 read more g.HIamp package BCI — upgrade for BCI research consisting of: g.HIamp SIMULINK HIGH-SPEED ONLINE Processing; g.HIamp P300 + Ping Pong + SSVEP model; g.HIamp common spatial patterns; g.RTanalyze; g.BSanalyze (Base + EEG + Classify)


product no.: 0111 read more g.RTanalyze: Specs & Features — real-time biosignal processing blockset under SIMULINK; real-time algorithms
product no.: 0101 read more g.BSanalyze: Specs & Features — advanced biosignal data processing toolbox; multi-modal Off-line Biosignal Analysis under MATLAB®
product no.: 0264 read more g.UDPinterface — data exchange with network connection between Simulink/MATLAB on different PCs (eg. BCI, VR, XVR, ...); single place licence; prerequisite MATLAB for OS English Win 32/64, SIMULINK
product no.: 0148 read more ACTOR BCI - Application ConTrol and Online Reconfiguration (ACTOR) protocol — Simulink model with matrix interface that can be remotely updated or configured with configuration files; sends commands to external devices
product no.: 0147 read more hybrid BCI model — SSVEP and P300 hybrid based control; prerequisite SIMULINK HIGH-SPEED ONLINE Processing, g.RTanaylze
product no.: 0149 read more EMG/EOG/mouse control — Simulink model to control the matrix interface with EMG, EOG or a mouse
product no.: 0146 read more hyperscanning BCI model — multi-user P300 and Motor Imagery based control; prerequisite SIMULINK HIGH-SPEED ONLINE Processing, g.RTanalyze, g.BSanalyze Base, EEG and Classify Toolbox
product no.: 0144 read more g.VIBROTACTILEp300 Model — 2-, 3- and 8-channel vibrotactile P300 based BCI control; prerequisite: SIMULINK HIGH SPEED ONLINE Processing, g.BSanalyze Base, EEG and Classify Toolbox
product no.: 0260d read more Highspeed Online Processing Under Simulink for g.HIamp — SIMULINK driver and blockset modules; data processing with maximum system speed
product no.: 0260a read more Highspeed Online Processing Under Simulink for g.USBamp — SIMULINK driver and blockset modules; data processing with maximum system speed
product no.: 0260E read more Highspeed Online Processing Under Simulink for g.Nautilus — SIMULINK driver and blockset modules; data processing with maximum system speed
product no.: 5012a read more Highspeed Online Processing Under Simulink for g.MOBIlab+ — SIMULINK driver and blockset modules; data processing with maximum system speed
product no.: 0139 read more g.USBamp P300 model — 8-channel P300 based speller; prerequisite: SIMULINK HIGH SPEED ONLINE PROCESSING (0260a)
product no.: 0140a read more g.USBamp Ping Pong model — 2 subject and 4-channel motor imagery based game; prerequisite: SIMULINK HIGH SPEED ONLINE Processing (0260a)
product no.: 0141a read more g.USBamp SSVEP BCI model — 8 channel SSVEP based control
product no.: 1303a read more SSVEP model and Hardware for g.USBamp — bundle for SSVEP based robot control; consists of g.USBamp SSVEP model, g.SSVEPbox, g.STIMbox
product no.: 0142 read more g.USBamp common spatial patterns — Simulink model to calculate CSPs for 2 or 3 classes; variable amount of electrodes, tutorial
product no.: 0139b read more g.MOBIlab+ P300 model — 8-channel P300 based speller; prerequisite: SIMULINK HIGH-SPEED ONLINE Processing
product no.: 1303b read more SSVEP model and Hardware for g.MOBIlab+ — bundle for SSVEP based robot control; consists of g.MOBIlab+ SSVEP model, g.SSVEPbox, g.STIMbox
product no.: 0141b read more g.MOBIlab+ SSVEP BCI model — 8 channel SSVEP based control
product no.: 1303E read more SSVEP model and Hardware for g.Nautilus — bundle for SSVEP based robot control; consists of g.Nautilus SSVEP model, g.SSVEPbox, g.STIMbox
product no.: 0140d read more g.HIamp Ping Pong model — 2 subject and 4-channel motor imagery based game; prerequisite: SIMULINK HIGH-SPEED ONLINE Processing (0260d)
product no.: 0141d read more g.HIamp SSVEP BCI model — 8 channel SSVEP based control; prerequisite SIMULINK HIGH-SPEED ONLINE Processing for g.HIamp, g.RTanalyze
product no.: 1303d read more SSVEP model and Hardware for g.HIamp — bundle for SSVEP based robot control; consists of g.HIamp SSVEP model, g.SSVEPbox, g.STIMbox
product no.: 0142b read more g.HIamp common spatial patterns — Simulink model to calculate CSPs for 2 or 3 classes; variable number of electrodes, tutorial
product no.: 0380 read more g.AVATAR — 3D human avatar for rehabilitation applications
product no.: 0137 read more g.BCI SOCI model — Screen Overlay Control Interface module; can be used especially for virtual reality (VR) applications and remote control of devices to provide the standard user interface by directly embedding the BCI stimuli; generates CVEP or SSVEP stimuli and supports single symbol and row column for P300 stimulation. single place licence; prerequisite MATLAB R2014a for OS English Win 64 (Windows 7), SIMULINK HIGH-SPEED ONLINE Processing, g.BSanalyze Base, EEG and Classify Toolbox
product no.: 0139E read more g.Nautilus P300 BCI model — 8-channel P300 based speller; prerequisite: SIMULINK HIGH SPEED ONLINE PROCESSING (0260e)
product no.: 0141E read more g.Nautilus SSVEP BCI model — 8 channel SSVEP based control
product no.: 0142E read more g.Nautilus common spatial patterns BCI model — Simulink model to calculate CSPs for 2 / 3 classes, tutorial; prerequisite: SIMULINK HIGH-SPEED ONLINE Processing (0260E), g.Bsanalyze Base, EEG and Classify Toolbox (0153); special g.Nautilus electrode montage necessary!
product no.: 0139d read more g.HIamp P300 model — 8-channel P300 based speller
product no.: 0112 read more g.MONcreator standalone — The g.MONcreator allows defining electrode positions, names and constellations for source derivations. Supports import of scanned electrode positions of various systems (e.g Polhemus Patriot Digitizer or NDI Krios)
product no.: 0136 read more g.BCI CVEP model — Code-based BCI model; BCI systems can also use pseudo-random stimulation sequences on a screen (code-based BCI); prerequisite SIMULINK HIGH-SPEED ONLINE Processing, g.BSanalyze Base, EEG and Classify Toolbox
product no.: 0155 read more g.Estim FES Simulink Interface — provides a graphical interface to the g.Estim FES hardware, which can be used under Simulink to specify the properties of the electrical stimulator; intended to be used for research applications only; requires g.tec Highspeed Online Processing, MATLAB, Simulink and Signal Processing Toolbox.
product no.: 0154 read more g.CSP recoveriX extension — extends the g.CSP Simulink model by the activation of g.Estim FES in the desired mode PRACTICE or REHABILITATION (EEG controlled); requires g.tec Highspeed Online Processing, MATLAB, Simulink and Signal Processing Toolbox.

See some related products


read more Software options — choose the best software components for your system

Hardware and Accessories

read more g.USBamp-RESEARCH: Specs & Features — g.tec's high performance biosignal amplifier, acquisition and processing system
read more g.HIamp: Specs & Features — multi-channel biosignal amplifier
read more g.HIamp-RESEARCH: Specs & Features — multi-channel biosignal amplifier
read more g.MOBIlab+: Specs & Features — mobile biosignal acquisition and processing with a PC or notebook
read more g.Nautilus: Specs & Features — wireless EEG system with active electrodes
read more g.Nautilus fNIRS: Specs & Features — highly portable EEG and fNIRS in one device
read more g.Nautilus-PRO: Specs & Features — wireless EEG system with active electrodes
read more g.VRsys: Specs & Features — Virtual Reality software package
product no.: 7020 read more g.LABboy — mobile trolley for recording and data processing system with isolation transformer

Electrodes and Sensors

read more g.GAMMAsys: Specs & Features — active electrode system for EEG/EMG/EOG/ECG
read more g.SAHARA: Specs & Features — active dry EEG electrode system
read more g.Electrodes: Specs & Features — various active and passive electrodes for electrophysiological recordings
read more g.Sensors: Specs & Features — various sensors for physiological and physical signals

Related Media and Documents


log in required
gCSPbci_TwoClasses — 28/11/2018 — 2.57 MB
log in required
gCSPbci_ThreeClasses — 28/11/2018 — 2.84 MB


log in required
P300Speller — 11/09/2018 — 2.31 MB

— —
— —
— —
— —


log in required
gPingPong — 11/09/2018 — 739.72 kB


log in required
SSVEP BCI — 11/09/2018 — 1.41 MB

Product Manuals/Handbooks


log in required
gRTanalyze — 25/09/2018 — 2.46 MB


g.tec Lectures Introduction — 26/11/2015 — 142.03 kB
g.tec Material Use Agreement — 26/11/2015 — 88.98 kB
Lecture 2: BCI — 26/11/2015 — 4.23 MB


Shielded EEG lab — 26/08/2011 — 151.27 kB
— —
— —
— —


log in required
g.BCI_CVEP — 11/09/2018 — 2.04 MB


log in required
g.BCI_SOCI — 11/09/2018 — 2.11 MB
— —


log in required
g.EMGEOGcontrol — 11/09/2018 — 1.17 MB


log in required
g.HybridBCI — 11/09/2018 — 1.14 MB


log in required
gUDPinterface — 11/09/2018 — 837.24 kB


log in required
g.VIBROTACTILEp300 — 17/12/2018 — 1.90 MB

g.EstimFES Simulink Interface

log in required
g.HIsys-g.EstimFES-User-Manual — 19/09/2019 — 851.61 kB


log in required
gHIampHS — 11/09/2018 — 1.71 MB
— —


log in required
gMOBIlabHS — 11/09/2018 — 1.31 MB
— —


log in required
gNautilusHS — 11/09/2018 — 1.85 MB
log in required
gNautilus_Read_before_Installation — 11/09/2018 — 330.69 kB
— —


log in required
gUSBampHS — 11/09/2018 — 1.77 MB
— —

g.HIsys Highspeed Online Processing for SIMULINK

log in required
gHIsysLibraryDescription — 11/09/2018 — 2.76 MB


log in required
gHyperscanningP300 — 28/11/2018 — 2.36 MB
log in required
gHyperscanningMI — 28/11/2018 — 2.03 MB


log in required
gRehaBCI — 05/10/2015 — 1.36 MB


log in required
gMONcreator — 20/12/2018 — 1.60 MB


log in required
gVRsys — 05/10/2015 — 2.28 MB