Researchers have created biosensor technology enabling you to operate equipment, such as robots and machines, using only your thoughts.

In the peer-reviewed journal ACS Applied Nano Materials, a new study about the technology has just emerged. It shows that the graphene sensors made at UTS are very conductive, easy to use, and strong.

The hexagon-shaped sensors are put on the back of the head to pick up brainwaves from the visual cortex. The sensors can handle challenging work environments. The user wears an augmented reality lens on their head that shows white squares that flicker. The biosensor picks up the operator's brain waves when they focus on a specific square, and a decoder turns the signal into commands. The Australian Army recently showed off this technology. Soldiers used the brain-machine interface to control a Ghost Robotics four-legged robot. The device lets you control the robotic dog without touching it and works up to 94% of the time.

Objective

For brain-machine interfaces (BMIs) to be used on a large scale, there must be accurate and reliable dry sensors for electroencephalography (EEG). But dry sensors always do worse than the Ag/AgCl wet sensors, considered the gold standard. Moreover, when monitoring the signal from hairy and curved areas of the scalp, which requires bulky and uncomfortable acicular sensors, the loss of performance with dry sensors is even more apparent.

This work shows that subnanometer-thick epitaxial graphene can be used to make three-dimensional micropatterned sensors that can pick up the EEG signal from the complex occipital region of the scalp. For BMIs based on the standard steady-state visually evoked potential paradigm, the occipital region, where the brain's visual cortex is, is very important. Furthermore, patterned epitaxial graphene sensors make good contact with the skin and have low impedance. As a result, they can get similar signal-to-noise ratios to those of wet sensors. Using these sensors, researchers have proved that brain activity can talk to a four-legged robot without touching it.

Furthermore, researchers have made small EG EEG sensors with micro patterns on SiC on silicon. On average, the graphene layer that touches the skin is less than a nanometer thick. They developed ten m-deep structures with varying shapes and packing factors for the back of the head.

Conclusion

The researchers found that the graphene area affects sensor contact impedance with the forehead's flat skin. However, this relationship breaks down when the sensors are placed on the occipital area. Even though the goal of this study wasn't to come up with a final design, the researchers did notice that an ideal design needs to find a balance between the total graphene area and other factors, like being able to work with hair and transferring enough contact pressure.

The researchers got the EEG signal from the occipital area of a person with 5-mm-long hair using the hexagonal structure design (HPEG). At 50 Hz, these sensors had a low average impedance of 155 10 k and a good S/N ratio that could reach up to 25 5 dB, which is very close to the gold standard.

Lastly, the researchers showed a complete BMI system that could control a four-legged robot with 94% accuracy using the SSVEP paradigm and an eight-channel HPEG sensor array. They say that monitoring EEG from the back of the head (occipital region) with dry sensors is very hard and that, for a given design, the observed variability comes from where the sensor is placed, not from the properties of the individual sensor. Wet sensors are less likely to change in this way. Even though it is still hard to match wet Ag/AgCl sensors' performance with dry sensors in real-world applications, the researchers think these three-dimensional micropatterned EG sensors are a concrete step towards this goal.

Want to publish your content?

Publish an article and share your insights to the world.

Get Published Icon
ALSO EXPLORE