No, this isn’t an episode of Black Mirror. Brain hackers are real.
Speaking at the University of Washington, electrical engineering researcher Tamara Bonaci described an experiment in which subliminal images were periodically displayed during a simple video game. The images, logos of fast food restaurants and car companies, were displayed for a few milliseconds at a time, not long enough that someone playing the game would be disrupted by them.
Data through reaction
Through electrodes connected to the player’s head that measured electroencephalography signals, the team of researchers could then gauge how the player reacted to the subliminal images and read the player’s thoughts and feelings about the things that were depicted.
In the experiment, the players wore a complicated system of electrodes, but Bonaci says it would not be hard for similar data to be pulled from a VR headset, smart watch or other wearable tech. The process could also be used to sense reactions to things other than just brands, such as religious beliefs, political leanings, medical conditions, and prejudices.
[clickToTweet tweet=”They could get data on religious beliefs, political leanings, medical conditions, & prejudices.” quote=”They could get data on religious beliefs, political leanings, medical conditions, & prejudices.”]
Real life applications
Science aside, the idea is relatively simple. On one level, hackers could insert images like these into a game or app and record your brain’s unintentional response to them, perhaps gaining insight into which brands you’re familiar with or which images you have a strong reaction to.
It could be used to determine which bank someone uses, where they are planning to travel, or perhaps even where their home is.
There are also advertising applications, providing even more specific information about potential consumers than existing data ever could. Imagine a political candidate wants to only advertise to a small group of people they know feel negatively about them. Images related to the candidate or campaign could be flashed subliminally in a VR headset and reactions could be pulled. The candidate could then just target a specific group who had negative physical reactions.
Similar methods could be used to gauge social or political stances on issues, determine who might have a certain physical ailment, or simply build a database of consumers with negative feelings towards a brand that should be advertised to more heavily.
Don’t freak out just yet
At the moment, the method is a few steps away from “mind reading,” since it is not always easy to tell what a reaction or signal means. However, just the fact that we’re able to get people’s responses to images without them knowing they’re providing data is a huge, potentially dangerous, breakthrough.
[clickToTweet tweet=”This is a huge, potentially dangerous breakthrough.” quote=”This is a huge, potentially dangerous breakthrough.”]
Bonaci says there is no such evidence of brain hacking of this nature happening in the real world yet. The scariest part though is that when it does arrive, we may not even know it.