This NewScientist article (16 July 2022, paywall) was, at first, bewildering, and then bemusing:
Artificial intelligence can use your brainwaves to see around corners. The technique, called “ghost imaging”, can reconstruct the basic details of objects hidden from view by analysing how the brain processes barely visible reflections on a wall.
Ghost imaging has been used before to reveal objects hidden around corners and normally relies on using video recordings of faint reflections cast by an object onto a nearby wall. Daniele Faccio and Gao Wang at the University of Glasgow, UK, have now replaced the video component with electroencephalography (EEG) brain scans.
In their experiment, a single person wearing an EEG headset connected to a computer stands in front a white wall and next to a wall painted grey, which obscures the view of an object and a projector. This projector is controlled by the computer and casts a series of special patterns onto the object.
Some of this patterned light reflects off the object and hits the white wall or diffuses through the room. The person can’t see the object in the reflections. However, a ghost-imaging machine-learning algorithm can build a basic 16-by-16 pixel image of the object using the EEG data. [“AI can use your brainwaves to see things that you can’t,” Karmela Padavic-Callaghan]
It’s a bit astounding, and then a bit mundane: as data processors, there are some data making it into our cerebral cortex that we either don’t or can’t use – and probably of which we have no awareness.
There’s something distinctly ghost-like, and, yet, so dull as to be a “so what?”, even though it’s being used to recognize a reality that our unaided senses couldn’t.
An odd melange of opposites.