“The Cougar 20-H is a robot with the ability to hear breath through walls. With funding from the U.S. military the robot is equipped with a panel of ultra-wideband radio frequency sensors, remote control functionality, and integrated cameras for day and night time visibility.
The robot that is so sensitive it can not only detect motion through walls but, to ensure no one goes unnoticed, it can also detect the breathing of a stationary person.”
This body of work uses computer generated images to challenge the notion of autonomous thought within digital manifestations of the future.
One may believe that autonomous thought exists within OpenAI’s machine learning algorithm GP-2. This A.I. has been tasked and succeed in writing an entirely new play. While this may seem like a success for the free thinking machines the reality requires a bit more inspection. To create this ‘new’ play the A.I. was required to scan and analyze millions of other dramatic works in order to produce it’s one original. What looked like free, creative and autonomous thought is merely a copy of human creativity. Within each example of an A.I. exhibiting ,what seems to be, true creativity there is always the caveat of the corpus of content it first needed to gather and train on to complete the feat. The machine is only able to act upon the information it is provided with, by the human/society who made it.
To demonstrate this often blurred line between free action and human coding within the machine I examined the obscure notion of robot breath. We can see robot vision in self driving cars, we can feel robot touch while riding an escalator or wearing haptic feedback technology, but robot breath still alludes us. It seems that the breath is the last refuge for the human that the machine cannot replicate.
Through the lens of robot breath the images of this project flip the reality of autonomous action back and fourth between the miraculous and the coincidental.