Blade Runner Is Now
May 6, 2015
Abe Davis, and his team at MIT, have done something truly remarkable.
They have created a mathematical algorithm, that makes something really unique possible.
Because of these awesome people, humans now have the ability to take a short video of any object in the room, using a normal video camera (even one recording inside a sound proof room), and do something crazy with it.
To us, the video camera records the seemingly stationary bliss of everyday objects, like the solemn meditating houseplants, plastic bags, or earphone buds.
The neat thing is, as Abe explains, the video camera is actually recording information being ‘saved’ by these objects, as they are being affected by other things in the environment.
Any force that can cause a vibration in another object is in essence trapping that information within the vibration itself.
If we watch that same video of a normal household object later, without any sound at all, we will be able to reverse engineer which sounds were occurring in the environment, at that time.
WE CAN NOW CONVERT RECORDED VIBRATIONS BACK INTO THEIR ORIGINAL SOUND WAVES, AND LISTEN TO WHAT HAPPENED IN THE IMMEDIATE ENVIRONMENT AROUND THE OBJECT DURING THE RECORDING.
EVERY PHYSICAL OBJECT IN OUR UNIVERSE IS A MICROPHONE.
With this algorithm and a normal video camera, you could, from your own kitchen, video tape a flower in a vase on the dining room table in your neighbor’s house, during a dinner party.
Once you then played back the video of this flower, later in the privacy of your home, you would be able to hear every conversation that took place during that party.
You could video record the tiniest of vibrations from the ear buds of your child’s iPhone, and be able to play back what song they were listening to.
Accurately enough, in fact, that even Shazam could recognize it.
During one experiment, a person yelled a message at an empty bag of chips, and the vibrations in the bag of chips (really just the captured sound waves) could be translated back to the shouted message.
This essentially means all physical objects are capable of having ‘memory’ of some kind.
And that science and math are teaching us more about the phenomenon of synesthesia.
We also now have the ability to video record an object, as it exists normally in the environment, like a towel hanging on a laundry line.
That sounds pretty normal so far.
The cool part is, if we record the video for longer periods of time (30-60 seconds), we can then accurately predict, using computer modelling, how that object will behave, in real life.
This would include how it bends, moves, its texture, or how it holds up to applied forces; in the actual real world.
Some examples may include how a rubber band may stretch, how a perfect powder bowl will react to snowboarders (or how many it would take to cause an avalanche); or whether that shoddy bridge on the Road to Hana in Maui (with a beautiful bamboo forest and waterfall on the other side) will handle the weight of the rental car.
All of this information, will soon be at our real time video fingertips.
Pretty cool, eh?
Blade Runner is now.
Simon Trepel, MD
Simon Trepel, MD FRCPC, is a practicing Child and Adolescent Psychiatrist, in Winnipeg, Canada. He is an Assistant Professor, at the University Of Manitoba, in the Faculty of Medicine, and the Co-founder of the GDAAY Clinic. He is, more importantly, the proud Father of 2 beautiful Daughters. He writes in his spare time about things he knows something about, and occasionally about things he doesn’t; like Yoga, and Italian flavored coffees. He was not referring to coffee that tastes like an Italian person.
Check out his Blog, called Simon Says Psych Stuff, at