Get a coffee and enjoy the below journey.
My approach of teleportation is a series of steps that come together from multiple disciplines such as neuroscience, AI, and quantum mechanics. Each step is backed up by evidence pieced together from experiments done by researchers globally and published in some of the most leading and well-respected journals (to avoid “fake news”). A cursory glance will provide an overview of how teleportation is possible but the best way to understand the possibility is of-course by reading every detail evidence behind the step.
The definition — Teleportation or tele-transportation is the theoretical transfer of matter or energy from one point to another without traversing the physical space between them.
The below steps would lead to a human experiencing a different geographic location without traversing the physical space between them.
The steps: Picture below gives the overall schematic.
Step 1: Transfer the brain/ mind and not body
The human brain doesn't need the body and can be trained to experience its bodily sensors without the body. Experiments have shown that brain can directly control machines — without requiring a body as the middleman. The brain can also be triggered to sense/ feel…example proper electrical signals can make brain feel a texture without actually touching it. Below is a picture of a now-famous paraplegic who kicked a football (Brazil World Cup) with his mind and felt the kicking feedback.
Brain Computer Interface Ex: Electroencephalography (EEG) can extract signals from brain
Ted talk (Brain-to-brain communication has arrived. How we did it, Miguel Nicolelis: A monkey that controls a robot with its thoughts. No, really, How to control someone else’s arm with your brain)
Nature: Building an organic computing device with multiple interconnected brains
Neuroscientists demonstrate operation of the first network of brains (brainet) in both primates and rodents
Direct Brain-to-Brain Communication in Humans by University Of Washington
Team links two human brains for question-and-answer experiment
Brain-Machine Interfaces: From Basic Science to Neuroprostheses and Neurorehabilitation — Physiol. Rev. 97: 767–837, 2017. (NCBI-NIH)
Inferences from step: Brain can perform actions without the body. The brain can be triggered to sense feedback like texture without a body. Teleportation of just brain is enough to experience a teleport. Signals can be extracted from the brain to prepare for teleport (step 2).
Step 2: Prepared brain to be modeled in code
Certain senses of the brain can be modeled in a machine learning system For e.g. vision (using deep neural networks). Advances in recognition of faces/ objects to self-driven cars have shown that what the brain perceives can be modeled (in this case vision). Today only single domain senses have been modeled and research is headed towards multi-domain artificial general intelligence. However specific instances of modeling a human have been demonstrated by Hanson Robotics’ using multiple humanoid robots. SOPHIA has appeared onstage as a panel member and presenter at high-level conferences. She even became the first robot to obtain citizenship of a country (from Saudi Arabia). BINA48 is a humanoid robot, modeled after Bina Aspen through more than one hundred hours in compiling all of her memories, feelings, and beliefs. BINA48 engages in conversation with other humans. MindCloud™ by Hanson Robotics is a cloud-based AI, which enables large-scale cloud control of their robots, as well as deep-learning data analytics for processing the rich social data gathered from the millions of interactions with their robots.
Sophia giving an interview
Bina48 in a conference
Bina48 and Bina Aspen
TED Talks (Exploring digital immortality | Bruce Duncan & Bina48 | TEDxMadrid)
Video (Interview With The Lifelike Hot Robot Named Sophia — CNBC, Robot Meets Self Driving Car — Sophia by Hanson & Jack by Audi)
Inferences from step: Deep learning (AI) is increasingly used to model a human brain and its senses. Single domain sensing is ripe for teleportation like seeing, touching objects and detecting them without physically moving near the object etc. Model of the brain is ready for teleport (step 3).
Step 3: Upload model to quantum computer (not ordinary computer to ensure quantum entanglement)
The model can be uploaded to a quantum computer to utilize quantum mechanic principles of entanglement. When two or more particles/ QuBits (in quantum computing, a quantum bit is a unit of quantum information — the quantum analogue of the classical binary bit) are entangled, their quantum states are interdependent, no matter how far apart they are. In effect, they act as a single quantum object, described by a single wave-function. When a particle is entangled with another, everything that happens to one affects the other.
Quantum computers exist, can be bought today and are evolving rapidly. They do need complex engineering including freezing temperatures etc but companies like Microsoft, Google, IBM etc have shown complete quantum computer stack including the ability to code algorithms.
We are at a stage where neural networks are being modeled under various names like Quantum Neural Networks which can demonstrate machine learning tasks like image classification etc.
A quantum computer… big & looks like legacy mainframes BUT real and exists
Quantum computer available to be bought today (price approx $15mn)… ex: google uses D-Wave
Video Talks (Transforming Machine Learning and Optimization Through Quantum Computing by Microsoft, Quantum computing explained with a deck of cards | Dario Gil, IBM Research)
D-Wave is now shipping its new $15 million, 10-foot tall quantum computer
Deploying a quantum annealing processor to detect tree cover in aerial imagery of California
A Quantum Recurrent Neural Network -arXiv:1710.03599
Quantum computing available online today (QNNCloud from NTT, Quantum Development Kit by Microsoft, IBM’s Quantum Experience Online)
Snapshot of code from Microsoft’s quantum development kit
Inferences from step: The deep learning model of the brain (“Original Model”) can be simulated on a quantum computer using QuBits (instead of classical bits & thereby neurons) and once quantized, they can leverage the quantum properties of entanglement to teleport (step 4).
Step 4: Quantum entangled bits replicated at distance
Quantum entanglement enables QuBits to be separated over large distances without losing entanglement. The QuBits can be broken into 2 entangled QuBits which are separated. Then the QuBit based model on the quantum computer can have entangled companions (“entangled model”) which can be located at a distance from the “original model” without physically traversing the distance. Experiments have shown up-to 1200KM of separation while maintaining entanglement of QuBits. Using principles of quantum entanglement a team has successfully transferred a monochrome bitmap from one location to another without physically transmitting it, hence teleportation of the image.
Direct counterfactual communication via quantum Zeno effect — National Academy Of Sciences
Satellite-based entanglement distribution over 1200 kilometers —Science
Ground-to-satellite quantum teleportation — arXiv:1707.00934
Quantum teleportation across a metropolitan fibre network — Nature
Quantum teleportation with independent sources and prior entanglement distribution over a network — Nature
Inferences from step: The brain model (“original model”) QuBits can now be separated into 2 models that are quantum entangled. We would then have “original model” which is near the human and “entangled model” that can be at a distance but reflects the “original model”. The “entangled model” can sense its surroundings at distance (step 5).
Step 5: Entangled model senses environment (Step 1 but on the distant side)
Leverage step 1 on the “entangled model” on the distant quantum computer and transmit back the experience QuBits. The “entangled model” is similar to the “original model” and can do classification like the tree example in step 3 and transfer like the monochrome bitmap in step 4 the result of the sensing to the “original model”. Sources: Same as Step 4
Inferences from step: “Entangled model” will perform the sensing action like seeing objects and classifying (ex. the tree classification quoted in sources of Step 3) and this will be reflected in the “original model” due to entanglement. Entangled “original model” would experience what the “entangled model” senses (step 6) Step 6: Locally, the original but entangled model feels experiences… trigger human brain… teleportation complete The experiences will be transferred back to “original model” and can be used to trigger the brain like in Step 1 converting the information into electrical signals and triggering the brain functions appropriately as described in the details of Step 1 (feeling the football kick)
Sources: Same as Step 1 plus some additions below
In a First, Pitt-UPMC Team Help Paralyzed Man Feel Again Through a Mind-Controlled Robotic Arm — University of Pittsburg
Intracortical microstimulation of human somatosensory cortex — Science
Inferences from step: You experience what the “entangled model” sensed at a great distance without ever moving from your location. You have had teleportation.
Each of the above steps has been achieved but to varying degrees. It’s feasible but we need to wait till a convergence of capabilities happens to the same level so there is no bottleneck in the process. Single domain to multi-domain AI, 100s of QuBits operations, brain stimulus with less invasive techniques, quantum teleportation with higher accuracy and in more relaxed environmental conditions are a few of the technical barriers that need to change.
Didn’t the coffee help you to traverse this journey till here… “teleportation will change how journeys happen… it’s only the destination and no journey in-between”.