Adventures of AI: For The Love of Machines, The Future And Everything In Between


In this issue of Adventures of AI, we will talk about artificial emotional intelligence, teaching robots the meaning of touch and turning ourselves into giant living batteries. Trust me. It’s not as crazy as it sounds. But first, we will talk about a brand new revolution taking place very quietly in the world of robotics. Soon our robots might not have any electronics at all and instead will function via a complex system of valves, hydraulics and air pressure. Here’s the entire story.

Robots That Don’t Need Electronics

Micheal T. Tolly, a mechanical engineering professor at the University of California San Diego, has created a robot prototype entirely of affordable pneumatic systems consisting of tubes and soft valves. The best part is that the entire system runs on pressurized air and requires no electricity to function.

The prototype robot resembles a four-legged creature, and each of its legs is placed at 45 degrees. The robot also consists of non-electronic sensors like bubbles filled with fluids that inflate and deflate to signal an action.

The findings of Tolly and his team tried to build upon other scientists who had worked on the same subject. The team had added necessary components to enable functions like walking in their robot. Their report was published in Science Robotics on 17 Feb 2021.

The “no-electronic” soft-touch robot works in the following manner. It consists of a valve switch that changes the direction of limbs between clockwise and counter-clockwise. Each leg is composed of pneumatic cylindrical chambers which change the position of the limb whenever pressurised.

In total, three pneumatic chambers control four limbs placed at a 45-degree angle. The constant increase and decrease in pressure creates a forward movement in limbs and enables walking.

Researchers claim that the new findings could alter the anatomy of future robots. Their functional movement and locomotion could be taken care of by a pneumatic system; meanwhile, other complex tasks like pattern recognition would be reserved for a processing chip.

Turning Ourselves Into Giant Batteries

In our previous “Adventures of AI” session, we talked about how an artificial electronic skin can sense a change in touch, temperature and physical stress.

The same researcher has co-authored a brand new paper detailing how humans can be utilised to charge electric devices. Don’t worry. I am not describing a plot from the matrix. Instead, the electronic devices I am referring to today are small fitness wristbands, smartwatches etc.

Senior author of the paper, Jianliang Xiao, detailed how he and his team plan to harvest a human’s heat and convert it into electricity.

He detailed the use of tiny thermoelectric generators to convert body heat into rechargeable voltage for batteries. The devices can range from rings, bracelets to any other accessory that sits in contact with your skin. The main motivation behind building human-powered devices is the elimination of battery, according to Xiao.

Batteries cost a lot and take up a good chunk of space in wearable devices. Removing them opens a world of whole new possibilities.

Xiao explained the working of his non-battery tech by referencing the common activity of jogging. He mentioned that whenever a human goes for a run, they generate heat, and the air around them eventually cools their bodies. The wearable device captures some of this heat and turns it into electricity for the device.

Scalability of power is not an issue as Xiao claims that he can add additional tiny generators to boost the power generation. What’s really interesting is that these devices are akin to biological tissues because they can heal themselves when broken. One merely has to stitch them together, and their base material will regenerate.

Such a battery-less watch with regenerative capabilities could appear on the market in a span of 10 years, according to Xiao.

Artificial Emotion Intelligence

As 5G nears mainstream adoption, scientists are working together to build a hyper-connected world capable of interacting with humans. A major part of this new world would understand human emotions, guiding them through rough ones and praising them in joyful ones.

The latest study from Prof. Hyunbum Kim from the University of Korea detail the many ways in which 5G technology can enable machines to understand human emotions and respond accordingly. Their interconnected AI-based system is called 5G-I-VEmoSYS, capable of understanding human emotions.

The system uses bodily movements to understand what a human is going through—the major application of such a system is in driver-assist platforms. AI can sense stress, anger, sadness or any other emotion in the driver and respond accordingly. If the driver is overwhelmed with anger and could drive off their car to cause an accident, the system would respond by calling emergency services for help.

The emotion is processed by a sub-system called Artificial Intelligence Virtual Emotion Flow or AI-VEmoFlow. It processes emotion and creates a virtual map that can be used for threat and crime detection.

According to Prof. Kim, the latest study is merely a prototype and more research will be needed to predict human emotions accurately. “Only then will it enable people to have safer and more convenient lives in the advanced smart cities of the future”,, Kim concluded.

Robots To Respond To Human Touch

An upcoming AI technology dubbed ShadowSense will teach robots to identify human touch by looking at individuals’ shadow. The study is spearheaded by the paper’s lead author Yuan Hu, a doctoral student and senior author Guy Hoffman, associate professor in the Sibley School of Mechanical and Aerospace Engineering.

Shadowsense was initially developed to help the public during an evacuation process. It is different from conventional technology as it only notices the concerned person’s shadow instead of their face. Shadowsense can allow a robot to safely lead passengers down a hallway filled with smoke, dust and escort them to safety.

According to Hu, “By placing a camera inside the robot, we can infer how the person is touching it and what the person’s intent is just by looking at the shadow images.”

The robot can presently detect the following different types of touches – hugging, pointing, palm touch, touching with two hands, punching and not touching at all.

Additionally, the robots can be trained to respond to different touches as well. These robots offer “privacy-friendly” means of understanding human emotions as they don’t require no high definition pictures of individuals to do their processing.

The robot world continues to become more human as we speak. In a hyper-connected world often times the need to understand human emotions is left behind. However, thanks to our ever-evolving knowledge of technology, none of us will have to end up alone, at least not in a traditional sense.

Do catch us in our next issue where we will explore Adventures of AI will explore another set of new and exciting stories.

Assistant Editor at Exhibit Magazine. A tech and auto journalist who likes to reverse engineer anything he can get his hands on. He writes about everything technical under the sun, ranging from smartphones and laptops to micro-controllers in Tesla batteries.

    Samsung F62 Review: Shoots Fast And Lasts Longer

    Previous article

    Living With: Samsung Z fold2

    Next article

    More in Features

    You may also like