Emotional intelligence – sounds strange in the context of robotics. However, I believe it is a crucial topic for developers who want their robots to be integrated into society. Many think of robots as bland tools meant to fulfill one task: a robotic lawn mower, a robot arm in an industrial setting, or even an autonomous car – all things intended to do their one thing, and to do it well. While I think these inventions have their place and bring value to this world, my intuition is that robots could be so much more.
The fascinating thing about machines is that they are not limited to embryogenesis, which is the current architecture of biological agents depending on being constructed from a single cell that divides itself until it becomes the complete organism. Nor are they bound by a direct evolutionary path, like how apes can only evolve into creatures similar to apes within limited timespans (see the cool Tree of Life Explorer for a nice visualization). This means a broader space of possible configurations exists for synthetic creations than for the ones we currently call “living.”
However, the machines we build currently lack many seemingly basic features that animals possess. Even taking dogs as an example, they exhibit more complex social and emotional behavior than any machine on the market. While I don’t believe we can currently recreate a dog’s level of intelligence in this regard, I strongly think we should at least try to better incorporate the emotional aspect into our agents.
In this post, I would like to share my thoughts on why emotional intelligence is crucial if robotics wants to take its next step toward a solar-punk future. I will try to point out my ideas on the hurdles and pitfalls of these implementations and then give explicit steps on how to add a basic emotional awareness into the Roboost project. Remember, this post is strongly influenced by the excellent book Emotionally Intelligent Design by Pamela Pavliscak, which I think is a must-read for anyone in the robotics or design field.
A Crazy But Important Tangent:
We have something precious on this little planet of ours. As far as we know, there are no living organisms elsewhere in the universe. From what we currently have observed, we are at an extreme peak in complexity within this vast universe. Assembly theory describes this as a long temporal chain of events that needed to happen to get to “us.” So far, life might have only happened once. This means we have the responsibility to preserve this life and complexity.
While it is crucial to take care of the existing lifeforms on this planet, machines might add another completely different sort of life to the table. Maybe one could even say we should turn the evolutionary tree into a forest. I would hope for my kids to live in a world where machines are not only tools but a new form of living entity in symbiosis with humans.
To come back from this tangent, while we are far from understanding or even implementing artificial consciousness, I think the tools are there to dip our toes into exploring how it would look if our robots were (nicely designed) conscious agents. Through the use of emotional intelligence, we could start mimicking consciousness in a more meaningful way. Once there is a better understanding and definition of consciousness, this mimicking might turn into actual artificial consciousness – perhaps even life.
Why Should We Care?
In today’s fast-paced world, the integration of emotionally intelligent robots into our daily lives might seem like a futuristic fantasy, but it’s a vision worth pursuing. The potential benefits extend far beyond mere convenience; these robots could transform our relationship with technology, making it more personal, supportive, and enriching.
To better communicate this vision, I asked ChatGPT to generate a suitable short story for me. After some tweaks and adjustments, this is the result:
Imagine a vibrant solar-punk city where technology and nature coexist harmoniously. The city is filled with lush greenery, powered by renewable energy, and bustling with life. Amidst this urban utopia, we follow Alex, a young professional, as they go about their day, accompanied by their trusted robotic companion, Robi.
As the sun rises, Alex wakes up to the gentle humming of solar panels outside their window. Robi waits patiently by the door, its sleek, agile frame reflecting the morning light. Its sensor-filled dome scans the room, and expressive LED lights pulse softly, reflecting a calm and welcoming blue.
“Good morning, Robi,” Alex says, stretching. Robi’s LEDs brighten, and it emits a cheerful sound, mimicking a chirping bird.
Alex heads to the kitchen, and Robi follows, its movements fluid and purposeful. While Alex prepares breakfast, Robi assists by carrying items to the table with its small, claw-like appendage, carefully navigating the space with curiosity-driven exploration. Its LEDs flash green, indicating its eagerness to help.
After breakfast, Alex and Robi head out for a morning jog. The city’s paths are lined with trees and flowers, and other residents greet them warmly. Robi keeps pace with Alex, adjusting its speed and style to match their rhythm. When they encounter a neighbor, Robi’s LEDs change to a friendly yellow, and it emits a soft chime, acknowledging the person.
During their run, Alex trips over a tree root. Robi immediately switches to a concerned red, rushing to Alex’s side. It gently nudges them, emitting soothing sounds. Its sensors prepare to inform emergency services, but Alex quickly reassures Robi that they’re okay. “I’m alright, Robi,” Alex says with a smile, and Robi’s LEDs gradually return to a calming blue, relieved by Alex’s assurance.
Later, at work, Robi carries Alex’s heavy bags and essentials, making the commute effortless. While Alex’s day is filled with meetings and deadlines, Robi waits outside the office, its LEDs indicating a calm standby mode. When Alex emerges, tired but satisfied, Robi’s LEDs turn a celebratory green, and it performs a small dance, sharing in Alex’s success.
In the evening, Alex and Robi head to the community garden. Robi helps carry tools and water plants, its LEDs reflecting its enjoyment of the task. As the sun sets, Alex sits on a bench, and Robi nestles beside them, its LEDs displaying a warm, golden glow.
“Thank you, Robi,” Alex says, patting the robot’s dome. Robi’s LEDs shift to a soft pink, and it emits a gentle, contented hum.
In this imagined world, Robi isn’t just a tool but a companion, sharing in Alex’s daily life, joys, and challenges. This bond highlights the profound potential of emotionally intelligent robots. They can make our lives more connected, supportive, and enriched, transforming our relationship with technology into one of mutual understanding and care. Note that it is not a goal to replace human connection, but rather to rethink and expand our possible relationship with technology.
Incorporating emotional intelligence into robots can bring about amazing advantages:
- Enhanced Interaction: Robots like Robi can provide companionship, assistance, and emotional support, making technology more user-friendly and engaging.
- Increased Safety: With the ability to understand and respond to human emotions (and other states of humans), robots can offer assistance and safety in certain situations.
- Improved Well-being: Emotional intelligence in robots might help reduce stress and loneliness, offering a sense of companionship and understanding.
- Personalized Experiences: Robots that learn and adapt to their owners’ preferences and behaviours can provide highly personalized interactions. Or why stop there? Imagine a robot that has learned from your ancestors. I think it would be amazing to have this “undying” bridge between generations – a robot specific to your family.
What Are The Challenges?
Designing autonomous robots with emotional intelligence is a big task, with several challenges. These challenges can be grouped into technical, regulatory, and social aspects.
- Technical Challenges:
- Designing from Scratch: Building an emotionally intelligent robot from the ground up is a massive undertaking. Sure, we have existing hardware platforms like Boston Dynamics’ Spot, but adding the necessary “brain power” for emotional intelligence is a whole new ball game. For example, teaching Spot to be friendly, curious, caring, and happy needs advanced algorithms and a ton of data.
- Safety and Autonomy: Ensuring robots are safe in public spaces is crucial. Industrial robots are powerful and operate in controlled environments. Bringing this power to public spaces involves stringent safety measures. The robot must navigate independently, learn behaviours, and interact safely with people and objects.
- Engaging Behavior: Making a robot’s behaviour interesting and engaging is another challenge. This includes autonomous exploration, behaviour learning, and person-specific interactions—basically modelling relationships with humans, animals, and objects. The good news is that there are already existing algorithms that fulfil some of these requirements, but combining them into a seamless package is the trick.
- Creative Input: Character designers and animation artists, who create lifelike and emotionally engaging characters, can offer valuable insights for designing emotionally intelligent robots. Their expertise can make robots more relatable and engaging.
- Regulatory Challenges:
- Public Safety and Usefulness – The Trilemma: Creating a robot that is useful, autonomous, and safe in public spaces is tough. Powerful robots need to perform useful tasks like carrying items, but such power usually limits them to industrial settings due to safety concerns. Meeting safety standards and getting approvals is essential. It’s the classic trilemma: useful, autonomous, and safe—pick two.
- Ethical Issues: Adding emotional intelligence to robots raises ethical questions. How should robots act in sensitive situations? What data should they collect, and how should it be used? Ensuring ethical design and operation is key to gaining public trust.
- Social Challenges
- Public Perception: Changing how people view robots, from simple tools to emotional companions, is a major challenge. People need to trust and feel comfortable around these machines. Demonstrating their usefulness, safety, and emotional capabilities through real-world examples is important.
- Cost and Accessibility: Developing emotionally intelligent robots is expensive. Making these robots affordable and accessible to everyone is crucial for widespread use. This includes lowering production costs, improving battery life, and ensuring they can work in different environments.
- Hardware is HARD: Hardware startups are always difficult. There’s a reason it’s called HARDware. Developing robots involves numerous iterations, high costs, and complex logistics. It’s a tough field, but the potential rewards are huge.
What Would Incorporating Emotional Intelligent Design Look Like In The Roboost Project?
To establish an emotional bond with a robot, several key points need to be tackled:
- Two-way Communication:
- Sound, touch, movement, and gesture are essential channels for this communication. Through these, the robot needs to display behaviors like conciliatory (interest in general pro-social engagement with humans), curiosity (interest in certain actions of humans), caring (wanting humans to feel good), and celebratory (sharing and reflecting the emotions of humans to a certain degree).
- Behavioural Display:
- Robots like Spot show that zoomorphizing the design can create an immediate connection, but more than mechanical design is needed for lasting relationships. As Pamela Pavliscak notes, “The cult of convenience fails to acknowledge that difficulty is core to human experience,” suggesting that robots should sometimes exhibit a degree of autonomy and curiosity to develop character and depth in interactions.
Learning Behaviors
- Reinforcement Learning:
- Implementing reinforcement learning (RL) can enable Roboost to learn behaviours over time. The NVIDIA Isaac Lab is an excellent platform for training RL models that can be integrated with ROS2. Isaac Lab offers a simulation environment where robots can be trained to exhibit desired behaviours through trial and error.
- Behavior Trees:
- The learned behaviours can be structured using behaviour trees. The BehaviorTree.ROS2 library is compatible with ROS2 and provides a way to organize and manage complex robot behaviours that can adapt and respond to various stimuli.
Emotional Expression and Interaction
- Emotional Detection:
- For emotional detection and expression, the OpenVINO Toolkit can be utilized. OpenVINO provides robust tools for emotion recognition, allowing Roboost to detect human emotions and respond appropriately. This can enhance the robot’s ability to engage in meaningful interactions by recognizing and adapting to the emotional states of its users.
So far, I’ve played with most of these tools, but not in-depth. I hope to soon be able to update you with posts, where we dive deeper into the specific implementations and integrations.
Further Resources
I have summarized all my notes on the book in a PDF which you can download here. Also, I really recommend looking into the following videos:
That’s it, goodbye and thanks for the fish!
Nevermind. As OpenVino Toolkit is only optimized for Intel hardware, I will opt to use TensorRT in addition to some of the open-source models from Onnx!