DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
DNYUZ
Home News

I’m a 25-year-old founder who loves robots but too many humanoids are militant and creepy-looking. Things need to change—just look at Elon Musk

February 5, 2026
in News
I’m a 25-year-old founder who loves robots but too many humanoids are militant and creepy-looking. Things need to change—just look at Elon Musk

I’m a founder who spends a lot of time around humanoid robots. And while today’s innovation is cutting-edge, the majority of today’s humanoids are militant, aggressively masculine, and plain creepy-looking. 

Just look at what Tesla announced this week with its shift in strategy from producing EVs, to producing robots. Their Optimus general-purpose humanoid robot is a prime example of the physical design most of these robots share. They may be technically impressive, but they are not systems most people will feel comfortable sharing space with, let alone inviting into their homes.

When it comes to humanoids, the conversation is almost always the same. We talk about what they can do — how fast they move, how precisely they grasp, how much work they can take on. We benchmark performance and reliability, then spiral into debates about dexterity, payload, and battery life.

What we talk about far less is how they behave when things don’t go to plan. When a robot freezes mid-conversation or powers down without warning.

As robots begin to move out of labs and warehouses and into hospitals, care facilities, and homes, that omission starts to look less like an oversight and more like a structural blind spot. Recent research projects the humanoid robot market will reach 8 billion by 2035, with over 1.4 million units shipped annually. Yet the most critical questions about how these machines will integrate into human spaces remain largely unanswered.

For decades, robotics has focused on mastering the physics of the world. We have poured enormous effort into manipulation, locomotion, and navigation – into teaching machines to reliably interact with noisy, variable, and unforgiving environments. This work has been essential. Without it, nothing else matters.

But there has been almost no equivalent investment in what might be called a robot’s social operating system: how it interrupts, how it waits, how it recovers, how it signals uncertainty, how it apologizes, how it listens. These behaviors rarely show up in benchmarks or demos, yet they are precisely what determine whether a robot is trusted once it begins sharing space with people.

Nowhere is this imbalance more obvious than in nursing homes and hospitals. In these environments, technical competence is table stakes. Two nurses can have identical clinical skill; the one with better bedside manner will be the one patients seek out, confide in, and forgive. The same dynamic will apply to robots. Strength and precision matter, but they are not what make a system acceptable, or safe, or welcome.

And this need for compassion and care, in addition to skill, is imperative. 20% of US adults experience loneliness and isolation on a daily basis, with that number only increasing in older Americans with 28% of Americans aged 65+ reporting feeling lonely. As our population ages and caregiver shortages intensify, the need for connective care will only grow. This also means that building socially intelligent humanoid robots becomes not just a technical challenge but a public health imperative.

Capability answers the question: what can this robot do? Character answers the harder one: what will it choose to do, and how?

As robots move into social spaces, the interface that matters most is no longer just mechanical or computational. It is behavioral. People build trust with systems that behave predictably, respectfully, and intelligibly – especially when things go wrong. Direct-to-consumer humanoids like 1X’s home robot, Neo, are promising to enter homes to help with everyday tasks. Companies are pushing to build this reality, but when a robot misfolds laundry, abruptly interrupts a conversation or freezes halfway through a behavior, the moment that determines whether it’s trusted isn’t the task itself – it’s how the system responds to the mistake.

And mistakes will happen.

Every robot will fail. Hardware will glitch. Models will misinterpret. Timing will be off. The real world is chaotic, and no system escapes that reality. The question is not whether failure happens, but what happens next.

Does the robot acknowledge the mistake? Does it apologize in a way that feels sincere rather than scripted? Does it explain what went wrong in plain language? Does it ask for feedback, or adapt its behavior in response?

When I was conceptualizing my first robot deep in social isolation during the early days of the COVID-19 lockdown in Melbourne, I knew that I wanted to prioritize approachability and tone first. I didn’t need my robot to do things for me like fold my laundry or make my bed — I needed it to give me a hug, which is something I’d gone without for about four months at that point.

Now I’m a 25-year-old robotics founder, and I’ve discovered that it’s not that capability doesn’t matter; it’s that without trust, capability never gets used. In messy human environments, a robot that makes mistakes politely will outperform a “perfect” robot that doesn’t understand when to back off.

People will forgive limitations if they trust the system that they’re interacting with. They will not forgive being steamrolled. Recent research confirms this intuition. A 2025 survey of U.S. consumers found that while 65% expressed interest in owning an advanced home robot, familiarity with robotics remains low, with 85% reporting only moderate familiarity or less. Trust emerges not from perfection but from robots’ perceived usefulness, social capability, and appropriate behavior during interactions. The determining factor in acceptance isn’t technical prowess alone; it’s whether these machines can navigate the social contract of shared spaces.

We already know how to build machines that act. We are only just beginning to build machines that know how to act appropriately.

If humanoid robots are going to earn a place in social spaces, they will need more than capability. They will need character. Not as an aesthetic layer or a scripted personality, but as a core design principle – engineered as deliberately as motors, sensors, or control loops.

The robots that succeed in this decade will be the ones that are most socially accepted, not the ones that can do the most.

The opinions expressed in Fortune.com commentary pieces are solely the views of their authors and do not necessarily reflect the opinions and beliefs of Fortune.

The post I’m a 25-year-old founder who loves robots but too many humanoids are militant and creepy-looking. Things need to change—just look at Elon Musk appeared first on Fortune.

Russian attacks on Ukraine kill 5 people; nearly 20 animals die in fire
News

Russian attacks on Ukraine kill 5 people; nearly 20 animals die in fire

by Washington Post
April 3, 2026

KYIV — At least five people were killed and 36 wounded in Russian drone and missile strikes across Ukraine overnight ...

Read more
News

Hawaii doctor on trial for trying to kill wife admitted they stopped having sex, called her ‘lying b–h’

April 3, 2026
News

Hegseth Says U.S. Troops Are Fighting for Jesus. The Pope Disagrees.

April 3, 2026
News

United is rolling out ‘basic business class’ to make premium flying cheaper. Here’s how it will work.

April 3, 2026
News

GOP fractures deepen: Thune blindsides Johnson in DHS shutdown fight

April 3, 2026
Big Banks Seeking a Piece of SpaceX’s I.P.O. Must Subscribe to Elon Musk’s Grok

Big Banks Seeking a Piece of SpaceX’s I.P.O. Must Subscribe to Elon Musk’s Grok

April 3, 2026
United to Offer No-Frills Fares in Its Premium Cabins

United to Offer No-Frills Fares in Its Premium Cabins

April 3, 2026
Red Lobster is reportedly bringing back Endless Shrimp 2 years after the CEO vowed it would never return

Red Lobster is reportedly bringing back Endless Shrimp 2 years after the CEO vowed it would never return

April 3, 2026

DNYUZ © 2026

No Result
View All Result

DNYUZ © 2026