What we’re watching next in humanoids
By Sophia Chen
Humanoids stole the Tokyo stage at TechCrunch’s SusHi Tech 2026 showcase.
Engineering documentation shows TechCrunch is using its Tokyo visit to stage a wider discussion about robotics alongside AI, resilience, and entertainment. The event promises live demos of humanoid robots, with sessions that dig into autonomous driving software, cyber defense, and the social impact of AI on music and anime. Demonstration footage from prior labs has often highlighted capability bursts—quick grabs, some lifelike gait, flashes of perception—but rarely a simple statement of reliability. The technical hype is real; the question is whether the demos translate into real-world usefulness.
What’s clear from the organizers’ framing is that the demonstrations occupy a controlled environment rather than field deployments. The absence of disclosed specifications—dimensions like degrees of freedom (DOF), payload capacity, battery chemistry, run time, and charging cycles—reiterates a familiar pattern: show-stopping visuals at a conference, with the gritty hardware data left for later. In practice, that means the public takeaway will be about potential and latency rather than guaranteed performance. Lab testing confirms that progress in humanoid robotics remains incremental and highly context-dependent; capability in a demo arena does not automatically map to factory floors, hospital wards, or disaster zones.
Compared with previous-generation demos, SusHi Tech 2026 leans into cross-domain signaling: not just motion, but the integration of perception, autonomy, and social relevance. The four tech domains—AI, Robotics, Resilience, and Entertainment—signal a broader narrative arc: robots that can collaborate with software ecosystems, endure edge-case scenarios, and resonate culturally. That’s a subtle but meaningful shift from “look, it can walk” to “it can reason, adapt, and fit into human-driven workflows.” The inevitable tension remains: until hardware specs and performance benchmarks are published, the improvements are best described as qualitative gains—better integration, smoother interfaces, more coherent autonomy—rather than a quantified leap in capabilities.
For researchers and investors, a few critical takeaways emerge. First, the absence of disclosed DOF, payloads, and endurance data makes it hard to judge real utility. Second, the emphasis on demos in a tech-forward event is a classic “demo reel vs. reality” moment: compelling to watch, less compelling when you’re sizing a deployment. Third, if this year’s path holds, expect more enterprise-oriented narratives: robots that can slot into existing software stacks, share data responsibly, and operate under explicit safety constraints. That progress matters, even if the first public wins are still measured in seconds of deft manipulation rather than hours of nonstop operation.
What’s next, then, is to see concrete spec sheets, independent lab validation, and a clearer story on power and endurance. The industry has learned that hype trims down quickly when you ask for repeatable, field-ready performance data.
What we’re watching next in humanoids
Sources
Newsletter
The Robotics Briefing
Weekly intelligence on automation, regulation, and investment trends - crafted for operators, researchers, and policy leaders.
No spam. Unsubscribe anytime. Read our privacy policy for details.