Skip to content
WEDNESDAY, FEBRUARY 25, 2026
Humanoids3 min read

AI² Robotics Raises $145M for AlphaBot

By Sophia Chen

AI2 Robotics raises Series B funding to advance AlphaBot, embodied AI

Image / therobotreport.com

AI² Robotics just raised CN¥1.2 billion ($144.7 million) to turn AlphaBot into a universal robot. The company disclosed a Series B round aimed at accelerating its embodied AI platform and moving AlphaBot toward broader, general-purpose use.

Based in Shenzhen and founded in 2023 by Dr. Yangdong Eric Guo, AI² Robotics has positioned AlphaBot as more than a toy—it’s billed as a hardware-and-AI stack designed for real-time interaction and task reasoning. The company’s distinctive angle centers on what it calls GOVLA: Global and Omni-body Vision-Language-Action. In short, a single model and software layer that tries to coordinate perception, language understanding, and action across the robot’s entire body. The project rests on a “data closed-loop + scenario compounding” approach, which the startup argues yields faster generalization across tasks and environments than traditional, piecemeal learning pipelines.

The funding coincides with claims that AlphaBot is already deployed in real-world settings, including retail and public-service applications. That would place the project above pure lab demos, though specifics about scale, task complexity, and reliability remain sparse in public disclosures. The company frames AlphaBot 2 as the vehicle for its broader ambition: a general-purpose humanoid that can be affordable, serviceable, and broadly useful—akin to how smartphones and smart cars refocused consumer tech markets.

Crucially, the company has not published AlphaBot’s mechanical specifics in the public briefings available with the funding announcement. DOF (degrees of freedom) counts and payload capacity—the hardware metrics that tell you how much torque a joint can deliver or how much weight a gripper can lift—are not disclosed in the cited materials. Without those numbers, assessing manipulation capability, object handling robustness, or human-robot interaction constraints remains speculative. Likewise, power source, runtime, and charging requirements are not public, leaving questions about daily operating cycles, recharging logistics, and field maintenance unanswered. This lack of published hardware specs is a notable gap for engineering teams evaluating the platform for production use.

From a tech-readiness perspective, the trajectory described by AI² Robotics—funding to push beyond “embodied AI” research into deployable, general-purpose capabilities—suggests a move past purely demonstrator scale. The claim of existing deployments in retail and public service hints at a TRL somewhere between controlled-environment trials and limited field pilots, rather than a fully market-ready commercial product with broad fulfillment capabilities. In other words: solid prototypes exist, but mass adoption hinges on dramatic improvements in reliability, safety, and serviceability under diverse tasks and environments.

Compared to AlphaBot’s prior generation, the emphasis on GOVLA and the data-closed loop approach signals a software-centric upgrade path. If AlphaBot 2 indeed leverages a unified vision-language-action stack across the whole-body, it could offer more fluid task planning and multi-modal interaction than earlier humanoids that treated perception, reasoning, and motor control as more siloed subsystems. The claim of “full-space understanding” and a scalable platform for general-purpose use is compelling—but the proof will be in repeatable field performance, not in a glossy demo reel.

Power and runtime remain critical unknowns. For service roles in retail or public service, you’d expect multi-hour operation on a docked battery, with rapid-swap options and safe charging between shifts. Without disclosed numbers, teams evaluating AlphaBot must plan for contingencies: slower grinds in busy environments, more frequent maintenance on mechanical joints, and stricter safety measures around human-robot interaction. The interplay between AI capabilities and hardware durability will be the deciding factor for field deployment, not just the elegance of a governance model like GOVLA.

Practitioner insights

  • The GOVLA stack promises broad coordination across perception, language, and action, but hardware integration quality will determine whether this translates to reliable task execution in cluttered spaces.
  • The data closed-loop approach can accelerate learning, but success hinges on data quality, scenario diversity, and safeguards against biased or brittle policies in real-world tasks.
  • Without published DOF/payload and power specs, end-user teams should request detailed hardware matrices and runtime plans before integrating AlphaBot into production workflows.
  • The funding signal from a China-focused Series B aligns with a broader industry trend: investors backing embodied-AI platforms that blend software intelligence with manufacturable hardware, aiming to outpace mere toy or gimmick robotics.
  • If AlphaBot can demonstrate durable performance with clear hardware specs, the funding round could foreshadow a new generation of practical humanoids. Until then, it’s a credible bet with significant caveats—the classic demo-versus-reality tension that haunts every humanoid pitch deck.

    Sources

  • AI2 Robotics raises Series B funding to advance AlphaBot, embodied AI

  • Newsletter

    The Robotics Briefing

    Weekly intelligence on automation, regulation, and investment trends - crafted for operators, researchers, and policy leaders.

    No spam. Unsubscribe anytime. Read our privacy policy for details.