Figure AI CEO Brett Adcock Teases 'Hark': A 'Bespoke Natural Language' Interface for AI
Brett Adcock, the CEO of humanoid robotics company Figure, has offered a first glimpse of a new project called Hark. The preview came via a social media post, which described Hark as "a new interface to artificial intelligence" and "a series of bespoke natural language…"
The post is a teaser, cutting off mid-sentence, and provides no concrete technical specifications, release timeline, or detailed use cases. The core claim is that Hark represents a novel interface layer built around natural language.
What Happened
Adcock's post consists of a brief text statement and a short, cryptic video clip. The video shows a dark, minimalist interface with a waveform visualization and the word "Hark" displayed. The description frames it as a new paradigm for interacting with AI systems, emphasizing customization ("bespoke") and natural language as the primary modality.
Context
Figure AI is primarily known for developing the Figure 01 general-purpose humanoid robot, designed to perform tasks in logistics, manufacturing, and retail. The company has positioned itself as a frontrunner in embodied AI, securing a landmark $675 million funding round in February 2024 led by Microsoft, OpenAI, NVIDIA, and Jeff Bezos' Bezos Expeditions. A key part of Figure's strategy is its partnership with OpenAI to develop next-generation AI models for humanoid robots.
The tease of Hark suggests Figure is investing not only in the physical robot and its core AI models but also in the human-machine interface (HMI) that will allow users to command and collaborate with these systems. A robust, intuitive natural language interface is critical for the adoption of complex robots in real-world environments where traditional programming or joystick control is impractical.
gentic.news Analysis
This teaser, while light on details, is a strategic signal in the rapidly evolving humanoid robotics space. Figure's massive February funding round, which we covered in detail, was a clear inflection point, valuing the company at $2.6 billion and aligning it closely with OpenAI's AI capabilities. The announcement of Hark follows that momentum and indicates Figure is building a full-stack solution: hardware (Figure 01), AI brain (via OpenAI), and now a dedicated interface layer.
The focus on "bespoke natural language" is particularly noteworthy. It implies the interface is not a generic chatbot but is likely being specifically engineered for task-oriented dialogue in operational contexts—think a warehouse manager saying, "Hark, have Figure 01 unload the pallet from bay three and inspect for damaged boxes." This aligns with the industry trend toward multimodal foundation models that can translate high-level language commands into sequences of actionable steps for a robot, a domain where companies like Covariant and Google's RT-2 model are also active.
However, the tease raises immediate questions for practitioners. Is Hark an internal tool for Figure 01, or a standalone product? Does it represent a novel model architecture, or is it a sophisticated prompting/UI layer built atop OpenAI's existing models (like GPT-4)? The use of "bespoke" could hint at fine-tuning on proprietary robotics data. Until Figure releases technical details or benchmarks, Hark remains a compelling but unproven concept. Its success will hinge on reliability, latency, and the breadth of complex instructions it can accurately parse and execute—the perennial challenges of robotics AI.
Frequently Asked Questions
What is Figure AI's Hark?
Hark is a teased project from Figure AI, described by CEO Brett Adcock as "a new interface to artificial intelligence" and "a series of bespoke natural language...". It appears to be a natural language interface system, potentially designed for controlling or interacting with robotic systems like the Figure 01 humanoid.
How does Hark relate to the Figure 01 robot?
While not explicitly confirmed, the logical connection is that Hark is the natural language interface being developed for the Figure 01 robot. Figure's core mission is to deploy humanoids in the workforce, and a conversational interface would be essential for non-expert users to give them instructions. Hark likely represents the software layer that translates user commands into actionable tasks for the robot.
Who are Figure AI's main competitors in robotics AI interfaces?
Figure operates in the competitive humanoid robotics space with companies like Boston Dynamics (Atlas, now Hyundai-owned), Tesla (Optimus), and Agility Robotics (Digit). In the realm of AI-powered robotics interfaces and models, key players include OpenAI (Figure's partner), Google DeepMind (with RT-X and RT-2 models), and Covariant, which builds AI-powered robot control systems for logistics. The race is to create the most reliable, generalizable "brain" and interface for robots.
When will more details about Hark be released?
There is no official timeline. The initial announcement was a teaser on social media. More substantive details, such as a technical paper, a live demonstration, or integration news with Figure 01, will likely follow in the coming weeks or months as part of Figure's ongoing product development and public communication strategy.



