▾ G11 Media Network: | ChannelCity | ImpresaCity | SecurityOpenLab | Italian Channel Awards | Italian Project Awards | Italian Security Awards | ...
InnovationOpenLab

Groundlight Unveils Open-Source ROS Package, Revolutionizing Embodied AI for Robotics

#AI--Groundlight, a pioneer in visual AI solutions, today announced the release of its open-source ROS package, accelerating the development of embodied AI in robotics. This innovative tool enables RO...

Business Wire

New tool empowers robot builders to easily integrate visual intelligence into robotic systems, combining machine learning efficiency with human-level reliability.

SEATTLE: #AI--Groundlight, a pioneer in visual AI solutions, today announced the release of its open-source ROS package, accelerating the development of embodied AI in robotics. This innovative tool enables ROS2 developers to effortlessly incorporate advanced computer vision capabilities into their projects. By merging machine learning with real-time human oversight, Groundlight's ROS package makes robots more perceptive and adaptable to real-world environments. The open source package is available here.

The classic computer vision (CV) process has been a bottleneck in developing robust robotic systems. The conventional method involves a time-consuming and labor-intensive process: gathering a comprehensive dataset, meticulously labeling each image, training a model, evaluating its performance, and then iteratively refining the dataset and model to handle edge cases. This can take months for each use case. After all this, when robots encounter situations outside their training set, they act unpredictably, even dangerously. Fixing that requires developers to redo much of the model development process.

Groundlight's open-source ROS package revolutionizes this approach by offering fast, customized edge models that can run locally, tailored to each robot's specific needs. Backed by automatic cloud training and 24/7 human oversight, robots will simply pause and await human guidance when faced with unfamiliar situations. This system enables real-time adaptation to unexpected scenarios. Human-verified responses typically arrive in under a minute, and are instantly trained back into the model and pushed down to the edge, improving safety and reliability, while dramatically speeding the development process.

"Our ROS package gives reliable eyes to embodied AI systems," said Leo Dirac, CTO of Groundlight. "Modern LLMs are too slow and expensive for direct robotic control, and often fail at simple visual tasks. We combine fast edge models with human oversight, enabling robots to see and understand their environment efficiently and reliably."

The Groundlight ROS package allows developers to ask binary questions about images in natural language. Queries are first processed by the current ML model, with high-confidence answers provided immediately. Low-confidence cases are escalated to human reviewers for real-time responses. This human-in-the-loop approach ensures reliability while continuously improving the underlying ML model without manual retraining cycles.

Robotics pioneer Sarah Osentoski, Ph.D., commented, "Groundlight's ROS package is a game changer for teams building robotic systems for unstructured environments. It makes human fallback simple, and automatically incorporates exception handling into ML models, improving efficiency over time."

This release marks a significant milestone in robotics and computer vision. By combining machine learning speed with human oversight reliability, Groundlight enables developers to create more intelligent, adaptive robotic systems easily. Whether for industrial automation, research, or innovative applications, this node paves the way for the next generation of visually-aware robots.

Groundlight is a leading innovator in visual AI solutions, dedicated to making computer vision more accessible and reliable for robotics and automation applications. By combining cutting-edge machine learning with human intelligence, Groundlight empowers developers to create smarter, more adaptable systems that thrive in real-world environments.

Fonte: Business Wire

If you liked this article and want to stay up to date with news from InnovationOpenLab.com subscribe to ours Free newsletter.

Related news

Last News

RSA at Cybertech Europe 2024

Alaa Abdul Nabi, Vice President, Sales International at RSA presents the innovations the vendor brings to Cybertech as part of a passwordless vision for…

Italian Security Awards 2024: G11 Media honours the best of Italian cybersecurity

G11 Media's SecurityOpenLab magazine rewards excellence in cybersecurity: the best vendors based on user votes

How Austria is making its AI ecosystem grow

Always keeping an European perspective, Austria has developed a thriving AI ecosystem that now can attract talents and companies from other countries

Sparkle and Telsy test Quantum Key Distribution in practice

Successfully completing a Proof of Concept implementation in Athens, the two Italian companies prove that QKD can be easily implemented also in pre-existing…

Most read

Chai Discovery Unveils Chai-2 Breakthrough, Achieving Fully De Novo Antibody…

Chai Discovery, which builds frontier artificial intelligence to predict and reprogram the interactions between biochemical molecules, today announced…

NiCE Unveils 2025 International CX Excellence Award Winners, Spotlighting…

#AI--NiCE (Nasdaq: NICE) today announced the winners of its 2025 International CX Excellence Awards, honoring organizations from across EMEA and APAC…

Roblox to Report Second Quarter 2025 Financial Results on July 31, 2025

Roblox Corporation (NYSE: RBLX) today announced that it will report the company’s second quarter 2025 financial results before the opening of the U.S.…

LambdaTest Announces Deeper Collaboration with Appium as Strategic Partnership

#AI--LambdaTest, a unified agentic AI and cloud engineering platform, has announced it has become a strategic sponsor of Appium, the world’s most widely…

Newsletter signup

Join our mailing list to get weekly updates delivered to your inbox.

Sign me up!