EnCharge AI today announced the EnCharge EN100, the industry's first AI accelerator built on precise and scalable analog in-memory computing. Designed to bring advanced AI capabilities to laptops, wor...
Advanced analog in-memory computing technology delivers 200+ TOPS of AI compute power in a highly optimized package
SANTA CLARA, Calif.: EnCharge AI today announced the EnCharge EN100, the industry's first AI accelerator built on precise and scalable analog in-memory computing. Designed to bring advanced AI capabilities to laptops, workstations, and edge devices, EN100 leverages transformational efficiency to deliver 200+ TOPS of total compute power within the power constraints of edge and client platforms such as laptops.
“EN100 represents a fundamental shift in AI computing architecture, rooted in hardware and software innovations that have been de-risked through fundamental research spanning multiple generations of silicon development," said Naveen Verma, CEO at EnCharge AI. "These innovations are now being made available as products for the industry to use, as scalable, programmable AI inference solutions that break through the energy efficiency limits of today’s digital solutions. This means advanced, secure, and personalized AI can run locally, without relying on cloud infrastructure. We hope this will radically expand what you can do with AI.”
Previously, models driving the next generation of AI economy—multimodal and reasoning systems—required massive data center processing power. Cloud dependency's cost, latency, and security drawbacks made countless AI applications impossible.
EN100 shatters these limitations. By fundamentally reshaping where AI inference happens, developers can now deploy sophisticated, secure, personalized applications locally.
This breakthrough enables organizations to rapidly integrate advanced capabilities into existing products—democratizing powerful AI technologies and bringing high-performance inference directly to end-users
EN100, the first of the EnCharge EN series of chips, features an optimized architecture that efficiently processes AI tasks while minimizing energy. Available in two form factors – M.2 for laptops and PCIe for workstations – EN100 is engineered to transform on-device capabilities:
EnCharge AI's comprehensive software suite delivers full platform support across the evolving model landscape with maximum efficiency. This purpose-built ecosystem combines specialized optimization tools, high-performance compilation, and extensive development resources—all supporting popular frameworks like PyTorch and TensorFlow.
Compared to competing solutions, EN100 demonstrates up to ~20x better performance per watt across various AI workloads. With up to 128GB of high-density LPDDR memory and bandwidth reaching 272 GB/s, EN100 efficiently handles sophisticated AI tasks, such as generative language models and real-time computer vision, that typically require specialized data center hardware. The programmability of EN100 ensures optimized performance of AI models today and the ability to adapt for the AI models of tomorrow.
"The real magic of EN100 is that it makes transformative efficiency for AI inference easily accessible to our partners, which can be used to help them achieve their ambitious AI roadmaps," says Ram Rangarajan, Senior Vice President of Product and Strategy at EnCharge AI. "For client platforms, EN100 can bring sophisticated AI capabilities on device, enabling a new generation of intelligent applications that are not only faster and more responsive but also more secure and personalized."
Early adoption partners have already begun working closely with EnCharge to map out how EN100 will deliver transformative AI experiences, such as always-on multimodal AI agents and enhanced gaming applications that render realistic environments in real-time.
While the first round of EN100's Early Access Program is currently full, interested developers and OEMs can sign up to learn more about the upcoming Round 2 Early Access Program, which provides a unique opportunity to gain a competitive advantage by being among the first to leverage EN100's capabilities for commercial applications at env1.enchargeai.com.
About EnCharge AI
EnCharge AI is the leader in advanced AI compute solutions for deployments from edge-to-cloud. EnCharge's robust and scalable next-generation in-memory computing technology provides orders-of-magnitude higher compute efficiency and density compared to today's best-in-class solutions. The high-performance solutions will enable the immense potential of AI to be accessible at scale, in power, size, and weight constrained applications. EnCharge AI launched in 2022 and is led by veteran technologists with backgrounds in semiconductor design and AI systems. For more information about EnCharge AI, please visit https://enchargeai.com/.
Fonte: Business Wire
Alaa Abdul Nabi, Vice President, Sales International at RSA presents the innovations the vendor brings to Cybertech as part of a passwordless vision for…
G11 Media's SecurityOpenLab magazine rewards excellence in cybersecurity: the best vendors based on user votes
Always keeping an European perspective, Austria has developed a thriving AI ecosystem that now can attract talents and companies from other countries
Successfully completing a Proof of Concept implementation in Athens, the two Italian companies prove that QKD can be easily implemented also in pre-existing…
Ionic Digital Inc., (the “Company” or “Ionic”), an emerging innovator in digital infrastructure and bitcoin mining, today issued an open letter to stockholders…
University of Phoenix is pleased to announce that Vice President of Accessibility and Student Affairs Kelly Hermann co-presented at the 2025 1EdTech Learning…
#EWA--Tapcheck , today announced that it has achieved Workday Certified Integration status. As a Workday Innovation Partner, Tapcheck offers customers…
EDO, the TV outcomes company, and TelevisaUnivision, the world’s leading Spanish-language media company, have expanded their partnership to measure the…