### Cadence Design Systems Expands AI Collaborations at CadenceLIVE Event
Cadence Design Systems recently announced two significant AI-related collaborations during its CadenceLIVE event. The focus lies on enhancing partnerships with Nvidia and launching new integrations with Google Cloud. These strategies aim to augment Cadence’s capabilities in AI-powered physical simulations and robotic development.
### Collaborating with Nvidia for Advanced Simulation
This new partnership with Nvidia seeks to merge AI with physics-based simulation and accelerated computing, particularly in robotic systems and system-level design. The integration aims to refine modeling and deployment strategies within semiconductors and large-scale AI infrastructures. Nvidia refers to this innovative approach as “physical AI,” highlighting its relevance in robotics and system integrations.
Cadence is set to integrate its multi-physics simulation and system design tools with Nvidia’s CUDA-X libraries and AI models, in addition to the Omniverse-based simulation environment. These powerful tools allow engineers to model thermal and mechanical interactions, significantly improving assessments of system behaviors under real-world conditions. This collaboration extends beyond just chip design, encompassing critical infrastructure components like networking and power systems. By simulating potential system behaviors prior to physical deployment, engineers can ensure optimal performance across various interconnected systems.
### Advancements in Robotics Development
Furthering this collaboration, Cadence’s physics engines — adept at modeling real-world material interactions — will be linked with Nvidia’s AI models. These models are pivotal in training AI-driven robotic systems in simulated environments, thereby minimizing the need for extensive real-world data collection.
During the event, Nvidia CEO Jensen Huang emphasized the collaboration, stating, “We’re working with you in the board on robotic systems.” The significance of simulation-driven training cannot be understated. By utilizing physics-based models to generate datasets, training becomes more accurate, directly impacting the effectiveness of the resulting AI models. Cadence CEO Anirudh Devgan pointed out, “The more accurate (generated training data) is, the better the model will be.”
Industrial robotics companies are already leveraging Nvidia’s Isaac simulation frameworks and Omniverse-based digital twin tools. Major players, such as ABB Robotics, FANUC, YASKAWA, and KUKA, are incorporating these tools into their workflows for virtual commissioning, which allows them to test production systems digitally before the physical rollout.
### AI Agent for Chip Design Automation via Google Cloud
In a separate initiative, Cadence introduced an AI agent designed to streamline late-stage chip design tasks. This agent focuses primarily on translating circuit designs into their silicon implementations, enhancing productivity in physical layout processes. This tool builds upon an earlier agent focused on front-end chip design, which established circuit designs through code-like descriptions.
This enhanced design workflow is available through Google Cloud, marking a significant shift towards cloud-based electronic design automation tools. The integration combines Cadence’s electronic design automation (EDA) tools with Google’s Gemini models, promoting automated design and verification workflows. This cloud deployment enables teams to execute complex workloads without being tethered to on-premise compute infrastructure.
Cadence’s ChipStack AI Super Agent platform utilizes model-based reasoning to facilitate task coordination across multiple design stages. It interprets design requirements efficiently and can autonomously execute necessary tasks, showcasing productivity gains of up to 10 times in early deployment phases.
### Virtual Testing with Digital Twin Models
Both collaborations prioritize simulation tools for validating systems in virtual environments before physical deployment. Digital twin models allow engineers to thoroughly test design trade-offs, evaluate performance scenarios, and optimize configurations in software. This method not only mitigates risk but also enhances design accuracy significantly.
Given the high costs and complexities associated with large-scale data center infrastructure, the ability to rely on virtual simulations can transform conventional trial-and-error methods. This transition is critical in expediting the design and verification processes while reducing overall complexity.
### Launch of Open-Source Quantum AI Models
In an additional announcement, Nvidia unveiled a suite of open-source quantum AI models, known as NVIDIA Ising. Named after a mathematical framework that represents interactions in physical systems, these models aim to support quantum processor calibration and error correction. Nvidia claims these models can deliver up to 2.5 times faster performance and three times higher accuracy in processes related to error correction.
Huang stated, “AI is essential to making quantum computing practical.” The introduction of Ising positions AI as the control plane—or operating system—of quantum machines, thereby enhancing the reliability and scalability of quantum-GPU systems.
### Explore the Future of AI and Big Data
To delve deeper into AI advancements, consider attending the upcoming AI & Big Data Expo, taking place in Amsterdam, California, and London. This comprehensive event, part of TechEx, focuses on the latest innovations, making it a must-visit for industry leaders seeking insights into the evolving landscape of AI.
Inspired by: Source

