
Presentation
Physical AI Twins – with Simulation, Sensors, and Data towards Autonomy
Innovation and thought leadership
The digital age of artificial intelligence is rapidly evolving from digital decision support and semantic content generation towards the physical space enabling further process automation and integration of autonomous systems interacting with real world environments.
Realizing this potential at scale, however, remains challenging due to brownfield constraints: mission-critical systems designed to operate reliably for decades, limited incentives to modify validated processes, and fragmented shopfloor data that has accumulated into complex, heterogeneous IT/OT architectures. The next layer of Agentic and Physical AI, particularly at the edge will further increase complexity by raising requirements for clean, reliable, and verifiable data inputs.
Physical AI is grounded in high-fidelity industrial data deriving real-time insights from a variety of machine parameters, sensors, robots and other (agentic) systems. Besides an interoperable infrastructure with suitable data for simulation and adaptive learning, the combination of software and hardware leaves space for further development along production and operation processes mainly on microchip level and its applications.
The visionary session outlines how Human-Machine Collaboration can deliver measurable benefits when solutions are engineered with both technical rigor and commercial viability. It highlights an operating model of new work based on multi-sided ecosystems combining partner solutions, shared platforms and agile collaboration leading to co-creation of new value chains and shared revenue models. Drawing on proven implementations from adjacent industries, the presentation discusses key challenges for upcoming autonomous use cases and explores how the semiconductor industry can position itself to capture new value in the emerging era of Physical AI.
