Event Summary: Robotics – Shaping the Future with Intelligence and Connectivity
30 June 2025 | The Bradfield Centre, Cambridge
Hosted jointly by the CW Future Devices and Technologies Group (#CWFDT) and the CW Academic and Industry (#CWAcademic) Group, this afternoon event brought together a distinguished line-up of researchers, technologists and innovators to explore the future of robotics. With the field advancing rapidly, the sessions highlighted how robots are evolving from tools into intelligent collaborators across sectors such as healthcare, agriculture and manufacturing.

Welcome Address
Michaela Eschbach, CEO of Cambridge Wireless, opened the event and welcomed participants to what was the first joint Special Interest Group meeting between the Future Devices and Technologies and Academic and Industry groups. She highlighted the importance of collaboration across academia and industry to drive meaningful innovation in robotics.
Session One: Robotics in Action and at Scale
Chaired by David Roberts from the GSMA, the first session focused on the role of robots in cooperative systems, industrial collaboration and agricultural challenges.
Professor Amanda Prorok from the University of Cambridge presented her work on synthesising policies for multi-robot systems. She explained how Graph Neural Networks (GNNs) can help model agent interaction strategies in order to deliver coordinated behaviours. Her team’s research uses simulation environments to train models that are then transferable to physical systems. She gave examples of cooperative perception, human-led navigation using wireless sensor nodes and downwash-aware flight with quadrotors. She noted the importance of scalability, policy robustness and managing lifelong planning for robot fleets.
Azmat Hossain of Extend Robotics addressed the need for adaptable robotic solutions in real-world industrial settings. He highlighted the limitations of traditional factory automation, especially in variable and dynamic environments such as electric vehicle assembly. Azmat introduced the Extend AMAS system, a virtual reality platform that enables operators to train and control robots from day one. Using simulation and real-world data, these systems support remote connectivity and learning at scale. He explained that robots are expected to work alongside humans, with intuitive interfaces allowing remote supervision and retraining. The Leyland Trucks example showed how digital twins are already supporting production environments. The company’s long-term vision is to use simulation-based automation to reduce operational costs, improve reliability and address global labour shortages.
Tom Dean from Dogtooth Technologies gave a focused look at the complexities of applying robotics to fruit picking, particularly strawberries. He described the challenges of generating high-quality 3D ground truth data for training computer vision systems and selecting the right feedback mechanisms to ensure precision in picking. The talk reflected broader issues faced by robotics in natural, unstructured environments where conditions vary frequently.
Break

Before the first session began and even during the coffee break, participants had the opportunity to engage with a live demonstration by Extend Robotics. The interactive setup showcased their intuitive VR control system and human-in-the-loop robotics platform. The demo attracted considerable interest and many attendees enthusiastically explored the system first-hand. It also served as a lively focal point during the mid-event coffee break, sparking conversations around real-world deployment and usability.
Session Two: Autonomy in Medical and Human-Centric Robotics
The second session was chaired by Professor Kevin Morris from the University of Leeds and focused on the application of robotics in healthcare and the future of human-machine interaction.
Dr Dominic Jones, also from the University of Leeds, presented his research on autonomy in minimally invasive surgical robotics. He outlined how autonomy can enhance the performance of surgeons during complex procedures. Examples included autonomous marking of tumour margins and support during liver dissection. Using systems like the Da Vinci Surgical System as a baseline, he explored the challenges of tracking soft tissue variation and the ethical implications of introducing partial autonomy in surgical arms. The question of responsibility in case of failure was raised, underlining the importance of robust control and accountability in clinical robotics.
Dr Ali Shafti from Cambridge Consultants concluded the presentations with a compelling discussion on Human-Machine Understanding (HMU). He explained that the industry is moving beyond simply automating tasks to building robotic teammates that can interpret and respond to human intent. Ali described a roadmap from basic interaction models, such as presence detection and reactive assistance, to full collaboration on shared goals. He outlined how Human-Machine Understanding models interpret signals such as gaze, heart rate and motion to infer mental state, workload and preferences. His examples included gaze tracking to estimate cognitive load and the use of behavioural models to simulate human navigation decisions. He emphasised the importance of building robots that can adapt to cultural and neurodiverse needs, especially in critical environments like surgery. The goal is to move towards intelligent systems that do not just respond to instructions but truly collaborate with people.
Panel Discussion

All speakers joined a panel to answer audience questions and expand on key topics. One question explored how robots could be used to train humans, prompting a comparison with flight simulators for pilot instruction. There was also discussion on the use of multiple cameras in surgical settings, where physical limitations and patient safety make minimal invasiveness a priority. A question about multi-agent communication led to insights into the challenges of synchronising data streams and control loops in swarm robotics. Another point raised was the difficulty of transferring agricultural data from remote locations to processing centres for real-time analysis.
Closing Remarks
The event demonstrated that robotics is undergoing a fundamental transformation. Robots are no longer confined to repetitive tasks in static environments. With the integration of artificial intelligence, advanced sensing and reliable connectivity, they are becoming sophisticated partners in complex human activities. From the operating theatre to the factory floor and into the field, the future of robotics lies not in replacement but in collaboration. This event highlighted the importance of cross-sector dialogue and set the stage for ongoing innovation in the field.