On September 14, The Autonomous Main Event 2023 brought together over 500 industry leaders and experts, representing 230+ global companies. Together, they explored collaborative solutions to tackle safety challenges, cut through the hype, and unleash the full potential of autonomous driving.

The Autonomous has established itself as a global platform for collaboration across the autonomous vehicle industry. The goal of the initiative is to solve the industry’s biggest challenges by facilitating collaboration: safety, consumer acceptance and regulation. More than 500 executives and experts from over 230 companies and organisations met this month at The Autonomous Main Event 2023 to talk about how to move beyond the hype around autonomous mobility towards real-world deployment of autonomous vehicles.

 

The Autonomous plays a crucial role in this as the enabler of collaborative action.

‘The safety challenges facing our industry continually evolve,’ said Ricky Hudi, Chairman of The Autonomous.

‘It would be negligent to believe that any one company can meet all these challenges with in-house capabilities alone – it needs the skills and expertise of a complete ecosystem. Before, the industry was pursuing too many different approaches to the various problems. It is much quicker and more effective to collaborate. This is the reason why the industry has got together behind the platform offered by The Autonomous.’

And Philip Schreiner, Head of The Autonomous, highlighted the significant achievements of The Autonomous since 2022.

‘Away from events such as today, the innovation stream is running strongly,’ he said. ‘Three groups have already produced important findings: the Working Group on Safety and Architecture, the Expert Circle on Safety and AI, and the Expert Circle on Safety and Regulation are setting the strategic agenda for the industry’s approach to safety.’

The event was inspired by a growing sense of momentum. Projections about the growth of the autonomous driving market are hard to make, but McKinsey forecasts that autonomous driving could create between $300bn and $400bn in revenue by 2035.

And as Dirk Linzmeier, CEO of TTTech Auto and key supporter of the initiative, said:

‘We are moving beyond the hype, and reality counts more than vision: robotaxis are on the streets of San Francisco, Beijing, and Shanghai, and the first L3 autonomous systems have been launched. Real progress is being made, step-by-step, and I truly believe in the future of autonomous driving.’

Drawing on industry-wide expertise

The industry’s commitment to The Autonomous’ vision of collaboration was clearly evident as the Main Event drew together partner companies from around the world. They included TTTech, Audi, BASELABS, BMW, CoreAVI, DENSO, FDTech, Green Hills Software, Infineon, Kontrol, MathWorks, McKinsey, Mercedes-Benz, and NXP Semiconductors, as well as various academic institutions.

One architecture to rule them all?

One important question facing the leading experts at the event was the best approach for designing a common software architecture for autonomous vehicles (AVs). Georg Kopetz, CEO of TTTech Group, compared the opportunity in AVs to that of the mobile phone.

‘Manufacturers benefit from the existence of a single platform – the Android™ operating environment – for smart devices. In AVs, we need the same,’ he remarked.

 

Maria Anhalt, CEO of Elektrobit, agreed that the consumer model of separating the hardware from the operating system and services, and then running an app marketplace on top of the platform, was likely to be as appealing to car users as it was to mobile phone users.

But, she said, ‘the biggest challenge is not the technology, it’s the collaboration that is needed to make standardized platforms on which services and apps can be monetized.’

So how is a huge industry, made up of rival manufacturers as well as many disruptive start-ups, to agree on a single basis for the AV’s software?

 

As Professor Törngren of KTH Royal Institute of Technology remarked in a nod to The Lord of the Rings, ‘There is currently not one architecture to rule them all.’

But the value of a common architecture was explained by Stefan Poledna, CTO of TTTech Auto. He said:

‘It can be tempting for car manufacturers with a large base of legacy code to try to incrementally enhance it as they move up the levels of autonomy from L1 upwards. This is a mistake. Safety is best served by a top-down, not a bottom-up approach. You get safety right by getting the culture and the architecture right.’

Bernhard Augustin of Audi expanded on this point, saying: ‘Every autonomous system has its weaknesses, so you have to build in a feedback loop back to the architecture. You need to be ready to update the architecture to rectify the problems you find in testing.’

 

Phil Koopman of Carnegie Mellon University also reminded delegates that the safety of an AV’s architecture could not be assured just by designing safety into the vehicle.

‘What happens if your state-of-the-art, high-definition navigation system drives your AV over a bridge that is no longer there because it just fell down?’ he asked. ‘The AV is part of a system, and safety requires all parts of the system, including the road infrastructure, to support its safe operation.’

The discussion about safety and architecture was complemented by the presentation of the Safety and Architecture Working Group, which described the main outcome of its Digest Report published online.

 

One of the Working Group Members, Udo Dannebaum of Infineon, said: ‘We studied various candidate architectures, from simple monolithic up to a fully distributed safety mechanism. We found that asymmetric architectures are clearly better suited to autonomous driving, as independently developed, complementary channels can compensate for each other’s weaknesses.’

 

The Expert Circle on Safety and Regulation also published its report in September 2023, the result of a year of diligent work by its members. Benedikt Wolfers of law firm PSWP explained that in some respects, regulation has run ahead of the state of autonomous driving technology, and that this has given rise to an imbalance in the exposure to legal risk.

He said:

‘Regulators of vehicles only recognise two actors: the vehicle manufacturer, and the driver. But the providers of software systems generate an enormous amount of the value in an autonomous vehicle, yet it is left to the OEM alone to carry the regulatory responsibility alone. So the legislation is not aligned with the business reality.’

The Autonomous Main Event also introduced the newest Expert Circle, on the Safety of Embedded AI, which is led by Jürgen Schäfer and Thomas Schneid of Infineon.

A new way to understand the car’s ‘brain’

The role that AI has to play in AVs was picked up by Lars Reger, CTO of NXP Semiconductors. He drew an analogy between the human brain and the car’s intelligent systems: the brain has different regions for connecting to nerves, for managing basic life functions, and for high-level thinking and awareness.

 

Lars Reger said: ‘This brain model shows that you need different types of intelligence in AVs, and not only the ultra-powerful processor for AI functions. In fact, most AV accidents happen when the sensors are not properly activated because of a fault in the “spinal cord”, nothing to do with the AI processor!’

The vehicle’s brain has come under the spotlight because of the move to the concept of a ‘software-defined vehicle’. How does this concept help to advance the adoption of autonomous driving?

Alejandro Vukotich of Qualcomm was clear about its benefits. ‘The question is how we can react fast to changes in demand and in the operating environment,’ he said. ‘You need software and an architecture that are flexible, so that you can react to new information and new challenges.’

 

Simon Fürst of BMW Group highlighted the risk of inefficiency and high costs arising from the effort to respond to new demands. He said: ‘As AV systems are continually updated, there will be a need to re-use software as much as possible – re-use is a key success factor.’

Evidence of helpful network effects in the automotive domain

As the operation of robotaxis in San Francisco and in China shows, the commercial vehicle sector has been the first to deploy fully autonomous vehicles for ordinary use, not only for road testing. This is revealing interesting outcomes in the real world. Sascha Meyer of MOIA described how autonomous driving could breathe life into the market for ride pooling.

‘Today, ride pooling struggles because of the difficulty in recruiting drivers, but AVs eliminate this problem. What we then see is that AVs give the flexibility to remodel demand patterns. With AVs, you can build much more dense networks, and this then transforms the customer experience of ride pooling, because there are fewer and shorter detours.’

The operation of AVs such as robotaxis in cities such as San Francisco is providing evidence of what happens when the interaction of AVs with other road users results in harm. Legal experts were unanimous in their opinion about where liability resides. Benedikt Wolfers of law firm PSWP drew the distinction between the human driver and the computer driver. When a car is in autonomous mode, such as an L3 vehicle on the highway, the computer driver is in control.

‘The law should assess the computer driver against the standard of what a competent human driver would do in any given circumstance,’ Wolfers said. ‘And the OEM would have responsibility for the behaviour of its computer drivers. If the system is in control and the vehicle is in a collision, it should be clear that the human driver is not liable, either criminally or in civil law.’

Jürgen Gleichmann of Mercedes-Benz agreed that the industry needed clarity from lawmakers, but that it was equally important for the law to be harmonized as for it to be clear.  ‘At the regulatory level, I wish for much more harmonization globally, not only of homologation, but also of the laws that govern traffic and road usage,’ he explained.

If AV manufacturers are to be liable, they will need to make vehicles that can see road scenes – but will they ever be able to ‘understand’ what is happening around them? The basic requirement is for systems that adequately perceive the environment.

 

Johann Jungwirth of Mobileye described how its system for L2+ vehicles requires just 11 vision cameras. He said: ‘Since the driver is responsible, redundancy is not required. But if the computer driver is responsible, redundancy is a must – you would need a vision + radar system as the primary sensing modality, with a LiDAR + radar back-up system for redundancy.’

The experts at the Main Event concurred that generative AI offers exciting potential to improve AVs’ understanding of their environment. According to Ralf Herrtwich of Nvidia, ‘the traditional wisdom in system development is that you develop a number of simulation environments on which to train the machine, and scroll through them all in turn. What we now know is that generative AI can learn how to train itself, and decide which simulations it needs to learn from.’

 

In the face of public concern about the potential of generative AI software such as ChatGPT to make mistakes, Sara Schneider of Aurora stressed the need to define the purpose of training and testing AVs’ scene understanding as being the safety of all road users.

‘We can increase public trust by being more transparent about how we do testing, and by being more collaborative with our regulators,’ she asserted.

Becoming ready for autonomous driving in the real world

Leading experts speaking at The Autonomous Main Event agreed about the importance of industry collaboration to fast progress on safety and regulation. But thanks to the platform provided by The Autonomous, and the thoughtful reports of its Working Groups and Expert Circles, the industry is now moving beyond the hype and unlocking the potential of autonomous driving.

As Philip Schreiner, Head of The Autonomous, told delegates, ‘We are well beyond the phase of inflated expectations in the Gartner hype cycle. But many safety challenges are still ahead. We are now in a decisive phase when reality counts more than vision. The industry must work as one to make fast progress, and at its heart will be The Autonomous, which has become the global platform for the kind of collaboration we need.’

 

By Iulia Juchert

Leave a Reply

Your email address will not be published. Required fields are marked *