The central body control module was the first ECU to become a sort of hub or gateway managing multiple basic functionalities related to doors, windows, seats, lighting and air conditioning. However, infotainment is currently the vehicle domain where the most radical consolidation is happening. Cockpit domain controllers are expected to quickly penetrate both premium and entry market segments globally, aggregating in one powerful silver box the computing and memory capabilities that were before distributed around the car in stand-alone modules. This will reduce harnessing and packaging weight, simplifying at the same time the electronics bill of materials by eliminating redundancy of common integrated circuits such as microcontrollers, analog interfaces, transceivers or power discretes. Security can also benefit from using standard communication protocols and having a sole robust master brain, but the increasing software complexity intrinsic to these multipurpose systems will cause real struggles in the industry. Particularly the leading car manufacturers at the top of the automotive supply chain will have to go through a fundamental corporate restructuring effort to be able to adapt to this new paradigm and remain competitive against disruptive electric car startups.
ADAS and infotainment overlap
ADAS and infotainment were always closely related, since the instrument cluster is an in-vehicle information module displaying safety related information to the driver. With the cockpit domain controller, new compelling camera-based applications such as automatic parking or driver monitoring can be enabled by the available infotainment processor. Modern chipsets that were originally designed for mobile phones by companies like Qualcomm, Mediatek or Samsung are already conquering the infotainment market after passing an automotive-grade qualification. They often include neural processing units that can enhance intuitive infotainment experiences as well as intelligent ADAS applications in the cockpit, and thanks to software separation via virtualization (hypervisors), the safety integrity level and low latency required by real-time ADAS applications can be guaranteed, while the less critical infotainment performance is rarely compromised.
The next picture represents the current state of ECU consolidation in the cockpit. Stand-alone ECUs or silver boxes (in green), sometimes with integrated displays or cameras, become simple peripheral modules managed by the cockpit domain controller:
Domain fusion and zonal architectures
Although the Tesla Model 3 autopilot and media controller might not be a domain fusion module but just a clever co-package design using water cooling, some OEMs are already considering the possibility of entirely merging the ADAS and infotainment domain controllers. Leading tier 1s have also envisioned a “scalable autonomous domain controller for Level 2 and higher automated driving that is integrated with the cockpit domain controller for seamless interaction with the driver”, as Visteon explained, and semiconductor supplier Nvidia, increasingly shifting its focus to software, has developed an integrated platform incorporating autonomous driving and intelligent cockpit software stacks that can run on a single real-time operating system.
It makes sense to share the hardware resources both domains need, for example inertial sensors, memory for high definition maps and neural networks, chips for vehicle-to-vehicle communications and positioning, etc. Nevertheless, the overall bill of materials will not be significantly reduced, since these semiconductor components must comply now with higher reliability and safety standards than previously required by infotainment-only applications.
In this centralized topology, sensors (lidar, radar, camera, ultrasonic, proximity, ambient light, temperature, pressure) and actuators (motors, displays, loudspeakers) at the edge are relatively commoditized and dumb. Raw sensor data is sent to the main computer via a time-sensitive gigabit Ethernet backbone network to enable the best data fusion algorithms, but if the approach is rather hybrid, peripherals from all domains are distributed in a scalable zonal architecture, with cross-domain ECUs in each vehicle partition performing sensor data aggregation to reduce the bandwidth. Redundancy for ADAS applications is essential at the initial sensory perception phase, but possibly also for the decision making. Therefore, having two separated central computers instead of just one could be beneficial from the safety point of view: if the ADAS domain controller fails, the cockpit domain controller could operate as an emergency backup solution.
Vehicle functions in the cloud
The trend seems to be merging more and more domains, ultimately aiming for a “server on wheels” by 2030. The autonomous car of the future could become a mobile platform with two central computers for ADAS/safety/chassis/powertrain and infotainment/connectivity/body/comfort that receive inputs from multiple sensors or IoT devices in a service-oriented architecture. Thanks to the 5G infrastructure, fast and ubiquitous access to the cloud would not only improve the performance of local functions using live online data but also guarantee successful on-demand activation and remote execution of cloud-based vehicle functions, substantially reducing the embedded software processing and storage needs in the car.
It would be crucial to reuse software blocks and decouple them from the hardware, adopting network protocols and strategies from the information technology sector. However, effectively dealing with the strict safety requirements and cybersecurity threats posed to connected and highly autonomous cars is a huge challenge, and some argue that the automotive world will always remain intrinsically different to the consumer electronics and telecommunications industries.