Semiconductors Set to Drive Cars
Advanced driver assistance systems (ADAS) are growing at much higher rates that the rest of the automotive component industry—reaching $2 billion in 2016. In the next 15 years, further functionality and features will finally drive the car itself: so-called autonomous driving. The trend passes also through the implementation of data fusion modules because of the need of more “sensing” redundancy and higher functional safety requirements (ISO26262 and ASIL compliancy).
Today, the semiconductor revenue for front-view camera applications is growing particularly quickly. Its cost position, the demand from customers and the encouragement from regional mandates and safety regulations are among the major drivers. Therefore, the sales accrued from front-view camera modules will hit $650 million in 2020 from $118 million in 2013, a 27.6% CAGR in this timeframe. In addition, radar and LIDAR modules will add $490 million by 2020, form $177 million in 2013 (CAGR = 15.7%).
IHS expects LIDAR modules to play a major role in implementing active control applications in more sophisticated autonomous driving systems, which many OEMs target form 2020-2025, depending on the “flavours” in vehicle´s autonomy.
However, an accurate detection of the surroundings might be insufficient for autonomous driving systems. Additional systems may be needed to monitor the situation of a driver and passengers and it is for this reason that driver monitoring and road frustration index would make autonomous driving more reliable and safer, relying even more on prevention.
Future dynamic market
An important point is that the transition of driver assistance from passive systems (warning only) to active (active control) is changing rapidly. Semiconductor revenue for active systems is expected to hit the $800 million in 2020, growing at a CAGR of 25% from $190 million in 2013. Semiconductor revenue for passive systems is expected to grow slow reaching $1.8 billion in 2020 from $953 million in 2013 with a CAGR of 9%.
Nonetheless, we are still far from an autonomous vehicle. The term “autonomous” is a very attractive but quite generic, and it spans different levels, where drivers have different roles and responsibilities.
Examples for vehicle´s autonomy ranges from self-parking and platooning functionality, through self-driving in limited traffic condition with drivers still maintaining control, up to “fully” autonomous vehicles.
Fully autonomous vehicles are however not expected to hit the market before the 2030-35 timeframe; but it will happen.
Independently of technologies and electronics, IHS believes that the next 20 years will in fact be required to prepare an entire eco-system for autonomous vehicles. Mandate and regulations will need to be shared among different countries, surrounding infrastructure would need to be part of a vehicle-friendly eco-system, and legal implications and liability for autonomous vehicle would need to be closely considered and regulated.
Furthermore, OEMs will need to build a database of possible situation and test pattern to play with from scratch. System and vehicle testing would need to be bullet-proof before the launch of autonomous vehicles on the market. In this extend, and the effort in the design for testability, will probably increase its share in the production costs, in addition to the increasing amount of electronics and software.
ADAS Acceptance and Trend
Today´s acceptance and diffusion of several ADAS technologies is already reflected in the market trend for ADAS multi-function modules that tends to optimize hardware and ADAS functionalities. Multi-function systems have been steadily rising since a couple of years. NCAP regulations have driven car makers to offer more ADAS functions such as LDW+TSR or LDW+FCW or LDW+FCW in order to achieve a 5-star rating.
Since 2012, the minimum number of applications implemented by OEMs front-view camera modules is at least two. Negligible amount of car makers had implemented single ADAS application with a front-view camera module. Different types of multi-function systems implemented with a front-view camera module are:
- Stand-alone Applications: Single application implemented by a front camera module such as LDW or LKA or HBC.
- Basic Multi-function Application: A combination of 2 functions implemented by a front camera module such as (LDW+FCW or LDW+HBC or LDW+TSR)
- Mid-range Multi-function Application: A combination of 3 or 4 functions implemented by a front camera module.
- High-end Multi-function Application: A combination of 5 or more functions implemented by a front camera module.
At the end, the multi-function system approach reflects nothing else than the ever green trend that accompanies electronics and semiconductor since the origins: integration and consequent cost reduction.
On the semiconductor side the integration affects micro-component IC (controller and processors) and sensors, in the path to optimize efforts and space in implementing ADAS applications successfully. In particular, both these semi categories occupy major space in semiconductor revenues for ADAS. Such optimisation and integration trend have an obvious impact in the overall system´s bill of materials.
Keeping in mind the above factors, many technology makers have been trying to achieve a cost-effective and a viable solution using different methods:
- High functional integration in single hardware
- High relevance of software for system scalability è High flexibility
- Key requirement: flexible system architecture and high computational power
- Hardware Scalability
- System´s Adjustment based on Features´ Attach-Rate
- Easy Decouple between Safety-Critical Hardware and „Non-“
- Enable Easier Implementation of Redundancy Schemes
From the above we see the ADAS market as a quite dynamic market and far from steady state. All the factors influencing this industry, and the implications on society as consequence, make ADAS one of the most attractive and contemporarily challenging segments in the automotive space—one that has the potential to re-shape our way of driving and living.