The automotive revolution begins with Imagination

Innovating the future of automotive technology and autonomous vehicles with PowerVR Automotive.

Making the future of automotive a reality

The automotive industry is over 100 years old and is a huge global undertaking with a market per annum worth trillions of dollars. Today, in the face of major technological disruptions, the industry is undergoing rapid change. To counter the threat of rising pollution, high accident rates and congestion, the car industry and governments are looking for technological solutions.

To curb pollution, many countries a looking to move to electric drive trains by 2040 or earlier, as well as fuel cell technologies with ultra-low or zero emissions. Automated driving will drastically reduce the 1.2 million road deaths worldwide per annum, while fully integrated traffic systems in smart cities along with changes to the model of car ownership will create more free-flowing roads.

Imagination embedded GPUs offer market-leading performance, low power and low footprint, bolstered by support for the latest industry APIs. This combines to make its IP ideally suited to delivering the high-performance requirements that the automotive industry now demands, even across multiple screens. Its GPUs also delivering the GFlops processing power required for tomorrow’s advanced driver assistance systems (ADAS) inspired compute applications.

Automotive infotainment and cluster​

Inside the car today, there is a shift from analogue dashboards to fully digital displays featuring multiple, large high-resolution screens. This will contain the cluster, head-up display (HUD), infotainment, navigation and  ADAS as a single complete system – all driven off a single GPU.

In vehicles of the near future, the human-machine interface (HMI) will be more than just dials and digital readouts. As driver assistance systems become increasingly prevalent there will be a need for more audio/visual cues to inform the driver what the car is about to do and what lies ahead that the driver needs to be aware of. The HUD will become more prevalent to make sure that eyes are on the road at all times. The need to ensure that the most relevant information is highlighted to the driver at all times will require advanced high-performance 3D graphics.

In-car entertainment

The infotainment system is taking over from traditional audio/video systems. Instead, the modern in-car entertainment system will feature a graphically rich user interface, enabling the driver to quickly interact with services such as DAB radio, internet, navigation, music and multi-source data and camera information to the vehicle occupants.

Imagination has long been a leader in the infotainment space. Many DAB radio SoCs are based on Ensigma IP, and thanks to many automotive applications processor partners, Imagination IP is present in tens of millions of cars.

Navigation

As part of the infotainment system, navigation requires outstanding GPU pixel performance to deliver a smooth experience. The unique hardware virtualisation feature of PowerVR GPUs also enables the optimal use of a single GPU for multiple functions within the dashboard systems, offering robustness and reliability, without compromising performance.

Functional Safety and Virtualization

Functional safety is an essential requirement for many automotive applications that demand a high level of reliability and Imagination is investing in the broadest functional safety solutions in the industry. PowerVR GPUs will make it easier for SoC manufacturers to achieve the Automotive Safety Integrity Levels (ASIL) that are classified by the ISO 26262 standard for automotive safety. 

Virtualization is a unique PowerVR feature that enables critical and non-critical systems, such as ADAS and infotainment, to be run on a single GPU simultaneously, without loss of performance, while remaining completely secure. It will also enable the secure deployment of apps and services to the car, providing car manufacturers with the flexibility and confidence to deploy value-added services.

 

Advanced driver-assistance systems (ADAS)

Most new cars will have some form of advanced driver assist system (ADAS). There are different types of ADAS function – passive/informational where the driver is informed of issues or situations visually on the cluster or centre console or audibly. These functions include lane departure warning, blind spot detect, reversing cameras and surround view. Active ADAS is where the car will automatically take control of some of the car functions such as braking and steering. For example, Automatic Emergency Braking will activate the brakes when the driver has ignored a previous warning. ADAS functions can be implemented using the compute pipeline of a GPU or using a dedicated neural network accelerator.

Lane departure warning

Lane departure warning and lane keep assist are two different use cases. One is a warning to the driver that they are drifting out of lane, which can be an audio-visual indicator. Lane keep assist on the other hand, will take control of the steering momentarily to bring the car safely back into the right lane.

Blind spot detect

Blind spot detect is a use case to cover for that gap in the rear-view mirror and the door mirrors, warning the driver either visually (via a light on the door mirror) or via an audible signal, that the path into that lane is not clear. This either uses dedicated cameras for the function or leverages the multiple surround view cameras that may be in place.

Reversing camera

This has been now mandated on all new cars by NHTSA (The US Transport Authority), so as of today all new cars will be required to support this.  This can be a standalone camera or part of the surround view camera system in place.

Surround view

Surround view is becoming the number one ADAS feature to be added to a car package. It uses a minimum of four cameras of nominally 2K resolution that are stitched together using a GPU to give a 360° view from above the car. This is ideal for parking, and to check for pedestrians.

Automotive AI and Neural Networks

As we move along the driving curve to full autonomy, there is a broad rule of thumb that says that at least 10 times the processing power is required to move from one level to the next. An ADAS function can be run generally 10-20 times faster and a lower power on an embedded GPU than a CPU, due to its inherent capability to perform large number of MAC functions at a lower clock frequency than the CPU. Dedicated hardware such as a neural network accelerator are an order of magnitude faster still at performing compute-based ADAS tasks, with even more modest power consumption.