Behind the wheel of a self-driving car

The year is 2032. Former LAPD sergeant John Spartan wakes up after three decades in deep freeze sleep to find everything has changed, including driving.

Almost all the cars in the 1993 movie Demolition Man were self-driving, but of course they were concept cars. Today, more than ten leading car makers (including Audi, BMW, GM, Tesla Motors, Volkswagen, and Volvo) are working on self-driving cars. In addition, Google has developed its own custom vehicle which recently racked over 1 million miles (“the equivalent of 75 years of typical U.S. adult driving”) without any major incidents.

Demolition Man cars

Looking at these impressive achievements, many are asking a pertinent question: how long will it be before self-driving cars will be a common occurrence on our roads?

Imagination Technologies is uniquely positioned at the start of a complex automotive food chain that includes semiconductors vendors, system integrators, software developers and car makers. All of these companies are closely working together on the underlying ADAS (advanced driver assistance systems) technologies that will steer the market on the road of adopting the first commercially-available driverless vehicles.

There are currently three main approaches to implement ADAS functionality inside the car.

One direction relies on storing huge amounts of map data which is then used by the car to navigate around a given environment; think of this method as a train riding on a set of invisible tracks. An example of this approach is the Google driverless car which essentially navigates using only on a set of pre-recorded, street-level HD maps and very little sensing. In this instance, the car would rely solely on high-speed connectivity and sensor aggregation, maintaining a constant link to a cloud mega-structure that supplies the navigational coordinates.

Google-self-driving-car

In contrast, another technique relies only on computer vision processing and little to no use of pre-recorded maps. This approach replicates a human driver because the car is able to make real-time decisions based on the rich sensors and high-performance processors aboard. This category of vehicles typically includes multiple cameras for a wide field of view and uses specialized high-performance, low-power chips that offer the processing power of a supercomputer to implement ADAS software and hardware algorithms.

A company pioneering this approach has been Mobileye with its powerful and energy-efficient EyeQ SoCs designed specifically for autonomous cars; for example, a MIPS-based Mobileye EyeQ3 SoC handles the processing required for the highway autopilot feature recently enabled on Tesla cars. In this case, Mobileye have been using a multicore, multi-threaded MIPS I-class CPU to handle the data streams coming from the multiple cameras installed on the vehicle.

In the block diagram below, the quad-core, quad-threaded interAptiv CPU inside the EyeQ4 SoC acts as the brain of the chip, directing the flow of information from the cameras and other sensors to the 2.5 TFLOPS of processing muscle represented by the VLIW blocks on the right.

Mobileye EyeQ4 architecture - MIPS interAptiv M5150 CPU

Finally, a third trend involves the reuse of general-purpose SoC processors for autonomous driving. For cars that already come equipped with an embedded GPU for infotainment purposes, developers can take advantage of the compute resources of a graphics engine to run algorithms that execute recognition and tracking of lanes, pedestrians, vehicles, building facades, parking lots and more. For example, Luxoft’s Computer Vision and Augmented Reality solution uses our PowerVR Imaging Framework and additional software to implement ADAS functionality. The software framework is optimized for embedded PowerVR-based hardware that helps implement a wide range of in-car autonomous driving functionality quickly and cost-efficiently.

PowerVR imaging framework - ADAS demo

To address the requirements of next-generation automotive applications, Imagination recently released PowerVR Series7 – a family of GPUs that implement 10-bit support and hardware virtualization, two important features for ADAS developers. Having 10-bit color depth inside the GPU enables an automotive maker to build a high-precision capture-to-display pipeline that includes the image sensor, the video encoder/decoder and the graphics engine.

PowerVR vision platformThis 10-bit capable multimedia subsystem can detect lanes, traffic signs or other road markings more accurately than traditional 8-bit solutions. Implementing hardware virtualization offers the ability to run two or three applications in separate containers reliably and securely. For example, an SoC combining a MIPS Warrior CPU and a PowerVR Series7 GPU can optimally execute both the Rightware Kanzi UI infotainment cluster and the Luxoft ADAS solution concurrently on the same chip, providing best-in-class performance and reliability for both.

No matter the approach, self-driving vehicles represent a bright future for automotive companies, sensor providers and the semiconductor industry in general. These cars can benefit from advanced SoC modules and high-performance microcontrollers featuring hardware memory management, multi-threading and virtualization capabilities. These capabilities allow OEMs to implement more sophisticated software applications, including model-based process control, artificial intelligence and advanced vision computing.

I’m really excited to see Imagination and our partners playing a central role in developing hardware and software solutions for autonomous driving in 2016 and beyond.

For more news and updates on MIPS, follow us on Twitter (@ImaginationTech, @MIPSGuru, @MIPSdev), LinkedIn, Facebook and Google+.

Search by Tag

Search for posts by tag.

Search by Author

Search for posts by one of our authors.

Featured posts
Popular posts

Blog Contact

If you have any enquiries regarding any of our blog posts, please contact:

United Kingdom

benny.har-even@imgtec.com
Tel: +44 (0)1923 260 511

Related blog articles

British Engineering Excellence Award

PowerVR Vision & AI design team collect another award

We’re delighted that the design team for our PowerVR Series2NX Neural Network Accelerator (NNA) has been honoured with a prestigious British Engineering Excellence Award (BEEA). The BEEAs were established in 2009 to demonstrate the high calibre of engineering design and innovation in the

Series8XT AR/VR Banner

Imagination Technologies: the ray tracing pioneers

After a period out of the spotlight, ray tracing technology has recently come back into focus, taking up a lot of column inches in the tech press. The primary reason is because graphics cards for the PC gaming market have

Amazon Fire Stick 4K pic

Amazon Lights up its Fire TV Stick 4K with PowerVR

Amazon, the internet shopping giant, announced earlier this week the latest version of its media streaming device, the Fire TV Stick 4K. First released in 2016, the Fire TV stick brings catch-up streaming services to any TV with an HDMI

Stay up-to-date with Imagination

Sign up to receive the latest news and product updates from Imagination straight to your inbox.

  • This field is for validation purposes and should be left unchanged.
>
Contact Us

Contact Us