chris longstaff, Senior Director of Product & Technology Marketing for PowerVR, visionaryImagination Technologies is a company with a vision to create fantastic products for innovators, for those that are looking to change the world. As such, we are always thinking ahead to see how we can best deliver that future – a future that’s bright, bold and empowering. At the core of this vision are our employees who are key in bringing this future to fruition and have an interesting take on the world.

We therefore present the second in a series of interviews with some of our key thinkers, where we’ll learn a little about their background, what makes them tick and then find out what they see coming down the road.

This time we chat to Chris Longstaff, Senior Director of Product & Technology Marketing for PowerVR.

To start us off Chris I’d like to find out what makes you tick. How did you get into this industry and from where did your original interest in technology stem?

“I’ve always been in electronics. I started as a hardware design engineer doing FPGAs for a broadcast video company and then moved into more of a customer-facing role. I then took the leap to Field Application Engineer (FAE), combining both customer-facing skills with engineering.”

As a child, I had an Acorn Electron and enjoyed playing around on that – not just games but also hacking and coding, so that’s where it stemmed from. I remember plugging it into the television and being impressed with 640 x 200 resolution in two colours! One of the games, I believe it was Exile, did speech synthesis with some very clever modulating it – it was cool.

While the electrical side was interesting to me, shortly after university I came across a role that involved video – and I loved it. I love visual things, where you can see the output and the fruits of your labour.”

My role at Imagination is a perfect fit since I get to work with the technologies that power some of the world’s most visually interesting products.

Go here to read the first interview in our Visionary series.

Let’s talk about the future. What trends are we going to see in the short to medium term?

As we saw at CES this year, drones are really starting to make an impact. People laughed at Amazon’s video of delivery drones, and I’m not convinced that we’ll get to the stage where they are used for mainstream package delivery. However, you could certainly see them used in remote regions for delivery – indeed; we’ve already seen a pizza delivery service via drone come online in New Zealand. In addition, medical drones are already helping save lives.

In a city, we could see them being used in a strict flight path for deliveries between two points. I don’t think they will be dropping into every house but there is scope for them to be used across a busy city where traffic issues mean you cannot get there with conventional transport.

Then there’s entertainment. When you’re watching a football game and you’ve got the wires across the stadium for the cameras – that’s a natural area where a drone would work. We’ve started to see an interest in ‘free viewpoint TV’, so if you’re watching a football game you’d say, “I’d like to see that from a different angle’, so you could extrapolate more information about where the ball is and things like that. Obviously, there are safety issues to be addressed – you’ve got to think about a drone running out of power and dropping into a crowd. However, the benefits would be a free point of view – you can go where you like.

And many sports such as downhill skiing, rallying, and motor racing don’t have these restrictions in terms of needing to fly over crowds or participants. Rather than needing a very expensive helicopter in the air you could use a drone, or even better, several drones.”

What technologies do we need to enable this drone future to happen?

“One of the areas I think we’re still behind on is that most drones are fairly dumb and rely on a human for control.

Leading companies such as DJI are starting to include vision capabilities, with obstacle avoidance technology providing some level of autonomy. This enables the drones to estimate how high they are and to compensate automatically for weather. I think this first level of autonomy will become standard through a combination of GPS and camera-based vision.

The other trend is augmented reality (AR) – I think it will be massive. If you say AR today people think of Pokemon Go, which is arguably AR but is a limited use case. I think we’re going to see more Mixed Reality, where you’re mixing real and virtual worlds together.

Today, if you go to a popular tourist attraction such as the Empire State Building you can download an app and scan your smartphone across the horizon and it will recognise landmarks such as the Chrysler building in the distance, etc. That sort of thing is only going to continue and get better and faster.

The limitation with AR is the huge amount of battery power it requires as it’s powering all the smartphone systems at once. As an industry we need enable devices to run much more efficiently.”

Tell me more about how you think this AR and mixed-reality will touch society.

“A professional use case would be in surgery. Convolutional Neural Networks (CNNs) are very good at spotting and recognising patterns. While a trained doctor may miss something, a CNN engine could rapidly analyse whatever they’re looking at and very quickly highlight suggestions on a screen, essentially using machine learning to offer up a second opinion asking, “Have you looked at this? According to my intelligent learning, this area here looks like it could be a cancerous growth.”

Or it could help doctors train, by explaining exactly what a doctor should be seeing and how they should be cutting. You can also start to look at more sophisticated use cases, such as replacing an organ with an artificial one. The AR system could visualise it for the doctor – overlaying it in the real world.

Ray tracing could help the images to be more natural with the correct lighting and the correct shadows.”

What is your estimation for the timescales of this sort of thing?

“Starting with drones I think we’ll see collision avoidance as standard over the next five years, but fully autonomous flight will be further out. A lot of that will be to do with regulation. The same as for autonomous cars – there’s a huge amount of regulation creation and testing that will have to go on.

In terms of AR today, we already have elements of machine learning available – in the camera, the mobile phone, and the apps but we’ll increasingly see more dedicated hardware towards vision processing, to further improve performance and reduce the power consumption.”

How would these trends affect the world in 2045?

“2045 is a long way out but I fully expect that by then we will have fully autonomous drones that are being used for multiple applications – certainly for delivering food from your local restaurant. I did see a personal drone that someone could get in and fly – a quadcopter. It would make my trip to work a lot easier, but obviously, there are a significant number of safety issues around that! Aside from drones, there could even be jetpacks. Yes seriously! I think vision processing and AR overlaid inside the helmet – Iron Man style – would be needed to make it a reality. And it would be cool!”

What technology does Imagination have that can help bring these things into being? 

“In terms of drones, we have our IP for the camera pipeline where we take data from the camera and make it available at high quality for the vision processor. We also have vision processing technologies. Today, the GPU is a very flexible compute engine and APIs like OpenVX and OpenCL enable us to use the GPU as a high-quality vision-processing engine. Of course, as an IP company, we are always looking at further optimisations for vision processing, and we will see new cores introduced down the line. Security will be an issue in many of the applications we have talked about, and Imagination’s class-leading solutions in the guise of the OmniShield platform for both PowerVR and MIPS IP cores will be critical.”

Where do you see yourself fitting in?

“Personally, I’m very focused on making sure that our roadmap is very strong for the core markets and applications that we want to serve. I find it quite exciting helping define the products and making sure that we are balancing up realistic timescales for projects against market requirements.”

What do you think is unique about Imagination in how it approaches developing technology?

“At Imagination, we take a market-led approach. Sometimes you see companies that try to shoehorn in solutions that are not suitable but here we are brutally honest in the way we look at every market to assess if it is a good fit for Imagination and our IP. That’s good news for customers as they can be confident they are getting the best solution for their market.

“In terms of the core technologies we have, we are extremely strong. We have a strong leadership position in graphics and we are confident that this will continue, but I’m also excited for our ISP and vision capabilities, particularly around GPU compute and I’m excited to see how this will develop.”

Comments