Named one of USA Today’s 10 best automotive shows, CES is at the center of making cars safer and drivers more connected. Despite the various new features that surprised and delighted the crowd at the show for automotive technology, much of what was revealed barely hints at the starring role the automotive sector will play one day in the future of tech business.

Consider the following: as of 2014, there were an estimated 1.2 billion cars in the world, and this figure was growing at a rate of 50 million per year. That puts the car market at roughly one-third the number of Internet users. According to one study published in 2017, the new economy spurred by technological advances in the automotive world will represent a $7 trillion opportunity by 2050.

Optimism about the future viability of autonomous cars is the main reason behind this expected explosion in the automotive technology sector. Fully automated driving still requires highly advanced processing capabilities and other technical capacities that are a few years out of reach; but as autonomous driving technology improves and drops in cost, its importance will no longer be possible to miss. Aside from representing a huge market in their own right, autonomous cars will likely have secondary spillover effects that will open new markets in state infrastructure, logistics, delivery, building construction, mass transportation, and entertainment, in addition to having other untold effects we can’t yet imagine.

With so much at stake, CPU manufacturers and others are competing to create a de-facto chip standard for autonomous cars, much as the x86 chip has been a standard for PCs. And until fairly recently, one processor manufacturer in particular—NVIDIA—held pole position in the race to supply the chip for automotive tech in the future. In 2015, NVIDIA already powered—and still powers to this day—all of Tesla’s vehicles. The graphics-chip manufacturer has also forged alliances with Audi and more than 220 other partners that can enable and implement its automotive technologies.

Intel Muscles Its Way to the Front

NVIDIA’s position in the field is now getting a serious challenge from Intel. Over the past couple of years, Intel has spent more than $36 billion in acquisitions that apply directly or indirectly to the autonomous driving sector. (Compare that to the current valuation of Nissan, which is about $38 billion.[1]) The acquired companies have proven expertise in designing application-specific integrated chips (ASICs) related either to vision processing or machine learning (ML), including one, Mobileye, that is considered a leader in advanced driver assistance systems (ADAS) and autonomous driving technologies. With this newly acquired expertise and intellectual property, Intel is no longer working only with its x86 processors in autonomous driving. It’s taken a heterogeneous approach to its automotive chipset.

Intel has also been collaborating with other big players in the industry. In 2016, Intel announced it had entered into a new, self-driving car alliance with BMW and Israeli automotive technology innovator Mobileye (which it later acquired in 2017). The goal of this automotive-technology alliance is to develop an autonomous driving platform and software development kit (SDK) based on Intel and Mobileye technologies—which can then be used for many different vehicles. Since its creation in 2016, the alliance has expanded to include partners such as Continental, Delphi, Fiat Chrysler Automobiles (FCA), and the Google subsidiary-juggernaut Waymo (which has logged 5 million self-driving miles to date).

One early result of this alliance is that Mobileye, now a part of Intel, introduced the first autonomous car in its 100-vehicle test fleet during CEO Brian Krzanich’s keynote address at the industry expo CES in Las Vegas. Intel also plans to put a fully self-driving car on the market by 2021.

Intel’s Vision for Self-Driving Cars

The Intel vision for autonomous driving can be described as holistic. It’s compute- and data-driven, and it spans the entire distance from the automobile to the data center, including the wireless technologies and carriers in between.

Here are a few of the components that Intel (including its Mobileye subsidiary) is contributing to its autonomous driving platform:

  • Mobileye EyeQ5 system-on-a-chip (SoC): The EyeQ5 SoC is Mobileye’s fifth-generation SoC for automated driving, providing vision sensor and processing capabilities for fully autonomous driving vehicles. The EyeQ5 SoC will first be released in 2018, launching with four OEMs later in 2018 and 12 OEMs in 2019. Intel claims that a key feature of this chip is that it delivers an extremely high ratio of deep learning tera operations per second (TOPS) per watt. This high ratio is required, Intel claims, because autonomous driving requires high performance computing without an extensive cooling system, and having more operations per watt reduces cooling requirements.
  • An automotive SDK: The SDK helps developers create functions that take advantage of computer vision, deep learning, performance libraries, and other features.
  • Automotive 5G platform: Intel has made clear that it sees high-speed wireless connectivity as important for self-driving cars. 5G wireless will enable high-definition map downloads in real time, over-the-air firmware updates, and, perhaps most importantly, sensor-data uploads from cars to data centers for ML. The wireless technology will also be used to talk to other cars and municipal traffic systems. This 5G network platform will be the industry’s first that is specifically created for automobiles.
  • Data-center support: According to Intel, the single most important factor driving the autonomous future is data—how to process, manage, move, share, store, analyze, and learn from it. In Intel’s vision, self-driving cars will count on data centers to store, manage, and process huge volumes of data. ML, especially, will be performed in the data center using Intel architecture–based compute, storage, and networking resources, with results sent back to the car wirelessly via 5G and a software-defined network. More specifically, hardware based on Intel Xeon processors, Intel Xeon Phi processors, and other high-end x86 processors is expected to deliver the high-performance processing needed for artificial intelligence (AI) and other compute-intensive workloads on the back end.
  • On-board ADAS: On-board Intel and Mobileye technologies will use data delivered from the data center to improve driving and navigation, in addition to offering entertainment. Thirty new design wins from 27 separate automakers in 2017 will help ensure that Mobileye technology–powered ADAS solutions will grow substantially beyond the 24 million vehicles already on the road today. A diverse chipset will be used to support these functions.
  • Mobileye Road Experience Management (REM) Mapping: This year, Mobileye will begin collecting REM data, enabled by software embedded on EyeQ4. To this end, in April 2018, Intel announced two new collaborations for Mobileye: one with a large Chinese automaker, SAIC Motor, to develop L3/L4/L5 autonomous vehicles and the other with NavInfo to bring REM mapping to China. Specifically, the purpose of the partnership with SAIC and NavInfo is to use Mobileye’s REM technology to generate a RoadBook in China that is integrated and aligned with NavInfo’s mapping solutions.

The first use of RoadBook will be as a valuable input to L2+ and L3 systems to be launched by several automakers in 2019. The category of L2+ is created by using the RoadBook solution’s low-cost and low-bandwidth footprint, which is crowdsourced through front-facing cameras on millions of L1/L2 ADAS vehicles using only 10 kilobytes of data per 1 kilometer of driving. The footprint enables a major leap in both lateral (lane keeping support) and longitudinal (adaptive cruise control) control features at affordable cost.

In January 2018, Intel also announced Mobileye’s plan to use ride-share fleets to help map cities, mounting the Mobileye 8 Connect solution, Mobileye’s next-generation aftermarket collision-avoidance system powered by EyeQ4 technology, on cabs and ride-share app–related vehicles.

Overall, Mobileye has design wins for advanced L2+ and L3 autonomous systems with 11 automakers, who collectively represent more than 50 percent of the auto industry. These include designs that will launch this year and in 2019.

We’ll Keep an Eye on Intel and Mobileye

Intel’s vision for autonomous driving is inextricably linked to ML, AI, and the data center. Here at Prowess, we have experience with ML and AI in-house, and we produce content regularly that draws upon our up-to-date knowledge in these areas. Because we’re in tune with developments in ML and AI, we find the topic of autonomous driving interesting, and we’re watching developments in this area closely. This is the type of thing we do.

To keep up with more emerging trends and technologies, follow Prowess on our blog, Twitter, and LinkedIn.

[1] Forbes. “Nissan Motor on the Forbes Best Employers for Diversity List.” May 2017. www.forbes.com/companies/nissan-motor/.