This article by”
AI the front“Original, original
Leading Uber on the left and Volkswagen on the right, Huang jiaobao led 320 car companies to unify the autonomous driving arena


Translator | Vincent Debra


Edit | Emily

CES 2018 kicks off in Las Vegas at 8:00 PM ON January 7th. Nvidia, which has risen rapidly in recent years to become a giant in consumer electronics, stole the show early on. Huang wore his trademark black leather jacket, as he has done at every Nvidia event.”


New product, Yellow Lord released a bombshell “nuclear bomb”

If it’s the opening, we have to grab the jackpot. Huang renxun, known as the “director of a nuclear bomb factory” by Internet users, will not miss the opportunity to show off his muscles.

First, Huang announced nvidia’s release of the world’s first autonomous machine processor, Nvidia Drive Xavier, an AI supercomputing chip for driverless cars and the most complex and largest SoC ever.

The Drive Xavier is 350mm², has 9 billion transistors, supports 30 teraflops and only 30 watts of power, making it 15 times more efficient than its predecessor. It took nvidia 2,000 engineers four years and $2 billion to develop the DRIVE Xavier.

Xavier includes a custom 8-core CPU, a new 512-core Volta GPU, a new deep learning accelerator, a brand new computer vision accelerator, and a brand new 8K HDR video processor.

Huang mentioned both baidu and the Chinese market in his speech, saying That China is a huge market in the world, and all systems need to be localized and suitable for China. Baidu is said to be carrying Drive Xavier in each of its self-driving cars.

In addition to Drive Xavier, Huang also released two autonomous driving platforms:

DRIVE IX, an intelligent experience software development kit, provides ai-assisted functions for drivers and passengers using sensors inside and outside the car;

Drive AR is an autonomous driving app developed by Nvidia. It can comb through data fused with sensors such as surround cameras, radar and lidar to sense the outside world. Later, multiple deep learning and computer vision algorithms will provide the required variety and redundancy for L4 and L5 autonomous driving technologies to complete positioning and path planning.

In addition, Huang renxun announced an important cooperation project in his speech: to join forces with Uber to jointly develop self-driving Uber, and also released a new product called: Drive Pegasus’ AI computing platform. According to Pegasus, a Pegasus contains two Xavier’s and two Nvidia’s new Gpus, capable of 320 trillion calculations per second. One or two Pegasus can provide the amount of computing needed to drive an L5 fully autonomous car, and support the smooth operation of an autonomous taxi.

                                                               Pegasus

Uber is understood to have started using Nvidia GPU computing in its first test fleet of Volvo XC90 SUVs at CES this year, and is currently using high-performance Nvidia processors to run deep neural networks in its self-driving ride-hailing and self-driving vans. In addition to Uber, 25 companies are using Nvidia’s technology to develop fully autonomous vehicles.

Huang also announced Nvidia’s Drive Functional Safety Architecture, which integrates software, algorithms, system design and other technologies to improve the overall Safety of autonomous vehicles.


Review of The Highlights of Huang Yan Hsun’s Speech (non-full text)

8:30 – Launches Volta, the world’s most complex processor, with 21 billion transistors. The world’s simplest supercomputer, at 125 teraflop, runs on 250 amps. The world’s smallest supercomputer, at 125 teraflops, a v100 box of petaquinaflops, is at the top of the Top500 list of fastest supercomputers in the world.

8:42 — Present the deep learning platform and the company’s research in the following areas:

  • Use AI to predict ray tracing to predict colors in the image
  • Create audio-driven facial animations using AI
  • Artificial intelligence-based generative adversarial networks (GAN) are used to synthesize human images to generate virtual reality images

Write music in AI. To celebrate the latest Star Wars movie, we worked with Disney to create a Star Wars-themed GPU, and to commemorate, we wrote John Williams’ music with AI.

8:50 — AI is going to revolutionize driving, and driving needs a revolution.

There are 82 million car accidents every year, with 1.3 million victims and more than $500 billion in treatment costs. AV (Autonomous vehicles) could be the biggest change in the automotive industry in the next 100 years.

In the future, there will be billions more people on the planet, which means a threefold increase in traffic, causing social problems such as lack of parking Spaces.

AV will revolutionize transportation. In the United States, it takes 3.5 million truck drivers 11 and 10 hours to move 10.5 billion tons and $75 billion worth of goods. That means there are 50,000 truck driver shortfalls in the US alone.

To solve AV, you need to solve the problem from the bottom up. Unlike games consoles, supercomputers, personal computers, building a self-driving supercomputer is a huge challenge because it can never go wrong and requires extreme complexity.

So today, we’re announcing that DRIVE Xavier, the world’s first automated machine processor, will be available to consumers this quarter. It is by far the most complex system on a chip and the largest SoC, taking 8,000 engineering years to integrate 9,000 transistors and costing $2 billion to develop.

8:56 — How do you build a computer that performs well when errors are detected? This problem, huang says, can be solved through redundancy and diversity.

DRIVE Xavier has an 8-core CPU, a Volta GPU with 512 CUDA cores and 20 tensor cores, and 1.5 gigabytes of pixels (the camera might have 30-40).

Huang showed off an older version of the DRIVE PX 2, which had four chips and produced the fastest 24 teraflops per second, consuming 300 watts of power.

The next-generation Xavier uses 30 watts of power to deliver 30 teraflops, powering the next generation of vehicles.

He points out that while developers are already using DRIVE Px2, everyone can develop on Xavier.

9:00 — Show DRIVE Xavier running the entire self-driving DriveWorks and DRIVE Stack.

9:06 — Baidu and ZF will use DRIVE Xavier in their autonomous vehicles in China. Given that China is the world’s largest market, all cars need to adapt to the Needs of the Chinese market.

9:06 — Next up is the NVIDIA DRIVE Pegasus. Designed for robotic taxis, the device can perform up to 320 AI inferences at just 400 watts.

Huang showed off the Pegasus with two Xavier and two Gpus, only two of which are needed to power a driverless vehicle, and one motherboard to replace a supercomputer.

I’m pleased to announce a partnership with Aurora and Uber to develop driverless taxis.

9:19 — Security is critical. So how do we deal with the probability of an error?

The complexity of the whole system is unprecedented.

This requires a holistic plan. Our goal is to achieve the highest level of safety – ISO26262 ASIL safety level. SoC has tracing capabilities that can trace the source of an error.

To do this, we work with blackberry QNX and TTTech to build our products into their software.

Since it wasn’t practical to do the test in real life, we modeled the test in a virtual environment.

With NVIDIA AutoSim, we’ve done billions of kilometers of testing. Using a DGX supercomputer and an Nvidia simulator, Mark showed what to do in dangerous situations like when a driver misconducts his approach to another car.

9:28 — Next, it shows us pre-recorded video, with the system capturing different scenarios and ready to run when a new software stack appears.

Functional safety is one of the most important things we work on. All of our software stacks will be ISO certified, and we are working to be the first NEURAL network to be ISO certified.

We want to turn the whole car into an automated machine, turn the whole car into artificial intelligence. It has sensors around it. So you need AI in the car, because the cloud is not context-aware.

In the future, all vehicles will be autonomous vehicles, about 100 million autonomous cars, millions of robot taxis, hundreds of thousands of trucks are produced every year. Most importantly, AI will determine the driving experience.

We have created an SDK and a platform for developers to do surround awareness, speech recognition and synthesis, natural language processing, eye tracking, head position tracking and gesture recognition.

You can tell it to open a window, and it will know which window to open.

With all of these capabilities, I can track where I am and where I’m going by my location. We can push cars to the next level in the future.

Today, we announced a new platform called NVIDIA DRIVE AR. This is an augmented reality-focused software stack that highlights the graphics it sees.

9:31 — We’re going to show you something really cool. Justin, an engineer from Nvidia, will demonstrate the virtual reality environment.

The holographic screen shows Justin and the $1.4 billion supercomputer Lafferrari in a test entirely based on augmented reality.

In the future, when you look down and you get direct feedback from the car, the system will be able to identify important objects and drive safely.

All of these basic technologies can be implemented with our new SDK.

9:44 — I would say that this car is unlikely to bring AI to the rest of the world, more likely to bring 100 people. What we want to do is revolutionize the way people drive, the way goods move, and how we interact with cars.

No other company has changed the way we think about cars more than Volkswagen.

We are working with VW, the world’s largest car company, to inject AI into the future.

Then Dr. Herbert Diess, Volkswagen’s CHIEF executive, took the stage. After Jansen, in a black leather jacket, embraced Herbert, in a blue blazer and dark blue trousers, Huang recalled their first meeting, when Herbert asked Nvidia’s deep neural network to identify a dachd when he showed Herbert the image-recognition AI. Herbert failed.

“That was the end of the meeting.” Huang said with a smile.

Nvidia has come a long way since then.

‘I think cars will become even more important, even more exciting and sexy than they are now,’ Dr. Diess said. But also full of pollution, accidents and other negative news. In the future, cars will be more affordable, safer and greener.

Dr Diess is sceptical about the idea that robotic cars will replace private cars because everyone wants their own space. “That’s why I think personal cars will have a bright future. “

‘The way cars were designed in the past was very complicated,’ Mr. Huang said. ‘Cars were heavy, and safety was important.’ But now, cars are moving toward electrification, AI and connectivity without internal combustion, and are constantly being updated. These three elements fundamentally change everything you do.

On the question of how Huang’s cars of the future will change, Dr Diess says they will still need safety, but at the same time more technology.

‘In the future, we will form a triangle of cooperation, with more cooperation in development and later in the process,’ Mr. Huang said. One challenge is how to apply AI technology to millions of cars.

On the holographic screen, Huang showed off a retro green minibus, followed by their new electric vehicle, the Volkswagen ID.Buzz, which is also green.

Some want to bring small and medium-sized buses back to life, and Huang says Nvidia has done that in VR.


Expand cooperation, seize the market, huang Jiaobao to dominate the autonomous driving arena?

CES has just begun, and Lord Huang has stolen the limelight, making people sigh at the strength of this “nuclear bomb factory”. I believe that careful readers will find: Lord Huang in the past two years in the field of automatic driving layout speed is a little crazy, almost less than 3 months on average for the release of new products for automatic driving, Lord Huang is trying to dominate the world of automatic driving?

Recall that Nvidia was founded as a semiconductor company primarily designing graphics processors. More than two decades later, Nvidia has become one of the leaders in smart chips, especially in autonomous driving, where its chips and autonomous driving solutions account for half of the market thanks to its computing power.

Huang jen-hsun introduced at the CES that more than 320 companies have reached cooperation with Nvidia in the autonomous driving field, more than 100 more than last year’s 200 plus, including some well-known auto manufacturers like Tesla. Today, Nvidia’s partnership with Uber is a big step toward dominating the autonomous driving landscape.

As is known to all, Uber has actively deployed autonomous driving technology in recent years and invested huge financial and human resources in autonomous taxi research. Since 2015, Uber has deployed dozens of autonomous driving (L3-L4) tests in the United States. Uber’s self-driving cars have driven more than 2 million miles, making more than 50,000 trips. Uber’s self-driving plans have been slowed by its recent legal battle with rival Waymo over technology leaks.

As the cooperation with Nvidia deepens, it is believed that Uber’s research speed in autonomous driving will pick up again. Nvidia can provide technical soil for companies like Uber, while Uber and other companies can provide huge data support for Nvidia. It is believed that autonomous driving technology will surely have a leap forward with the cooperation of multiple parties. Looking back at China, domestic autonomous driving enterprises led by Baidu have also obtained help or reached cooperation with Nvidia from various angles, and the cooperation between Nvidia and Baidu is even actively “opening privileges” for China’s autonomous driving. In addition to technology companies, Nvidia also announced a partnership with Volkswagen at CES. The world’s number one car manufacturer has been taken over by Nvidia, and under the strong combination, it is afraid to sweep the world’s autonomous driving enterprises.

“Autonomous vehicles will be on the road in four years,” Huang has said. Looking at The layout of Nvidia in the field of autonomous driving in recent years, Huang’s promise should not be far from being realized. What will the autonomous driving landscape look like under nvidia’s leadership? That’s something to look forward to.


One nuke, one patch, Intel really not in a hurry?

Nvidia’s “nukes” keep coming, but look at the recent situation of Intel, another chip giant, and it’s a little sad. On the Internet, Nvidia and Intel have their own nicknames: a nuke factory and a toothpaste factory.

It is understood that the origin of these two nicknames is also related to the speed and quality of the release of new products in the chip field. In terms of speed, Nvidia releases new products on average every three to five months, compared with Intel’s most recent Nervana announcement last September. In terms of quality, nvidia’s new products are “nuclear bomb” products with n-fold improvement in computing power, while Intel’s new products seem to always give people a feeling of “neither itching nor pain” and “not enough”, not as happy as Nvidia’s “the strongest on the ground”, “the fastest” and so on.

But a few days ago, a matter is making people feel bad for Intel: CPU exposed a serious bug called “bypass analysis utilization” let Intel into a disadvantageous situation, and after the public relations behavior suspected of “dumping the blame” behavior is making netizens angry. The chip giant is now patching everywhere, which makes you wonder: What happened to Intel these past few years?

Intel, the world’s largest semiconductor company, invested heavily in new microprocessor designs and nurtured the burgeoning PC industry in the 1990s. During this period Intel became the supply leader of PC microprocessors and positioned itself to dominate the PC industry with aggressive and sometimes controversial marketing strategies, much as Microsoft did. Millward Brown Optimor’s 2007 ranking of the world’s most powerful brands shows Intel dropping 10 places in brand value from 15th to 25th. The main competitors are AMD, Nvidia and Samsung.

Especially in the rising tide of artificial intelligence in recent years, although Intel has also launched a deep learning framework BigDL, intelligence chip Nervana and other artificial intelligence products, but throughout the whole attitude of Intel, seems to be a little lukewarm, always feel not too anxious.

On March 13, 2017, Intel bought Mobileye, an Israeli self-driving technology company, for $15.3 billion. When everyone thought Intel was finally on the road to autonomous driving, Intel was silent, and Mobileye has released few new products since the acquisition. There is a kind of Intel this kind of unhurried personality infected the feeling. It comes at a time when Nvidia is aggressively expanding its network of partnerships, with nearly 300 autonomous driving partners in 2017 alone, and now with Volkswagen, the world’s largest automaker, Intel and Nvidia seem to be growing further apart when it comes to autonomous driving.

For more content, you can follow AI Front, ID: AI-front, reply “AI”, “TF”, “big Data” to get AI Front series PDF mini-book and skill Map.