Brian Krzanich Shares New Details on Advances in Autonomous Driving and the Future of Artificial Intelligence

Spread the love

In the opening keynote for CES 2018, Intel CEO Brian Krzanich highlighted how data is transforming the world around us and driving the next great wave of technology innovation, from autonomous driving to artificial intelligence (AI) to virtual reality (VR) and other forms of media.

In autonomous driving, he Krzanich Intel’s first autonomous vehicle in its 100-car test fleet; disclosed that BMW, Nissan, and Volkswagen are moving their Mobileye-based mapping design wins to actual deployments; and announced new collaborations with SAIC Motor and NavInfo to extend crowdsourced map building to China. Focused on the future of AI, Krzanich announced a partnership with Ferrari North America to use Intel’s AI technologies to apply data from the racetrack to enhance the experience for fans and drivers. In immersive media, he introduced the newly established Intel Studios and announced Paramount Pictures will be the first major Hollywood studio to explore this technology in tandem with Intel to see where this will lead for the next generation of visual storytelling.

“Data is going to introduce social and economic changes that we see perhaps once or twice in a century,” Krzanich said. “We not only find data everywhere today, but it will be the creative force behind the innovations of the future. Data is going to redefine how we experience life – in our work, in our homes, how we travel, and how we enjoy sports and entertainment.”

In autonomous driving, Krzanich announced that 2 million vehicles from BMW, Nissan and Volkswagen will use Mobileye Road Experience Management (REM) technology to crowdsource data to build and rapidly update low-cost, scalable high-definition maps throughout this year.

For the China market, Krzanich disclosed two partnerships with leading automotive manufacturer SAIC Motor and digital mapping company NavInfo. In addition, SAIC will develop Level 3, 4 and 5 cars in China based on Mobileye technology. Levels are assigned based on capacity for autonomy – a Level 4 vehicle can drive itself almost exclusively without any human interaction, and a Level 5 vehicle can drive itself without human interaction on any road.

Krzanich also disclosed details for the company’s new automated driving platform, which combines automotive-grade Intel Atom processors with Mobileye EyeQ5 chips to deliver a platform with scalability and versatility for L3 (Level 3) to L5 (Level 5) autonomous driving.

Beyond autonomous cars, Krzanich also demonstrated the Volocopter, a fully electric, vertical takeoff and landing aircraft designed for passenger transport. The Intel Flight Control Technology used in the Volocopter is based on the intelligence found in the Intel Falcon 8+ drone used for inspection, surveying and mapping, showing the powerful intersection of data and autonomous technology.

Addressing artificial intelligence, Krzanich showcased how companies are using Intel’s technology to transform their businesses through AI. He announced Intel is partnering with Ferrari North America to bring the power of AI to the Ferrari Challenge North America Series that will take place on six courses in the U.S. this year. The Ferrari Challenge broadcast will use the processing power of Intel Xeon Scalable processors and the neonTM framework for deep learning not only to transcode, identify objects and events, and stream the experience to viewers online, but also to mine the resulting data for further insights for drivers and fans.

Looking ahead to the future of computing, Krzanich noted Intel’s promising research into neuromorphic computing, a new type of computing architecture that mimics the way brains observe, learn and understand. Intel’s neuromorphic research prototype chip (“Loihi”) is now fully functioning and will be shared with research partners this year.

Krzanich also announced the next milestone in Intel’s efforts to develop a quantum computing system. Intel shipped its first 49-qubit quantum computing test chip (“Tangle Lake”) to research partner QuTech. The chip is named after a chain of lakes in Alaska, a nod to the extreme cold temperatures and the entangled state that quantum bits (or “qubits”) require to function.

Achieving a 49-qubit test chip is an important milestone because it will allow researchers to assess and improve error correction techniques and simulate computational problems.

In his keynote, Krzanich predicted that quantum computing will solve problems that today might take our best supercomputers months or years to resolve, such as drug development, financial modeling and climate forecasting. While quantum computing has the potential to solve problems conventional computers can’t handle, the field is still nascent.

The need to scale to greater numbers of working qubits is why Intel, in addition to investing in superconducting qubits, is also researching another type called spin qubits in silicon. Spin qubits could have a scaling advantage because they are much smaller than superconducting qubits. Spin qubits resemble a single electron transistor, which is similar in many ways to conventional transistors and potentially able to be manufactured with comparable processes. In fact, Intel has already invented a spin qubit fabrication flow on its 300mm process technology.

Krzanich also showcased Intel’s research into neuromorphic computing – a new computing paradigm inspired by how the brain works that could unlock exponential gains in performance and power efficiency for the future of artificial intelligence.

Intel Labs has developed a neuromorphic research chip, code-named “Loihi,” which includes digital circuits that mimic the brain’s basic operation. Loihi combines training and inference on a single chip with the goal of making machine learning more power efficient.

Neuromorphic chips could ultimately be used anywhere real-world data needs to be processed in evolving real-time environments. For example, these chips could enable smarter security cameras and smart-city infrastructure designed for real-time communication with autonomous vehicles.

In the first half of this year, Intel plans to share the Loihi test chip with leading university and research institutions while applying it to more complex data sets and problems.

In addition to enabling AI and the autonomous future, Krzanich discussed how data can transform other everyday experiences, such as entertainment and media. He announced the debut of Intel Studios, a newly constructed, state-of-the-art studio dedicated to the production of large-scale, volumetric content – using Intel True View technology – that will create new forms of visual storytelling with and without VR. Intel Studios features the world’s largest volumetric video stage and a post-production and control facility. Paramount Pictures is the first major Hollywood studio to explore this technology in tandem with Intel.

In sports, Krzanich announced that Intel will enable the largest scale virtual reality event to date with the Olympic Winter Games PyeongChang 2018 using Intel True VR technology. Intel, together with the official Rights Holding Broadcasters, will capture a record 30 Olympic events, with both live and video-on-demand content available. This marks the first-ever live virtual reality broadcast of the Olympic Winter Games and will be available in the U.S. via a forthcoming NBC Sports VR app.

Finally, Intel achieved a new Guinness World Records title for the most UAVs airborne simultaneously from a single computer indoors when Krzanich presented a spectacular indoor light show performed by 100 Intel Shooting Star Mini drones, or UAVs (unmanned aerial vehicles).


VISIT THE SOURCE ARTICLE
Author:

Copyright © 2018 NEURALSCULPT.COM