Intel and Mobileye to test 100-car autonomous vehicle fleet on the streets of Jerusalem

The technology verifies the safety of the system’s design without requiring  billions of miles driven by unproven vehicles on public roads.

Autocar Pro News Desk By Autocar Pro News Desk calendar 17 May 2018 Views icon7068 Views Share - Share to Facebook Share to Twitter Share to LinkedIn Share to Whatsapp

Autonomous driving technology from Intel and Mobileye is being tested on the roads of Jerusalem to demonstrate that the Responsibility-Sensitive Safety (RSS) model increases safety and to integrate key learnings into their products and customer projects. In the coming months, the fleet will expand to the U.S. and other regions. 

Amnon Shashua, a computer science professor at the Hebrew University in Jerusalem, a senior vice president at Intel Corporation and the co-founder and CTO of Mobileye, an Intel company, states that the target of the project is to develop a technology where the vehicle gets from point A to point B faster, smoother and less expensively than a human-driven vehicle; can operate in any geography; and achieves a verifiable, transparent safety improvement over a human-driven vehicle without the need for billions of miles of validation testing on public roads.

The team at Mobileye is going to handle a very tricky situation where they have to drive with a human-like style (so as to not surprise other drivers) but without making human errors. To achieve this delicate balance, the Mobileye AV fleet separates the system that proposes driving actions from the system that approves (or rejects) the actions. Each system is fully operational in the current fleet.


In order to prove that they could contemplate an end-to-end solution solely through the data obtained from the cameras, the team has installed eight cameras to provide a long-range 360-degree view and four cameras for parking. The technology has been developed with the ability to detect road users, drivable paths and the semantic meaning of traffic signs/lights, real-time creation of HD maps, manoeuvring with centimetre-level accuracy, path planning and vehicle control. Amnon explains that the RSS is a model that formalizes the common sense principles of what it means to drive safely into a set of mathematical formulas that a machine can understand (safe following/merging distances, right of way, and caution around obstructed objects, for example). If the AI-based software proposes an action that would violate one of these common sense principles, the RSS layer rejects the decision.

Put simply, the AI-based driving policy is how the AV gets from point A to point B; RSS is what prevents the AV from causing dangerous situations along the way.

Incorporating multiple independent camera sensors

True redundancy, which is referred to as the camera-only phase, is a sensing system consisting of multiple independently engineered sensing systems, each of which can support fully autonomous driving on its own. This is in contrast to fusing raw sensor data from multiple sources together early in the process, which in practice results in a single sensing system. True redundancy provides two major advantages: The amount of data required to validate the perception system is massively lower (square root of 1 billion hours vs. 1 billion hours). So, in the case of a failure of one of the independent systems, the vehicle can continue operating safely in contrast to a vehicle with a low-level fused system that needs to cease driving immediately. A useful analogy to the fused system is a string of Christmas tree lights where the entire string fails when one bulb burns out.

Incorporating the computing hardware for level 5 autonomy

The end-to-end computer system in the AV fleet is powered by four Mobileye EyeQ4s. An EyeQ4 SoC has 2.5 Terra OP/s (TOP/s) (for deep networks with an 8-bit representation) running at 6 watts of power. Produced in 2018, the EyeQ4 is Mobileye’s latest SoC and this year will see four production launches, with an additional 12 production launches slated for 2019. Mobile EyeQ5 is being developed to achieve a complete autonomous state. An EyeQ5 has 24 TOP/s and is roughly 10 times more powerful than an EyeQ4. In production Mobileye and Intel are planning for three EyeQ5s to power level 4 and level 5 autonomous vehicles.

Also read: 

MIT enables autonomous cars to ply over unexplored roads

Intel partners with BMW, Nissan, SAIC Motor, Volkswagen and Ferrari for data-driven innovations

Daimler Truck and Siemens to build integrated digital engineering platform

auther Autocar Pro News Desk calendar29 Mar 2023

Siemens Xcelerator portfolio enables Daimler Truck to develop innovative technologies for its trucks and buses by using ...

Vehicle exhaust filters do not remove ‘ultrafine’ pollution: new study

auther Autocar Pro News Desk calendar29 Mar 2023

While filters are able to remove the majority of larger, solid particles, a new study shows they are less effective at r...

Helixx launches commercial EVs for emerging markets

auther Autocar Pro News Desk calendar28 Mar 2023

McDonald’s-inspired global production strategy uses new ‘digital twin’ tech to ensure quality; two passenger models, a m...