The United Arab Emirates is a federation of seven constituent monarchies: the Emirates of Abu Dhabi, Ajman, Dubai, Fujairah, Ras al-Khaimah, Sharjah, and Umm al-Quwain.
United Arab Emirates just granted a nationwide license to China’s WeRide to operate its robotaxis and other cars on its roads.
UAE is the first country to grant a nationwide license for self-driving cars on its roads.
WeRide’s primary focus is on developing Level 4 autonomous driving technology, which refers to highly automated driving where the vehicle can operate without human intervention under certain conditions or within specific geographic areas. They aim to provide safe and efficient transportation solutions by leveraging artificial intelligence, deep learning, sensor fusion, and other cutting-edge technologies.
The company has made significant progress in the development and deployment of autonomous vehicles. They have conducted extensive testing and validation of their self-driving technology on public roads in China and have accumulated millions of kilometers in real-world driving scenarios. WeRide has also collaborated with various automakers, technology partners, and government agencies to further advance autonomous driving technology and explore new business models.
In addition to autonomous vehicle technology, WeRide has also launched robotaxi services in select cities in China. These robotaxi services allow users to hail a self-driving vehicle through a mobile app and experience autonomous transportation firsthand. WeRide is working towards expanding its services and making autonomous mobility accessible to more people.
Limited operational conditions: L4 autonomous vehicles operate within specific conditions and areas. These conditions are typically well-defined and may include specific geographic locations, weather conditions, or road types. Outside of these predefined conditions, the vehicle may require human intervention or be unable to operate.
High-level perception and decision-making: L4 autonomous vehicles incorporate advanced sensor systems, such as lidar, radar, and cameras, to perceive and understand their surroundings. These sensors enable the vehicle to detect and track objects, recognize traffic signs and signals, and navigate complex road scenarios.
Real-time mapping and localization: L4 autonomous vehicles often rely on highly detailed and up-to-date maps for precise navigation. These maps provide essential information about road geometry, lane markings, traffic flow, and other relevant details. Simultaneously, the vehicle uses various localization techniques, such as GPS and sensor fusion, to accurately determine its position within the mapped environment.
Redundancy and fail-safe mechanisms: L4 autonomous driving technology includes redundant systems and fail-safe mechanisms to ensure safety. Redundancy involves duplicate sensors, controllers, and critical components that can take over if a primary system fails. Fail-safe mechanisms allow the vehicle to respond appropriately and bring the vehicle to a safe state if a critical failure is detected.
Human override capability: While L4 autonomous vehicles can operate without human intervention in their designated operational conditions, they may still allow human passengers to take control if necessary. For example, if the vehicle encounters a situation it cannot handle or if the human occupants desire manual control, they can override the autonomous system and drive the vehicle manually.
WeRide Technology
Perception: Perception refers to the ability of an autonomous vehicle to perceive and understand its surroundings using various sensors. These sensors typically include cameras, LiDAR (Light Detection and Ranging), radar, and sometimes ultrasonic sensors. Cameras capture visual information, LiDAR uses laser beams to measure distances, radar detects objects using radio waves, and ultrasonic sensors provide close-range detection.
Prediction: Prediction involves anticipating the future behavior of objects and entities around the autonomous vehicle. AI algorithms analyze sensor data to predict the trajectories and intentions of pedestrians, cyclists, and other vehicles, allowing the vehicle to make informed decisions.
Localization: Localization is the process of determining the precise position of the autonomous vehicle within its environment. This is typically achieved using a combination of sensor data, such as GPS (Global Positioning System), IMU (Inertial Measurement Unit), and odometry (based on wheel rotations). Additionally, advanced localization techniques, such as SLAM (Simultaneous Localization and Mapping), can be employed to create and update a map of the vehicle’s surroundings.
Sensor Units: Sensor units in autonomous vehicles consist of a combination of cameras, LiDAR, radar, and other sensors. These units are strategically placed on the vehicle to provide comprehensive coverage of the environment. The specific configuration and placement of these sensors can vary depending on the autonomous vehicle’s design and requirements.
Simulation: Simulation plays a crucial role in the development and testing of autonomous driving systems. By creating virtual environments, engineers can assess the performance and behavior of the AI algorithms in various scenarios, including rare and dangerous situations that are difficult to encounter on real roads. Simulations allow for extensive testing and refinement before deploying the technology on public roads.
Remote Assistance: Remote assistance involves human operators providing guidance or taking control of an autonomous vehicle when necessary. This can be done through a remote control center where operators monitor the vehicle’s operations and intervene if needed. Remote assistance is particularly useful during challenging situations or when encountering complex or uncertain scenarios that the AI system may struggle to handle autonomously.