Autonomous vehicle software refers to the use of software in the development of advanced driver assistance systems and autonomous vehicles and driving systems.
Software is often a differentiating factor between one vehicle and the next in terms of capability, performance, and self-driving experience. The software is critical to the vehicle's ability to safely operate advanced driver assistance systems and autonomous driving capabilities. Autonomous vehicle software also makes use of artificial intelligence (AI) to understand the surrounding environment, recognize objects, and classify those objects. The AI also works to predict what will happen next and pass that information on to a decision model to decidedetermine which course of action to take.
Motion control is an important component in the "think" and "do" steps of a "see-think-do" model that autonomous software works on. This can maintain a vehicle on a specific path, or it can adapt the vehicle to specific driving conditions or unforeseen road conditions. These systems and their prescribed performance characteristics are further being developed to ensure driver and passenger comfort. This means the model not only has not only to avoid collisions but also needs to do so in a way that provides the driver and passenger with a sense of security and comfort. This also includes developing the estimated tolerance of safety and caution for an autonomous vehicle in a multi-lane highway scenario, where an overly cautious autonomous vehicle driving slower (or to the limit) compared to other vehicles on the roadway can further cause accidents.
Another important part of autonomous vehicle systems is the ability of the car to monitor the state of the various parts. Vehicle software can suggest driver changechanging oil or filters, but autonomous vehicle software could monitor the driver's body and their state, such as blink rate sensors to ensure the driver does not fall asleep. This can extend to anti-theft systems, which can include authorization processes to make theft more difficult, requiring a key and even a smartphone to ensure the person entering the vehicle is the rightful person. Some systems have gone as far as including retina scanning or fingerprint sensors to identify drivers, and different profiles can be loaded into the car for allowed drivers.
Simulators can also be completely software-based, using a virtual reality or video game-like environment and engine to test the autonomous vehicle against various traffic, pedestrian, weather, parking lot, and obstacle conditions, among others. This can help train the autonomous vehicle software without requiring a physical platform. It can help the vehicle software to understand the parameters of a given vehicle platform, and to better understand the vehicle in the context of its environment.
Software is often a differentiating factor between one vehicle and the next in terms of capability, performance, and self-driving experiencesexperience. The software is critical into the vehicle's ability to for a vehicle to safely operate advanced driver assistance systems and autonomous driving capabilities. Autonomous vehicle software also makes use of artificial intelligence (AI) in order to understand theirthe surrounding environment, to recognize objects, and to classify those objects. The AI also works to predict what will happen next and pass that information on to a decision model to decide which course of action to take.
Autonomous vehicle software has to go through a similar compute model asthat humans go through, also known as a "see-think-do" approach. For humans, this occurs almost without thought. This model includes perception (seeing or sensing something), or perception; followed by an evaluation of optionsevaluating available options and weighing the potential outcomes;. and thenFinally, a decision is made, and a course of action is followed. For a compute engine in vehicles, this process uses sensors in the car, including cameras, lidar, and radar, in order to predict movement paths and evaluate options before issuing an instruction on the course and any potential corrections.
For an autonomous car to work, it requires sensors, actuators, complex algorithms, machine learning systems, and processors to execute software. The cars then use this to create and maintain a map of their surroundings based on a variety of sensors situatedlocated around tehthe vehicle. These can include radar sensors to monitor the position of other vehicles; video cameras to detect traffic lights, read road signs, track other vehicles, and track pedestrians; lidar for measuring distances, detecting road edges, and identifying lane marking; and ultrasonic sensors in the wheels for detecting curbs or other vehicles when parking. The onboard software then works to processprocesses this sensory input, plots pthspaths, and controlcontrols the acceleration, braking, and steering. This can include hard-codedHard-coded rules, obstacle avoidance algorithms, and object recognition to help the car follow traffic rules and navigate through the road.
As the challenges in developing autonomous cars have mounted, and the computation requirements have increased, more technology companies and automaotiveautomotive OEM'sOEMs have begun working together to develop software "ecosystems" to support the development of autonomous vehicles, create some industry standardization, and work towardstoward making autonomous vehicles safe. This has included the development of industry standards for those platforms, to allowallowing for interoperability in the software, and allowingenabling automakers to integrate various pieces of software as necessary based on the needs and expectations for the vehicle in question. And this allows software platforms to integrate machine learning, self-healing maps, artificial intelligence, V2X connectivity, and computer vision capabilities into a platform.
As part of the development of software platforms or "ecosystems" for the development of autonomous vehicles, operating system (OS) platforms have been developed for autonomous vehicles. An OS offers a platform on which autonomous services can be integrated, with the OS controlling the car's core capabilities, and working to keep passengers and the driving environment safe. For the OS to run, they rely on electronic control units (ECUs), which act as distributddistributed brains for the OS and the autonomous vehicles. ECUs are similar to minicomputers, varying in size, purpose, and OS,. theyThey can control various vehicle applications, such as steering, navigation, tracking, engine control, steering stability, orand active suspension. Some OS for autonomous vehicles include the following:
As noted above, the amount of computation necessary for an autonomous car can be quite demanding. To help with this compute load, and to increase the safety of autonomous vehicles, connectivity has been suggested as a solution. In this schema, each car becomes an edge-compute platform connected to eachother othercars, similar to an IoT environment, capable of communicating amongswith each other for relative driving conditions, road issues, orand other data to help the cars work together. In some future focusedfuture-focused looks, this could include removing lights from streets as the cars would be able to communicate with each other onregarding which car is going when and where, and to monitor for pedestrians. This connectivity goal has been furthered by the onset and promise of 5G networks and data optimization developments.
Connecting autonomous vehicles can also reduce the compute needs of any single car, allowing for automousautonomous vehicles to increase their compute and software performance while offering lower development costs. What can help with connectivity and data transfer speeds is developing frameworks whichthat can be used by various developers and ecosystem architectures to make interoperability between vehicles faster, and more futureproof, to minimize overall system complexity and cost.
Neural networks offer vehicle software platforms the ability for the detection, recognition, and classification of objects. Further, with the integration of computer vision algorithms, they can monitor the white lines of the road effectively. The neural networks used in this case are trained using thousands of driving hours and millions of miles of real and simulated roads for autonomous vehicle software. Simulations used are similar to video games, allowing the models to encounter everyday events and unusual occurencesoccurrences and to better prepare the models for real-world driving.
Similarly, convulotionalconvolutional neural networks (CNNs) can detect, classify, and segment - orsegment—or separate pavement from road. Or the vehicle platforms can use recurrent neural networks, which are temporally based and tend to inlcudeinclude many types of networks that involve loops. Either way, using a neural network of any type requires the compute hardware necessary to run the neural networks inferencing, or "computing" what is seeingseen, in a fast and low latency compute environment for the vehicle to be able to work in real-worldthe real world and real-timein real time.
Autonomous vehicle perception sensing collects all data from a vehicle's sensors and processes this data into an understanding of the world aroudnaround the vehicle, similar to the use of sight for a human driver to perceive and understand their driving position. To develop perception sensing, an autonomous vehicle requires vision, radar, and lidar sensing modalities;, each of which brings theirits own strengths and weaknesses, but in an overlapping system whichthat uses all of them and feeds data from each sensor into a perception system whichthat can use the different data points to develop a complete picture.
Another important part of perception sensing can be detecting traffic and predicting future behavior of traffic in inclement weather. The sensor arrays and computing in perception sensing can allow the vehicle to detect, track, and predict the movements of objects regardless of weather. This can be especially important in near-zero visibility conditions during winter storms, rain storms, orand in fog banks, where radar and lidar can detect objectobjects that optical systems cannot necessarynecessarily detect.
A portion of any autonomous vehicle software requires navigation. There are various navigation systems whichthat can be integrated ininto the software, including using GPS and related information from satellites, such as traffic systems, which can create a common information field whichthat cars can be an integrated part of. Further, it can be a part of the connectivity between other vehicles, collecting data from all vehicles to optimize routes and anticipate driver needs, keeping track of a weather forecast and road reports to ensure the drive can be as safe as possible. Connecting cars can further help cars avoid accidents, while being aware of traffic situations. In a navigation stack, the hardware required to connect a vehicle includes a GPS receiver, an inertial measurement unit (IMU), a compass, and a data processing computer, in order to properly connect and integrate the navigation data into the larger autonomous vehicle software.
As part of the autonomous navigation, especially in cross-country environments rather than urban environments, the vehicle software requires the components and computing for obstacle detection and terrain classification. This includes using geometric descriptions of the scene, and a terrain typing component of the perceptual system. Detecting obstacles and the classes of the terrain allows an autonomous vehicle to plan its path and choose the most efficient route toward a desired goal. To develop this capability, the autonomous vehicle software requires new sensor processing algorithms developed for cross-country navigation and sensor systems, and sensor systems such as a color stereo camera and a singesingle axis ladar. Using both systems can increase the potential for obstacle detection, while using the single axis ladar with an appropriate algorithm can be used to discriminate between types of terrain.
Vehicle motion control refers to technologies capable of influencing the longitudinal, lateral, and vertical dynamics of a vehicle. This can include steering, brakes, dampers, and eelctronicelectronic control units, butand software is being increasingly integrated into this. Motion control is a necessary part of the autonomous vehicle technology stack, as it allows the automated and autonomous software to control different parts of the vehicle, with intelligent networking across a vehicle allowing the vehicle to achieve better driving dynamics, driver safety, and comfort. The software models used with motion control systems can coordinate systems to guarantee tracking performance and characteristics to ensure the autonomous vehicle performs in a prescribed performance characteristic model.
Motion control is an important component in the "think" and "do" steps of a "see-think-do" model that autonomous software works uponon. This can maintain a vehicle on a specific path, or it can adapt the vehicle to specific driving conditions or unforseenunforeseen road conditions. These systems and their prescribed performance characteristics are further being developed to ensure driver and passenger comfort. This means the model has not only to avoid collisions, but also needs to do so in a way that provides the driver and passenger with a sense of security and comfort. This also includes developing the estimated tolerance of safety and caution for an autonomous vehicle in a multi-lane highway scenario, where an overly cautious autonomous vehicle driving slower (or to the limit) compared to other vehicles on the roadway can further cause accidents.
Another important part of automous vehicle systems is the ability for the car to monitor the state of the various parts. Vehicle software can suggest driver change oil or filters, but an autonomous vehicle softwre could monitor the driver's body and their state, such as blink rate sensors to ensure the driver does not fall asleep. This can extend to anti-theft systems which can include authorization processes to make theft more difficult, requiring a key and even a smartphone to ensure the person entering the vehicle is the rightful person. Some systems have gone as far as including retina scanning to identify drivers, or using fingerprint sensors, which allow the driver to load new profiles into the car of people allowed to drive the car.
Another important part of autonomous vehicle systems is the ability of the car to monitor the state of the various parts. Vehicle software can suggest driver change oil or filters, but autonomous vehicle software could monitor the driver's body and their state, such as blink rate sensors to ensure the driver does not fall asleep. This can extend to anti-theft systems, which can include authorization processes to make theft more difficult, requiring a key and even a smartphone to ensure the person entering the vehicle is the rightful person. Some systems have gone as far as including retina scanning or fingerprint sensors to identify drivers, and different profiles can be loaded into the car for allowed drivers.
As noted above, simulations are used to support autonomous vehicles and autonomous robotic platforms, which offer a simulacrum to the autonomous vehicle platform but in a lower stakeslower-stakes environment. Simulating autonomous vehicle software can be done either in a physical environment, such as using robotic platforms to simulate how the software will react to different conditions. And thisThis can help train the software to better respond to conditions in real-use cases.
However, simulatorsSimulators can also be completely software-based, using a virtual reality or video-gamevideo likegame-like environment and engine to test the autonmousautonomous vehicle against various traffic, pedestrian, weather, parking lot, and obstacle conditions, among others. This can help train the automousautonomous vehicle software without requiring a physical platform. It can help the vehicle software to understand the parameters of a givegiven vehicle platform, and to better understand the vehicle in the context of its environment.
Autonomous vehicle software refers to the use of software in the development of advanced driver assistance systems and autonomous vehicles and driving systems.
Software is often a differentiating factor between one vehicle and the next in terms of capability, performance, and self-driving experiences. The software is critical in the vehicle's ability to for a vehicle to safely operate advanced driver assistance systems and autonomous driving capabilities. Autonomous vehicle software also makes use of artificial intelligence (AI) in order to understand their environment, to recognize objects, and to classify those objects. The AI also works to predict what will happen next and pass that information on to a decision model to decide which course of action to take.
Autonomous vehicle software has to go through a similar compute model as humans go through, also known as a "see-think-do" approach. For humans, this occurs almost without thought. This model includes seeing or sensing something, or perception; followed by an evaluation of options available and weighing the potential outcomes; and then a decision is made and a course of action followed. For a compute engine in vehicles, this process uses sensors in the car, including cameras, lidar, and radar, in order to predict movement paths and evaluate options before issuing an instruction on the course and any potential corrections.
For an autonomous car to work, it requires sensors, actuators, complex algorithms, machine learning systems, and processors to execute software. The cars then use this to create and maintain a map of their surroundings based on a variety of sensors situated around teh vehicle. These can include radar sensors to monitor the position of other vehicles; video cameras to detect traffic lights, read road signs, track other vehicles, and track pedestrians; lidar for measuring distances, detecting road edges, and identifying lane marking; and ultrasonic sensors in the wheels for detecting curbs or other vehicles when parking. The onboard software then works to process this sensory input, plots pths, and control the acceleration, braking, and steering. This can include hard-coded rules, obstacle avoidance algorithms, and object recognition to help the car follow traffic rules and navigate through the road.
As the challenges in developing autonomous cars have mounted, and the computation requirements have increased, more technology companies and automaotive OEM's have begun working together to develop software "ecosystems" to support the development of autonomous vehicles, create some industry standardization, and work towards making autonomous vehicles safe. This has included the development of industry standards for those platforms, to allow for interoperability in the software, and allowing automakers to integrate various pieces of software as necessary based on the needs and expectations for the vehicle in question. And this allows software platforms to integrate machine learning, self-healing maps, artificial intelligence, V2X connectivity, and computer vision capabilities into a platform.
As part of the development of software platforms or "ecosystems" for the development of autonomous vehicles, operating system (OS) platforms have been developed for autonomous vehicles. An OS offers a platform on which autonomous services can be integrated, with the OS controlling the car's core capabilities, and working to keep passengers and the driving environment safe. For the OS to run, they rely on electronic control units (ECUs) which act as distributd brains for the OS and the autonomous vehicles. ECUs are similar to minicomputers, varying in size, purpose, and OS, they can control various vehicle applications such as steering, navigation, tracking, engine control, steering stability, or active suspension. Some OS for autonomous vehicles include:
As noted above, the amount of computation necessary for an autonomous car can be quite demanding. To help with this compute load, and increase the safety of autonomous vehicles, connectivity has been suggested as a solution. In this schema, each car becomes an edge-compute platform connected to each other similar to an IoT environment, capable of communicating amongs each other for relative driving conditions, road issues, or other data to help the cars work together. In some future focused looks, this could include removing lights from streets as the cars would be able to communicate with each other on which car is going when and where, and to monitor for pedestrians. This connectivity goal has been furthered by the onset and promise of 5G networks and data optimization developments.
Connecting autonomous vehicles can also reduce the compute needs of any single car, allowing for automous vehicles to increase their compute and software performance while offering lower development costs. What can help with connectivity and data transfer speeds is developing frameworks which can be used by various developers and ecosystem architectures to make interoperability between vehicles faster, and more futureproof, to minimize overall system complexity and cost.
Neural networks offer vehicle software platforms the ability for the detection, recognition, and classification of objects. Further, with the integration of computer vision algorithms, they can monitor the white lines of the road effectively. The neural networks used in this case are trained using thousands of driving hours and millions of miles of real and simulated roads for autonomous vehicle software. Simulations used are similar to video games, allowing the models to encounter everyday events and unusual occurences and to better prepare the models for real-world driving.
Similarly, convulotional neural networks (CNNs) can detect, classify, and segment - or separate pavement from road. Or the vehicle platforms can use recurrent neural networks which are temporally based and tend to inlcude many types of networks that involve loops. Either way, using a neural network of any type requires the compute hardware necessary to run the neural networks inferencing, or "computing" what is seeing, in a fast and low latency compute environment for the vehicle to be able to work in real-world and real-time.
Autonomous vehicle perception sensing collects all data from a vehicle's sensors and processes this data into an understanding of the world aroudn the vehicle, similar to the use of sight for a human driver to perceive and understand their driving position. To develop perception sensing, an autonomous vehicle requires vision, radar, and lidar sensing modalities; each of which brings their own strengths and weaknesses, but in an overlapping system which uses all of them and feeds data from each sensor into a perception system which can use the different data points to develop a complete picture.
Another important part of perception sensing can be detecting traffic and predicting future behavior of traffic in inclement weather. The sensor arrays and computing in perception sensing can allow the vehicle to detect, track, and predict the movements of objects regardless of weather. This can be especially important in near-zero visibility conditions during winter storms, rain storms, or in fog banks, where radar and lidar can detect object that optical systems cannot necessary detect.
A portion of any autonomous vehicle software requires navigation. There are various navigation systems which can be integrated in the software, including using GPS and related information from satellites, such as traffic systems, which can create a common information field which cars can be an integrated part of. Further, it can be a part of the connectivity between other vehicles, collecting data from all vehicles to optimize routes and anticipate driver needs, keeping track of a weather forecast and road reports to ensure the drive can be as safe as possible. Connecting cars can further help cars avoid accidents, while being aware of traffic situations. In a navigation stack, the hardware required to connect a vehicle includes a GPS receiver, an inertial measurement unit (IMU), a compass, and a data processing computer, in order to properly connect and integrate the navigation data into the larger autonomous vehicle software.
As part of the autonomous navigation, especially in cross-country environments rather than urban environments, the vehicle software requires the components and computing for obstacle detection and terrain classification. This includes using geometric descriptions of the scene, and a terrain typing component of the perceptual system. Detecting obstacles and the classes of the terrain allows an autonomous vehicle to plan its path and choose the most efficient route toward a desired goal. To develop this capability, the autonomous vehicle software requires new sensor processing algorithms developed for cross-country navigation, and sensor systems such as a color stereo camera and a singe axis ladar. Using both systems can increase the potential for obstacle detection, while using the single axis ladar with an appropriate algorithm can be used to discriminate between types of terrain.
Vehicle motion control refers to technologies capable of influencing the longitudinal, lateral, and vertical dynamics of a vehicle. This can include steering, brakes, dampers, and eelctronic control units, but software is being increasingly integrated into this. Motion control is a necessary part of the autonomous vehicle technology stack, as it allows the automated and autonomous software to control different parts of the vehicle, with intelligent networking across a vehicle allowing the vehicle to achieve better driving dynamics, driver safety, and comfort. The software models used with motion control systems can coordinate systems to guarantee tracking performance and characteristics to ensure the autonomous vehicle performs in a prescribed performance characteristic model.
Motion control is an important component in the "think" and "do" steps of a "see-think-do" model that autonomous software works upon. This can maintain a vehicle on a specific path, or it can adapt the vehicle to specific driving conditions or unforseen road conditions. These systems and their prescribed performance characteristics are further being developed to ensure driver and passenger comfort. This means the model has not only to avoid collisions, but also needs to do so in a way that provides the driver and passenger with a sense of security and comfort. This also includes developing the estimated tolerance of safety and caution for an autonomous vehicle in a multi-lane highway scenario, where an overly cautious autonomous vehicle driving slower (or to the limit) compared to other vehicles on the roadway can further cause accidents.
Another important part of automous vehicle systems is the ability for the car to monitor the state of the various parts. Vehicle software can suggest driver change oil or filters, but an autonomous vehicle softwre could monitor the driver's body and their state, such as blink rate sensors to ensure the driver does not fall asleep. This can extend to anti-theft systems which can include authorization processes to make theft more difficult, requiring a key and even a smartphone to ensure the person entering the vehicle is the rightful person. Some systems have gone as far as including retina scanning to identify drivers, or using fingerprint sensors, which allow the driver to load new profiles into the car of people allowed to drive the car.
As noted above, simulations are used to support autonomous vehicles and autonomous robotic platforms, which offer a simulacrum to the autonomous vehicle platform but in a lower stakes environment. Simulating autonomous vehicle software can be done either in a physical environment, such as using robotic platforms to simulate how the software will react to different conditions. And this can help train the software to better respond to conditions in real-use cases.
However, simulators can also be completely software-based, using a virtual reality or video-game like environment and engine to test the autonmous vehicle against various traffic, pedestrian, weather, parking lot, and obstacle conditions, among others. This can help train the automous vehicle software without requiring a physical platform. It can help the vehicle software to understand the parameters of a give vehicle platform, and to better understand the vehicle in context of its environment.
Autonomous vehicle software refers to the use of software in the development of advanced driver assistance systems and autonomous vehicles and driving systems.