With self-driving cars upon us, the future is now.
Tesla Inc. is leading the way with innovative technologies in the automotive industry. For years, they’ve been developing Autopilot, which is a fascinating invention that causes both hope and fear for car drivers.
Get everything you need to know right here.
What is Autopilot?
Tesla’s Autopilot is an advanced system that empowers a car to drive itself. On its own, Autopilot can stay centered in lane, change lanes, maintain distance from the next car, self-park, and summon the car from a parking spot or garage.
Tesla’s Autopilot is an example of full self-driving (FSD), which is sometimes called a driverless car or an autonomous car. Autopilot combines both hardware and software to make this possible.
Here’s a demonstration of Autopilot. Legally, there must be a person in the driver’s seat when Autopilot is on, but the person is not driving.
5 Cryptos Set To Soar For 2022 Expert reveals the strongest cryptocurrency investments for 2022 (NOT Dogecoin...)
How Autopilot Works
Under the Federal Aviation Administration, pilots must monitor an aircraft that is on autopilot at all times. In a similar vein, Tesla mandates that drivers must monitor a car on Autopilot at all times.
Tesla equips their cars with radar coverage, cameras, and ultrasonic sensors, which give the system a 360-degree view around the vehicle. The software then uses this equipment to discern surroundings.
The equipment is able to distinguish other vehicles, road signs, obstacles, and lane markings. This system can even detect other cars and pedestrians through fog, dust, and heavy rain. The hardware processes this information at 200 frames per second.
With this constant influx of information, the software is programmed to make the proper moves and adjust speed to the surroundings.
Anytime a new software update is released, a car with Autopilot receives the update wirelessly.
A Brief History of Autopilot
Autopilot has developed over the years in a series of updates and adjustments to both their software and hardware. This section will touch on some of the notable advancements of Autopilot.
In 2013, Tesla CEO Elon Musk said, “Autopilot is a good thing to have in planes, and we should have it in cars.” The theory was simple: if we use autopilot on planes, we can and should use it on cars.
In late 2014, Tesla offered customers Autopilot for the first time. The Model S and Model X came with a tech package upgrade: a windshield-mounted camera, frontside radar, and sensors on the front and back bumpers.
How to Diversify Your Savings in Uncertain Times With GOLD: With interest rate hikes, geopolitical unrest, increasing national debt, and inflation on the rise, there is no time like the present to protect the purchasing power of your savings with precious metals.
If you're looking to live the dream life that you deserve, Click Here Now!
Together this was known as Autopilot, which offered semi-autonomous driving. Tesla refers to this tech package as Hardware 1. This allowed drivers to do limited hands-free driving.
In October 2015, Tesla customers experienced the next step in Autopilot development with the release of software enabling Autopilot, which was packaged with Tesla’s version 7.0. This added a few features.
However, Tesla later released version 7.1, which removed a few features in 7.0. Tesla announced this was an effort to discourage risky behaviors. Version 7.1 did add a remote parking feature, which could be used without a driver in the vehicle.
In August 2016, Tesla announced Autopilot 8.0, which made a significant shift in software. Instead of using cameras as the primary sensor, Autopilot would now use radar. A November update added two notable safety features:
- Autopilot requires the driver to touch the wheel more often.
- Whenever Autopilot is activated, there is now a more noticeable indication that it’s engaged.
In November 2016, Autopilot had been used to drive over 300 million miles.
On October 19, 2016, Tesla announced that all their cars would now be made with full self-driving. Their vehicles would now come with better computing and sensing equipment, which they call Hardware Version 2 (HW2).
This allowed Autopilot to change lanes without needing driver input, transition between freeways, and exit a highway near the driver’s destination.
In February 2017, Autopilot became available for HW2 cars. They included auto-steering on local roads and divided highways, as well as adaptive cruise control. In June of the same year, version 8.1 arrived, which added the features of parallel parking and full-speed breaking.
Later updates made riders smoother by making acceleration and deceleration less jerky.
In July 2017, Hardware Version 2.5 was released. And as of 2019, Tesla continues to evolve Autopilot with developing Hardware Version 3.
Public opinions on Autopilot
As with all new technologies, many questions have been raised over the years by industry experts and concerned citizens.
Some have voiced concerns over the legality of full self-driving. Many question if Tesla drivers using Autopilot are even driving legally. Autopilot seemingly conflicts with the current best practices in driving, such as keeping both hands on the wheel or keeping one foot on a pedal.
Earlier, in the demonstration video, the driver is doing neither.
Tesla spokesman Alexis Georgeson said there is “nothing in our autopilot system that is in conflict with current regulations.” She went on to clarify a misconception about Autopilot, saying:
“We’re not getting rid of the pilot. This is about releasing the driver from tedious tasks so they can focus and provide better input.”
Despite Tesla’s safety measures, some still ask if self-driving cars inherently promote irresponsible driving.
Another recurring debate revolves around fault in the case of an accident caused by full self-driving. If Autopilot makes a mistake and there is a car wreck, who is responsible?
Drivers are expected to monitor Autopilot, but this complicates investigations and court proceedings. At what point is a driver responsible vs. the program and can a company like Tesla be liable?
Individuals, manufacturers, and insurance companies will likely fight a lot in the following years to work out liability issues with self-driving cars.
In the years developing and using Tesla’s Autopilot, there have been a few accidents and deaths recorded involving Autopilot. This article will note a few of the more recent incidents at the time of writing.
May 11, 2018 at South Jordan, Utah
During the evening, a Tesla Model S crashed into a fire truck that was stopped at a red light. Autopilot was engaged at the time and traveling at about 60 miles per hour during the time of impact. The Tesla driver survived with a broken foot.
According to witnesses, the Tesla car did not appear to avoid the impact or attempt to brake prior to the collision. Telemetry data revealed the driver did not touch the wheel 80 seconds preceding the crash and did not brake until milliseconds before impact.
The driver later admitted she was on her phone at the time. Police cited her for “failure to keep proper lookout.”
March 1, 2019 at Delray Beach, Florida
During the morning, a Tesla Model 3 hit a semi-truck on a highway and under rode the trailer. The driver did not survive. The dispatched investigators that analyzed the scene did not fault or cite the driver of the semi-truck.
Ten seconds prior to the collision, the Tesla driver activated Autopilot. And in May 2019, the National Transportation Safety Board (NTSB) determined that neither the Autopilot or the driver attempted evasive maneuver prior to impact.
Preliminary telemetry showed that the driver’s hands were not on the wheels for approximately 8 seconds leading up to the crash.
August 10, 2019 at Moscow, Russia
During the night, a Tesla Model 3 ran into a parked tow truck, then caught on fire. At the time, Autopilot was activated and was driving at the maximum speed limit (100 km/h).
Fortunately, the driver and his children exited the vehicle in time and escaped with survivable injuries. The driver had a broken leg and the children suffered bruising.
The driver claimed he was holding the wheel at the time of the crash, but was not paying attention.
Positive incidents with Autopilot
A noteworthy incident that puts a positive light on Autopilot occurred on July of 2016, in Washington D.C. While driving his Tesla Model X, Joshua Neally experienced pulmonary embolism, which made it impossible for him to drive.
Neally was able to drive most of the way to a hospital using Autopilot, which possibly saved his life.
On July 21, 2016, Elon Musk tweeted that Autopilot saved the life of a pedestrian in D.C., or at least prevented serious injury. The driver reported that a pedestrian stepped out in front of his vehicle while he was distracted. Autopilot instantly braked and prevented the car from hitting the pedestrian.
Musk confirmed this by looking at the vehicle’s logs.
The Future of Autopilot
Autopilot will unquestioningly impact greater society in the future. Tesla has repeatedly stated and demonstrated a commitment to developing a safe, legal, effective Autopilot system.
And they’re not the only ones interested in the technology. Google is also developing a self-driving car under their program Waymo.
While Tesla has made significant progress in recent years, it seems there are still technological and legal barriers they need to overcome.
Here’s Elon Musk on the future of Autopilot: “Full autonomy is really a software limitation. The hardware exists to create full autonomy. So it’s really about developing advanced narrow AI for the car to operate on.”
This astounding statement reveals how close we are to a world full of self-driving cars. The hardware already exists. We just have to write better software.
Are you ready for self-driving cars?