A Tesla driver has been charged after being found asleep at the wheel of his self-driving car speeding along at 93mph while using the ‘autopilot system’ in Canada.
The man’s Model S Tesla was pictured with its seats fully reclined while roaring along near the town of Ponoka, about 60 miles south of Edmonton in Alberta.
Both the driver and another passenger were said to be asleep in the two front seats of the car.
When cops discovered the car traveling at about 86mph, they turned on their emergency flashing lights – only for the Tesla to ‘automatically begin to accelerate’ to 93mph. The speed limit on that stretch of highway is about 68mph.
‘The car appeared to be self-driving, traveling over 140km/h, with both front seats completely reclined and both occupants appearing to be asleep,’ said a statement from the Royal Canadian Mounted police (RCMP).
It is not clear why the car sped up to 93 mph (exactly 150 km/h) as the cars moved out of the way when the police gave chase.
The driver, 20, from British Colombia, was charged with speeding and given a 24-hour license suspension for fatigue.
A Tesla driver has been charged after being found asleep at the wheel of his self-driving car speeding along at 93mph in Canada
He was later also charged with dangerous driving and he was served a summons to appear in court in December.
RCMP Sgt. Darrin Turnbull told CBC News: ‘Nobody was looking out the windshield to see where the car was going. I’ve been in policing for over 23 years and the majority of that in traffic law enforcement, and I’m speechless.
‘I’ve never, ever seen anything like this before, but of course the technology wasn’t there.’
Tesla Model S sedans have autopilot functions which includes auto-steer as well as ‘traffic-aware’ cruise control. In this case both functions appeared to be in use..
Turnball added: ‘We believe the vehicle was operating on the autopilot system, which is really just an advanced driver safety system, a driver assist program. You still need to be driving the vehicle.
‘But of course, there are after-market things that can be done to a vehicle against the manufacturer’s recommendations to change or circumvent the safety system.’
Pictured: A Tesla Model S, the same model that was caught speeding down a Canadian highway while its driver was using the auto-pilot features to take a nap (stock image)
The auto-pilot function will steer, accelerate and brake for the car within its lane, according to Tesla’s website, but notes that the driver still needs to be paying attention. The function does ‘not make the vehicle autonomous’, it says.
RCMP superintendent Gary Graham said in the statement: ‘Although manufacturers of new vehicles have built-in safeguards to prevent drivers from taking advantage of the new safety systems in vehicles, those systems are just that – supplemental safety systems.
‘They are not self-driving systems. They still come with the responsibility of driving.’
In all Canadian provinces, using the self-driving feature is illegal without an alert driver present, with the Insurance Corporation of British Columbia (ICBC) stating that a driver is responsible for the vehicle’s actions when driver assistance is turned on.
In July, Tesla CEO Elon Musk said he expects his company’s vehicles to be fully autonomous by the end of the year, saying it was already ‘very close’ to meeting the requirements of ‘level-five’ autonomy, which requires no input from a driver.
Tesla CEO Elon Musk has boldly claimed that Tesla cars could have ‘level-five’ autonomy by the end of the year. Level-five autonomy means the vehicle requires no input from the driver
Laws on autonomous vehicles in Canada, U.S. and U.K.
Laws regarding self-driving or autonomous vehicles are currently at different stages from country to country.
Canada is yet to adopt any comprehensive federal legislation that targets their use, or the liability issues they raise.
While the use of autonomous vehicles is legal, currently, a human driver is required at all times to be able to take control of the vehicle when alerted to do so.
Therefore, they must be aware and ready to drive the car in any given moment, and all existing laws – such as those regarding the use of a cell phone or remaining awake – still apply.
In the U.S., the legality of operating an autonomous vehicle varies from state to state, although no state has an outright ban on them.
As of 2020, Connecticut, District of Columbia, Illinois, Massachusetts, New Hampshire, New York, and Vermont all require a human operator in the vehicle.
Some states (Florida, Georgia, Nebraska, Nevada, North Carolina, North Dakota, Pennsylvania, and Washington) require a human operator to be present based on the level of the vehicle’s automation.
Some states have no legislation or executive orders regarding autonomous vehicles.
These are Alaska, Kansas, Maryland, Missouri, Montana, New Hampshire, New Jersey, New Mexico, Rhode Island, South Carolina, South Dakota, West Virginia, Wyoming.
Some, such as California, have actively encouraged the testing of autonomous vehicles on the road as far back as 2012.
In the U.K., legislation to regulate the use of automated vehicles is currently under consultation, although this is specifically related to the use of some safety features.
The use of a self-driving auto-pilot system on U.K. roads is illegal, but hands-free driving on some roads could become legal by next year.
This would would mean a driver must be ready to take control of the vehicle, but can do things such as checking their phone while the steering assist and cruise control is engaged on highways.
Currently, Tesla’s have ‘level-two’ autonomy, but Musk claimed that current Teslas on the road can be upgraded to ‘level-five’ with a simple software update push.
‘I remain confident that we will have the basic functionality for level five autonomy complete this year. There are no fundamental challenges remaining. There are many small problems,’ he said in July.
‘And then there’s the challenge of solving all those small problems and putting the whole system together.’
But IHS Markit analyst Tim Urquhart said while level-five autonomous driving was the ‘holy grail’ in the industry, ‘Even if Tesla can reliably roll out the technology in a production environment, the regulatory environment in all the major markets is way behind allowing completely autonomous vehicles on the road.’
The incident in Canada in July is not the only example of drivers being caught while being overly reliant on Tesla’s self-driving function, with some resulting in accidents.
In January, an Ontario driver was charged with reckless driver after police caught him using both hands to floss his teeth while behind the wheel as his vehicle shot down the highway at 83 mph.
Two months before, a Tesla was filmed in Richmond driving the wrong way down a road in a parking lot – without a driver.
In the United States, a number of fatal crashes involving the ‘autopilot’ function are being investigated by officials, including one in which a driver was using the feature to play on his smartphone.
Last week, a TikTok video emerged of a Tesla car driving down a California highway on autopilot with nobody in the driving seat as four passengers drink cans of seltzer and sing along to Justin Bieber.
The shocking footage, posted on the TikTok account @BlurrBlake, showed three young men – alongside the unknown person behind the camera – partying inside the vehicle as it flies down the highway.
The car allegedly reached speeds of 60 mph, all with no human driver ready to take over the vehicle, reported TMZ.
As per Tesla’s website, the vehicle’s autopilot system must be used ‘with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment’.
The technology has been connected to four fatal accidents in the U.S.
In March 2019, Florida driver Jeremy Banner, 50, died when his Tesla Model 3 slammed into a trailer truck.
National Transportation Safety Board investigators said Banner turned on the autopilot feature about 10 seconds before the crash, and the autopilot did not execute any evasive maneuvers to avoid the collision.
Three other fatal crashes date back to 2016.