The Tesla autopilot mode is being gleefully abused in brand new ways. 

In the parking lot of a climbing gym in Denver, my friend showed off his new Tesla Model X SUV. Shiny, grey, with gulf-wing doors. Then, grinning, the computer-programmer whipped out his iPhone and showed me video of Tesla's killer app. "I let it drive me around while I'm tripping on acid," he told me.

In the footage, he was in the driver's seat, and he panned over to show his Tesla's steering wheel turning itself as the car rounded the corners of the desert roads near Moab, Utah. On one side, the land sloped quickly down. Death loomed.

(I can't verify for a fact he was high on LSD. But I know he trips often, and he's that kind of guy.)  

His video was more than just a wild performance: it highlights both the promise and the terror of artificial intelligence. As we give over more responsibility to the machines, how do we feel about the humans inside them losing control of themselves?

photo - stock image of LSD

[Stock image of an LSD tab.]

Teslas can't fully drive themselves — yet. But in autopilot mode, on highways, Teslas can steer themselves to stay in their lanes and brake to avoid cars in front of them. Autopilot was designed so owners could chill out, sip a cup of coffee. But five years after it was introduced, Tesla's autopilot is being put to uses CEO Elon Musk must have imagined. After all, Musk buys LSD and trips face while tweeting (according to his former dealer and Azealia Banks). 

A year ago, California Highway Patrol officers spied a man passed out drunk in the driver's seat of a Tesla while it was barrelling 70 mph on Highway 101 in Silicon Valley, Wired reported. Cops pulled in front of him and slowed down and stopped — the Tesla did, too. He was charged with DUI.

This story went viral, and news outlets took two different angles on the hammered motorist and his futuristic car.

Some saw a story of triumph: Tesla's autopilot "saved the drunk driver's life," wrote TopSpeed.

Some saw a terrifying warning: "A sleeping Tesla driver highlights autopilot's biggest flaw," wrote Wired.

photo - Tesla CEO Elon Musk and Talulah Riley

[Tesla CEO Elon Musk with his then-wife, Talulah Riley, in 2012. Photos from Shutterstock.]

Other stories of Tesla shenanigans are out there. Dudes joke about getting BJ's while on autopilot. Checking email. Texting.

And, like teenagers, Tesla owners are trying to make autopilot even more autonomous, to give the machine even more control. Drivers are supposed to touch the wheel every two minutes, or an alarm goes off. But they've figured out that if you wedge an orange in the wheel, it tricks the car into thinking you're touching it. All this even though there's video of Teslas nearly killing dudes, and an autopiloted Model S rammed the back of a fire truck.

With all this uncertainty, will Evel Knievel behavior continue and expand? Probably. Driving is tedious; LSD and fellacio are fun. About 40 percent of us want a self-driving car. But do we trust our new robot overlords to ferry around our irresponsible brothers and sisters? And is letting autopilot drive better than substance users steering through our landscaped wasted or faced — which they're doing now? 

My tripping friend in his Tesla X won't stop dosing and driving. As both a computer guy and a psychedelics dude, he thinks LSD has showed him "99 percent" we live in a computer simulation. (Elon Musk, too, thinks there's only a "one in billions" chance we're not living in someone else's computer.) So if the Tesla drives him off a Utah cliff, it's not death, it's just an exit from the computer code. But what if he and his Tesla take out one of the rest of us along the way?