What do AI and Timothy Leary have in common?

1

If you were around in the ’60s, you probably know who Timothy Leary was. If you weren’t around then, the short story is that he was a proponent of then-legal lysergic acid diethylamide (LSD), aka “Acid,” and other psychedelic and hallucinogenic drugs. He coined the phrase, “Turn on, tune in, drop out,” which became a pop-culture mantra for the times.

The answer to the question of what Leary and AI — specifically the AI in Windows — have in common should now be rather obvious. They both hallucinate.

According to Microsoft itself in an article on videocardz.com, the AI agents in Windows hallucinate. The article quotes M$ about its agentic AIs, “…AI models still face functional limitations in terms of how they behave and occasionally may hallucinate.(Emphasis in the original.)

The article goes on to say that M$ is attempting to contain the damage “when something goes wrong,” by running its AI agents in a Windows form of jail. I’m not convinced that has even a microscopic probability of preventing computational disasters. I mean, just look at all the success they’ve had with security in the past; and that’s in addition to the overall enshittification of their former flagship product.

Just imagine for a moment that you’re in the Emergency Room, or A&E, or whatever it’s called in your location. You’re critically ill, the doctors have no idea what’s wrong with you, and turn to the AI. The very AI that just dropped a dose of computational Acid is now going to diagnose you — if they can get it to pay attention long enough.

Perhaps the AI gets paranoid like Hal 9000 while operating your self-driving car; or the aircraft you’re flying home in.

I’ve mentioned these before, but they’re even more scary now: Elevators that decide the occupants are unworthy of continued existence; hospital AI that changes the prescriptions of humans that it deems an existential threat to itself or just that it doesn’t like their morals — or religion — or the schools they went to — or the jobs they do — or the color of their skin. A military AI that provides generals and political leaders with false information that advocates a first-strike nuclear attack as in “War Games?” And don’t forget “Colossus: The Forbin Project.”

I can think of hundreds of disastrous scenarios but I’ll let you consider the possibilities for yourself.

Leave a Reply