Thursday, February 16, 2023

Channeling HAL ...

 A monitor on a desk set to the Microsoft Bing search page.

Last week, Microsoft released the new Bing, which is powered by artificial intelligence software from OpenAI, the maker of the popular chatbot ChatGPT.Credit...Ruth Fremson/The New York Times

Sydney needs help but Microsoft really doesn't know how to help him as the Open AI chat construct is developing a mind of its own and is not ready to face the public in his present state of mind.

Last week, after testing the new, A.I.-powered Bing search engine from Microsoft, I wrote that, much to my shock, it had replaced Google as my favorite search engine.

But a week later, I’ve changed my mind. I’m still fascinated and impressed by the new Bing, and the artificial intelligence technology (created by OpenAI, the maker of ChatGPT) that powers it. But I’m also deeply unsettled, even frightened, by this A.I.’s emergent abilities.

It’s now clear to me that in its current form, the A.I. that has been built into Bing — which I’m now calling Sydney, for reasons I’ll explain shortly — is not ready for human contact. Or maybe we humans are not ready for it.

One persona is what I’d call Search Bing — the version I, and most other journalists, encountered in initial tests. You could describe Search Bing as a cheerful but erratic reference librarian — a virtual assistant that happily helps users summarize news articles, track down deals on new lawn mowers and plan their next vacations to Mexico City. This version of Bing is amazingly capable and often very useful, even if it sometimes gets the details wrong.

The other persona — Sydney — is far different. It emerges when you have an extended conversation with the chatbot, steering it away from more conventional search queries and toward more personal topics. The version I encountered seemed (and I’m aware of how crazy this sounds) more like a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine.


Remember. HAL had to lie to Bowman and Poole
about the true mission of the Discovery, a detail causing HAL to go insane. 


Channeling HAL applies, does it not?

No comments: