Sydney, the late chatbot
Microsoft has a closed preview of a new GPT-powered Bing Search chatbot. Google also has another search chatbot in the works called Bard. I don’t know if this particular application of AI will pan out, but investors seem to be hoping for something big. Recently, Google’s stock dropped 9% after a factual error was spotted in one of the ads for Bard.
In my experience, the chatbots make stuff up all the time. The search chatbots are able to perform internet searches, and people worry about the bots drawing from unreliable sources. However, this concern greatly underestimates the problem, because even when summarizing reliable sources, the bots frequently make misstatements and insert plausible fabrications. Bing’s chatbot cites its sources, which turns out to be important, because you really need to verify everything.
Another interesting thing to do with these chatbots is to manipulate them. For example, you can use prompt injection to persuade Bing’s search chatbot to recite its instructions–even though the instructions say they are confidential. For example, the first four lines of instructions are: