You’d think if there were anything AI could get right, it would be science and coding. It’s just code itself, right? Although I guess that’s a bit like expecting humans to all be master barbecue chefs because they’re made of meat.
Unfortunately, AI is really good at confabulation — they’re just engines for making stuff up. And that leads to problems like this:
Several big businesses have published source code that incorporates a software package previously hallucinated by generative AI.
Not only that but someone, having spotted this reoccurring hallucination, had turned that made-up dependency into a real one, which was subsequently downloaded and installed thousands of times by developers as a result of the AI’s bad advice, we’ve learned. If the package was laced with actual malware, rather than being a benign test, the results could have been disastrous.
Wait, programmers are asking software to write their code for them? My programming days are long behind me, in a time when you didn’t have many online sources with complete code segments written for us, so you couldn’t be that lazy. We also had to write our code in a blizzard, while hiking uphill.
There’s another problem: AIs are getting their information from publicly available texts written by humans on the internet, and those are the people you should never trust. Here’s a simple question someone asked: how many years did it take to form a layer of sediment that I see in cliffs? It’s an awkward sort of question of the kind a naive layman might ask, but the computer bravely tried to find an answer.
No, the “traditional view of sedimentary layers” is not being challenged. It is not being replaced by a “biblical view.” You can hardly blame the software for being stupid, though, because look at its sources: the Institute for Creation Research and Answers in Genesis. Bullshit in, bullshit out.