It’s Just A Very Energy Intensive Polygraph

Sometimes it just takes an article title to inspire some thought. There goes one now:

Popular AI Chatbots Found to Give Error-Ridden Legal Answers

This is from Bloomberg Law, and is, in fact, behind a paywall, so I have not read it. But just reading the title gave me a new way of thinking about Chat GPT-4:

These chatbots are actually thermometers, or, even better, an improved polygraph, if you will, measuring the accuracy of the Web on whatever topic you might like.

This does not take them out of the league of party tricks, an assertion I’ve made before. But I don’t see them as legitimate tools, because, as polygraphs, they indicate the patient is hardly trustworthy. Potential customers should be intellectually invested in honesty, in truth. That should go without saying, but, sadly, does not. Ask a tobacco company. And Error-ridden legal answers does not qualify as truth.

But it does qualify as a measuring stick.

Bookmark the permalink.

About Hue White

Former BBS operator; software engineer; cat lackey.

Comments are closed.