3 Comments
User's avatar
⭠ Return to thread
PB's avatar

So ChatGPT makes up results? It also failed two gastroenterology self-assessment tests. So it lies and it doesn’t know s**t. What more could be wrong with it? https://www.medscape.com/viewarticle/992318#?form=fpf

Expand full comment
Mitch Barrie's avatar

I have caught ChatGPT inventing things (guessing) if it does not know the answer. It is more like human beings that we like to admit.

Expand full comment
tracy's avatar

It's a program written by a human. AI it ain't!

Expand full comment