Discussion about this post

User's avatar
Frank Ch. Eigler's avatar

"because the software cannot be accused of having a bias to conclude anything"

oh sweet summer child

surely you've heard about l'affair d'gemini

there's bias in the input dataset

and then there is BIAS in the woke lobotomy imposed by openai/etc.

Expand full comment
PB's avatar

So ChatGPT makes up results? It also failed two gastroenterology self-assessment tests. So it lies and it doesn’t know s**t. What more could be wrong with it? https://www.medscape.com/viewarticle/992318#?form=fpf

Expand full comment
104 more comments...

No posts