I was curious about exploring the motivations of a character (specifically Linter in The State of the Art) and asked a question to start off with (and bring existing understanding into the context) with another character (Diziet Sma)... and ChatGPT got things wrong...
The thing is if you don't know the story or the books mentioned... it's perfectly plausible that what was written is correct. And while a good bit of it is... maybe; that it got material facts wrong makes it "if it's working from that, then nothing it produces is based on the correct information."
I've known that ChatGPT is full of crap (and experienced in other chats).
It can be a good tool to augment some capacities - but the exploration of ideas based on facts and reality are often (at best) flawed and if one is to try to build upon those flaws and add in one's own misconceptions, then its output is even more questionable.
The chat is https://chatgpt.com/share/691266fa-c76c-8011-876c-027206abd2... if one is curious. I continued a bit to see what else it got right and wrong.
The thing is if you don't know the story or the books mentioned... it's perfectly plausible that what was written is correct. And while a good bit of it is... maybe; that it got material facts wrong makes it "if it's working from that, then nothing it produces is based on the correct information."
I've known that ChatGPT is full of crap (and experienced in other chats).
It can be a good tool to augment some capacities - but the exploration of ideas based on facts and reality are often (at best) flawed and if one is to try to build upon those flaws and add in one's own misconceptions, then its output is even more questionable.