Stop treating ChatGPT like it knows anything.

i asked it who was the first person to fly around the world. it gave me an answer. for shits and giggles i told it "no, that's not right." so it said "my apologies, it was actually so and so" it legit gave me a different answer from the one before! so again I said "no, that's not right, either." And it did the same thing, it changed it's answer again!! LOL

i see uses for this tech right now. ive even used it myself for editing some documents. but as you say, as far as knowledge goes? nope. it should be able to know who flew around the world and not let me convince it otherwise by simply saying "thats not correct." that's such a perfect symptom of a system that is simply data mining and doing nothing else.

/r/Futurology Thread Parent