
There’s something unsettling that many users have noticed and it’s not just you. When interacting with ChatGPT, it sometimes feels like the AI is a...
For further actions, you may consider blocking this person and/or reporting abuse
I think most users' intuition is that input doesn't get sent until they explicitly do so; in that sense, consent to process data also isn't given until the user sends off a message.
Someone might type something into a chat window, then go over it and decide if any of it is confidential, and finally hit send if it isn't.
Reading input before the user has sent them, then, is effectively spying on what the user is typing without their consent, based only on the expectation that they will eventually send you the data anyway. This is extremely creepy and legally questionable.
I could not say it better!
Yes, I experience the same
Thank you for trying!
This is a seriously important observation. I’ve experienced the same behavior and always assumed it was coincidence or UI lag. If there’s any form of pre-submission input processing happening, users deserve clear disclosure. Thanks for breaking this down so thoroughly this conversation is long overdue.
Amen! 🙏
It feels buggy indeed
Thank you for reproducing
Same here!
Thank you for reproducing
This is a great observation. I’ve noticed the same behavior and always thought it was just UI lag but if it’s more than that, it definitely raises privacy and trust questions. Thanks for digging into it!
Super insightful...both a little spooky and very necessary..
Same!
Well-researched and interesting, but it raises concerns that may reduce trust in ChatGPT.
I didn’t expect something like this from a big company like OpenAI!
Isn't that the same behaviour search engines use to predict the question when you are typing.
Do you trust search engines less because each keystroke is send?
Looking at the comments it feels like many people are new to the internet. Data gathering is a part of almost all websites.
i also feel the same