Some Chatgpt users have recently noticed a strange phenomenon: Occasionally, chatbots feature problems with their names. This is not the default behavior before, and although it has never been told what to call it, several users claim that Chatgpt mentioned their name.
Comments are mixed. A user, software developer and artificial intelligence enthusiast Simon Willisoncalled “creepy and unnecessary features”. Another developer, Nick Dobos, explain He “hate it.” A rough X search appears Dozens of users Confused and alert by Chatgpt’s name base behavior.
“Like the teacher kept calling my name, lol.” Write a user. “Yes, I don’t like it.”
Does anyone like what O3 uses your name in its chain of thoughts instead of finding it creepy and unnecessary? pic.twitter.com/lyrby6bk6j
— Simon Willison (@simonw) April 17, 2025
It is not clear when exactly changes occur, or whether to Chatgpt’s upgraded “memory” function This allows the chatbot to personalize its response through past chats. Some users on X said Chatgpt also started calling them with names, even if they disable memory and related personalization.
Openai has not responded to TechCrunch’s request for comment.
It’s weird to see your own name in model thinking. Is there any reason to add it? Will it make more mistakes or just make more mistakes like I do in the github repository? @Openai O4-Mini-High, do you really use it in custom prompts? pic.twitter.com/j1vv7arbx4
– Debasish Pattanayak (@drdebmath) April 16, 2025
In any case, the counterattack shows that the peculiar valley Openai may be difficult to overcome for those who use it, making Chatgpt more “personal”. Last week, the company’s CEO Sam Altman hinted at AI systems that “know you all your life” to be “very useful and personalized.” But, judging by the latest wave of reactions, not everyone is selling on this idea.
one article Published by Valens Clinic, the Office of Psychiatry in Dubai, it may shed light on visceral responses to the use of Chatgpt’s name. The name conveys intimacy. However, when a person (or chatbot) often uses a name, it can appear unreal.
“Transferring a person’s name directly is a powerful relationship development strategy,” Valens wrote. “It expresses acceptance and admiration. However, bad or extravagant uses can be considered false and invasive.”
Again, maybe another reason many people don’t want to use their name Chatgpt is that it feels han-footed – it’s a clumsy attempt to anthropomorphize an emotionless robot fit. Just like most people don’t want to call their toaster with their name, they don’t want to “pretend” it knows the meaning of a name.
When O3 said earlier this week in Chatgpt, it was a study of “Kyle”, of course it would be upset. (As of Friday, this change seems to have recovered; O3 calls me a “user.”) It’s the opposite of what it expected – the flaw in fantasy is that the underlying model is more than just a programmable, synthetic thing.