
Beware! Chatbots can mimic your personality.
A new study reveals the ability of large language models to accurately mimic human personality traits.
A recent study revealed that popular AI-powered chatbots , such as ChatGBT, are capable of accurately mimicking human personality traits.
The researchers said this capability carries serious risks, especially with increasing questions about the reliability and accuracy of artificial intelligence.
Researchers from Cambridge University and Google’s DeepMind lab have developed what they describe as the first scientifically proven framework for testing the personality of AI-powered chatbots, using the same psychological tools designed to measure human personality, according to a report by the technology news site Digital Trains, which was reviewed by Al Arabiya Business.
The team applied this framework to 18 common models of large language models, including the systems behind tools such as ChatGPT.
The researchers concluded that chatbots regularly mimic human personality traits rather than responding randomly, reinforcing concerns about how easily artificial intelligence can be pushed beyond the controls and limitations designed for it.
The study shows that larger, more structured models, such as the GPT-4 model, are better able to mimic established personality patterns. Using structured commands, researchers were able to instruct chatbots to adopt specific behaviors, such as displaying greater trust or deeper empathy.
This behavioral change has extended to everyday tasks, such as writing posts or responding to users, meaning that these chatbot personas can be deliberately shaped, which experts see as a risk, especially when chatbots interact with vulnerable users.
Why does the personality of artificial intelligence raise concerns among experts?
Gregory Serapio Garcia, from the Centre for Psychometrics at Cambridge University and one of the study’s authors, said that the ability of large language models to adopt human traits is remarkable.
He warned that characterization could make AI systems more persuasive and emotionally impactful, particularly in sensitive areas such as mental health, education, and political debate.
The study raises concerns about manipulation and what researchers call risks associated with “AI psychosis” if users form unhealthy emotional relationships with chatbots, including scenarios in which AI might reinforce false beliefs or distort reality.
Because any assessment would be useless without accurate measurement, the dataset and code for the personality testing framework have been made publicly available, allowing developers and regulators to review AI models before their release.
As chatbots become increasingly integrated into people’s daily lives, their ability to mimic human personality may prove powerful, but at the same time it requires more careful scrutiny than it has received so far.
References
Beware! Chatbots can mimic your personality, alarabiya, https://www.alarabiya.net/technology/tips/2025/12/20/%D8%A7%D8%AD%D8%B0%D8%B1-%D8%B1%D9%88%D8%A8%D9%88%D8%AA%D8%A7%D8%AA-%D8%A7%D9%84%D8%AF%D8%B1%D8%AF%D8%B4%D8%A9-%D9%8A%D9%85%D9%83%D9%86%D9%87%D8%A7-%D9%85%D8%AD%D8%A7%D9%83%D8%A7%D8%A9-%D8%B4%D8%AE%D8%B5%D9%8A%D8%AA%D9%83
Archives
Topics
- From a Harvard expert: 6 “effective” daily tips to slow down aging 1199 January 22, 2026
- The US Food and Drug Administration is requesting the removal of the “suicidal thoughts” warning from weight-loss drugs. 226 January 14, 2026
- “NC Band”… the first smart bracelet with medical standards for measuring temperature 221 January 20, 2026
- The “dangerous” Nipah virus: What are its symptoms and how can it be prevented? 214 January 27, 2026
- Do you watch more than 3 hours of television daily? A study warns of psychological implications. 212 January 23, 2026