Artificial Intelligence has become the shiny new tool in everyone’s toolbox — from writing essays to making artwork, generating music, and even simulating conversations. It’s fast, clever, and often eerily good. But behind the convenience and creativity lies a growing concern among psychologists: Could frequent use of AI be tied to darker parts of our personality?
According to a new study published in the journal BMC Psychology, the answer might be yes — and it’s more complicated than it sounds.
The Curious Connection Between AI and Personality
Researchers in South Korea, led by Jinyi Song from Chodang University and Shuyan Liu from Baekseok University, decided to explore how personality might influence the way students use AI. They surveyed 504 Chinese college students studying the arts — a group known for creativity, pressure to perform, and increasingly, for using AI as a tool for inspiration or output.
Instead of just asking about AI habits, the researchers dug deeper. They evaluated each student’s responses against something known as the “Dark Triad” — a trio of personality traits known for being manipulative or self-centered:
- Narcissism: excessive self-focus, arrogance, and craving for admiration.
- Psychopathy: emotional coldness, impulsivity, and a lack of empathy.
- Machiavellianism: a tendency to manipulate others for personal gain.
What they discovered was surprising — but also telling. Students who scored higher on these traits were more likely to use AI tools like ChatGPT or Midjourney, and not just for brainstorming or fun. Many of them were using AI to generate schoolwork and then submit it as their own, which researchers classified as academic dishonesty.
Read more: Artificial Intelligence Can Now Replicate Itself—And It Has Experts Terrified
Why Art Students?
You might wonder: why focus on art students?
Art and design fields are currently at the frontlines of the AI revolution. With platforms like Midjourney, DALL·E, and Runway generating stunning visual pieces in seconds, it’s tempting for students under pressure to replace their own creative efforts with machine-made results.
These students are also often juggling heavy workloads, tight deadlines, and the need to constantly prove themselves creatively. That makes AI an especially appealing shortcut — especially for those who are prone to procrastination or fear failure.
And that’s exactly what the study uncovered.
Beyond the Dark Triad: Anxiety and Procrastination
Interestingly, students who leaned on AI weren’t just displaying dark personality traits — many were also struggling emotionally. The survey revealed that students who were anxious about their grades or performance, or who frequently procrastinated, were more likely to use AI to finish assignments at the last minute.
This adds another layer to the findings: using AI might not always stem from arrogance or manipulation. Sometimes, it’s a coping mechanism.
But even in these cases, the results can be ethically murky. Submitting AI-generated content as original work, even under stress, still poses a problem — especially in creative fields where authenticity is central.
Materialism: A Hidden Factor
The study didn’t stop with personality and stress. Researchers also explored how materialistic attitudes influenced AI usage. They found that students who valued external rewards — like money, praise, or prestige — were more likely to use AI tools in ways that could be considered unethical.
This suggests that for some, AI is simply a means to an end: a way to climb the academic or artistic ladder faster. If the goal is recognition, not growth or learning, then using AI shortcuts feels less like a moral dilemma and more like a strategy.
Read more: Google Claims That AI Will Surpass Human Intelligence By 2030, Posing Extinction Risk
Academic Misconduct in the Age of AI
To be clear, the study doesn’t label AI as the villain. The real issue lies in how and why people use it.
When students use AI to enhance their learning or explore ideas, that’s one thing. But when they rely on it to replace genuine effort — especially in academic settings where integrity matters — it becomes a form of academic misconduct. That includes plagiarism, cheating, and misrepresentation, all of which undermine the learning process.
And the rise of generative AI tools makes this harder to detect. A polished AI-generated essay or an intricate digital painting could easily pass as human-made. This puts schools, teachers, and institutions in a tricky position — how do you uphold academic honesty when the tools of deception are this sophisticated?
Rethinking Education for an AI World
The researchers behind this study believe it’s time for colleges and universities to adapt.
They suggest that educational institutions rethink how they structure assignments. Instead of traditional essays or static submissions, schools could focus more on creative processes, real-time feedback, and reflective elements that are harder to fake.
Some possible solutions include:
- Asking students to submit multiple drafts that show progress.
- Encouraging oral presentations or live demonstrations.
- Grading class participation and group collaboration.
- Including AI ethics modules in coursework.
- Using tools that detect AI-generated content (though this comes with its own complications).
Ultimately, the goal is to design tasks that reward critical thinking, personal effort, and originality — qualities AI cannot easily mimic.
What This Means for the Future
This study highlights a crossroads we’re all approaching. On one path, AI can empower students, creatives, and professionals to do more, learn faster, and explore new horizons. On the other, it risks becoming a crutch — or worse, a temptation to cut ethical corners.
While the study focused on Chinese art students, its implications stretch far beyond. Around the world, students are experimenting with AI, often without clear guidelines or a full understanding of the consequences. Meanwhile, educational systems are scrambling to catch up.
It’s not just about personality traits or academic rules — it’s about shaping how people interact with technology. Do we raise a generation of critical thinkers who use AI responsibly? Or do we create shortcuts so addictive that they undermine the very purpose of education?
Read more: New Study Finds the Human Brain Processes Reality in Up to 11 Dimensions
The Bigger Picture: AI and Humanity
It’s important to note that using AI doesn’t automatically mean someone has a dark personality. Plenty of people use ChatGPT or Midjourney with curiosity, creativity, or for sheer fun. But this study reminds us that personality, motivation, and context all play a role in how we use technology.
As AI becomes more powerful and accessible, we’ll need to ask tougher questions — not just about what the tech can do, but what it should do, and how it reflects back on us.
For now, one thing is clear: AI may be artificial, but the choices we make around it are very human.
Final Thought:
In a world increasingly shaped by intelligent machines, it’s not just about how smart the technology becomes — it’s about how wisely we choose to use it.