I’m part of a think tank, and over the past months we have been examining patterns around AI use and social media behavior. We expected to find shifts. What we did not expect was how quickly one particular pattern kept repeating itself.
We noticed that creators who rely heavily on AI-generated images or videos often begin to lose the level of engagement they once enjoyed when they shared natural, ‘unedited’ photos or real-life videos of themselves. Not immediately. But gradually, and then quite clearly.
The reason is subtle but powerful.
Most followers can now recognize AI content almost instantly. It takes only a second. Once that recognition happens, many people disengage without thinking much about it. Somewhere in their mind, a quiet judgment forms: “this isn’t real.” And when content feels unreal, it stops inviting emotional investment (media psychology).
We are living in a time where people crave human presence more than polish presence.
Transparency, authenticity, and genuine connection carry more weight than technical perfection (media psychology).
When AI visuals dominate someone’s feed, the content may look impressive, but it often feels distant. Over time, that distance erodes trust. – Dr. Joybert Javnyuy
This is not an argument against AI. It is a caution against excess and misuse.
Used thoughtfully, AI can enhance storytelling, clarify ideas, and support creativity. Used carelessly or excessively, it can flatten engagement and weaken the human bond that social platforms were built on in the first place.
There is an art to using AI well. Knowing when to use it, when to step back, and when to show your real face, voice, and imperfections matters more now than ever.
Engagement Is No Longer About Novelty, It Is About Credibility:
AI imagery initially performed well because it was novel. Novelty always spikes attention. But novelty decays quickly. Once audiences learn to recognize a pattern, the brain categorizes it and moves on. – Dr. Joybert Javnyuy
Today, AI visuals are processed by many users not as content, but as signal noise. The moment something is labeled “synthetic,” the brain assigns it lower emotional priority. Engagement drops not because the content is bad, but because it feels non-credible in a credibility-driven environment.
Trust Is Built Through Costly Signals, Not Perfect Outputs:
In behavioral science, trust forms through what are called costly signals, actions that are difficult to fake.
Showing up imperfectly, sharing lived experiences, speaking in your own voice, appearing on camera without excessive polish, these all signal effort, vulnerability, and presence.
AI-generated visuals are cheap signals. They require little personal exposure. Audiences may not articulate this consciously, but they feel it intuitively. Over time, feeds saturated with AI content communicate distance rather than intimacy with the audience. – Dr. Joybert Javnyuy
This is why creators who once built trust through human presence often see engagement soften when AI becomes dominant in their output.
Social media has quietly shifted from “impress me” to “convince me you’re real.”
The Problem Is Not AI – The Problem Is Substitution:
One important clarification: AI hurts engagement most when it replaces the creator, not when it supports them.
AI works best in three roles:
As a background enhancer (diagrams, mockups, conceptual visuals)
As a support tool (editing, structuring, summarizing)
As a clarifier of ideas that already originate from lived insight
AI performs poorly when it becomes the face, voice, or emotional center of the message.
The Future Belongs to Hybrid Creators:
What I am saying is, the creators who will win long-term are not anti-AI or AI-dependent. They are AI-literate but human-forward. – Dr. Joybert Javnyuy
They will:
Lead with lived experience
Use AI quietly, strategically, and invisibly
Make the human unmistakable and the AI secondary
Preserve friction, imperfection, and voice
In other words, AI will move backstage, while humanity returns to center stage.
Excessive visible use of AI-generated visuals does not reduce engagement because audiences dislike technology, but because it weakens perceived human presence.
In an era where trust, credibility, and emotional connection drive attention, creators who allow AI to replace rather than support their humanity risk long-term disengagement.
When our research is fully published, I plan to speak more about this, and about other patterns we uncovered.
For now, the message is simple: technology should amplify humanity, not replace it.
Cheers to the future
Dr. Joybert Javnyuy
Founder Cosdef Global Institute for Business and Technology
#AI #socialmedia #AIandSociety





