Director of Marketing
There’s a growing problem I can’t ignore: more and more professional communication doesn’t sound like anyone anymore.
While not a professional fundraiser, I live in the same relationship economy. I’m constantly on the receiving end of emails from vendors, professional associations, and partners. These are organizations I know well. I understand their audiences; I recognize their voices.
Or at least, I used to.
This year, one conference in particular stood out in a way I didn’t expect.
One email was themed around a well-known 90s sitcom, with references woven through subject lines, body copy, visuals, and plenty of emojis. One line, in particular, felt closer to a consumer brand tagline than a professional conference I’ve respected for years. The intent was clearly playful and familiar.
On paper, nothing was wrong. The conference has always embraced a sense of fun. And yet something felt off. It wasn’t any single phrase so much as how the messages landed overall, both visually and rhetorically. By the third or fourth email, they all began to blur together.
In the name of efficiency, likely driven by well-meaning and overextended volunteers, something subtle but important was lost. Not because fun is bad, but because the emails no longer sounded like the people or organization behind them. The campaign felt disjointed from message to message, untethered from a clear purpose, and disconnected from the voice I recognized. What remained was language that could have belonged to almost anyone.
That’s when it became clear this wasn’t just a conference problem. It was a communication problem, and one that AI is quietly accelerating.
AI has made it easier than ever to generate content at scale.
I can’t read your mind, but I’m fairly confident AI is already part of your workflow in some way. That’s not a critique. At this point, it’s simply reality, and its usage isn’t necessarily a bad thing. Used thoughtfully, it can be a helpful brainstorming partner or a way to clarify ideas that already exist. The problem shows up when AI is asked to replace voice rather than support it.
AI is very good at predicting language patterns. It knows what sounds professional and what phrases commonly appear together. What it does not know is context: institutional history, lived experience, or the subtle cues that make communication feel grounded in a real mission. As a result, it gravitates toward language that is safe, familiar, and broadly applicable. Over time, that sameness is what people feel, even if they can’t immediately name it.
This matters more than we might like to admit.
Think about your donor communications.
As the conference example illustrates, the moment technology starts speaking for us, something essential shifts. No matter how advanced AI becomes,* it can’t experience gratitude or understand the weight of a gift. When communication drifts toward generic, nobody-sounding language, trust begins to thin.
Effective donor communication sounds like it comes from a real organization with a real mission and real people behind it. It lives in memory, nuance, and long-standing expectations. Donors are encouraged to invest because communications reflect an understanding of who they are, what they value, and why they’ve stayed engaged over time. That kind of specificity cannot be mass-generated without cost.
You might assume donors won’t notice the difference between AI-written and human-written communication. Some won’t. Others will. And many will feel it before they can articulate why. I know how I react to a thank-you note that could have been sent to anyone. It doesn’t make me feel seen. It doesn’t make me feel appreciated. It certainly doesn’t make me more inclined to give again.
This is especially risky in a sector already fighting attention fatigue. Appeals, event invitations, board updates are all pieces where relationship-building should feature prominently. When those moments are filled with language that sounds interchangeable, the mission itself begins to blur.
Please, don't be afraid to sound like somebody.
I worry less about AI replacing human communication and more about organizations slowly training themselves to accept language that no longer sounds like them. Generic language is easy to ship. Voice takes work. But voice is how trust is built.
How willing we are to let our messages drift into language that could belong to anyone, anywhere?
At the beginning of this piece, I said that too much professional communication no longer sounds like anyone anymore. That should concern us. Not because technology is advancing, but because we are choosing sameness over specificity, and efficiency over voice.
If we care about the future of our institutions, our professions, and our sector, then we should resist the urge to sound clever at the expense of sounding real.
*As Gandalf said, “Even the very wise cannot see all ends.” With how fast language models evolve, this statement may be outdated by the time of publication.
Rachel Canady is director of marketing for the Winkler Group. She connects nonprofit partners with resources they can use to impact their communities—providing industry insights, research, and strategy to help them expand their reach and deepen their mission. Connect with Rachel on LinkedIn.