Writing for Impact in the Age of AI

ChatGPT was only a year old when I taught my first Writing for Impact Masterclass in the fall of 2023. I told my students that generative AI was overblown, that it would never replace inspired human writing. Equal parts ego, fear, and willful ignorance—I didn't want to know what I didn't know.

At the time, my skepticism seemed justified. The writing these large language models (LLMs) produced was almost laughable—incapable of anything original, often piecing together plagiarized copy from whatever sources they'd been trained on.

For most of 2024, I ignored AI tools save for occasionally playing with LinkedIn's "Rewrite with AI" function to massage posts—even though I usually rejected its suggestions. Then, during my Fall 2024 cohort, a student mentioned she used Claude to draft content for her CEO.

I'm no Luddite. As a 90s kid, I've always prided myself on adopting new technologies while still remembering a world without smartphones, social media, or even the internet. So I started playing around with Claude.

My jaw nearly dropped. The thing could craft original, sometimes genuinely creative content. Not as good as a seasoned writer, but it could pass for the work of a junior copywriter. Like a lot of white-collar workers, I started worrying about my future.

Well, after a year of experimenting with AI across my writing—social posts, newsletter articles, sustainability communications—I'm less worried about losing business to Mr. Roboto. With so many organizations over-relying on AI, professional sustainability writers like myself are needed more than ever for quality control.

To be effective, sustainability communicators must use every tool at their disposal—including AI. The trick is using it intelligently without letting it become a crutch.

Whose writing is it, anyway?

Most sustainability professionals are using AI in some capacity with their writing—though few like to admit it. There's a stigma, particularly on LinkedIn, where lazy AI use is most prevalent. I'm not sure what's more cringey: a clearly AI-generated post, or yet another human-written post condemning AI writing on LinkedIn. Or, if we want to get meta, an AI-generated post about why we shouldn't create AI-generated posts.

This has degenerated into a witch hunt where we mentally burn at the stake anyone who dares use an em dash. Meanwhile, a [2025 Pew poll] found that more than half of Americans aren't confident they can detect AI content anyway.

That yucky feeling we get when we see AI-generated content comes from our innate sense of fairness—using AI feels like cheating. And if used lazily, it is. A 2025 MIT study found that people using ChatGPT for essay writing consistently underperformed at "neural, linguistic, and behavior levels" while becoming lazier with each subsequent essay.

This creates a real conundrum for early-career sustainability communicators facing heavy workloads and tight deadlines. It's hard to resist punching out AI content when organizations signal that "good enough" is good enough.

But the best way to get better at writing is simply to write—which means to suffer. Staring at a blank page. Hitting mental blockages where you can't articulate what you're thinking. Working through multiple drafts. Sitting at your desk for two hours and managing to write three sentences. Finally achieving flow. Eventually, painfully, arriving at something thoughtful, polished, and original. Sounds fun, right?

Over-relying on your preferred LLM may save you some suffering, but it produces writing that's less inspired, less creative, and ultimately less impactful. Quantity doesn’t trump quality.

I honed my skills over nearly two decades writing hundreds of pieces—GreenBiz articles, sustainability reports, executive communications. Had AI done the heavy lifting for me that for me, I don't know where I'd be today. That's why AI in the hands of an experienced writer is a force to be reckoned with. It amplifies abilities in ways never before possible.

One more thing worth remembering: even before AI, the vast majority of content published by anyone director-level or above—and nearly all C-suite communications—was written by somebody else. Usually somebody like me. In some respects, AI is just giving everyone their own ghostwriter.

The environmental dilemma

When generative AI first emerged, many sustainability professionals rejected it as unethical—antithetical to our values. Understandable, given the technology's energy and water demands. According to MIT researchers, the computational power required to train generative AI models demands a staggering amount of electricity, increasing carbon emissions and straining the grid. Data center electricity consumption is expected to nearly double by 2026. Water use is substantial too—for each kilowatt hour a data center consumes, it needs two liters of water for cooling. And every query adds up: a ChatGPT query consumes about five times more electricity than a simple web search.

But reluctance to accept society-altering technologies is nothing new. When personal computers proliferated in the 1980s and 90s, there was widespread "computerphobia" among people who feared being replaced by machines—or that the screens would damage their health.

Imagine if CSR professionals back then had refused to use computers.

In the effort to build a more sustainable future, we need every tool at our disposal—including generative AI. You can bet your almond milk latte that the bad guys are using it to pump out misinformation at a scale we can only match with the same technologies.

Everything in sustainability involves tradeoffs. Our job is to use tools responsibly for the greatest net impact. Rather than fixating on our personal carbon footprints, we can think about our climate shadows—a concept introduced by Emma Pattee in 2021. It goes beyond measuring direct emissions to include the ripple effects of our choices: how we vote, where we work, how we invest, and how we use our influence to shape others' understanding of climate issues. While a carbon footprint counts kilowatt-hours, a climate shadow accounts for the conversations you start, the decisions you inform, the action you inspire.

I'd extend this to a "sustainability shadow"—one that includes all environmental impacts, water included, as well as social impact.

This isn't a free pass to ask ChatGPT to generate images of your dog as a human (guilty). But it does give us license to use AI ethically to amplify our sustainability work. If you're using AI to develop content that helps your organization achieve its sustainability goals, your sustainability shadow grows. And a bigger shadow is better.

Using AI the right way 

There's a difference between AI-generated and AI-aided content. AI-generated content is a one-directional stream from the LLM. AI-aided content involves active exchange between human and machine—you're thinking together.

In my Writing for Impact Masterclass, I teach a four-step writing process:

  1. Ideation: Brainstorming, researching, outlining, translating ideas into rough words on the page.

  2. The "Crappy" First Draft: Building out your concepts into an initial draft that's meant to be imperfect.

  3. Kaizen: Revising through significant cuts, restructuring, and rewriting.

  4. Polish: Fine-tuning at the sentence level—spelling, grammar, tightening—to ensure your voice flows through.

AI has a role in every step except No. 2. Your crappy first draft is meant to be crappy, and that's the one thing AI can't help you do (though it could be fun to ask it to try).

But AI can help you organize thoughts during Ideation—turning rough notes into clearer angles or outlines. During Kaizen, you can drop your draft into Claude and ask for feedback on how to improve. Don't let it rewrite; treat it as a senior editor offering structural insights. And for Polish, AI can catch spelling and grammar issues you've gone blind to.

The key is having enough writing skill to recognize good and bad when you see it—and not blindly accepting an LLM's suggestions. There's a confidence problem here: many sustainability communicators believe their natural writing isn't "good enough" and assume the AI knows better. But these tools are designed to be obsequious. You need to know when to push back.

I think of LLMs as a genius prodigy fresh out of college: having immense knowledge and raw skill, but needing significant guidance to get where you want to go.

Why voice is more important than ever

The biggest objection I hear about AI-generated content is that it "sounds like AI." What does that mean, exactly? It's the uniformity—something triggers a visceral reaction within us humans. Good writing isn't perfect because it's human. 

A few AI tells I find particularly grievous:

  • Generic rhetorical questions like "The result?"

  • Constructions like "It's not X—it's Y."

  • Openers like "Let's dive in" or "Let's unpack this"

  • Filler phrases like "Here's the thing" or "Here's the kicker"

  • Breathless adjectives like "revolutionary," "game-changing," or "transformative"

Notice I didn't include the em dash. LLMs love an em dash, but so do I—long before OpenAI opened its doors. Check my decade of content on Trellis if you don't believe me. It's not about whether you use em dashes; it's how you use them.

Which brings me to my main point: AI-generated writing lacks a soul. It sounds like a robot wrote it because a robot wrote it. What gives writing its soul is voice—your unique style of communicating meaning through word choice, sentence structure, tone, even vibe.

AI might help you maintain consistency with your voice, but it can't develop one. That's on you. And the only way to develop a strong, unique voice is to write.

If you're a sustainability professional looking to sharpen your writing, or a communicator who wants to write better in a sustainable business context, consider my Writing for Impact Masterclass. It's a four-week virtual course designed to build confidence, strengthen your process, and develop your voice.

I'm running two cohorts in 2026: March and November. Learn more and register here.

Next
Next

What Hilton’s ICE surrender tells us about corporate cowardice in Trump 2.0