Video February 17, 2026

UnTangled with Aram Fischer: The Social Sector Cannot Afford to Sit This One Out

UnTangled with Aram Fischer: The Social Sector Cannot Afford to Sit This One Out
Jen Frazier
(she/her/hers)
CEO & Founder
I have been saying for a while now that the social good sector needs to stop treating AI like a trend to evaluate from a safe distance. But a recent conversation on UnTangled with Aram Fischer, co-founder and CEO of Change Agent, pushed me to get a lot more specific about what that actually means and what the stakes really are.

Aram did not come out of tech. He was a documentary filmmaker. Then a marketer. Then a political communications consultant. His co-founder Craig Johnson was the technical brain. Together, in the summer of 2023, when most people were still asking “wait, what is ChatGPT,” they started building an AI platform specifically for social change organizations. Not a generalist tool with a nonprofit discount code. Something built from the ground up with the sector’s values, needs, and workflows in mind.

That distinction matters more than it might seem.

The problem with one-size-fits-all

Aram described two schools of thought in the AI company world right now. There are generalist platforms that can do a lot of things pretty well, and there are narrower tools built for specific tasks like editing, design, or presentations. Change Agent sits in a third category called verticalized AI, meaning it is built for a specific industry. In their case, that industry is us.

What does that actually change? A lot, it turns out. The social sector has tools and workflows that the rest of the world does not. ActBlue integrations. C3 and C4 distinctions. Program briefs, donor data, organizing infrastructure. When your AI platform is built by people who have never organized a precinct or written a grant, you spend a lot of time working around it. When it is built by people who have, you spend that time doing actual work instead.

But here is where I am genuinely still wrestling: does the sector need to build entirely new platforms, or do we need to get smarter about connecting and configuring the tools we already have? I do not think there is a clean answer. I suspect it is both, and that the right choice depends on your org’s size, capacity, and specific needs. What I am increasingly sure of is that the default of just adopting whatever big tech is selling and hoping for the best is not a strategy.

The trust question is not optional

One of the most important parts of our conversation was about data privacy, and Aram did not sugarcoat it.

When you use a mainstream AI tool, you are operating inside a system built by Silicon Valley venture capitalists whose primary obligation is to shareholders. Not to you, not to your communities, and not to your mission. Aram pointed to the New York Times lawsuit against OpenAI, where a standing court order means they cannot delete data. He also named something I think more people in the sector need to hear: Anthropic, which makes Claude and started out with a reputation as the more values-aligned option, recently shifted its policy on training with user data from opt-in to opt-out.

That is a meaningful bait and switch. And it is a useful reminder that values alignment is not a static thing. It can drift, especially when venture capital is involved and the pressure to monetize data is constant.

Change Agent’s approach is a walled garden. Your data stays on their servers. It does not feed into some larger model training pipeline. If you want to delete it, it gets deleted. And the open source models they build on are chosen in part based on how they were trained and whether that training was done with environmental considerations in mind, including carbon neutrality. Their retraining actually runs on a Mac Studio powered by solar panels at the CTO’s house, with the heat from the computers helping to warm his home in winter. I am not making that up.

None of this means Change Agent is the only answer. But it is an example of what it looks like when the people building the tool actually share your values, not just your market.

The beautifully banal wins

Here is the part I want skeptics to hear, because I know you are out there.

Jess O’Donnell, development director at Family Forward Oregon, writes personalized thank you notes to donors every quarter. A best practice that almost no one has time for. With Change Agent, she went from 40 hours per quarter down to 90 minutes. That is 38 and a half hours back. Every single quarter.

Aram called these kinds of outcomes “beautifully banal” and I love that framing. We spend so much time talking about the frontier capabilities of AI that we skip past the part where someone in a resource-strapped nonprofit just got almost a full work week back to go apply for grants, spend time with their family, or honestly, just breathe for a second.

There is also a story about Olé New Mexico, an advocacy org that used AI to reduce the time spent on their C4 political activities so significantly during the 2024 cycle that they had capacity to also do their C3 work, and helped get a housing measure passed in Albuquerque. Not because AI did the organizing.

Because AI freed up the humans to do more of it.

So where does that leave us?

I keep coming back to the same place. The social sector cannot afford to sit this one out. Not because AI is magic, and not because every tool deserves our trust, but because the systems being built right now are going to shape how our work gets done whether we are at the table or not.

We can engage with skepticism, with clear values criteria, with real questions about privacy and environmental impact and alignment. We should do all of that. But opting out is not a neutral act. It just means someone else makes the decisions.

Aram’s parting thought was that connecting with people in this space and building AI together is the path forward. I think he is right. The question is whether enough of us decide that now, while there is still something to shape.

 

Watch the full episode here:

And if you are ready to start thinking through what AI could look like at your organization, I would love to talk. Let’s figure it out together.

Want to learn more about Change Agent? Find Aram on LinkedIn or visit www.changeagent.ai.

Thanks! You’ll hear back within 48 business hours

In the meantime, why not check out our latest case study?

offCanvas Footer success image
A website that the MS community can rely on
Can Do MS

Whether you need help with a project, want to learn more about us, or just want to say hi, you’ve come to the right place.