Nonprofits shaping the future of responsible AI

For-profit enterprise giants like Amazon and Walmart may lead the charge in AI deployment and integration, but another sector is taking a more methodical, and perhaps more responsible, approach. As nonprofit organizations around the world see how AI fits into their directives, they’re deeply considering issues like privacy, transparency, governance, and cost, often avoiding the stumbles of their more swiftly innovating counterparts.

Cost in particular is no small matter. Nonprofits are feeling squeezed in 2026, with roughly a third reporting a decrease in funding at the federal, state, and local levels, and more than two thirds anticipating a rising service demand this year. The result has led to 87% of foundation leaders seeing more organizations vying for grant funding.

“It’s an interesting dichotomy,” says Ben Miller, SVP of data science and analytics at Bonterra, a software solution for nonprofits like Special Olympics, NAACP, and Audubon. “On the one hand, spending extra money for resource-constrained nonprofits — the majority are under $500,000 in donations — is really tough, but at the same time, those are the very organizations that could leverage efficiencies gained by AI.”

Despite these barriers to growth, nonprofits are still finding ways to innovate.

JA Worldwide, for instance, runs educational junior achievement programs in 115 countries for more than 20 million students, and has navigated their scant tech budget by fostering commercial partnerships with consultancies like McKinsey, Accenture, and EY, and financial market infrastructure group Euroclear. Partnerships, Miller says, are table stakes for nonprofits, especially smaller ones without dedicated teams like IT and legal that can carry the burden.

To augment JA’s main program Student Company, where high school students can build a virtual company and present it in pitch competitions, the organization developed something called JA Boost.

“It’s an AI agent that helps students refine their pitch decks,” says Boris Kolev, JA’s global head of technology.

Within JA Boost are AI mentor agents like PitchMaster and Business Plan Helper. And while JA’s roughly 700,000 human educators and volunteers focus on meaningful coaching, students can get real-time feedback on their projects.

In PitchMaster, students upload a video of their pitch or even pitch the agent live. The agent then analyzes their performance, gives advice for their slides or language, and helps them refine their idea and go-to-market strategy. Business Plan Helper gives students advice on their financial projections and other logistics, and JA is currently piloting these tools across nearly 500 schools in 10 countries.

Ensuring oversight

Naturally, any AI that young students interact with must be equipped with serious governance. That’s something JA made sure to include from the start. “We have layers checking if sensitive data is shared, so it can be flagged and removed before being saved in the database,” says Kolev. JA’s LLMs are specially trained with conversational guardrails, chats are scrubbed of personal identifying information, and any photos or videos have faces blurred before entering the database. This, Kolev notes, makes an AI project far more expensive, with costs ending up about three times as much as initial projections. But it’s necessary and avoids regulatory, legal, or reputational backlash down the line.

As a global organization without the budget for geographically bespoke versions, JA, says Kolev, takes the strongest regulation, Europe’s General Data Protection Regulation, and applies it everywhere. This safeguards them from having to make later changes as regulations inevitably advance beyond Europe.

While Kolev has a firm grasp on governance, other nonprofits are forced to view it through an even more nuanced lens.

When all eyes are on the user

What happens when your user may be a target of systematized control, stalking, and abuse? That’s what Ashley Rumschlag, national director at Domesticshelters.org, an Alliance for HOPE International program, faces as she works to better serve the community through technology.

In January 2025, Domesticshelters.org launched a tool called HopeChat. At the surface, it may seem like any typical chatbot but it’s much more intentional.

Over the last 12 years, the program has grown to more than 1,400 educational articles and a searchable database with nearly 3,000 programs and resources. To maximize efficiency, Rumschlag and her team built a RAG system to pull resources from a closed and trusted database. It serves as a solution to a very real UX problem.

“HopeChat is essentially a navigator for the website, so people can go through the journey that we’ve outlined, which is to identify abuse, escape from abuse, and then heal from abuse,” she says. “It guides people toward advocates in their local communities so they can get help.”

The chatbot is trained to communicate in trauma-informed language that’s kind but firm, never allowing users’ confirmation biases to shape its convictions, and it relies on expert and survivor testimonies as the foundation for its responses.

Because of the groundwork Rumschlag and her team laid, HopeChat’s only hallucination issues have been largely inconsequential, like making up a URL for the organization’s About page. But in the nonprofit domestic violence space, if AI hallucinates, it can be dangerous, not just an inconvenience.

Other governance considerations include the inability to get back on the site with the back button, and immediate deletion of a HopeChat conversation once you leave the page. “We operate in a very high-risk environment,” adds Rumschlag. “Survivors are being monitored, abusers use technology for coercion, and the advice we give has real world implications.”

To date, HopeChat has had roughly 70,000 conversations and the site is currently working on piloting the tool for other domestic violence organizations as a more cohesive resource. Collaboration, after all, only advances the mission.

Rumschlag has no delusions about being a forerunner in AI innovation, but she does believe nonprofits have an edge. “They’re centered less around profitability and more around the problem they’re trying to solve,” she says.

Leaning on the bigger players

Sue Harnett is at the helm of an organization more deeply rooted in tech literacy, but even she knows to lean on the more robust resources around her.

Harnett is CEO and founder of Rewriting the Code, a nonprofit helping to propel women in tech with 40,000 global members. This year, RTC’s biggest project is creating a new member platform equipped with tech upskilling and reskilling resources from its more than 300 company partners, including Netflix, Nvidia, and Bank of America.

As part of the trust matrix, Harnett emphasizes bias mitigation in any of RTC’s AI and emerging tech solutions. “As an organization that’s advancing women in tech, we’re always assessing tools for their bias,” she says. This tracks with the perspective Bonterra’s Miller holds, that when it comes to a responsible AI ethics framework, the nonprofit sector is ahead of many of its for-profit counterparts. Miller points to Fundraising.AI, a group he’s an ambassador for that developed a responsible AI framework for fundraising organizations.

As RTC incorporates workflow AI enhancements for its 26 full-time employees, Harnett is constantly upskilling her own staff so they feel comfortable with any transition.

“The philosophy starts with leadership and what they’re trying to accomplish,” she says. “I don’t think you can over-communicate enough internally to make sure everyone understands how they’ll participate in this new world.”

Harnett’s perspective is based on the fact that many employees at for-profit companies have reported that reskilling efforts lag behind implementation, leaving them confused and worried about their future. For her own team and the tens of thousands of members in her community, she aims to equip people with the know-how needed for a long, successful career. For members, a new program called Rewrite AI allows women to learn from people in the industry who want to build the skills, confidence, and connections to get ahead despite the fast pace of an AI-powered world.

For nonprofits deploying AI, Miller doubles down on the importance of education. “It’s really important to understand what AI is, because you also need to understand what it isn’t,” he says. “I’d ask nonprofit tech leads to roll up their sleeves, understand what they’re leveraging, and be transparent.”