Fear, Fundraising, and Artificial Intelligence
by Ben Williams | May 2023
"Our greatest concern is the possibility that leaders will think smart tech is a technical problem to be solved by the IT department rather than a profoundly important leadership issue."
Beth Kanter & Allison Fine, The Smart Non-Profit: Staying Human-Centered in an Automated World
Nonprofit leaders and fundraisers: What are we afraid of?
For the last few years, I’ve liked to proclaim – to anyone who will listen or rather is generous enough to indulge me – “The days of the technically literate nonprofit leader and fundraiser are here!” Maybe. Though whether those days have actually arrived is not as important as the fact that they are possible. No one wants to look a fool. But, unfortunately, if you don’t risk it, you can’t learn. And if you can’t learn, you can’t lead. In the movie Wall Street, Michael Douglas’ now infamous character Gordon Gekko says: “In my book you either do it right or you get eliminated”. Of course, I don’t approve of the character or his villainy, but I do sometimes imagine this quote about underperforming corporations applied to nonprofit leadership and fundraising. In other words: Do it right or go away.
I understand that reads a bit harsh, although the leaders and fundraisers I know don’t mind an occasional kick in the pants. We’re human, after all, and even without incorporating new tech there is already a lot of fear in the nonprofit sector, especially around fundraising. In many ways, this is perfectly natural. An organizational mission positively impacts people’s lives, and without revenue the mission cannot be accomplished, and those people cannot be served. And so, in any organization, there is usually a significant stress point at the intersection of any and all departments with – you guessed it – the fundraising team.
So there’s already fear. And revenue goals. And limited resources. And now I want you to add artificial intelligence into the mix? What, should I also drop you into a shark tank and make you simultaneously redo your taxes? Who needs this?
"Very refin'd reflections have little or no influence upon us; and yet we do not, and cannot establish it for a rule, that they ought not to have any influence; which implies a manifest contradiction."
David Hume, A Treatise on Human Nature?
The short answer to who needs A.I.: Everyone. Who is ready? Not everyone. Ironically, for those nonprofits that are not ready, some form of A.I. is probably currently available to them in the tools they have, they just don’t use it, don’t know how to use it, or don’t know they’re already using it. For platform examples, see page 17 for the section “Summary of How Platforms Are Using Artificial Intelligence” in Beth Kantor and Allison Fine’s downloadable report entitled “AI4Giving: Unlocking Generosity with Artificial Intelligence: The Future of Giving”. For those nonprofits that are ready, they have already made a deep investment in developing a culture of philanthropy. This means they know that if you ignore or misuse the tools you have available to connect your communities together, you are ignoring or misusing opportunities to do things right for the people you work with and the people you serve. I mean ethically right, morally right, as well as doing those things well.
领英推荐
Earlier this month as I was working on this article, I decided to reach out to my colleague Robert D. Thompson, Jr., Chief Philanthropy Counsel at Strategies for Giving LLC, for an interview. Robert is not only a seasoned fundraising veteran with over 35 years in the field, but also, lucky for me, he’s a good friend and so agreed to connect. We of course touched on smart tech in our conversation, but what he had to share about his fears as a nonprofit leader and fundraiser stood out to me most, especially as he believes he holds these fears in common with many if not all veterans like himself. He talked about the idea of saying yes to unrealistic revenue goals too soon only to realize after initial assessments that the goals for the most part are simply out of range in the short term. I certainly related to his point about the dangers of saying yes too readily, and I was interested in where he thought the mismatch between capacity and goal might come from in the first place. “I think unrealistic expectations {re: fundraising goals} are rooted in this mindset of management that all fundraisers, if they're worth their salt, have to be rainmakers. And to me that's not the role of fundraisers. It's also an unrealistic expectation because fundraising is and has always been and always will be both art (relationship-building) and science (process).”
With that prompt from Mr. Thompson, let me clarify why all nonprofits need A.I. by taking a look at where the process of fundraising and the process of artificial intelligence overlap. According to boodleAI’s white paper “Artificial Intelligence 101 for Nonprofits”, here are four tasks machine learning can perform, among others, to feed A.I. models: Classification, Prediction, Clustering, and Identification of Key Features. Using boodleAI’s examples, I’ve put a question under each task indicating the kind of answer we are looking to receive from the executed task as it relates to a fundraising event:
Now let me put this to you, as a nonprofit leader and fundraiser: When was the last time you did not ask these questions in preparation for an event, a campaign, while building a pipeline or portfolio or donor relationship? And assuming you did ask these questions, did you also end up asking them again for the next event, campaign, or portfolio? And again? And again? And then, before systematizing the answers, did you move on to another organization and leave it to the next person in charge to ask the same questions.
This is where the ethics and morality come into the picture. Ethically, it is not correct to ask these questions, grind yourself and your team to bits coming up with answers, and then not effectively capture these answers in your files, only to ask the same questions again or leave them for the next team to struggle with. This is a waste of time and resources and the future of the organization, and so directly and negatively affects the mission. Morally, it is not correct to ask your team to find these answers again and again without offering complete buy-in on systematizing their capture. That way madness (and burnout) lies, and not only for your people, but for the people that may very well take your place in the future.
But here's the bigger, biggest rub. The operations above technically refer to machine learning tasks, not A.I. strictly speaking. Intelligence manifests when there is an adaptation to new surroundings, challenges, or information. So the above is a precursor to intelligence, whether human or artificial. And so the reason I rail that all nonprofits need A.I. is because all nonprofits need to use the principles of learning and analysis A.I. requires whether they end up using some astronomically priced A.I. enterprise software or not! Our engagement with these technical triumphs is our chance to learn about how we can learn, how to use what we already have, how we can stop saying we have an “IT Guy” or a “Data Guru” or a “Specialist”. Baloney. This is a leadership issue. If engaging with artificial intelligence and its principles can show us how to bolster our own intelligence regarding our donors, our colleagues, our mission, the people we serve – then it is incumbent upon you, the leader, to be or become the tech expert or at least expert tech cheerleader on your team of one or team of one thousand.
"So, in essence, a neutrino is a particle that takes no time to get no place."
My Astronomy 101 Professor
It was over 20 years ago my astronomy professor laid the above gem on me. He declared it with obvious insight and good humor, offering a pithy and memorable way to describe the mind-boggling experience of measuring neutrinos up until that time. And I still don’t know what the hell he was talking about. Yet it wasn’t his fault I finally withdrew from the class. My lack of understanding was at a level hitherto unknown to me – and I therefore had to get out of there as soon as possible. And I regret leaving. I let my fear get the better of me, and who knows? If I’d stuck with it, maybe I would’ve finally learned something more about the cosmos instead of haranguing my nonprofit colleagues here on terra firma.
That said, I’ll leave the conclusion to ChatGPT:
“In conclusion, an organizational culture of philanthropy is a mindset that prioritizes philanthropy as a means of achieving a nonprofit's mission. AI can play a role in supporting and strengthening a culture of philanthropy by providing valuable insights and automating certain tasks, but it is ultimately up to the organization's leadership and staff to foster a culture of philanthropy through their actions and attitudes. By embracing a culture of philanthropy and leveraging AI to support their efforts, nonprofits can build stronger relationships with donors and supporters, achieve greater impact, and ensure their long-term sustainability. And just like the lovable Muppets, nonprofits can use AI to enhance their performance and bring their mission to life in innovative and exciting ways.”
ChatGPT, excerpt from answer to my question “What is an organizational culture of philanthropy and how does it relate to artificial intelligence, and could you please include the word ‘Muppets’”?