Most organizations believe they already know who is responsible when AI is used: the person who used the tool. But that answer assumes something that often isn't true — that the authority underneath that responsibility is clearly defined.
In practice, many nonprofits operate with informal decision structures. Authority settles into roles, trusted individuals, compressed processes, and software systems over time. The org chart stays the same, but the real decision rights slowly move somewhere else.
This episode explores four patterns of authority drift that exist in most organizations long before AI arrives: position drift, trust drift, process drift, and tool drift. AI does not introduce these patterns — it accelerates them by removing the friction that once made them visible.
The governance challenge, then, is not simply writing AI policies. It is making operational decision rights visible before AI embeds those informal structures into systems operating at scale.
If you want to see the full video you can watch it here:
YouTube video: https://youtu.be/rpjqYXbm218
Other relevant links:
Substack: https://brightnonprofit.substack.com/
Website: https://brightnonprofit.org