Pocket Guide on Technology for Nonprofit Organizations | HAZ Technologies


HAZ Technologies, LLC — Fractional CIO Advisory

Pocket Guide on
Technology
for Nonprofits

A practical, plain-language primer on using technology to advance mission, protect trust, and sustain impact.

Micheal Cudgel
Managing Partner · HAZ Technologies, LLC

HAZ Technologies, LLC

For Nonprofit Founders
Executive Directors
Board Members
Technology Leaders

This guide exists to help nonprofit founders and nonprofit leaders think clearly about technology as mission infrastructure, not as an afterthought or a luxury. Technology decisions shape how nonprofits deliver services, protect people, steward resources, comply with regulations, and build trust with funders and communities.

Built for Nonprofit Leaders

For the Founder

To help you think early about technology choices that will either stabilize or strain your organization.

For the Nonprofit Leader & Board

To help you govern, oversee, and invest in technology responsibly, ethically, and sustainably.

For the Technology Lead

To provide a shared strategic language for communicating technology risk and investment to leadership.

How to Use This Guide
01

Learning Points

What you should understand after each chapter.

02

Concept

Plain-language explanation of the topic.

03

Case Study

A real-world scenario nonprofits commonly face.

04

Practical Takeaway

Actionable guidance you can actually use.

Table of Contents

Chapter 01
The Technology Misconception
Why technology is not “extra,” optional, or just an IT issue

Learning Points

  • Technology is part of nonprofit operations, not a side function
  • Cheap or free tools are not always low-risk
  • Under-investing in technology often costs more later
  • Technology decisions are leadership decisions
  • The mission does not exempt an organization from technological responsibility

Concept

A common misconception is that nonprofits are too small, too mission-driven, or too resource-constrained to think seriously about technology. This belief often shows up in boardrooms, in staff meetings, and in budget conversations as:

  • “We’ll deal with tech later.”
  • “We can’t afford real systems.”
  • “Technology takes money away from mission.”
  • “We’re a small shop—this doesn’t apply to us.”

Each of these statements carries a hidden cost. In most cases, the cost of delay is higher than the cost of action.

In reality, technology is how the mission is delivered. Case management systems, donor platforms, email, cloud storage, payroll, scheduling, data reporting, and cybersecurity all directly affect whether an organization can serve its community, maintain funder relationships, and sustain operations.

Nonprofits are businesses. Technology either amplifies impact or amplifies dysfunction. There is rarely a neutral outcome.

The Cost of the Misconception

When technology is treated as optional or secondary, the consequences compound over time:

  • Operational fragility — Processes built on personal email accounts and spreadsheets collapse when staff transition or systems fail.
  • Compliance exposure — Nonprofits without proper data systems face risk during audits, grant reviews, and regulatory inquiries.
  • Missed opportunity — Technology-enabled nonprofits serve more people, respond faster, and demonstrate impact with data.
  • Donor and funder skepticism — Funders increasingly evaluate technology infrastructure as part of due diligence.

Technology is not a reward for organizational maturity. It is a condition of it.

Case Study

A growing nonprofit relied on personal email accounts and shared passwords to manage donor records, grant documentation, and client communications. When the executive director transitioned unexpectedly, the organization discovered it had no centralized records, no documented processes, and no way to recover access to critical accounts. A funder report was missed. A donor database was lost. A grant was placed under review.

The organization had not experienced a cyberattack. It had experienced the slow, quiet cost of technological neglect.

Practical Takeaway

  • Conduct an honest audit: What technology does your organization use? Who owns each system? What happens if a key staff member leaves tomorrow?
  • Reframe technology in budget conversations. It is not overhead. It is infrastructure.
  • Include a technology line item in every annual budget as a strategic allocation.
  • Bring technology into board discussions. If the board does not know what systems the organization depends on, that is a governance gap.

“If your nonprofit stopped using technology tomorrow, how much of your mission would still function?”

If technology is ignored, it does not disappear—it accumulates risk.

Chapter 02
Digital Infrastructure vs. Digital Tools
Understanding systems, platforms, and long-term architecture

Learning Points

  • Tools are not systems
  • Infrastructure decisions compound over time
  • Fragmentation increases risk and cost
  • Owning a tool is not the same as building capacity
  • The right question is not “what do we use?” but “what do we own and control?”

Concept

There is a difference between buying a hammer and building a house. Nonprofits often buy hammers—individual software tools that solve an immediate problem—without ever building the house. The result is many tools but no coherent system, many platforms but no architecture.

Digital infrastructure includes:

  • Identity and Access Management — Who can log in? To what? What happens when someone leaves?
  • Data Storage and Ownership — Where does your data live? Who owns it? Can you export it?
  • Backup and Recovery — If a system fails, how quickly can you restore operations? Have you tested it?
  • Security Controls — What protects your systems from unauthorized access or data loss?
  • Integration Between Systems — Do your tools communicate, or does staff manually transfer data?

Tools come and go. Infrastructure stays.

The Fragmentation Problem

Nonprofits that adopt tools reactively often end up with fragmentation—creating:

  • Duplicate data — The same information lives in multiple places and cannot be trusted.
  • Manual workarounds — Staff copy data between systems, increasing error and reducing capacity.
  • Inconsistent reporting — Producing accurate reports becomes a project rather than a function.
  • Security gaps — Every unmanaged platform is a potential vulnerability.
  • Vendor dependency — Organizations locked into contracts, unable to export their data.

Building With Intention

Infrastructure planning requires clarity about what the organization needs and discipline in selecting tools that align with that need. Start with:

  • What are our core operational functions?
  • What technology does each function depend on?
  • Who is responsible for each system?
  • What is our plan if a system fails or a vendor closes?

Case Study

A regional nonprofit had accumulated seventeen separate software subscriptions over ten years. Reports required pulling data from six platforms. When the organization applied for a federal grant requiring audited outcome data, it took three staff members six weeks to compile what should have been a one-day report. The grant was submitted late. The organization did not receive the award.

The problem was not the tools. It was the absence of infrastructure thinking behind them.

Practical Takeaway

  • Create a Technology Inventory: list every platform, subscription, and tool with owner, data, and annual cost.
  • Identify core systems: email, file storage, financial management, donor or client relationship management.
  • Establish a Technology Review Cycle: annually review your inventory, eliminate redundancies, and plan intentionally.
  • Before any new tool, ask: Does this integrate with what we have? Who will own it? What happens to our data if we stop using it?

Chapter 03
Start With the Mission, Not the Software
Technology decisions before you buy anything

Learning Points

  • Technology should follow strategy, not lead it
  • Vendors do not define your needs—you do
  • “Best in class” may be wrong for your organization
  • A technology purchase without a purpose statement is a liability
  • The most expensive technology decision is the wrong one

Concept

Nonprofits often ask: “What software should we buy?” The better question is: “What problem are we trying to solve?”

Mission Before Software

Every technology decision should begin with four strategic questions:

  • What outcome are we trying to achieve?
  • What does our staff actually need to do their work?
  • What are our compliance requirements?
  • What is our realistic capacity to implement and maintain this system?

The Vendor Relationship

Vendors are partners, not authorities. Know what you need before you learn what they offer. When evaluating options, ask:

  • Can we see a reference from a nonprofit of our size and type?
  • What does implementation actually look like, and who is responsible for it?
  • What happens to our data if we end the contract?
  • What does the true total cost of ownership look like over three years?

The Technology Purpose Statement

Before purchasing any system, write a one-page Technology Purpose Statement answering:

  • What specific problem does this solve?
  • Who will use it, and how often?
  • How will we measure whether it is working?
  • What is our plan if it does not work?

If you cannot answer these questions, you are not ready to buy. That is not a failure—it is discipline.

Case Study

A nonprofit workforce development organization received a technology grant and adopted a nationally recognized case management platform designed for organizations ten times its size. The implementation required eighteen months, consumed the entire grant, and was never fully adopted by staff. Two years later, the organization abandoned the platform and returned to spreadsheets.

The tool was not the problem. The mismatch between the tool and the organization’s needs, capacity, and context was.

Practical Takeaway

  • Before any technology purchase, complete a Technology Purpose Statement: problem, users, success measure, contingency plan.
  • Involve staff in the selection process.
  • Request references from similar organizations before committing to any significant platform.
  • Negotiate contract terms carefully. Ensure you retain ownership of your data and have a clear exit path.
  • Budget for implementation, not just licensing.

“The most expensive technology decision is not the one that costs the most. It is the one made without clarity.”

Chapter 04
Data, Privacy, and Trust
How nonprofits are regulated as data stewards—and why ethics go further than compliance

Learning Points

  • Nonprofits are data custodians, not just data collectors
  • Privacy is an ethical obligation, not just a legal one
  • Trust is easier to lose than rebuild
  • Data governance is a leadership responsibility, not an IT function
  • State and federal regulations apply to nonprofit data handling—whether or not you know their names

Concept

Every time a client fills out an intake form, a donor submits payment information, or a program participant discloses a personal circumstance—your organization becomes a custodian of someone’s story. Nonprofits routinely collect and store:

  • Personally Identifiable Information (PII) — names, addresses, Social Security numbers, dates of birth
  • Protected Health Information (PHI) — medical diagnoses, treatment history, behavioral health records
  • Financial Data — donor payment details, bank account information, grant disbursement records
  • Educational Records — student progress, enrollment data, family information (governed by FERPA)
  • Information About Vulnerable Populations — children, survivors of abuse, individuals in the justice system

The Regulatory Landscape

  • HIPAA — Applies to organizations handling protected health information.
  • FERPA — Governs student education records. Nonprofits running afterschool or tutoring programs may be covered.
  • COPPA — Applies to organizations collecting data from children under 13 online.
  • State Privacy Laws — California (CCPA), Virginia, Colorado, and Ohio all have active or developing frameworks.
  • Grant and Funder Requirements — Many funders impose their own data handling requirements as a condition of award.

Privacy Is Ethical, Not Just Legal

Compliance is the floor—not the ceiling. Nonprofits are morally accountable to the people they serve. Many clients engage with nonprofits because they have nowhere else to turn. They share information out of need, not preference. Breaching that trust causes real harm—discouraging people from seeking help and damaging community confidence.

What Data Governance Actually Looks Like

Data governance begins with four questions:

  • What data do we collect? Create an inventory.
  • Why do we collect it? If you cannot answer this, reconsider collecting it.
  • Who has access? Access should be role-based and regularly reviewed.
  • How do we protect and dispose of it? Retention schedules and secure deletion matter.

Data you do not need is data you should not keep. Every record retained beyond its purpose is a liability.

Case Study

A nonprofit providing legal services to immigrant communities maintained client files in a shared, unprotected cloud folder. When a staff account was compromised, attackers accessed names, immigration status, case details, and attorney notes for hundreds of clients. The breach created fear among the population the organization existed to protect. Several clients disengaged from legal services entirely.

The data was never the problem. The absence of governance around it was.

Practical Takeaway

  • Conduct a Data Inventory — Know what data you collect, where it lives, who can access it, and how long you keep it.
  • Identify Applicable Regulations — Work with a qualified attorney to identify federal, state, and funder requirements.
  • Develop a Privacy Policy — Document how data is collected, used, protected, and disposed of.
  • Establish a Breach Response Plan — Know what you will do before something goes wrong.
  • Train Your Staff — Most breaches begin with human behavior.
  • Review Vendor Agreements — Every platform touching your data needs a data processing agreement.
  • Brief Your Board — Data privacy is a governance issue.

“If you collect data you cannot protect, you are carrying a risk you may not see—until someone else does.”

Trust is not lost through hacks alone. It is lost through neglect.

Chapter 05
Cybersecurity and Risk
Why nonprofits are targets and how to reduce exposure

Learning Points

  • Nonprofits are frequent and intentional targets for cybercrime
  • Cybersecurity risk is about impact, not just likelihood
  • Most nonprofit breaches start with human behavior, not hackers
  • Cybersecurity is a leadership and governance issue—not just an IT task
  • Preparation matters more than perfection

Concept

Nonprofits are among the most attractive targets for cybercriminals because they combine three conditions attackers look for: high-value data, limited security resources, and mission pressure that encourages speed over caution. Cyberattacks on nonprofits are rarely sophisticated—they are usually opportunistic.

Common nonprofit cybersecurity incidents include:

  • Phishing emails that trick staff into sharing credentials
  • Ransomware that encrypts files and halts operations
  • Account takeovers caused by reused or weak passwords
  • Accidental data exposure through misconfigured cloud storage
  • Lost or stolen laptops and mobile devices without encryption

The Human Factor

Most nonprofit breaches begin with a rushed staff member clicking a convincing email, a volunteer using a personal device, shared passwords, or a well-meaning employee trying to move faster. Technology alone cannot fix this. Cybersecurity is as much about culture as it is about controls.

What Cybersecurity Readiness Actually Looks Like

  • Strong Identity Controls — Unique user accounts, strong passwords, and multi-factor authentication (MFA)
  • Device Protection — Encrypted laptops and mobile devices with remote wipe capability
  • Email Security — Spam filtering, phishing detection, and regular user awareness training
  • Backups — Regular, tested backups stored separately from the primary network
  • Access Management — Role-based controls with immediate removal upon departure
  • Incident Response Plan — Documented plan with assigned responsibilities and notification timelines
  • Vendor Security Review — Baseline review of any vendor with access to organizational data

Case Study

A community-based nonprofit received an email appearing to come from its executive director requesting urgent document review. A staff member clicked the link and entered login credentials. Within hours, attackers accessed the email system, created forwarding rules, and sent fraudulent wire transfer requests to finance. Funds were transferred. Sensitive records were accessed.

The nonprofit had no multi-factor authentication, no phishing training, and no incident response plan. The breach involved a convincing email, a moment of urgency, and a complete absence of preparation. The cost of prevention would have been a fraction of the cost of recovery.

Practical Takeaway — Every nonprofit should be able to answer:

  • If we were attacked tomorrow, who would be responsible for responding?
  • Do we require multi-factor authentication for email and core systems?
  • Are our devices encrypted and recoverable if lost or stolen?
  • Do we have backups, and have we tested restoring from them?
  • Have staff received basic cybersecurity awareness training in the last twelve months?
  • Do we have a cyber liability insurance policy that reflects our current risk?
  • Does our board receive an annual cybersecurity risk briefing from leadership?

“The goal of cybersecurity is not to avoid every incident. It is to ensure that one incident does not end the mission.”

Cybersecurity is not paranoia. It is preparedness.

Chapter 06
Operations, Automation, and Teams
Using technology to support people—not replace them

Learning Points

  • Technology should reduce friction, not create it
  • Automation must be designed with people in mind
  • Training is not optional—it is part of implementation
  • The right technology, poorly implemented, still fails
  • Staff adoption is the true measure of a successful technology deployment

Concept

Technology is only as valuable as the people who use it. A system that staff avoid, misuse, or work around is not a solution—it is a new problem.

The Implementation Gap

Effective implementation includes:

  • Staff involvement in selection — People who use a system daily should have input in choosing it.
  • Dedicated training — Structured, role-specific training that prepares staff for their actual work.
  • A transition period — Allow parallel operation before decommissioning legacy processes.
  • A feedback mechanism — A way for staff to report problems and suggest improvements.
  • An identified system owner — Every platform needs a designated internal owner.

Automation: Opportunity and Responsibility

Automation done well removes repetitive tasks, reduces human error, improves consistency, and enables scale. Done poorly, it creates impersonal communications, unmonitored errors, and erodes the relational quality that defines nonprofit service delivery.

The governing principle: automate the process, not the relationship. Use technology to handle administrative burden. Preserve human judgment and human connection for the work that requires it.

Case Study

A nonprofit providing housing navigation services implemented a new case management platform. Leadership provided a two-hour training and set a go-live date. Six months later, case managers maintained parallel records in both systems. Data was incomplete. Reports were unreliable.

There had been no role-specific training, no transition period, no internal champion, and no feedback mechanism. A targeted re-implementation effort—including individual coaching sessions and a designated system administrator—resolved the challenges within ninety days.

Practical Takeaway

  • Before implementing any new system, develop an Implementation Plan: timeline, training, system owner, feedback, and success metrics.
  • Involve staff in selection and configuration.
  • Distinguish between processes appropriate for automation and those requiring human judgment.
  • Identify a Technology Point of Contact within the organization.
  • Evaluate technology at 30, 90, and 180 days post-implementation.

“If staff are avoiding a system, the system—not the staff—is the problem.”

Chapter 07
Governing Technology Well
The board’s role in oversight, risk, and accountability

Learning Points

  • Boards oversee risk—including technology risk
  • Technology is a governance issue, not just an operational one
  • Accountability in technology protects the mission
  • The board does not need to choose tools—it needs to ask the right questions
  • Ungoverned technology is an organizational liability

Concept

Many nonprofit boards treat technology as a staff matter—managed below the board level and reported on only when something goes wrong. This posture leaves organizations exposed. Technology decisions carry significant financial, legal, reputational, and operational risk. Boards that are not engaged in technology governance are boards that are not fully governing.

What Boards Should Know and Ask Annually

  • What are our most significant technology risks, and how are we managing them?
  • Do we have a cybersecurity incident response plan, and has it been tested?
  • Is our data protected in accordance with applicable law and our own policies?
  • What technology investments are planned in the next fiscal year, and how do they align with mission?
  • Do we have cyber liability insurance, and is the coverage appropriate?
  • Has our technology infrastructure been reviewed by an independent expert in the last two years?

Technology Policy as Governance Infrastructure

Every nonprofit should have, at minimum:

  • Acceptable Use Policy — Defines how staff, volunteers, and board may use organizational technology
  • Data Privacy Policy — Documents how data is collected, used, stored, and disposed of
  • Cybersecurity Policy — Establishes baseline security requirements for all users and systems
  • AI Use Policy — Governs the use of artificial intelligence tools by staff and leadership
  • Incident Response Plan — Documents steps to be taken in the event of a breach or failure
  • Vendor Management Policy — Establishes expectations for reviewing and monitoring technology vendors

Case Study

A regional nonprofit experienced a ransomware attack that encrypted its entire file system—including client records, financial documents, and grant reports. The organization had no usable backups. Recovery took four months and cost more than the annual technology budget. The board had never discussed cybersecurity, had no technology policy, and had received no technology risk reporting from leadership.

A single annual board conversation about technology risk would not have prevented the attack—but it may have prompted the backup policy that would have made recovery possible in four days instead of four months.

Practical Takeaway

  • Add technology risk to the board’s annual risk assessment.
  • Request an annual Technology Health Report from leadership.
  • Ensure the organization has cyber liability insurance and that the board has reviewed the coverage.
  • Adopt a board-level Technology Policy Checklist and review it annually.
  • If the board lacks technology literacy, consider adding an advisor with relevant expertise.

“Ignoring technology is still a decision—just not a governed one.”

Chapter 08
Artificial Intelligence and Emerging Technology
What AI means for nonprofits—and how to lead it, not just allow it

Learning Points

  • AI is not a trend—it is an operational reality nonprofits must engage now
  • Emerging technology carries mission opportunity and mission risk
  • Boards and leaders must govern AI, not just allow it
  • Ethical use of AI in the nonprofit space requires intentionality and policy
  • AI does not replace mission—but ungoverned AI can quietly compromise it

Concept

Artificial intelligence has entered the nonprofit sector whether organizations planned for it or not. The question is no longer whether AI is present. The question is whether you are governing it.

AI creates real opportunity for nonprofits:

  • Drafting grant narratives and donor communications faster
  • Analyzing program data to surface outcomes and trends
  • Automating repetitive administrative tasks
  • Translating materials for multilingual communities
  • Providing 24/7 service touchpoints through AI-assisted communication tools
  • Generating reports and impact summaries that once required dedicated analyst capacity

But opportunity without governance is exposure.

The Risks That Matter Most for Nonprofits

  • Data privacy violations — Staff entering client or donor data into public AI tools without understanding where that data goes
  • Bias and inaccuracy — AI-generated content reflecting biases embedded in training data
  • Relational erosion — Automation displacing the human connection central to nonprofit service delivery
  • Embedded vendor AI — AI features introduced into existing platforms the organization never reviewed
  • Over-reliance — Treating AI outputs as authoritative without human review in high-stakes decisions

The Ethical Dimension

Nonprofits operate in a space of public trust. The ethical use of AI is a practical commitment to:

  • Transparency with clients about when and how AI is being used
  • Human oversight of AI-assisted decisions, particularly those affecting eligibility or access
  • Ongoing review of AI outputs for accuracy, bias, and appropriateness
  • Staff training that includes not just how to use AI tools, but when not to

Case Study

A mid-size social services nonprofit began using an AI writing tool to accelerate grant reporting. Staff adopted it broadly. Within months, leadership discovered that staff had been entering client case notes—including names, diagnoses, and housing status—into the public tool. No data processing agreement existed. No HIPAA review had been conducted. No client consent had been obtained. The organization faced regulatory exposure, a funder audit, and a trust crisis.

The tool was not the problem. The absence of governance was.

Practical Takeaway

  • Establish an AI Use Policy — What tools are approved? What data may never be entered into external AI systems?
  • Create a Data Classification Standard — Public, internal, sensitive, restricted. Apply before any AI tool is used.
  • Conduct Staff Training — What AI can and cannot do, and where human judgment must remain primary.
  • Brief the Board — Confirm that AI policy and oversight exist.
  • Review Vendor Agreements — Understand what AI features exist in current platforms and what happens to your data.
  • Establish a Human Oversight Protocol — Define when AI outputs must be reviewed by a qualified human before use.

“Artificial intelligence is not the future of nonprofits—it is the present. The only question is whether you are leading it or following it.”

Appendix A
Incident Response One-Pager
A practical, printable checklist for when something goes wrong

No organization wants to experience a cybersecurity incident. But every organization should be prepared for one. The following checklist is designed to be adapted, printed, and kept accessible by leadership and IT contacts.

Phase 1: Identify and Contain — First 1–4 Hours

Confirm the incident — Real breach, ransomware, account compromise, or phishing attempt?

Identify what is affected — Which systems, accounts, or data are involved?

Isolate affected systems — Disconnect compromised devices. Do not turn them off—preserve evidence.

Revoke compromised credentials — Change passwords and disable unauthorized accounts.

Preserve evidence — Document what you know, when you learned it, and what actions were taken.

Activate your incident response team — Notify designated lead, executive director, and IT contact.

Phase 2: Assess and Notify — First 24–72 Hours

Determine the scope — What categories of data were exposed? How many individuals affected?

Engage legal counsel — Do not make public statements or send notifications without legal guidance.

Notify your cyber liability insurance carrier — Most policies require prompt notification.

Assess mandatory notification requirements — State AGs, HHS (HIPAA), and other regulators may apply.

Notify funders if required — Review grant agreements for breach notification requirements.

Communicate internally — Staff need to know what happened and who to contact.

Phase 3: Remediate and Recover — Days 3–30

Restore systems from clean backups — Only restore from backups verified as unaffected.

Patch and harden systems — Address the vulnerability that enabled the incident.

Implement or reinforce security controls — MFA, access reviews, updated passwords.

Conduct staff training — If human error was involved, training is part of the remediation.

Document the full incident timeline — From first awareness to full recovery.

Phase 4: Review and Strengthen — 30–90 Days Post-Incident

Conduct a post-incident review — What happened? Why? What could have prevented it?

Update the incident response plan — Incorporate lessons learned.

Brief the board — Clear, factual summary and steps taken to prevent recurrence.

Review insurance coverage — Does the current policy reflect the organization’s risk profile?

Consider an independent security assessment — Identify vulnerabilities before the next incident.

Key Contacts — Complete Before an Incident

Role Name Contact
Executive Director
IT Contact / MSP
Legal Counsel
Cyber Insurance Carrier
Board Chair
Key Funder Contacts

“The organization that prepares for an incident before it happens recovers faster, communicates better, and sustains less lasting damage than the one that does not.”

Appendix B
Technology Governance Quick Reference
For Boards and Leadership Teams

Annual Board Technology Review: Questions to Ask

On Risk

  • What are our three most significant technology risks this year?
  • Have we experienced any incidents, near-misses, or policy violations in the past twelve months?
  • Is our cyber liability insurance current and appropriate for our risk profile?

On Data and Privacy

  • Do we know what data we collect, where it lives, and who can access it?
  • Are we in compliance with applicable data privacy laws and funder requirements?
  • Have we reviewed our vendor data agreements in the past year?

On Cybersecurity

  • Do we have an incident response plan, and has it been tested?
  • Do all staff use multi-factor authentication for organizational systems?
  • Have staff received cybersecurity awareness training in the past twelve months?

On AI and Emerging Technology

  • Are staff using AI tools? Do we have a policy governing their use?
  • Has our AI use policy been reviewed in the past year?
  • Do we understand how our vendors are using AI within their platforms?

On Governance

  • Do we have current acceptable use, data privacy, and cybersecurity policies?
  • Is there a designated technology lead or point of contact within the organization?
  • When did we last conduct an independent technology assessment?

Technology Policy Checklist

Policy In Place Last Reviewed Owner
Acceptable Use Policy
Data Privacy Policy
Cybersecurity Policy
AI Use Policy
Incident Response Plan
Vendor Management Policy
Device and Remote Work Policy
Data Retention and Disposal Policy

Technology Health Indicators

Leadership should be able to report on the following annually:

All staff accounts use multi-factor authentication

All organizational devices are encrypted

Backups are performed regularly and restoration has been tested

Access is reviewed and updated quarterly

No staff share passwords or use personal accounts for organizational work

All vendors with data access have signed data processing agreements

Cyber liability insurance is current

An incident response plan exists and has been reviewed in the past year

“Governance is not a response to failure. It is the condition that makes failure survivable.”

Micheal Cudgel

Micheal Cudgel is a fractional Chief Information Officer and technology strategist with nearly two decades of experience helping nonprofits, municipalities, and mission-driven organizations build technology infrastructure that serves people—not just operations.

As Managing Partner of HAZ Technologies, LLC, a fractional CIO firm operating across Central Ohio, Micheal works alongside executive directors, board members, and community leaders to translate complex technology decisions into plain-language, practical strategies and sustainable infrastructure. His clients range from grassroots nonprofits with volunteer-only staff to established organizations managing multi-million-dollar budgets and federal grants.

Micheal also serves as fractional Director of Technology for ROOTT (Restoring Our Own Through Transformation), fractional Director of Technology for The Center for Healthy Families, and fractional Chief Information Officer for Lead the Way Learning Academy.

He is the co-founder of Columbus Executive, a leadership development and advisory platform, and the architect of VeritySignal, a cybersecurity intelligence platform designed to bring transparency and accountability to organizational security reporting.

Micheal is a first-year master’s student in Theology and Divinity, exploring the intersection of faith, leadership, and community flourishing—themes that inform his approach to technology as a tool for justice rather than efficiency. He is based in Columbus, Ohio.

Disclaimer

This guide is for educational purposes only. It does not constitute legal, technical, or cybersecurity advice. When making decisions with significant risk or regulatory impact, consult qualified professionals including legal counsel, licensed cybersecurity professionals, and certified compliance advisors.

Technology does not replace mission. It reveals how seriously we take it.