Community Service · 2 Years

Service Beyond the Uniform

After the Army, I traded the theater of war for the theater of community. Two years of nonprofit work taught me that the most complex system any engineer will ever face is a human being in crisis.

The Transition from Military to Mission

When I left the Army, I had a choice: pursue the highest-paying role my security clearance and engineering background could land me, or go further into service. I chose service — not out of naïveté, but because I understood something that took many of my peers years to learn: the hardest problems in technology are not technical.

For two years, I served in a nonprofit organization working directly with communities underserved by mental health resources, economic opportunity, and educational infrastructure. This work was not peripheral to my academic development — it is my academic development.

Why Nonprofits Belong in an AI Portfolio

Every AI ethics framework — from Harvard's "Humanity Meets AI" symposium to Princeton's Center for Theology, Science, and Human Flourishing — asks the same question: who gets harmed by the systems we build? Two years of direct service work gives me an answer that most applicants can only theorize about. I have sat across from people whose lives were shaped — for better and worse — by algorithmic decision-making in housing, healthcare, and criminal justice.

Areas of Service

Year 1

Community Outreach & Program Coordination

Designed and coordinated outreach programs connecting community members to mental health services, legal aid, and educational resources. Managed volunteer cohorts and donor communications.

Year 2

Technology Access & Digital Literacy Initiative

Led a technology access initiative helping underserved community members gain digital literacy skills. This work directly informed my research interest in AI equity and algorithmic bias.

What I Learned About Technology Ethics

In the nonprofit sector, I witnessed firsthand how algorithmic systems — deployed without community input, designed without cultural context — can simultaneously claim to help and systematically harm. Benefits eligibility algorithms that exclude documented cases. Predictive policing tools that treat zip codes as risk factors. Credit scoring models that penalize the unbanked.

This is not an abstract research interest for me. It is lived experience. And it is the foundation of my argument that AI practitioners must have genuine relationships with the communities their systems will affect — before, during, and after deployment.

Photo Gallery

Nonprofit Service Photo 1
Nonprofit Service Photo 2
Nonprofit Service Photo 3
Nonprofit Service Photo 4
Nonprofit Service Photo 5
Nonprofit Service Photo 6
Nonprofit Service Photo 7
Nonprofit Service Photo 8
Nonprofit Service Photo 9
Nonprofit Service Photo 10