I’m a
Lead Designer
with 12 years of experience, who loves
the messy middle, the space where a vague idea needs to become a tangible, scalable reality.
I bridge the gap between product strategy, brand vision, systems thinking and high-fidelity craft. If you’re looking for a partner to
help scale a design culture, go from 0 to 1,
make the complex simple, or
turn lemons into lemonade,
let’s talk. Outside of design, I also make films and paintings.


Design Process
CV
Email
LinkedIn



ProjectClient
1Post ComposerNextdoor
2EventsNextdoor
3Content Moderation SystemNextdoor
43 Year Product VisionNextdoor
5Financial DashboardFlychain
6Brand Refresh & Website RedesignFlychain
7Store Experience and DesignTrove
8Cheddar CounterTrove

Content Moderation System



Team


Head of Product
Lead Designer 🙋🏻‍♀️
Engineering Manager
4 Engineers
Data Scientist

This project involved system mapping of the entire content moderation system to identify key gaps and opportunities, leading to a reduction in harmful content, increased operation efficiency and increased trust and retention.







Project Context

Nextdoor’s content moderation system was a complex system with four different actors balancing local community nuance and centralized safety oversight. The moderation system was built incrementally over time as technical architecture evolved. As a result, how each actor worked in concert with each other was not clear. Designing from a user centric POV, we were guided by principles from the “Procedural Justice Framework”. This designs for fairness, transparency, and dignity throughout the user journey, to build trust and legitimacy in the decision-making process, rather than focusing on just the final outcome.








Key Design Decisions




Streamlined Context Aware Reporting


We designed a streamlined logic tree tailored to the context of the content (e.g. DMs vs. Business Pages). With additional contextual data, Leadbot and Community Moderators can now make decisions more accurately and efficiently.





Transparency in Report Status


The outcome of the report (Hidden, Kept, or Account Suspended) is communicated back to the Reporter and Post Creator. Educating users on how and why the decision was made increases the likelihood they will accept the outcome and trust the platform.






Structured Appeals Process


If a creator feels a community vote was biased, they can appeal to NOPS for a final, neutral review, providing a path to being heard. Ensuring the system has a formal recourse path was also part of regulatory requirements in the EU. 





Proactive Intervention: "Kindness Tips" for Repeat Violators


Data showed that a tiny fraction of users caused the majority of the issues (0.4% of users accounted for 57% of removals). The majority of posts (51%) removed by NOPS and community moderators is due to the reason of being “uncivil or unkind”.

I designed a proactive approach to interrupt the behavior of repeat violators. If a user is a repeat violator (3+ content removals), the system triggers "Kindness Tips". This in-app education unit interrupts the posting flow to provide tips and examples before the user can post again. 









Impact

Reduction in Harmful Content


The "Kindness Tips" project aimed to significantly lower the volume of Harmful/Hurtful (H/H) content without needing to ban thousands of accounts. In a system where content is the lifeblood, we needed to balance removing harmful content without discouraging posting.


Increased Operation Efficiency


The streamlined reporting logic tree enabled NOPS to sieve through the noise more efficiently and focus on high stakes violations rather than minor neighborhood squabbles.


Increased Trust & Retention


The "Reporter Experience" and “Report Status” designs led to higher acceptance of content decisions and as a result, increased user retention. When a neighbor feels heard, they are less likely to churn from the platform even after having their content removed.