A research-led competitive experience that united 100+ at MSU to design, assess, and evaluate real-world innovation through user-centered methods.

Project Type: Design Strategy & UX Research Developed competition strategy and UX research for MSU’s first Designathon, integrating user insights into a real-world design sprint.
Role: Experience Researcher & Designer
Led market research and co-developed strategy and rubrics to guide user-centered evaluations.
Duration: October 2024 - March 2025
Tools: Figma, Notion, & Google Suite

TL;DR Summary

Problem: MSU’s first 24-hour Designathon. The challenge: support students in designing user-centered solutions and established a fair structure to assess innovation under time pressure.
Approach: Led experience research and co-structured judging. Conducted cross-team market research, synthesized insights, and supported strategic evaluation.
Outcome: Developed rulebooks and rubric systems used by judges, influenced over 100 participants, and contributed to the overall design quality and strategic success of the event.

Framing the Problem

With over 100 participants from multiple universities, MSU Designathon needed a structure that balanced design freedom with fair evaluation. There was no existing system to benchmark success, evaluate user research integration, or guide teams through experience-driven thinking.

User Research & Insights

PHASE 1: Research & Benchmarking — Led research efforts on industry standards, target audiences, and design trends. Conducted comparative analyses to identify key design sprint differentiators and common participant pain points, such as unclear evaluation goals and limited research scaffolding. PHASE 2: Framework Development & Support — Translated insights into practical tools, including judging rubrics, innovation criteria, and feedback channels. Developed a tiered scoring model and co-authored a rulebook to guide fair evaluation and support participant research. PHASE 3: Insight Sharing & Implementation — Facilitated insight-sharing before final pitches to ground ideas in validated research. Shared toolkits and support materials to help teams apply user-centered design and conduct usable research.

Key Insights & Design Decisions

1. Teams needed more upfront research to design effectively under time constraints.
2. Judges sought objectivity—leading to structured, tiered evaluation criteria.
3. Real-time research synthesis created stronger final outcomes and made judging more consistent.

Projected Impact

This framework lays the groundwork for stronger innovation, clearer evaluation, and long-term impact across future MSU design-a-thon and events.

1. Created a replicable research-evaluation framework used by Designathon leadership.
2. Supported innovation direction across 20+ teams.
3. Rubric and research materials set the foundation for future MSU hackathons and events.

Design Solutions

I led 15+ market research, judging criteria, and rulebook development—drawing from industry benchmarks and user-centered design frameworks. Our team created a tiered scoring model used by 15+ judges and developed materials to help 20+ teams conduct usable research. We also facilitated insight-sharing before final pitches to ensure ideas were grounded in validated understanding.

Reflections & Opportunities for Growth

Designathon reminded me of the power of UX research under pressure. Conducting real-time research pushed me to prioritize clarity, brevity, and immediate application—and exposed me to designing in the physical realm, which was a refreshing contrast to the digital-first work I was so used to. I’d like to expand this experience into a toolkit for future teams, incorporating usability testing and inclusive design checklists to elevate participant outputs. With early survey data pending, I’m eager to measure how research impacted user satisfaction and design quality.
zaydalghaza@gmail.com