Use for free
CoCar
Desgined, created, and prototyped a voice agent for a university based ride-sharing app

Connecting students for shared trips between campus and surrounding suburbs

Challenge

Project overview

CoCar is a ride-sharing platform designed specifically for university students seeking more accessible, affordable, and social ways to commute. Our project centred on designing "Coco," a friendly and inclusive chatbot that helps users register, estimate ride costs, find compatible student drivers, and get support. The aim was to replace clunky form-based flows with natural, contextual, and responsive interactions. Coco was developed through a rigorous, user-centred, iterative design process that prioritised accessibility and conversational UX. Our work spanned across Figma wireframes, Voiceflow prototypes, Wizard of Oz testing, accessibility research, A/B testing, and chatbot personality crafting.

My Role

I directed and defined the design process, including ideation, scope, testing, and prototyping pipelines, while organising group meetings to ensure timely delivery and presenting work-in-progress to gather feedback. I broke down the project into manageable tasks, led user testing including A/B tests, Wizard of Oz sessions, and desktop walkthroughs, and built the first draft of our Voiceflow prototype, identifying where JavaScript logic and API integration would be beneficial. I collaborated with the team to establish chatbot tone and flow using conversation design principles, designed WCAG compliant, accessible Figma wireframes for the app, created the initial conversation flow and custom components to organise user flows, and conducted research to inform and justify design decisions.
Process

Design Process

The design process began with extensive ideation and research to deeply understand student needs, behaviours, and accessibility requirements. Using 10+10 sketching, HMW framing, and priority matrices, we explored dozens of ideas and mapped them against user journeys, focusing on core chatbot functionalities such as registration, fare estimation, driver matching, and FAQs. Secondary research into student transport behaviours, accessibility needs, and conversational design best practices, alongside real-world data from the University of Melbourne, PWDA, and NDIS, informed personas representing diverse users, including those with mobility impairments, vision limitations, and sensory needs. Early prototyping involved low-fidelity Figma flow diagrams, storyboards, and physical walkthroughs, progressing to high-fidelity conversation flows in Figma and semi-functional Voiceflow prototypes to test logic, tone, and usability. Attention to accessibility included WCAG AAA colour contrast, responsive layouts, screen reader compatibility, and button-based interactions to reduce typing and cognitive load.

We iteratively refined Coco through rigorous testing and modular development. Wizard of Oz sessions, A/B testing, and feedback loops shaped the chatbot’s conversational tone, flow, and interface, ensuring she felt like a helpful peer rather than robotic or overly cheerful. High-fidelity Figma prototypes visualised interaction flows across mobile, tablet, and desktop, while Voiceflow was used to build flexible, intent-based flows with API integrations for real-time cost estimation and location validation. Each flow was designed with fallback paths, progressive prompts, and reusable components to handle errors gracefully and support unexpected user behaviours. Accessibility testing with personas like Chloe (cerebral palsy) and Alex (low vision) ensured that interactions were inclusive, with considerations for voice-to-voice input, keyboard navigation, and simplified input methods.
Results

Product

The final chatbot was successfully launched as a modular, intent-based system, fully integrated with Voiceflow and supplemented with real-time APIs and JavaScript logic. Coco supported key user tasks: driver matching, registration, cost estimation, and FAQs with flexible flows that could handle mid-conversation topic changes. Inclusivity was central to the design from the start: interactions favoured buttons over free text to assist users with mobility impairments, layouts were responsive with readable fonts and WCAG AAA colour contrast, and the system supported screen readers, keyboard navigation, and voice-to-voice interactions for blind or low-vision users. Fallback flows and progressive prompts were built in to reduce cognitive load demonstrating both technical robustness and strong accessibility consideration.

The final chatbot was successfully launched as a modular, intent-based system, fully integrated with Voiceflow and supplemented with real-time APIs and JavaScript logic. Coco supported key user tasks—driver matching, registration, cost estimation, and FAQs—with flexible flows that could handle mid-conversation topic changes. User testing informed over 50 refinements, leading to a highly usable and accessible experience. The platform gained 200+ active beta users, ran a paid trial with FEIT at UniMelb, and included a fully functioning web app and mobile app on TestFlight, demonstrating both technical robustness and positive user reception.

For more details regarding this project such as figma files etc. please contact me at miaicasey1971@gmail.com