


Dari
Dari
Bridging Deaf and hearing worlds.
Bridging Deaf and hearing worlds.
Bridging Deaf and hearing worlds.
UX Design
UX Design
UI Design
UI Design
Accessibility
Accessibility
Overview
Overview
Dari is an AI-powered assistive technology that bridges the communication gap between Deaf and hearing individuals. It features the Woori armband, which interprets ASL gestures into speech, and Sari smart glasses, which provide real-time captions and environmental awareness. By combining gesture recognition, AI translation, and contextual alerts, Dari enables natural casual communication, fostering inclusivity and seamless interactions.
Dari is an AI-powered assistive technology that bridges the communication gap between Deaf and hearing individuals. It features the Woori armband, which interprets ASL gestures into speech, and Sari smart glasses, which provide real-time captions and environmental awareness. By combining gesture recognition, AI translation, and contextual alerts, Dari enables natural casual communication, fostering inclusivity and seamless interactions.
Dari is an AI-powered assistive technology that bridges the communication gap between Deaf and hearing individuals. It features the Woori armband, which interprets ASL gestures into speech, and Sari smart glasses, which provide real-time captions and environmental awareness. By combining gesture recognition, AI translation, and contextual alerts, Dari enables natural casual communication, fostering inclusivity and seamless interactions.
My Role
UI Design / Visual Lead
My Role
UI Design / Visual Lead
Duration
26 Weeks
Duration
26 Weeks
Tools
Figma, Rhino 3D, Keyshot, Blender, Adobe Suite, After Effect, DaVinci Resolve
Tools
Figma, Rhino 3D, Keyshot, Blender, Adobe Suite, After Effect, DaVinci Resolve
Team
Sejoon Kim, Lukas Wiesner,
Lara Kurt
Team
Sejoon Kim, Lukas Wiesner, Lara Kurt

WHY
Communication isn’t equal — yet.
Despite major technological advances, the Deaf community still encounters significant communication barriers, hindering their ability to engage in seamless everyday interactions.

HOW
Using real stories to inspire inclusive tech.
Utilize the vast collection of data and stories to inspire the next generation, encouraging their engagement with science and technology.

WHAT
Explore space, science, and inclusive discovery.
An educational platform designed to showcase videos and activities, allowing users to experience the wonders of space and explore how science is applied.
WHY
WHY
HOW
HOW
WHAT
WHAT
My Role
UI Design / Visual Lead
Duration
26 Weeks
Tools
Figma, Rhino 3D, Keyshot, Blender, Adobe Suite, After Effect, DaVinci Resolve
Team
Sejoon Kim, Lukas Wiesner, Lara Kurt
WHY
WHY
HOW
HOW
WHAT
WHAT
Background
Background
Uncovering the unnatural
Uncovering the unnatural
Uncovering the unnatural
Existing technologies primarily support one-way communication unless interpreters are involved. However, there are still gaps in ASL technology, which do not do justice to the language, including limitations in ASL gloves that hinder truly seamless interaction.
Existing technologies primarily support one-way communication unless interpreters are involved. However, there are still gaps in ASL technology, which do not do justice to the language, including limitations in ASL gloves that hinder truly seamless interaction.
Existing technologies primarily support one-way communication unless interpreters are involved. However, there are still gaps in ASL technology, which do not do justice to the language, including limitations in ASL gloves that hinder truly seamless interaction.
Wearable
Tech Gloves
Wearable
Tech Gloves



Wearable
Tech Gloves
Background
Research
Research
Secondary Research
Secondary Research
Secondary Research
We needed to make sure we understand the Deaf community and the industry in general before getting into details in the how American Sign Language works.
We collected the meaningful information that would explain why we need to start this project in the beginning.
We needed to make sure we understand the Deaf community and the industry in general before getting into details in the how American Sign Language works.
We collected the meaningful information that would explain why we need to start this project in the beginning.
We needed to make sure we understand the Deaf community and the industry in general before getting into details in the how American Sign Language works.
We collected the meaningful information that would explain why we need to start this project in the beginning.
360 million people are considered Deaf.
360 million people are considered Deaf.
360 million people are considered Deaf.
More than 90% of Deaf children are born to hearing parents.
More than 90% of Deaf children are born to hearing parents.
More than 90% of Deaf children are born
to hearing parents.
More than 70 million Deaf individuals use sign language as their primary language.
More than 70 million Deaf individuals use sign language as their primary language.
More than 70 million Deaf individuals use sign language as their primary language.
ASL and English are NOT word-to-word translation.
ASL and English are NOT word-to-word translation.
ASL and English are NOT word-to-word translation.
Stakeholder Map
Stakeholder Map
Stakeholder Map
We mapped out the stakeholder map to understand the people and services that are closely related to the lives of the Deaf individuals. This helped us visualize the daily interactions that Deaf individuals usually have in general.
We mapped out the stakeholder map to understand the people and services that are closely related to the lives of the Deaf individuals. This helped us visualize the daily interactions that Deaf individuals usually have in general.
We mapped out the stakeholder map to understand the people and services that are closely related to the lives of the Deaf individuals. This helped us visualize the daily interactions that Deaf individuals usually have in general.



Primary Research
Primary Research
Primary Research
Now that we have a clear understanding of the scope and industry. We first spoke to experts to validate the information and gather holistic view before engaging with Deaf individuals to prevent any miscommunication or misunderstanding.
Now that we have a clear understanding of the scope and industry. We first spoke to experts to validate the information and gather holistic view before engaging with Deaf individuals to prevent any miscommunication or misunderstanding.
Now that we have a clear understanding of the scope and industry. We first spoke to experts to validate the information and gather holistic view before engaging with Deaf individuals to prevent any miscommunication or misunderstanding.
Interviews are held on Zoom.
Interviews are held on Zoom.
Interviews are held on Zoom.
Topic guides are developed from the primary research questions.
Topic guides are developed from the primary research questions.
Topic guides are developed from the primary
research questions.
Interviews were recorded/transcribed.
Interviews were recorded/transcribed.
Interviews were recorded/transcribed.
Insights
Insights
Insights
Building on our analysis, we identified key insights from our refined themes that not only clarified the core challenges but also revealed meaningful opportunities for design. These insights helped us understand how we might better support Deaf individuals in conversations with the hearing population. From these opportunities, we distilled four actionable criteria to guide our concept development and spark meaningful ideation.
Building on our analysis, we identified key insights from our refined themes that not only clarified the core challenges but also revealed meaningful opportunities for design. These insights helped us understand how we might better support Deaf individuals in conversations with the hearing population. From these opportunities, we distilled four actionable criteria to guide our concept development and spark meaningful ideation.
Building on our analysis, we identified key insights from our refined themes that not only clarified the core challenges but also revealed meaningful opportunities for design. These insights helped us understand how we might better support Deaf individuals in conversations with the hearing population. From these opportunities, we distilled four actionable criteria to guide our concept development and spark meaningful ideation.
4
actionable criteria
Being bilingual in the Deaf Community
Being bilingual in the Deaf Community
Being bilingual in the Deaf Community
Working with English through caption technology
Working with English through caption technology
Working with English through caption technology
ASL interpretation constructs expressive meaning
ASL interpretation constructs expressive meaning
ASL interpretation constructs expressive meaning
ASL auto-translate tech is unable to interpret
ASL auto-translate tech is unable to interpret
ASL auto-translate tech is unable to interpret
4
actionable criteria
4
actionable criteria
Research
POV Statement
POV Statement
Connection
Connection
Connection
The Deaf Community needs an empathetic technological ecosystem to express their message articulately because ASL conveys nuanced meaning through spatial grammar, facial expressions, and cultural context that aren’t easily translated into written English.
The Deaf Community needs an empathetic technological ecosystem to express their message articulately because ASL conveys nuanced meaning through spatial grammar, facial expressions, and cultural context that aren’t easily translated into written English.
The Deaf Community needs an empathetic technological ecosystem to express their message articulately because ASL conveys nuanced meaning through spatial grammar, facial expressions, and cultural context that aren’t easily translated into written English.
POV Statement
System Design
System Design
System Flow
System Flow
System Flow
To support seamless interaction, the team selected a semantic engine for matching ASL terms with English sentences. Sentiment analysis enhances emotional understanding, and gesture recognition links ASL terms to specific motion data.
To support seamless interaction, the team selected a semantic engine for matching ASL terms with English sentences. Sentiment analysis enhances emotional understanding, and gesture recognition links ASL terms to specific motion data.
To support seamless interaction, the team selected a semantic engine for matching ASL terms with English sentences. Sentiment analysis enhances emotional understanding, and gesture recognition links ASL terms to specific motion data.



Ecosystem
Ecosystem
Ecosystem
After extensive discussions, we developed an ecosystem for seamless interaction across various situations. Wearables will be the primary interface, enabling screen-free communication, while a mobile app and online community will provide the data needed for fluid and diverse interactions. Otherwise, the wearables will have too much datas to process in real-time.
After extensive discussions, we developed an ecosystem for seamless interaction across various situations. Wearables will be the primary interface, enabling screen-free communication, while a mobile app and online community will provide the data needed for fluid and diverse interactions. Otherwise, the wearables will have too much datas to process in real-time.
After extensive discussions, we developed an ecosystem for seamless interaction across various situations. Wearables will be the primary interface, enabling screen-free communication, while a mobile app and online community will provide the data needed for fluid and diverse interactions. Otherwise, the wearables will have too much datas to process in real-time.

Wearables

Wearables

Mobile App

Mobile App

Online Community

Online Community
System Design
Visual Design
Visual Design
Branding
Branding
Branding
As Visual/UI Lead, I defined DARI’s brand identity, typography, and visual language to reflect clarity, trust, and inclusivity. Named after the Korean word for “bridge,” DARI connects Deaf and hearing communities through seamless, human-centered communication. Gradients and glassmorphism create a fluid, dynamic aesthetic that enhances openness and symbolizes progress. More than a product, DARI is a movement toward connection, accessibility, and shared understanding.
As Visual/UI Lead, I defined DARI’s brand identity, typography, and visual language to reflect clarity, trust, and inclusivity. Named after the Korean word for “bridge,” DARI connects Deaf and hearing communities through seamless, human-centered communication. Gradients and glassmorphism create a fluid, dynamic aesthetic that enhances openness and symbolizes progress. More than a product, DARI is a movement toward connection, accessibility, and shared understanding.
As Visual/UI Lead, I defined DARI’s brand identity, typography, and visual language to reflect clarity, trust, and inclusivity. Named after the Korean word for “bridge,” DARI connects Deaf and hearing communities through seamless, human-centered communication. Gradients and glassmorphism create a fluid, dynamic aesthetic that enhances openness and symbolizes progress. More than a product, DARI is a movement toward connection, accessibility, and shared understanding.






UI Development
UI Development
UI Development
I designed the entire mobile app interface, starting from wireframes to low-fidelity screens, then applying our brand system to develop a cohesive mid-fidelity prototype. To support upcoming user testing, I prioritized completing the mobile app first, ensuring a strong visual foundation for evaluation.
I designed the entire mobile app interface, starting from wireframes to low-fidelity screens, then applying our brand system to develop a cohesive mid-fidelity prototype. To support upcoming user testing, I prioritized completing the mobile app first, ensuring a strong visual foundation for evaluation.
I designed the entire mobile app interface, starting from wireframes to low-fidelity screens, then applying our brand system to develop a cohesive mid-fidelity prototype. To support upcoming user testing, I prioritized completing the mobile app first, ensuring a strong visual foundation for evaluation.










User Testing
at Atlanta Area School for the Deaf & Three Rivers Association of the Deaf
User Testing
at Atlanta Area School for the Deaf & Three Rivers Association of the Deaf
Young Generation
Young Generation
Young Generation
Atlanta Area School for the Deaf were very nice to let us give a presentation to the students, to showcase our initial products to the students who are in the age group of our target audience. Everything had to be very intentional, since we had to go through interpreter who were conveying our message to the students.
Atlanta Area School for the Deaf were very nice to let us give a presentation to the students, to showcase our initial products to the students who are in the age group of our target audience. Everything had to be very intentional, since we had to go through interpreter who were conveying our message to the students.
Atlanta Area School for the Deaf were very nice to let us give a presentation to the students, to showcase our initial products to the students who are in the age group of our target audience. Everything had to be very intentional, since we had to go through interpreter who were conveying our message to the students.
💡
Adjusted color indicators to be accessible for color-blind users.
💡💡
Improved text contrast and readability in the mobile UI based on legibility concerns.
💡💡💡
Reinforced visibility of environmental sound cues within users’ visual field.
💡💡💡💡
Prioritized refinement of the interpretation feature based on high engagement and feedback.
💡💡💡
Reinforced visibility of environmental sound cues within users’ visual field.
💡💡💡💡
Prioritized refinement of the interpretation feature based on high engagement and feedback.
💡
Adjusted color indicators to be accessible for color-blind users.
💡💡
Improved text contrast and readability in the mobile UI based on legibility concerns.
Old Generation
Old Generation
Old Generation
It was honored to be part of the Valentine’s event conducted by Three Rivers Association held at Cave Springs, Atlanta. This time, there were wide range of age group. The adults were able to provide us a validation with very technical and detailed example of how this could be impactful to the lives of the Deaf individuals.
It was honored to be part of the Valentine’s event conducted by Three Rivers Association held at Cave Springs, Atlanta. This time, there were wide range of age group. The adults were able to provide us a validation with very technical and detailed example of how this could be impactful to the lives of the Deaf individuals.
It was honored to be part of the Valentine’s event conducted by Three Rivers Association held at Cave Springs, Atlanta. This time, there were wide range of age group. The adults were able to provide us a validation with very technical and detailed example of how this could be impactful to the lives of the Deaf individuals.
💡
Strengthened support for ASL-first interaction, making selection of ASL terms more prominent and accessible.
💡💡
Continued to refine contrast and readability in the mobile app UI for Deaf-blind accessibility.
💡💡💡
Validated overall product direction and expanded context-specific use cases to inform feature development.
User Testing
at Atlanta Area School for the Deaf & Three Rivers Association of the Deaf
💡💡💡
Reinforced visibility of environmental sound cues within users’ visual field.
💡💡💡💡
Prioritized refinement of the interpretation feature based on high engagement and feedback.
💡
Adjusted color indicators to be accessible for color-blind users.
💡💡
Improved text contrast and readability in the mobile UI based on legibility concerns.
💡💡💡
Validated overall product direction and expanded context-specific use cases to inform feature development.
💡
Strengthened support for ASL-first interaction, making selection of ASL terms more prominent and accessible.
💡💡
Continued to refine contrast and readability in the mobile app UI for Deaf-blind accessibility.
💡
Strengthened support for ASL-first interaction, making selection of ASL terms more prominent and accessible.
💡💡
Continued to refine contrast and readability in the mobile app UI for Deaf-blind accessibility.
💡💡💡
Validated overall product direction and expanded context-specific use cases to inform feature development.
Final Design
Final Design
My Dari
My Dari
My Dari
As the UI/Visual Lead, I designed DARI with accessibility, clarity, and delight at its core. The app includes a soft gradient background that moves gently in response to sound, faster in noisy places, slower in quiet ones, making the experience more engaging without being distracting. MyDARI gives users real-time English captions, an AI-powered ASL signer, and voice output, so they can communicate smoothly, even without the wearable device.
As the UI/Visual Lead, I designed DARI with accessibility, clarity, and delight at its core. The app includes a soft gradient background that moves gently in response to sound, faster in noisy places, slower in quiet ones, making the experience more engaging without being distracting. MyDARI gives users real-time English captions, an AI-powered ASL signer, and voice output, so they can communicate smoothly, even without the wearable device.
As the UI/Visual Lead, I designed DARI with accessibility, clarity, and delight at its core. The app includes a soft gradient background that moves gently in response to sound, faster in noisy places, slower in quiet ones, making the experience more engaging without being distracting. MyDARI gives users real-time English captions, an AI-powered ASL signer, and voice output, so they can communicate smoothly, even without the wearable device.
Real-Time Captioning + Saved Terms
Real-Time Captioning + Saved Terms
Real-Time Captioning + Saved Terms
The dashboard and real-time captions enable accessible communication even without wearables, with bold, readable text for live conversations.
Captions transition seamlessly into chat, where users can reply using pre-saved ASL terms, keeping responses fast, intuitive, and under five taps.
The dashboard and real-time captions enable accessible communication even without wearables, with bold, readable text for live conversations.
Captions transition seamlessly into chat, where users can reply using pre-saved ASL terms, keeping responses fast, intuitive, and under five taps.
The dashboard and real-time captions enable accessible communication even without wearables, with bold, readable text for live conversations.
Captions transition seamlessly into chat, where users can reply using pre-saved ASL terms, keeping responses fast, intuitive, and under five taps.
In Chat AI Signer
In Chat AI Signer
In Chat AI Signer
The AI Signer translates typed text into sign language in real time, helping Deaf users visually confirm meaning. If unclear, it suggests alternative sentences to ensure accurate communication.
The AI Signer translates typed text into sign language in real time, helping Deaf users visually confirm meaning. If unclear, it suggests alternative sentences to ensure accurate communication.
The AI Signer translates typed text into sign language in real time, helping Deaf users visually confirm meaning. If unclear, it suggests alternative sentences to ensure accurate communication.
Saved Terms
Saved Terms
Saved Terms
The Saved Terms page lets Deaf users store frequently used phrases for quick access via Woori, Sari, or in-chat. Users can also re-pair terms to improve accuracy and ensure smoother communication.
The Saved Terms page lets Deaf users store frequently used phrases for quick access via Woori, Sari, or in-chat. Users can also re-pair terms to improve accuracy and ensure smoother communication.
The Saved Terms page lets Deaf users store frequently used phrases for quick access via Woori, Sari, or in-chat. Users can also re-pair terms to improve accuracy and ensure smoother communication.
Explore ASL Terms
Explore ASL Terms
Explore ASL Terms
The Explore page lets users browse new ASL terms from the Dari website and save them for quick access through Woori and Sari.
The Explore page lets users browse new ASL terms from the Dari website and save them for quick access through Woori and Sari.
The Explore page lets users browse new ASL terms from the Dari website and save them for quick access through Woori and Sari.
Conversation History
Conversation History
Conversation History
The Conversation History feature lets Deaf users revisit past conversations, reducing cognitive load and helping them recall important details with ease.
The Conversation History feature lets Deaf users revisit past conversations, reducing cognitive load and helping them recall important details with ease.
The Conversation History feature lets Deaf users revisit past conversations, reducing cognitive load and helping them recall important details with ease.
Online Community
Online Community
Online Community
I designed the desktop experience to help users easily manage their account, review saved ASL terms, and revisit past conversations. Most importantly, I created a space for community, a forum where users can request, suggest, and discuss ASL terms by typing in English or recording themselves signing. This helps the platform grow with its users and better meet their needs.
I designed the desktop experience to help users easily manage their account, review saved ASL terms, and revisit past conversations. Most importantly, I created a space for community, a forum where users can request, suggest, and discuss ASL terms by typing in English or recording themselves signing. This helps the platform grow with its users and better meet their needs.
I designed the desktop experience to help users easily manage their account, review saved ASL terms, and revisit past conversations. Most importantly, I created a space for community, a forum where users can request, suggest, and discuss ASL terms by typing in English or recording themselves signing. This helps the platform grow with its users and better meet their needs.
Wearables
Wearables
Wearables
Our wearables enhance our app’s capabilities. Woori, our smart ASL armband, uses IMU and EMG data to let users respond by signing, paired with saved ASL terms. Sari, our smart glasses, keeps conversations phone-free with an embedded display and discreetly indicates sound sources for environmental awareness.
Our wearables enhance our app’s capabilities. Woori, our smart ASL armband, uses IMU and EMG data to let users respond by signing, paired with saved ASL terms. Sari, our smart glasses, keeps conversations phone-free with an embedded display and discreetly indicates sound sources for environmental awareness.
Our wearables enhance our app’s capabilities. Woori, our smart ASL armband, uses IMU and EMG data to let users respond by signing, paired with saved ASL terms. Sari, our smart glasses, keeps conversations phone-free with an embedded display and discreetly indicates sound sources for environmental awareness.
Final Design
Reflection
Reflection
As a UI/Visual Lead…
As a UI/Visual Lead…
As a UI/Visual Lead…
This 26-week journey with DARI was one of the most meaningful and challenging projects I’ve led as a Visual and UI Designer. Designing for the Deaf community came with a deep responsibility, and for me, it wasn’t just about accessibility. I wanted every user to feel seen, included, and equal under the same sky, regardless of how they communicate. Balancing clarity with visual warmth was a core challenge.
Crafting a UI that was both easy to read and emotionally engaging required constant care, especially with font sizes, contrast, and color. I dove into accessibility standards like WCAG and applied them with intention, learning how thoughtful design can create real connection. At its heart, DARI is about bridging the communication gap between Deaf and hearing individuals. And for me, it became more than a product, it became a reminder that inclusive design isn’t just about access. It’s about belonging. :)
This 26-week journey with DARI was one of the most meaningful and challenging projects I’ve led as a Visual and UI Designer. Designing for the Deaf community came with a deep responsibility, and for me, it wasn’t just about accessibility. I wanted every user to feel seen, included, and equal under the same sky, regardless of how they communicate. Balancing clarity with visual warmth was a core challenge.
Crafting a UI that was both easy to read and emotionally engaging required constant care, especially with font sizes, contrast, and color. I dove into accessibility standards like WCAG and applied them with intention, learning how thoughtful design can create real connection. At its heart, DARI is about bridging the communication gap between Deaf and hearing individuals. And for me, it became more than a product, it became a reminder that inclusive design isn’t just about access. It’s about belonging. :)
This 26-week journey with DARI was one of the most meaningful and challenging projects I’ve led as a Visual and UI Designer. Designing for the Deaf community came with a deep responsibility, and for me, it wasn’t just about accessibility. I wanted every user to feel seen, included, and equal under the same sky, regardless of how they communicate. Balancing clarity with visual warmth was a core challenge.
Crafting a UI that was both easy to read and emotionally engaging required constant care, especially with font sizes, contrast, and color. I dove into accessibility standards like WCAG and applied them with intention, learning how thoughtful design can create real connection. At its heart, DARI is about bridging the communication gap between Deaf and hearing individuals. And for me, it became more than a product, it became a reminder that inclusive design isn’t just about access. It’s about belonging. :)
Reflection
Visual Design
Let’s
Collaborate! ☺︎
Designing with users at the core, always. If you're ready to create something meaningful, let's make it happen together!
Let’s
Collaborate! ☺︎
Designing with users at the core, always. If you're ready to create something meaningful, let's make it happen together!
Let’s
Collaborate! ☺︎
Designing with users at the core, always. If you're ready to create something meaningful, let's make it happen together!





















