Keyflow

Keyflow is an alternative to smartphone keyboards, eliminating the barriers of visually looking at a screen to type, search or text, this is especially beneficial to Blind/ Visually impaired individuals.
The letters, numbers and special characters coded in a loop, where users can type out short sentences by using pre defined gestures to type out a sentence.

Keyflow is an alternative to smartphone keyboards, eliminating the barriers of visually looking at a screen to type, search or text, this is especially beneficial to Blind/ Visually impaired individuals.
The letters, numbers and special characters coded in a loop, where users can type out short sentences by using pre defined gestures to type out a sentence.

Keyflow is an alternative to smartphone keyboards, eliminating the barriers of visually looking at a screen to type, search or text, this is especially beneficial to Blind/ Visually impaired individuals.
The letters, numbers and special characters coded in a loop, where users can type out short sentences by using pre defined gestures to type out a sentence.

4 Designers

4 Designers

2 Developers

2 Developers

Mentored By:
Dr. Davide Bolchini

Mentored By:
Dr. Davide Bolchini

Team Members

Team Members

Funded By

Funded By

National Science Foundation(NSF)

National Science Foundation(NSF)

Google

Google

Funded By

Funded By

Google

National Science Foundation(NSF)

4 Designers

2 Developers

Mentored By:
Dr. Davide Bolchini

Google

Google

National Science Foundation(NSF)

National Science Foundation(NSF)

4 Designers

4 Designers

2 Developers

2 Developers

Mentored By:
Dr. Davide Bolchini

Mentored By:
Dr. Davide Bolchini

Team Members

Team Members

Funded By

Team Members

Google

National Science Foundation(NSF)

4 Designers

2 Developers

Mentored By:
Dr. Davide Bolchini

My Contributions :

My Contributions :

My Contributions :

  1. Simplified the Gesture Library: Achieved a 32% reduction in the number of gestures required, making the system more intuitive and easier to learn for users.


  2. Enhanced Task Efficiency: Increased task efficiency by 65% by minimizing the number of modes needed to perform typing tasks, streamlining the user experience


  3. Optimized Typing Speed: Reduced overall typing time by 2 minutes and 17 seconds, enabling a faster and smoother interaction for users.


  4. Exploring Innovative Gestures: Currently designing new gestures tailored to blind and visually impaired (BVI) users to enhance the usability of Tapstrap on Generative AI platforms, pushing the boundaries of accessibility.

  1. Simplified the Gesture Library: Achieved a 32% reduction in the number of gestures required, making the system more intuitive and easier to learn for users.


  2. Enhanced Task Efficiency: Increased task efficiency by 65% by minimizing the number of modes needed to perform typing tasks, streamlining the user experience


  3. Optimized Typing Speed: Reduced overall typing time by 2 minutes and 17 seconds, enabling a faster and smoother interaction for users.


  4. Exploring Innovative Gestures: Currently designing new gestures tailored to blind and visually impaired (BVI) users to enhance the usability of Tapstrap on Generative AI platforms, pushing the boundaries of accessibility.

  1. Simplified the Gesture Library: Achieved a 32% reduction in the number of gestures required, making the system more intuitive and easier to learn for users.


  2. Enhanced Task Efficiency: Increased task efficiency by 65% by minimizing the number of modes needed to perform typing tasks, streamlining the user experience


  3. Optimized Typing Speed: Reduced overall typing time by 2 minutes and 17 seconds, enabling a faster and smoother interaction for users.


  4. Exploring Innovative Gestures: Currently designing new gestures tailored to blind and visually impaired (BVI) users to enhance the usability of Tapstrap on Generative AI platforms, pushing the boundaries of accessibility.

Ryan, a 32-year-old professional excelling in his tech career, navigates a visually dominated world with remarkable determination despite being blind. With his phone, laptop bag, and guide dog accompanying him, he encounters unique challenges every day.

What are the obstacles in his path?

  1. For texting, Ryan depends on VoiceOver, which requires precise screen interactions through tapping and multi-finger gestures.

  2. Privacy? A luxury he can’t afford. Everything he types is audible to those around him.
    On top of it all, he constantly worries about the risk of someone snatching his expensive smartphone from his hands.

Ryan, a 32-year-old professional excelling in his tech career, navigates a visually dominated world with remarkable determination despite being blind. With his phone, laptop bag, and guide dog accompanying him, he encounters unique challenges every day.

What are the obstacles in his path?

  1. For texting, Ryan depends on VoiceOver, which requires precise screen interactions through tapping and multi-finger gestures.

  2. Privacy? A luxury he can’t afford. Everything he types is audible to those around him.
    On top of it all, he constantly worries about the risk of someone snatching his expensive smartphone from his hands.

The Problem

"How might we redesign smartphone keyboards to operate entirely without the need for a screen, enabling Blind and Visually Impaired users to overcome the limitations and challenges of traditional touch-based typing interfaces, thereby fostering greater accessibility and independence in digital communication?"

"How might we redesign smartphone keyboards to operate entirely without the need for a screen, enabling Blind and Visually Impaired users to overcome the limitations and challenges of traditional touch-based typing interfaces, thereby fostering greater accessibility and independence in digital communication?"

"How might we redesign smartphone keyboards to operate entirely without the need for a screen, enabling Blind and Visually Impaired users to overcome the limitations and challenges of traditional touch-based typing interfaces, thereby fostering greater accessibility and independence in digital communication?"

Consider This Scenario…

Consider This Scenario…

Consider This Scenario…

The above video showcases a version of the implementation of the auditory keyboard, Keyflow where BVI (Blind and Visually Impaired) individuals use gestures to type out messages and get auditory feedback. It is a Text-to-Speech aural loop with three primary modes- Alphabet mode (A-Z), Number mode (0-9) and Special character mode.
 

How will Ryan use this?

How will Ryan use this?

Step 1: Wear the Tapstrap

on your dominant hand.

Step 2: Pair the Tapstrap

with a mobile device.

Step 3: Tap to activate the modes.

Ryan, a 32-year-old professional excelling in his tech career, navigates a visually dominated world with remarkable determination despite being blind. With his phone, laptop bag, and guide dog accompanying him, he encounters unique challenges every day.

What are the obstacles in his path?

  1. For texting, Ryan depends on VoiceOver, which requires precise screen interactions through tapping and multi-finger gestures.

  2. Privacy? A luxury he can’t afford. Everything he types is audible to those around him.
    On top of it all, he constantly worries about the risk of someone snatching his expensive smartphone from his hands.

Ryan, a 32-year-old professional excelling in his tech career, navigates a visually dominated world with remarkable determination despite being blind. With his phone, laptop bag, and guide dog accompanying him, he encounters unique challenges every day.

What are the obstacles in his path?

  1. For texting, Ryan depends on VoiceOver, which requires precise screen interactions through tapping and multi-finger gestures.

  2. Privacy? A luxury he can’t afford. Everything he types is audible to those around him.
    On top of it all, he constantly worries about the risk of someone snatching his expensive smartphone from his hands.

Ryan has to learn the above 9 main gestures

What if Ryan makes a spelling mistake? Can he edit it?

What if Ryan makes a spelling mistake? Can he edit it?

How can we improve Ryan's texting experience to make it more seamless and secure while ensuring his privacy?

How can we improve Ryan's texting experience to make it more seamless and secure while ensuring his privacy?

How can we improve Ryan's texting experience to make it more seamless and secure while ensuring his privacy?

How will Ryan use this?

How will Ryan use this?

The above video showcases a version of the implementation of the auditory keyboard, Keyflow where BVI (Blind and Visually Impaired) individuals use gestures to type out messages and get auditory feedback. It is a Text-to-Speech aural loop with three primary modes- Alphabet mode (A-Z), Number mode (0-9) and Special character mode.
 

The above video showcases a version of the implementation of the auditory keyboard, Keyflow where BVI (Blind and Visually Impaired) individuals use gestures to type out messages and get auditory feedback. It is a Text-to-Speech aural loop with three primary modes- Alphabet mode (A-Z), Number mode (0-9) and Special character mode.
 

Errors In Typing

Learning a New System

Misrecognition Of Gestures

Multiple Steps Required to edit a typed message

Remembering Gestures

Step 1: Wear the Tapstrap

on your dominant hand.

Step 2: Pair the Tapstrap

with a mobile device.

Step 3: Tap to activate the modes.

5

3

1

Find ways to shift through an


an array of characters faster

2

2

Cognitive

Load Reduction

Gesture

Simplication

Effort

Reduction

4

Streamlining the elaborated


list of modes

Benefits

Benefits

Ryan has to learn the above 9 main gestures

Reflection

Reflection

How will Ryan use this?

Ryan has to learn the above 9 main gestures

Step 1: Wear the Tapstrap

on your dominant hand.

Step 2: Pair the Tapstrap

with a mobile device.

Step 3: Tap to activate the modes.

What if Ryan makes a spelling mistake? Can he edit it?

What if Ryan makes a spelling mistake? Can he edit it?

The image illustrates how Ryan can access the edit mode and correct a word he entered incorrectly.
 

The image illustrates how Ryan can access the edit mode and correct a word he entered incorrectly.
 

The image illustrates how Ryan can access the edit mode and correct a word he entered incorrectly.
 

Usability Testing

It was observed that the earlier version of Keyflow required significantly more time than Voiceflow, prompting optimization to achieve comparable performance. Results were based on testing results from expert and novice users

It was observed that the earlier version of Keyflow required significantly more time than Voiceflow, prompting optimization to achieve comparable performance. Results were based on testing results from expert and novice users

It was observed that the earlier version of Keyflow required significantly more time than Voiceflow, prompting optimization to achieve comparable performance. Results were based on testing results from expert and novice users

Errors In Typing

Errors In Typing

Learning a New System

Learning a New System

Misrecognition Of Gestures

Misrecognition Of Gestures

Misrecognition Of Gestures

Multiple Steps Required to edit a typed message

Multiple Steps Required to edit a typed message

Remembering Gestures

Remembering Gestures

Multiple Steps Required to edit a typed message

Areas of Exploration

Areas of Exploration

Areas of Exploration

4

5

2

3

1

Find ways to shift through an


array of characters faster

Find ways to shift through an array of characters faster

Cognitive

load reduction

Cognitive
Load Reduction

Gesture

simplication

Gesture
Simplication

Effort

reduction

Effort
Reduction

Streamlining the elaborated


list of modes

Streamlining the elaborated list of modes

Benefits

Benefits

  1. No Screen Requirements:
    The system allows the use of any surface, including one's own thigh, as a base for typing.


  2. Auditory-Based Feedback:
    Audio-based feedback enables easy and private responses to messages.


  3. Close-Looped Navigation:
    The layout for alphabets, numbers, and special characters is close-looped, providing efficiency and predictability. The system offers greater control and space for self-correction.


  4. Low Learnability:
    Feedback from user testing indicates that users require minimal time to adapt to the new system.

  1. No Screen Requirements:
    The system allows the use of any surface, including one's own thigh, as a base for typing.


  2. Auditory-Based Feedback:
    Audio-based feedback enables easy and private responses to messages.


  3. Close-Looped Navigation:
    The layout for alphabets, numbers, and special characters is close-looped, providing efficiency and predictability. The system offers greater control and space for self-correction.


  4. Low Learnability:
    Feedback from user testing indicates that users require minimal time to adapt to the new system.

  1. No Screen Requirements:
    The system allows the use of any surface, including one's own thigh, as a base for typing.


  2. Auditory-Based Feedback:
    Audio-based feedback enables easy and private responses to messages.


  3. Close-Looped Navigation:
    The layout for alphabets, numbers, and special characters is close-looped, providing efficiency and predictability. The system offers greater control and space for self-correction.


  4. Low Learnability:
    Feedback from user testing indicates that users require minimal time to adapt to the new system.

Reflection

Reflection

  1. This project underscored the value of conducting tests in real-world scenarios, as most of our evaluations were limited to a lab environment. Moving forward, I plan to assess the system under varied environmental conditions to better understand how external distractions impact task performance and efficiency.


  2. Initially, I focused on aligning Keyflow’s performance time with that of VoiceOver. However, as the project progressed, I realized that Keyflow is designed to complement VoiceOver, not replace it. Both systems have distinct roles and provide unique benefits.


  3. Currently, my primary focus is on helping blind and visually impaired (BVI) users engage with generative AI platforms using the Tapstrap, further expanding accessibility and usability in emerging technologies.

  1. This project underscored the value of conducting tests in real-world scenarios, as most of our evaluations were limited to a lab environment. Moving forward, I plan to assess the system under varied environmental conditions to better understand how external distractions impact task performance and efficiency.


  2. Initially, I focused on aligning Keyflow’s performance time with that of VoiceOver. However, as the project progressed, I realized that Keyflow is designed to complement VoiceOver, not replace it. Both systems have distinct roles and provide unique benefits.


  3. Currently, my primary focus is on helping blind and visually impaired (BVI) users engage with generative AI platforms using the Tapstrap, further expanding accessibility and usability in emerging technologies.

  1. This project underscored the value of conducting tests in real-world scenarios, as most of our evaluations were limited to a lab environment. Moving forward, I plan to assess the system under varied environmental conditions to better understand how external distractions impact task performance and efficiency.


  2. Initially, I focused on aligning Keyflow’s performance time with that of VoiceOver. However, as the project progressed, I realized that Keyflow is designed to complement VoiceOver, not replace it. Both systems have distinct roles and provide unique benefits.


  3. Currently, my primary focus is on helping blind and visually impaired (BVI) users engage with generative AI platforms using the Tapstrap, further expanding accessibility and usability in emerging technologies.

Thanks for scrolling 🙏

Thanks for scrolling 🙏

Thanks for scrolling 🙏