Back to homepage


Yoguide is the result of my final student project at CIID. It lasted 9 weeks, and it was an individual project.

Student Project
9 weeks

I have always been a big believer in accessibility, and that the world should be equally accessible for everyone, regardless of their disability. This is why I began my research with blind people to understand the problem area and see if I could make a positive impact.

What is Yoguide?

Yoguide is an app that helps the visually impaired do yoga at home, by themselves. It acts as a personal trainer and keeps an eye on you, providing specific instructions and correcting you when you go wrong.

Yoguide explained

Yoguide uses a phone camera with computer vision to detect the body's joints and orientation, compares it to the correct orientation, and guides them towards it. This way, it helps them learn and form mental models of the yoga poses and they can stay fit from home.

In the spirit of inclusivity, even though Yoguide was designed and developed with blind people, it can be used by anyone. It is not just a product for blind people; it is a product which can also be used by blind people. Designing for the niche of blind people helped make this a better and accommodating product.

The problem

Global problem

On a larger scale, according to research, visually impaired people are generally less fit and more obese than sighted people.

Local problem

On a more local level, it is stressful and tiring to use public transport or arrange for transport to go to fitness centers in the city, so they don't go. The problem space was well explained by an expert, Birgit, in the video below:

"Getting out of the house to go to a fitness center involves a lot of stress"
- Birgit
"Going to a fitness centre takes all my energy. Exercising at home gives me energy." - Inger

Doing exercise by themselves at home is not a possibility as they don't know if they are doing something correctly since they have no way of confirming visually, nor do they get the guidance of a fitness center.

"I can't exercise at home because I don't know if I am doing it right, because I have no eyes on me" - Flemming S

From these problems, there was a clear tension and opportunity area I could see. People wanted to exercise at home, but had no way to do it in the existing ecosystem.

I learned that designing for yoga would be a good way to go as it requires minimal space and cost at home, which led to the opportunity:

Helping the blind do yoga at home

How it works

As shown in the video, one can select the exercise and time, place the phone somewhere it can see the person, and do the yoga session. This is all done on the app.

The touchpoints

The phone camera can track the person's joints and orientations between them. It can recognise if the person is doing the pose wrong, and can guide them towards the correct one.

This helps them form a mental model of the exercise, as they can't do it visually, and helps them stay fit from home.

Detecting the wrong orientation and correcting it

The working prototype was built on a Microsoft Kinect v2 and was demoed live in the CIID Final exhibition. It uses computer vision for pose detection, and gets the orientation between each joint, as you can see in the video below.

Working prototype

The Process

The process took place over 9 weeks and involved 7 participants, which consisted of 4 blind people and 3 experts, to tap into their wealth of  knowledge. The process was fairly structured like the image below shows, with a lot of back and forth and iterations in each stage, especially in the "Generating the concept" phase.

Finding the problem

Starting point:
In what ways can being blind reduce independence or autonomy of a person?

In order to understand more about this, I interviewed 3 people.

Three interviews to find out about the problem space
Different directions

Talking to all 3 of them, I found 3 separate problem areas relating to navigation, social security, and exercise. I found the third one quite intriguing.

Choosing the exercise route

I learned about exercise as a possible problem area from Birgit (who works at the Institute of Blind and Partially Sighted), and decided to work on this as a lot of work was done in the two other problem areas I’d looked at. I wanted to let the process guide me, and this seemed like a black box I could dive into and start prototyping and testing.

Generating the concept: Build, test, repeat

During this period, I tested and interviewed 2 experts and 2 users.

I interviewed and tested with 2 experts and 2 users
1: Defining a set of guiding principles

I held a co-creation session with Flemming Davidsen, who is an expert with mobility for blind people. He has been doing it for 20 years.

Co-creating principles with Flemming

I took 4 extreme ideas to figure out what he liked and didn't like about it, and tried to understand why. From this, we came up with a set of principles, which if satisfied would lead to a "successful" design.

The set of principles we came up with
2: Understanding how yoga is taught to the blind

I attended a yoga session taught by Nadia, a yoga instructor teaching the blind. I wanted to understand how she gives instructions and provides guidance

Attending a yoga session for the blind
"Don’t just give out instructions.Correct them when they are doing it wrong."- Nadia

From this quote and what I observed, I generated how Yoguide would instruct in order to mimic the instructors. It would work as a decision tree, giving a step and checking if it is correct. If not, it would get corrected and only then move on to the next step.

Decision tree trying to mimic an actual instructor
3: Understanding how blind people learn and do yoga

I did two yoga sessions with Flemming and Inger, to understand what kind of instructions they respond best to, and understanding the nuances better.

Yoga sessions

From these sessions came the metaphor that the app would behave like a personal trainer, as that is what the users gravitated towards naturally.

I got to understand the nuances of different aspects of the session, like how many times instructions should be repeated, the tone of voice, and the organic two way communication between the app and the user.

Room for error

Each instruction would only repeat for a maximum of 3 times, after which it would move on in order to not disrupt the flow.

Two way communication

Users can tell the app vocally when to stop, resume, and repeat instructions.

Tone of voice

The tone of voice is important as it is a yoga session. I tested different voices with a soundboard and as a result the most natural sounding voice was chosen.

4: Form of the product: an app? a suit?

I wanted to understand what form the product should take. I tested options from wearing a suit, a separate device, or an app on a phone. The result was overwhelmingly positive for an app, with 4 out of 4 participants vouching for it.

In fact, 95% of blind people use apps. It is a large part of their independence and helps them navigate daily life more independently. This would fit into their current ecosystem and mental models well.

The resulting screens -- easy to navigate with Voiceover

The screens were made after bouncing off ideas about the content with the participants. The app was made after prototyping for how blind people navigate phones with gestures using Voiceover or Talkback. It can be found on (open on your phone).

5: Choosing the technology

I experimented with motion capture suits, and computer vision models like PoseNET and OpenPose. They were not as accurate or as fast as the Kinect however, and that is what I decided to go with for demonstration purposes.

PoseNet wasn't accurate enough
OpenPose was too slow
Kinect struck the perfect balance

The Kinect could best demonstrate the concept. It can read the body position in 3D space, which means the phone doesn't need to be at a specific angle for the yoga session to work. It can be kept anywhere as long as it has the whole body in view.

The prototype was tested and improved after testing with Inger, Flemming, and a few sighted users.

Next Steps

The working prototype was demoed live in CIID's final exhibition. These are the things I'd like to do next:

An attendee of the exhibition trying out the demo
1. Measuring "success"

If number of instructions reduces over a week/month, it might be working, it means people are learning and using it regularly, which are two important parameters.

2. Learning more about yoga and the blind

This project was all about letting the process guide me, and as a result I made something about yoga for blind people, both areas I knew nothing about beforehand. There still remain a lot of nuances to be discovered by research and testing.

3. Exploring it for sighted people

As a result of designing it for a niche, it has also resulted in a solution that can be used by everyone, even people who can see. It can be helpful for any yoga practitioner to have a personal trainer always on their phone, who can correct them at home.

4. Exploring other use cases

The solution also works for squats, push ups, pilates, correct posture, and other home exercises. As a result of designing for accuracy, it has opened up a multitude of use cases.