Forgot your password?

Startup of the day #11. First AI-powered Yoga assistant in your phone. Zenia app controls your home asana practice and gives real-time voice guidance

Tuesday, December 3, 2019

Yoga is one of the most popular fitness activities in the world. About 300 million people practice it regularly. In the USA only, people spend 16 billion dollars on yoga sessions and special equipment for it on a yearly basis. It’s not surprising that new applications for home exercises appear on the market every year.

Startup Jedi

We talk to startups and investors, you get the value.

But still, there is no single application that would control the correctness of the asanas (because yoga is extremely traumatic).

The team of the startup Zenia decided to change the situation and developed the first-ever in the world audio application for yoga-practicing which uses neuro networks and computer vision technologies. So far, it has already attracted investments from Bulba Ventures and Misha Lyalin, CEO ZeptoLab.

...

What is Zenia

Indeed, there are many applications in mobile stores for practicing yoga at home, but the truth is that all of them are like a wrapping for videos — says Alexey Kurov, CEO Zenia. — Having studied how people practice yoga and analyzed comments to the top Youtube videos, we came to the conclusion that it isn’t really effective. In the beginning, for many people, especially for beginners, it can be difficult to adjust to the pace of the lesson held by a professional instructor. It took up to 6 months for some of them: they had to pause and review the video many times. It’s twice more inconvenient to do using a mobile phone.

We decided to solve this problem. With the help of our application, your home exercising will remind the studio classes than ever before. You will be monitored and, if necessary, corrected, step by step explained how to stand up, bend over and what you should feel at a certain moment. All commands will be voice. Thanks to such an approach it will be much easier for people to practice at home. As proved by our testings and by personal experience training yoga via Skype, if a person is carefully explained what and how to do, and what to feel as well, there will be no problems with understanding.

Among the analogs, presented on the market, one can compare us with a contactless sensor playing controller Kinect, but only to a certain extent. Still, our solution is less expensive and comprehensive.

 

zenia
Click the image to open Zenia's profile on the Rocket DAO platform

...

Team

As for today, 12 people work full-time on the project, and 9 work part-time. We all are from Samara. There we were developing Gentleminds — a company which specialized in AI technologies, deep learning and computer vision in particular. During this period we created about 20 projects based on computer vision. At the same time, we were working on a product dedicated to revealing the key point on the human body. This is how we found a niche associated with therapeutic and sports directions.

Now, while developing Zenia, we collaborate with yoga-instructors. They give us very detailed comments concerning the correctness of the asanas.

Technical challenges

...

Technical challenges

Developing Zenia, we are using technologies based on neuro networks and algorithms of classic computer vision. Accuracy of recognition of asanas now makes 95%, the dataset includes about 100 thousand images.

The main difficulty in training neural networks was to create a representative dataset for our tasks. In particular — to find a certain number of images of people in a particular pose from different angles. Most asanas are atypical for a person in ordinary life. Because of this, many existing systems with yoga poses do not work well.

The second challenge is the illumination of the object. Obviously, if there is a bright light and the system sees only the contours of the body, which is blending in the furniture, it will not work correctly. Even though while training the neural network, we took into account different lighting conditions, we would recommend not to exercise close to the windows (to be exact, we advise you to exercise so that the mobile does not stand in front of the window or in front of bright light sources).

Our first prototype had many limitations. For example, while practicing a person had to stand only sideways and at a certain angle. Now — it is possible to stand both in a profile and in front. But performing some asanas (such as Utkatasana), it is still recommended to stand in a certain way (face only). In the case of other asanas, we recommend standing in profile, as the picture is more representative then.

One of the typical questions concerns the correctness of the application in the case of complex twists and bends. So far, we have decided not to add such exercises, they will be implemented over some time. The application will be used by people of different levels and they must be carefully prepared for such content.

However, we have a system in a test mode, which assesses the degree of curvature of the back.

 

...

The first version of the product

The first release includes about 40 asanas. This content is for people of the basic and middle levels of training, who are in fact our main target audience. In the future, we will add materials for more advanced users.

In the future, we also plan to differentiate the content not only by the level but also by the objectives. For example improvement of flexibility, development of endurance, strengthening of muscles of the back. Also, the practice can be adjusted depending on personal preferences. For example, if you had a hard day, you can choose a more relaxing program that has almost no vigorous-intensity exercises; or on the contrary, you can choose a class with strengthening asanas.

Now practicing with the application looks like training in a studio. Moreover, the system analyzes the pace of the person and adjusts to it. For instance, if doing Surya Namasakar a person just moved to Chaturanga, the app will not propose switching to Downward-Facing Dog at once. The system tracks all the movements. So the person will not get new instructions unless the current asana is complete.

Today application is available in English only (for iOS only). In the future, we will add more languages. But since we have pretty much voice interactions, translation should be made very carefully and attentively, localization is a large layer of work.

Monetization

...

Monetization

We are starting with a B2C model. A person may purchase a monthly or yearly subscription for 15 and 100 dollars correspondingly.

We will also have free content consisting of basic exercises. We want to be useful even for people who don’t purchase a subscription. Subscription is a kind of membership in the club, which guarantees more interesting content, social interaction, and opportunities to take the classes of the famous gurus.

Having gained the trust of the community and received feedback, we will start thinking about working in the format of B2B partnerships, corporate subscriptions, etc.

...

Plans

In the future, the application should become a robust yoga-assistant. It will also help track progress, illustrating it with graphs so that a person sees not only mental confirmation of the effectiveness of training but also specific numbers.

Later we will also integrate a social component. In particular, we will add an option through which the user could share achievements of his practice in social networks (similar to how people share in Instagram videos with their training in a certain sequence of exercises or running routes). This positively influences motivation.

We are also interested in cooperation between a student and a yoga instructor. In this case, our application is not a replacement, but an assistant. We want to allow yoga instructors to add their programs and check the correctness of the asanas in a semi-automatic mode, give recommendations for improvements. We are also planning collaboration with prominent yogis so that they could share their courses.

 
3 Dec 2019

 

Stay tuned and don’t forget to follow us:

Facebook: facebook.com/StartupJedi/

Telegram: t.me/Startup_Jedi

Twitter: twitter.com/startup_jedi

Comments

More From Startup Jedi

Exciting news from the venture capital market from 26 April to 2 May
Let's have a look at how accelerators are developing in Asia.
Today we will tell about the Startup of the Day #13. WebTotem. It protects your websites and prevents attacks by providing total…