Automation for Good brings each year the possibility to build a more resilient society. The RAYS team won the 1st place in the UiPath HyperHack with HOPE, an application to help people with disabilities.
Education has been instrumental for economic growth in each country for ages. Employment opportunities improve with education and versatile skills. The new generation demands better schools and improved access to higher education.
Sadly, schooling has not been improved for people with disabilities as compared to the general population in the majority parts of India. It’s critically important to achieve global standards in education to fulfill the aspirations of today’s workers and the next generation of workers.
In India, as per 2011 census, 26.8 million people are disabled out of which 19% are sight or hearing impaired. Most of these people fall in the age group of 10-19, and do not have access to expensive education tools for learning.
There are non-governmental organizations (NGOs) who primarily focus on helping such people by:
Providing them with effective education
Acquainting them with the latest trends and news from around the world
Supporting their pursuit of hobbies
HOPE is an application using UiPath Studio, Apps, AI Center, and Python scripting that allows people with disabilities who primarily rely on Braille language to communicate and:
Have access to news
Learn online content
Access search engines
to Braille Script for accessibility
This screen is built under all the design parameters defined for better accessibility for visually impaired people - large buttons, large fonts, no bright colors, etc. Now, let’s deep dive into the four scenarios that we built:
We built this bot service using free APIs provided by Newsapi.org to fetch the latest news from all around the world. We used HTTPS Activity to fetch the data using an endpoint and parsed the JSON result to get the actual news. Once the news was grabbed, we checked an input parameter for Language which allowed us to validate whether we needed translation or English text would do.
Since translation was required, we used AI Center out-of-the-box machine learning (ML) models to translate data from one language to another. AI Center has five out-of-the-box translation ML models of which we used “English to German “and English to French”. The output was processed into Braille using Python script.
The result was reflected in the HOPE project as Braille Impression, on Braille Tab. Alternatively, one can print the text on Braille Printer via the application itself.
We entrusted Wikipedia as our primary working engine and processed input from users in the braille language. Here translation was required. We used AI Center out-of-the-box machine learning models to translate data from one language to another. Input was converted into English and submitted to Wikipedia. Once Wikipedia shared results for the key word input, we fetched the primary results as a data output. Bot checked an input parameter for Language that allowed us to validate whether we needed translation or English text would do. AI Center translation ML models — "English to German" and "English to French" were used and the output was processed into Braille using Python script.
The result was reflected in HOPE as Braille Impression. Alternatively, one could print the text on Braille Printer via the application itself.
We selected a prominent learning provider and decided to fetch assignments depending upon user input. This was achieved by creating a dynamic page on the UiPath Apps that allows us to publish all new assignments for a given session. Once the user selects a session and assignment—we load the learning management system, trigger sign in (along with one-time password validation) and fetch the assignment in PDF format. PDF was processed using UiPath PDF library and the data from it was fetched using Read PDF activity.
The bot HOPE checked an input parameter for Language that allowed us to validate whether we needed translation or English text would do. Need for Translation was fulfilled using AI Center out-of-the-box ML models to translate data from one language to another. Output was processed into Braille script using Python Script.
The result was reflected in the HOPE application as Braille Impression on Braille Tablet. Alternatively, one could print the text on Braille Printer via the application itself.
Music is pleasing to the ears and mind and it nurtures the soul. Music gives one an emotional response. Music is often a lifeline for people with disabilities. Many individuals who are sight impaired tend to make their career in the music industry, but as processing tools are bit expensive, it becomes difficult for most of them to excel in their passion. We have created a solution that could be widely used for these individuals.
MusicXML is a format in which the music sheets are often stored. We used a Python script to parse it. Output was converted into braille script. Braille music sheet with proper formatting is generated for use.
Just like all other options generated, Braille scripts are also visible on our HOPE application and can be used for further analysis or modification.