top of page

Our journey so far

Problem we are solving

Imran, one of our young coders, with an hearing impairment, told us that he had really wanted to find an App that he could use to sign a word into his mobile phone and get a written English translation to increase his vocabulary and writing skills. There wasn’t anything available.

 

There are 11 million Deaf/hearing impaired people in the UK and 70,000 use British Sign Language (BSL) as their main or only language.  A lack of literacy in written English and reading can seriously impact a Deaf learners educational and career prospects. BSL is a visual language which has no phonetic link to written English words so it is very hard to acquire new vocabulary.

​

With greater literacy, young Deaf people are able to overcome barriers in learning, can communicate more easily with the hearing world and live life better.

​

​

Our Solution

Our Sign2Word App will enable Deaf learners to translate a sign to written English, build up their vocabulary, add them to their personal dictionary and receive BSL keywords. This will aid their learning, improve their literacy and their overall career opportunities.

 

Our app is different from most Deaf signing apps; most translate words to sign and use the less complex American Sign Language. British Sign Language also uses the body/face. Early sign-to-word apps no longer exist, or involve expensive/impractical equipment (e.g. LeapMotion, gloves, AR). Some are at the ‘proof of concept’ stage and have limited scope.

 

Our app is unique because it will transform how deaf people learn in educational settings and will aid Communication Support Workers and class tutors make lessons more accessible by utilising the “share words” to push keyword lists to the Deaf learner. Learning vocabulary will help them to commit words to the long term memory. It also promotes independent learning as words can be accessed at home on the phone. This has the added value of promoting  better communication with hearing parents and siblings - the user can sign a word and then show the other person the written word to read.

​

​

How we built it

We have created a ‘proof of concept’ of our app using a k-NN, that can identify 15 sign words and convert them into written English for Deaf learners to use. We explored different options in terms of coding languages and machine learning models. We decided on React-Native, which is compatible across all types of phones, ensuring everyone could download the app. 

 

We used Teachable Machine as it is  easy to train and save data. However, we found that it is not fully compatible with React-Native. Therefore we are now looking into different options, such as TensorFlow Lite, but we still have a ‘proof of concept’ showing that the Translation can be done, using AI. We have also created wireframes for the app, developed in Sketch and then converted into React-Native code using Supernova Studio. The code was uploaded to GitHub for us to share and tidy up.

​

​

Next steps

We want to combine three different computer vision machine learning models, each tracking different parts of the body (head, hands and body) to improve machine learning accuracy. We would also like to test our Minimal Viable Product by partnering with schools and colleges.

​

​

bottom of page