AI Technology Innovation-A new age pivot for the community of challenged Deaf people; sign & non-sign language users
We enable social interactions in real time for the hearing/speech impaired globally.




A deep tech AI app helping Deaf & aphonic for the future.
Our Solution
Who ?
An innovative communication app developed by AI & Data science engineers from UK which helps and enables sign language users to interact for better interactions and social conversations, helping them better their tasks involved with work, study and communicate on daily basis with able-hearing users and vice versa globally.
Why ?
Lack of real time communication apps for 70 Million+ Sign Language users globally & 2,50,000+ in UK. They face a day to day challenge to communicate for social inclusion, to acess healthcare, be part of mainstream education & work opportunities, basic needs, acess public transport systems etc.
What ?
We use Deep Tech such as machine learning (ML) and artificial intelligence (AI) to analyse and convert sign language into text, which is overlaid for other participants online or offline, and vice versa, analysing voice input and providing text overlays for Deaf or hard of hearing participants.

An Innovative Technology to communicate with hearing/speech impaired
Silence Speaks enables & empowers public and private enterprises to collaborate with key underserved section of society, to better communicate.
We Harnesses Machine Learning & Artificial Intelligence. We use the latest ML/AI tools to analyse and manipulate core data for benefit of Good and serve the needs of hearing/speech impaired, with basic council, transport services, education and workplace communication.
Silence Speaks, helping to communicate with sign-language users globally.
Building Blocks of Sign Language

Handshape
Our robust algorithm, analyses the handshape and motion trajectory of each manual sign as well as their position with respect to the signer real-time, we allow self-occlusions, imprecise localizations of the fiducial points.

Motion
In our approach, the 3D motion and place of articulation of the signs are also identified, we use structure-from-motion algorithms to recover the 3D structure and motion of the hand.

Gesture articulation
We use the main linguistic components of manual signs identified, our algorithm can provide robust recognition over a large variety of signers and points of view.

Our Vision
To enable social inclusion using app that allow hearing/speech impaired individuals to communicate better. Our objective is to provide seamless services to 250,000 British sign-language users by 2025. Further, we aim to cater 70 million sign language users across the globe.
Social Inclusion - Interpreters, End Users, Advocacy, Companies, Schools and Colleges. We
partner with sign-language interpreters and hearing/speech impaired advocacy groups, ensuring maximum end
value due to qualified insights and inputs.
A unique app helping and enabling sign language users hearing/speech impaired to integrate with hearing
users for basic services as transportation, healthcare, work, study and communicate with able-
hearing users and vice versa, interfacing with any platform inhouse with enterprises.
Memberships

British Deaf Association
The BDA is a Deaf people’s organisation representing a diverse, vibrant and ever-changing community of Deaf people.

National Deaf Children’s Society
We give expert support on childhood deafness, raise awareness and campaign for deaf children’s rights, so they have the same opportunities as everyone else.

Royal National Institute for Deaf People
We are RNID, the charity working to make life fully inclusive for deaf people and those with hearing loss or tinnitus.
Supported By

Set Squared UK
SETsquared is a unique enterprise partnership and a dynamic collaboration between the six leading research-led UK universities of Bath, Bristol, Cardiff, Exeter, Southampton and Surrey. Ranked as the Global No. 1 Business Incubator, we provide a wide range of highly acclaimed support programmes to help turn ideas into thriving businesses.