Building a Smart Assistant for the Deaf and Dumb Using Deep Learning

Local Chapter Jordan Chapter

Coordinated by Jordan ,

Status: Completed

Project Duration: 20 Apr 2023 - 09 Jun 2023

Open Source resources available from this project

Project background.

Deaf and mute individuals have used sign language as a means of communication for centuries. Sign language is a visual language that uses a combination of hand gestures, facial expressions, and body language to convey meaning. Sign language is a complex and nuanced language that has its own grammar and syntax, and it is used by millions of people around the world.

There are many different sign languages used throughout the world, each with its own unique characteristics and regional variations. Some of the most widely used sign languages include American Sign Language (ASL), British Sign Language (BSL), and Australian Sign Language (Auslan). Each sign language has its own set of signs and gestures that are used to convey meaning, and many sign languages are not mutually intelligible.

Sign language has played an important role in the deaf community, providing a means of communication that is accessible to deaf and mute individuals. Sign language has also been recognized as an official language in many countries around the world, including the United States, Canada, and New Zealand.

Project plan.

  • Week 1

    Research

  • Week 2

    ● Data Collection and Exploratory Data Analysis

  • Week 3

    ● Feature Extraction

  • Week 4

    ● Model Development and Training

  • Week 5

    test model

  • Week 6

    App development

Share project on: