Skip to content

knvramana/COMP-6721-SUMMER-PROJECT

Repository files navigation

American Sign Language Alphabet Classification

(COMP_6721-SUMMER-PROJECT)

--> GROUP 07 <--

About The Project

ASL is accepted by many high schools, colleges, and universities in fulfillment of modern and “foreign” language academic degree requirements across the United States. This project aims for ASL alphabet classification to facilitate better understanding and interaction with the ASL community. The goal is to develop a reliable and efficient tool for ASL alphabet recognition, which can bridge the communication gap between individuals with hearing impairments and those who do not know sign language. we will use PyTorch to build and train a deep learning model to classify images to classes.

Link to the Dataset

The dataset used is ASL dataset from Kaggle, it has 26 letters(a-z) and two words space and nothing. The link to the dataset is follows: https://www.kaggle.com/datasets/kapillondhe/american-sign-language

Built With

  • Python
  • Pytorch
  • Colab
  • Scipy
  • Numpy
  • scikit-learn
  • PIL
  • Matplotlib
  • Pickle
  • THOP
  • CuPy

Repository Structure

  • Supervised learning Classification with Decision Trees : Jupiter Notebook files for Decision Tree.
  • Semi-supervised learning Classification with Decision Trees : Jupiter Notebook files for Decision Tree.
  • CNN: Jupiter Notebook files for CNN.

Getting Started

Prerequisites

Depend on the operating system, install the Python3.9 or above

Installation

Option 1: Install Anaconda

Installing Anaconda allows to run all Jupyter Notebook files on local computer. If you haven't installed Anaconda, go here: https://store.continuum.io/cshop/anaconda/ This will install everything that you need.

Option 2: Run on Google Colab

Running on Google Colab without local computer setup, which requires a Google Colab account. If you haven't register Colab, go here: https://colab.research.google.com/signup.

Decision Tree with Supervised Learning Classification

  1. Download the dataset from here
  2. Depend on running with local computer or Google Colab, change the dataset path in this code section inside Supervised Learning Classification .ipynb files
alphabet_dir = "ASL_Dataset/Train"
  1. Change following constants in code for hyperparameters( Change only if tuning is required,otherwise skip this step)
    param_grid = {
        'max_depth': [12],
        'criterion': ['gini', 'entropy'],
        'min_samples_split': [2, 5, 10]
    }
  1. Run Jupyter Notebook, and see the results.

Decision Tree with Semi-Supervised Learning Classification

  1. Download the dataset from here
  2. Depend on running with local computer or Google Colab, change the dataset path in this code section inside Semi-Supervised Learning Classification .ipynb files
alphabet_dir = "ASL_Dataset/Train"
  1. Change following constants in code for hyperparameters( Change only if tuning is required,otherwise skip this step)
    param_grid = {
        'max_depth': [12],
        'criterion': ['gini', 'entropy'],
    }
  1. Run Jupyter Notebook, and see the results.

ResNet18

  1. Download the dataset from here
  2. Depend on running with local computer or Google Colab, change the dataset path in this code section inside each ResNet .ipynb files
# give path of the input dataset folder
path="/kaggle/input/ASL/Dataset"
# give path to save the plot results(Example training vs epoch,loss vs steps,etc)
saveFilePath="/kaggle/input/hyperparameters.pkl"
# give path to save the trained model
saveModelPath="/kaggle/input/"
  1. Change following constants in code for hyperparameters
#change input dimensions of the image fed to the CNN
inputDimension=(256,256)
#Setting different batch sizes
batch_sizes= [128,64,32]
#Setting different learning rates
learning_rates= [0.00001,0.0001,0.01]
#Setting the number of epochs
epochs=10
#setting the loss function
criterion=nn.CrossEntropyLoss()
  1. Run Jupyter Notebook, and see the results

About

GROUP 07 SUMMER COMP 6721 PROJECT

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •