Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Image dehazing using Tensorflow #65

Open
wants to merge 3 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view

Large diffs are not rendered by default.

Large diffs are not rendered by default.

150 changes: 75 additions & 75 deletions 9_Deep_Learning/Convolutional_Neural_Networks/cnn.py
Original file line number Diff line number Diff line change
@@ -1,75 +1,75 @@
""" Convolutional Neural Network
"""

# Installing Theano
# pip install --upgrade --no-deps git+git://github.com/Theano/Theano.git

# Installing Tensorflow
# Install Tensorflow from the website: https://www.tensorflow.org/versions/r0.12/get_started/os_setup.html

# Installing Keras
# pip install --upgrade keras

# Part 1 - Building the CNN

# Importing the Keras libraries and packages
from keras.models import Sequential
from keras.layers import Convolution2D
from keras.layers import MaxPooling2D
from keras.layers import Flatten
from keras.layers import Dense
from keras.preprocessing.image import ImageDataGenerator


def main():
# Initialising the CNN
classifier = Sequential()

# Step 1 - Convolution
classifier.add(Convolution2D(32, 3, 3, input_shape=(64, 64, 3), activation='relu'))

# Step 2 - Pooling
classifier.add(MaxPooling2D(pool_size=(2, 2)))

# Adding a second convolutional layer
classifier.add(Convolution2D(32, 3, 3, activation='relu'))
classifier.add(MaxPooling2D(pool_size=(2, 2)))

# Step 3 - Flattening
classifier.add(Flatten())

# Step 4 - Full connection
classifier.add(Dense(output_dim=128, activation='relu'))
classifier.add(Dense(output_dim=1, activation='sigmoid'))

# Compiling the CNN
classifier.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])

# Part 2 - Fitting the CNN to the images

train_datagen = ImageDataGenerator(rescale=1. / 255,
shear_range=0.2,
zoom_range=0.2,
horizontal_flip=True)

test_datagen = ImageDataGenerator(rescale=1. / 255)

training_set = train_datagen.flow_from_directory('dataset/training_set',
target_size=(64, 64),
batch_size=32,
class_mode='binary')

test_set = test_datagen.flow_from_directory('dataset/test_set',
target_size=(64, 64),
batch_size=32,
class_mode='binary')

classifier.fit_generator(training_set,
samples_per_epoch=8000,
nb_epoch=25,
validation_data=test_set,
nb_val_samples=2000)


if __name__ == '__main__':
main()
""" Convolutional Neural Network
"""
# Installing Theano
# pip install --upgrade --no-deps git+git://github.com/Theano/Theano.git
# Installing Tensorflow
# Install Tensorflow from the website: https://www.tensorflow.org/versions/r0.12/get_started/os_setup.html
# Installing Keras
# pip install --upgrade keras
# Part 1 - Building the CNN
# Importing the Keras libraries and packages
from keras.models import Sequential
from keras.layers import Convolution2D
from keras.layers import MaxPooling2D
from keras.layers import Flatten
from keras.layers import Dense
from keras.preprocessing.image import ImageDataGenerator
def main():
# Initialising the CNN
classifier = Sequential()
# Step 1 - Convolution
classifier.add(Convolution2D(32, 3, 3, input_shape=(64, 64, 3), activation='relu'))
# Step 2 - Pooling
classifier.add(MaxPooling2D(pool_size=(2, 2)))
# Adding a second convolutional layer
classifier.add(Convolution2D(32, 3, 3, activation='relu'))
classifier.add(MaxPooling2D(pool_size=(2, 2)))
# Step 3 - Flattening
classifier.add(Flatten())
# Step 4 - Full connection
classifier.add(Dense(output_dim=128, activation='relu'))
classifier.add(Dense(output_dim=1, activation='sigmoid'))
# Compiling the CNN
classifier.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])
# Part 2 - Fitting the CNN to the images
train_datagen = ImageDataGenerator(rescale=1. / 255,
shear_range=0.2,
zoom_range=0.2,
horizontal_flip=True)
test_datagen = ImageDataGenerator(rescale=1. / 255)
training_set = train_datagen.flow_from_directory('dataset/training_set',
target_size=(64, 64),
batch_size=32,
class_mode='binary')
test_set = test_datagen.flow_from_directory('dataset/test_set',
target_size=(64, 64),
batch_size=32,
class_mode='binary')
classifier.fit_generator(training_set,
samples_per_epoch=8000,
nb_epoch=25,
validation_data=test_set,
nb_val_samples=2000)
if __name__ == '__main__':
main()
12 changes: 6 additions & 6 deletions 9_Deep_Learning/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
## Deep Learning

[1. Artificial Neural Networks (ANN)](Artificial_Neural_Networks)

[2. Convolutional Neural Networks (CNN)](Convolutional_Neural_Networks)

## Deep Learning
[1. Artificial Neural Networks (ANN)](Artificial_Neural_Networks)
[2. Convolutional Neural Networks (CNN)](Convolutional_Neural_Networks)
10 changes: 5 additions & 5 deletions 9_Deep_Learning/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
matplotlib==3.2.1
pandas==1.0.3
numpy==1.18.4
keras==2.4.3
scikit_learn==0.23.2
matplotlib==3.2.1
pandas==1.0.3
numpy==1.18.4
keras==2.4.3
scikit_learn==0.23.2
42 changes: 21 additions & 21 deletions LICENSE
Original file line number Diff line number Diff line change
@@ -1,21 +1,21 @@
MIT License

Copyright (c) 2020 Nishkarsh Raj

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
MIT License
Copyright (c) 2020 Nishkarsh Raj
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
166 changes: 83 additions & 83 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,83 +1,83 @@
![Cover Image](docs/cover.png)

<h1 align="center">#100DaysofMLCode</h1>

## Table of Contents

[**1. Data Pre-processing**](2_Data_Preprocessing/README.md)
* [Importing Libraries](2_Data_Preprocessing/README.md#importing_libraries)
* [Importing Data sets](2_Data_Preprocessing/README.md#importing_datasets)
* [Handling the missing data values](2_Data_Preprocessing/README.md#handling_veracity)
* [Encoding categorical data](2_Data_Preprocessing/README.md#encoding_cat_data)
* [Split Data into Train data and Test data](2_Data_Preprocessing/README.md#split_data)
* [Feature Scaling](2_Data_Preprocessing/README.md#feature_scaling)

[**2. Regression**](3_Regression/README.md)
* [Simple Linear Regression](3_Regression/Simple_Linear_Regression)
* [Multi Linear Regression](3_Regression/Multi_Linear_Regression)
* [Polynomial Regression](3_Regression/Polynomial_Regression)
* [Support Vector Regression](3_Regression/Support_Vector_Regression)
* [Decision Tree Regression](3_Regression/Decision_Tree_Regression)
* [Random Forest Regression](3_Regression/Random_Forest_Regression)

[**3. Classification**](4_Classification/README.md)
* [Logistic Regression](4_Classification/Logistic_Regression)
* [K Nearest Neighbors Classification](4_Classification/K_Nearest_Neighbors)
* [Support Vector Machine](4_Classification/Support_Vector_Machine)
* [Kernel SVM](4_Classification/Kernel-SVM)
* [Naive Bayes](4_Classification/Naive_Bayes)
* [Decision Tree Classification](4_Classification/Decision_Tree_Classification)
* [Random Forest Classification](4_Classification/Random_Forest_Classification)

[**4. Clustering**](5_Clustering/README.md)
* [K-Means Clustering](5_Clustering/K_Means)
* [Hierarchical Clustering](5_Clustering/Hierarchical_Clustering)

[**5. Association Rule**](6_Association_Rule/README.md)
* [Apriori](6_Association_Rule/Apriori)
* [Eclat](6_Association_Rule/Eclat)

[**6. Reinforcement Learning**](7_Reinforcement_Learning/README.md)
* [Upper Confidence Bounds](7_Reinforcement_Learning\Upper_confidence_Bound)
* [Thompson Sampling](7_Reinforcement_Learning/Thompson_Sampling)

[**7. Natural Language Processing** ](8_Natural_Language_Processing)
* [AWS Comprehend](8_Natural_Language_Processing)

[**8. Deep Learning**](9_Deep_Learning/README.md)
* [Artificial Neural Networks (ANN)](9_Deep_Learning/Artificial_Neural_Networks)
* [2. Convolutional Neural Networks (CNN)](9_Deep_Learning/Convolutional_Neural_Networks)


[**9. Dimensionality Reduction**](10_Dimensionality_Reduction/README.md)
* [Principal Component Analysis](10_Dimensionality_Reduction/Principal_Component_Analysis)
* [Linear Discriminant Analysis](10_Dimensionality_Reduction/Linear_Discriminant_Analysis)
* [Kernel PCA](10_Dimensionality_Reduction/Kernel_PCA)

[**10. Model Selection**](11_Model_Selection/README.md)
* [Grid Search](11_Model_Selection/Model_Selection)
* [K-fold Cross Validation](11_Model_Selection/Model_Selection)
* [XGBoost](11_Model_Selection/XGBoost)

**11. Data Visualization**
* Matplotlib library in Python
* Tableau
* Power BI
* Grafana

## Log of my Day-to-Day Activities

Track my daily activities [here](docs/100Days_Log.md)

## How to Contribute

This is an open project and contribution in all forms are welcomed.
Please follow these [Contribution Guidelines](docs/CONTRIBUTING.md)

## Code of Conduct

Adhere to the GitHub specified community [code](docs/CODE_OF_CONDUCT.md).

## License

Check the official MIT License [here](LICENSE).
![Cover Image](docs/cover.png)
<h1 align="center">#100DaysofMLCode</h1>
## Table of Contents
[**1. Data Pre-processing**](2_Data_Preprocessing/README.md)
* [Importing Libraries](2_Data_Preprocessing/README.md#importing_libraries)
* [Importing Data sets](2_Data_Preprocessing/README.md#importing_datasets)
* [Handling the missing data values](2_Data_Preprocessing/README.md#handling_veracity)
* [Encoding categorical data](2_Data_Preprocessing/README.md#encoding_cat_data)
* [Split Data into Train data and Test data](2_Data_Preprocessing/README.md#split_data)
* [Feature Scaling](2_Data_Preprocessing/README.md#feature_scaling)
[**2. Regression**](3_Regression/README.md)
* [Simple Linear Regression](3_Regression/Simple_Linear_Regression)
* [Multi Linear Regression](3_Regression/Multi_Linear_Regression)
* [Polynomial Regression](3_Regression/Polynomial_Regression)
* [Support Vector Regression](3_Regression/Support_Vector_Regression)
* [Decision Tree Regression](3_Regression/Decision_Tree_Regression)
* [Random Forest Regression](3_Regression/Random_Forest_Regression)
[**3. Classification**](4_Classification/README.md)
* [Logistic Regression](4_Classification/Logistic_Regression)
* [K Nearest Neighbors Classification](4_Classification/K_Nearest_Neighbors)
* [Support Vector Machine](4_Classification/Support_Vector_Machine)
* [Kernel SVM](4_Classification/Kernel-SVM)
* [Naive Bayes](4_Classification/Naive_Bayes)
* [Decision Tree Classification](4_Classification/Decision_Tree_Classification)
* [Random Forest Classification](4_Classification/Random_Forest_Classification)
[**4. Clustering**](5_Clustering/README.md)
* [K-Means Clustering](5_Clustering/K_Means)
* [Hierarchical Clustering](5_Clustering/Hierarchical_Clustering)
[**5. Association Rule**](6_Association_Rule/README.md)
* [Apriori](6_Association_Rule/Apriori)
* [Eclat](6_Association_Rule/Eclat)
[**6. Reinforcement Learning**](7_Reinforcement_Learning/README.md)
* [Upper Confidence Bounds](7_Reinforcement_Learning\Upper_confidence_Bound)
* [Thompson Sampling](7_Reinforcement_Learning/Thompson_Sampling)
[**7. Natural Language Processing** ](8_Natural_Language_Processing)
* [AWS Comprehend](8_Natural_Language_Processing)
[**8. Deep Learning**](9_Deep_Learning/README.md)
* [Artificial Neural Networks (ANN)](9_Deep_Learning/Artificial_Neural_Networks)
* [2. Convolutional Neural Networks (CNN)](9_Deep_Learning/Convolutional_Neural_Networks)
[**9. Dimensionality Reduction**](10_Dimensionality_Reduction/README.md)
* [Principal Component Analysis](10_Dimensionality_Reduction/Principal_Component_Analysis)
* [Linear Discriminant Analysis](10_Dimensionality_Reduction/Linear_Discriminant_Analysis)
* [Kernel PCA](10_Dimensionality_Reduction/Kernel_PCA)
[**10. Model Selection**](11_Model_Selection/README.md)
* [Grid Search](11_Model_Selection/Model_Selection)
* [K-fold Cross Validation](11_Model_Selection/Model_Selection)
* [XGBoost](11_Model_Selection/XGBoost)
**11. Data Visualization**
* Matplotlib library in Python
* Tableau
* Power BI
* Grafana
## Log of my Day-to-Day Activities
Track my daily activities [here](docs/100Days_Log.md)
## How to Contribute
This is an open project and contribution in all forms are welcomed.
Please follow these [Contribution Guidelines](docs/CONTRIBUTING.md)
## Code of Conduct
Adhere to the GitHub specified community [code](docs/CODE_OF_CONDUCT.md).
## License
Check the official MIT License [here](LICENSE).
Loading