Skip to content
Yohnjparra edited this page Jul 25, 2019 · 5 revisions

Welcome to the group3 wiki!

Overview paper Using Deep Learning for Image-Based Plant Disease Detection

They analyze 54,306 images of plant leaves, which have a spread of 38 class labels assigned to them. Each class label is a crop-disease pair, and they make an attempt to predict the crop-disease pair given just the image of the plant leaf. We choose googleNet architecture based on best accuracy with 22 layers We choose transfer learning mechanism We choose data type: color

Docker Containers in aws

Step one Download aws cli:

$ curl “https://s3.amazonaws.com/aws-cli/awscli-bundle.zip” -o “awscli-bundle.zip” $ unzip awscli-bundle.zip $ sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/aws $ /usr/local/bin/aws --version

Step two Launch and instance

in Console select instance aws select Ubuntu image Choose an instance type create a new key pair Launch instance connect and set the permissions connect in the terminal with the ssh -i namekeypair ubuntu@…

Deploying Docker

1.Uninstall old versions: $ sudo apt-get remove docker docker-engine docker.io containerd runc 2.Set up Repository : $sudo apt-get update 3.Add the docker ceo repo: $ sudo yum-config-manager –add-repo https://download.docker.com/linux/centos/docker-ce.repo 4.Install packages: sudo apt-get install apt-transport-https ca-certificates curl gnupg-agent software-properties-common 5. Add Docker’s official GPG key:$ sudo apt-key fingerprint 0EBFCD88 pub rsa4096 2017-02-22 [SCEA] 9DC8 5822 9FC7 DD38 854A E2D8 8D81 803C 0EBF CD88 uid [ unknown] Docker Release (CE deb) [email protected] sub rsa4096 2017-02-22 [S] 6. Install the latest version of Docker Engine: $ sudo apt-get install docker-ce docker-ce-cli containerd.io 8. sudo groupadd docker 9. sudo usermod -aG docker $USER 10. log out 11. log in again 12. docker run hello-world

We trying to mount data storage in s3 AWS as volume to docker but S3 is an object store, not a file system. One way is to have S3 trigger a message to SQS when new objects are added to the bucket. Then you can code your application running in the Docker container to poll SQS for new messages, and us the S3 location in the message to copy the object from S3 to local storage (using the appropriate AWS SDK) for processing.