Identification of people on videos based on their walking style using convolutional and recurrent neural networks.
- Add a section about using your own neural network with tundra.
We use a lot of technologies, so you've got to do a lot of stuff.
You can do it step by step with this little tutorial or just scroll down to see a longer code snippet that you can just copypaste into your terminal.
- Ruby and Ruby on Rails (Frontend Server)
- You will have to run
bundle install
infront-server
directory
- You will have to run
- Node.js (Backend Prediction Server)
- You will have to run
npm install
inprediction-server
directory
- You will have to run
- Torch7 (Neural Networks)
- ffmpeg (Data Preprocessing)
git clone https://github.com/apaszke/tundra
cd tundra/front-server
bundle install
cd ../prediction-server
npm install
./run-frontend-server.sh
./run-prediction-server.sh
Now you can open your browser at localhost:1337
and test the neural network.
We've done a prototype of tundra during AGHacks 2015.
People can easily recognize the walking style of their friends. Let's see if we can teach a computer to identify walking people on videos.
It identifies people on videos based on their walking style.
We build a neural network in torch and trained in on Amazon Web Service because we needed a lot of computing power. Additionally, we've made a website client in Ruby on Rails and a back-end server in Node.js which runs a trained neural network. When this neural network receives a video it returns the probabilities for every person it knows that the person is walking in the uploaded video.
We're going to...
Well, the tundra word just sounds awesome, doesn't it?