-
Notifications
You must be signed in to change notification settings - Fork 13
Home
Complete list: excel file
- RC Car body, $74
- Raspberry Pi 3, $35
- Adafruit DC & Stepper Motor HAT for Raspberry Pi, $22
- HobbyWing QUICRUN 1060 Brushed ESC, $20
- Battery pack (for Pi3), $6
- Playstation 3 Eye (webcam), $5
- MPU-6050 3 axis gyro + accelerometer. $5
We assume ROS kinetic is installed on the system. In addition, the following packages must be installed to function properly.
sudo apt-get install ros-kinetic-rosapi
sudo apt-get install ros-kinetic-rosbridge-server
Wiring: http://blog.bitify.co.uk/2013/11/interfacing-raspberry-pi-and-mpu-6050.html
RTIMULib + rtimulib_ros: http://jetsonhacks.com/2015/07/01/bosch-imu-under-ros-on-nvidia-jetson-tk1/
After installing the ros node, make sure that you have i2c kernel driver is loaded and the /dev/i2c-* files are accessible. Lastly, edit catkin_ws/src/rtimulib_ros/config/RTIMULib.ini to set IMUType=0. This is needed to auto-detect the proper IMU type.
You can run the following simple test program to check whether both the steering (left/right) and throttle motors work.
$ sudo ./servod
--> launch Pi3's pwm driver to control the throttle motor. (onetime)
$ sudo ./picar-kbd.py
--> manual control program.
The basic usage of the 'picar-kbd.py' is as follows:
- 'j' : left by one step
- 'k' : right by one step
- 'a' : increase the throttle.
- 'z' : decrease the throttle (negative: backward)
- 's' : stop
- 'q' : quit
- 'r' : start/stop recording a video
- 't' : reset the steering angle to 0. Use this in the case that the wheels aren't centered before starting 'picar-kbd.py'
Data is gathered whenever a video is recorded. During the actual recording, the estimated steering angle is continually collected until the recording finishes. The dataset is saved in a new directory /datasets/dataset# where # is the integer value corresponding to that dataset. Each dataset directory has three main components:
- A png directory. This contains the png images of each frame from the recorded video. These are used by the Keras model for training.
- A data.csv file. This contains the local paths to each image from the png directory and the estimated steering angle at that frame. This is used by both deeptesla and the Keras model for training.
- An out-mencoder.avi video file. This is the recorded video from which the frames in the png directory were extracted. This is used by deeptesla for training.
You can control and access the picar, including the real-time camera video stream. First, launch the ROS nodes by doing the following.
$ roslaunch picar_base picar_base.launch
If things worked, then you should see the following topics
$ rostopic list
/cmd_vel
/imu
/rosout
/rosout_agg
The '/cmd_vel' channel is subscribed by the controller (src/motor_driver.py), which generate motor control commands based on the received messages (type: geometry_msgs/Twist). At this point, any controllers that publish Twist messages can be used to control the robot.
For example, you can use the standard keyboard based controller by doing the following.
$ rosrun teleop_twist_keyboard teleop_twist_keyboard.py
Alternatively, you can use a browser to control the robot by connecting to http://devboard-picar-wifi.ittc.ku.edu:8080 with your browser.
Lastly, you can open the live video stream through http://devboard-picar-wifi:8080/?action=stream
- DeepTesla python source code: https://github.com/lexfridman/deeptesla
- Use 3D printed platform to mount components.
- Use Intel Up board (x86), instead of PI3.
- Use Traxxas Rustler 1/10. (This could eliminate the need to use HobbyWing ESC)
- Use VESC (open-source ESC): http://vedder.se/2015/01/vesc-open-source-esc/