Codebase Development
- Pick-n-Place
- Look
- Grasp
- Lift
- Flip
- Release
- Navigation(Base/Arm)
- Object Models
- Kinect Postural Detection
- Joystick Control (in progress) by Dirk Ruiken & Hee-Tae Jung
- Driving by Dirk Ruiken
- Pushup by Scott Kuindersma
- Endpoint Controller by Hee-Tae Jung
- How to run: roslaunch ubot5 joystick.launch & roslaunch ubot5 endpointcontrollers.launch & roslaunch ubot5 rviz.launch & roslaunch ubot5 server_loopback.launch
- Model-referenced Perception
- Q-Learning (SMACH)
- Model Based Planning
- RFID
- People Tracker
- Dexterous Mobility
- Knuckle Walk
- Prone Mobility
- Transitions - Push up
- Hardware
- C# Search Code
- Gesture Interface
- ARTag Dection by Shiraj Sen
- How to run: roslaunch ar_pose ar_pose_multi_kinect.launch & roslaunch ubot5 kinect.launch
- Teleoperation I (in progress) by Hee-Tae Jung & Takeshi Takahashi
- How to run: roslaunch ubot5 server_loopback.launch & roslaunch ubot5 rviz.launch & roslaunch ubot5 teleoperation.launch & roslaunch ubot5 teleoperation_joint_publisher.launch & roslaunch ubot5 gui.launch (optional)
- Caution: teleoperation.launch SHOULD be launched BEFORE 'teleop_command_convert_joint
- Suggested Setting: Main computer (server.launch, rviz.launch teleop_command_convert_joint), Laptop (teleoperation.launch)
- Misc.: Run skype using LD_PRELOAD=/usr/lib32/libv4l/v4l1compat.so skype
- Data Logging (in progress) by Hee-Tae Jung & Takeshi Takahashi
- Logging trajectories in multiple trackable frames
- Speech Recognition (in progress) by Hee-Tae Jung
- Speech Synthesis (in progress) by Takeshi Takahashi & Hee-Tae Jung
- Multi-objective Control (in progress) by Hee-Tae Jung
- How to run: roslaunch ubot5 server_loopback.launch & roslaunch ubot5 rviz.launch & roslaunch ubot5 multipointcontrollers.launch & rosrun uBotCartesianPositionController uBotMultiPositionController_Client_left -0.3 0.4 0.3 0.5 0.1 0.3
- Caution: As of 6/25, the code works correctly. Need to test extensively on the uBot-5 to make sure though.