Forum Member Zenta has just posted a teaser video of his improved MorpHex MK-II. The MorpHex is a robot that is capable of transforming from a hexapod cralwer into a mechanically rolling ball. The Upper half of the robot has been upgraded from a 1 dof linkage to a more flexible leg . We’re excited to see more demos of the MorpHex in action!
Archive for the ‘Featured Projects / Robots’ Category
ROBOTIS has posted a news article asking for applicants to beta-test their new XL-320 servos. These new servos are similar to the AX-12A servos, working with the DTYNAMIXEL data packet, allowing you to set position and speed of the servo as well as retrieve positional data, temperature data and more! You can even set the compliance slopes on the servo, just as you could with an AX-12A. The new servos are smaller than AX-12As, making them great for projects like ROBOTIS’s mini-darwin project.
To apply for the beta program, ROBOTIS has asked customers with DYNAMIXEL experience to share the news post on facebook and describe what makes them a good candidate for testing the XL-320. 10 teams/individuals will be chosen by ROBOTIS. Beta testers will receive 2 XL-320, a CM9 control board, 2 Li-ion battery packs, a charger and several OLLO pieces.
There is no release date for the XL-320s as of yet, but we’ll keep you posted on any news!
Forum member dburongarcia has been doing research and development for using robotics to assist people with cerebral palsy. He’s using a Arduino Uno with USB Host shield to control a pair of motors in a motorized wheel chair, allowing the user to control the chair with limited head movements. The system can even be controlled via facial recognition running on a PC.
What really caught our eye was the second half of his project – using one of our PhantomX Reactor Robot Arms to help the user feed themselves! The video is in spanish, but the project speaks for itself.
Using 4 pushbuttons, the user can pick food from 1 of 3 bowls, put the food back in the bowl, or get a drink of water. There’s also a version that uses facial recognition to feed the user – we’ll have the video of that up soon.
Projects like this really mean a great deal to us – we truly believe that robots are an amazing tool for assisting humans. When we see that our robots have been integrated into thoughtful designs like this, we know we’re doing something right. We’re definitely looking forward to seeing more documentation and details on this project.
UPDATE:Here’s the video of the facial gesture based commands
User anestsurfer has posted a videos of his PhantomX Reactor sorting colored blocks. It looks like he’s using a downward facing camera, vision processing software and a grid pattern to locate objects, then sending data to the arm to have it move the blocks to different locations. On his channel he has also shown off his custom inverse kinmenatics engine for the arm. We really love seeing the Reactor used with vision processing software.
Ever wonder what it would be like to be a huge robot spider stalking prey in the night? Well now you can know. Check out this cool FPV view from a PhantomX Mark II at night. At 6:19 you can see the head tracking set up.
In these experiments the researchers are examining algorithms for robots to ‘recover’ from damage. In this case, they shortned a leg of the hexapod, and use a T-Resilience Algorithm to calculate a new walking pattern/gait.
Their initial gait was able to acheive a top speed of 26cm/s. When they shortened one leg to half it’s original length, the performance dropped to 8cm/s. At this point the robot begins to calculate a new gait. It runs 40 simulations, then trys the best of these simulations in real life. The hexapod will run this experiment 25 times to determine the best new gait. In only 20 minutes the hexapod is able to use a new gait that gets 18cm/s – three times the performance of the original gait under damaged conditions!
Now this is no normal PhantomX Hexapod. This model has a custom chassis, an IR camera, MX-28s, an on on-board computer, and a custom built ‘damaged’ leg.
Researchers Sylvain Koos, Antoine Cully and Jean-Baptiste Mouret have done a great job with this experiement as well as documenting it. More information about the paper can be found here and here. A PDF of the paper can be found here.
Interbotix Labs is proud to announce our first entry into our line of Edge Kits!
Edge Kits are robotics kits that we are releasing from our R&D labs early. In an effort to make more kits available to the community and to accelerate the innovation cycle we have decided to start releasing “Edge Kits” for advanced builders. The name comes from the term “cutting edge” to denote that these kits are on the front edge of development. The kits are intended as hardware kits and may or may not come with certain levels of code that we and the community is working on, though code will often be available in unsupported beta formats which people are sharing and banging away on to improve.
We made the choice to release kits in this format due to the demand we have seen over the years from the community wanting to get their hands on advanced kits faster than the normal cycle. We as roboticists are notoriously impatient people and when we want to build, we want to build now. So the Edge Kit was born. The kits do NOT come with the same level of documentation, code, and/or assembly instructions that our fully support Interbotix Kits do. As a result, these kits are best suited for individuals with previous experience with our kits or other similar applications.
Our first Edge Kit is the Interbotix Octopod. As the name suggests, the kit is an eight legged crawler. Community member KevinO has already gotten his kit up and running with some seriously advanced features and sensing capabilities. His robot Charlotte started life as a PhantomX Hexapod running an Arbotix and from there he modified the legs several times to his liking (a great feature of the modular AX-12 bracket system). Soon a Raspberry Pi and Kinect 3d Camera sensor were added, and he ported over the Phoenix-Phoenix code (another community project for our crawlers lead by KurtE) to run natively on the Raspberry Pi. As he experimented with face tracking, obstacle avoidance and gesture tracking, KevinO expressed a desire to make Charlotte a bit closer to a real spider by adding an additional pair of legs and bringing the total up to 8. And thus, the Octopod project was born.
This is a perfect example of how the synergy of small companies and online communities are ever increasing the speed of innovation in technology. A big thanks goes out to Kevin for his VERY impressive project, we can’t wait to see where he takes it next! Check out his video of Charlotte taking some of her early steps:
Maker n8zach has built himself an amazing 3D printed arm. The arm is powered by 1 MX-106, 3 MX-64s, 1 MX-28 and 2 AX-18As (one of which is used with our PhantomX Gripper The arm is contorlled via a PC using a USB2DYNAMIXEL and the DYNAMIXEL SDK.
This arm on its own is pretty amazing, but on top of everything else, n8zach can control his arm via a Kinect. He can control the arm manually, or he can set it to automatically perform certain tasks.
Phil Williammee has created an application that can communicate with the the PhantomX Pincher through PyPose. pyPincher will allow you to control the Pincher via rotation, extension, and height, along with gripper angle and the gripper itself. The program will then also give you a 3D representation of the arm’s current position. You can even load different coordinates and toggle between them! This is just another great example of how users can leverage the open software and firmware of the InterbotiX robots to create custom setups.
You can grab a copy of the code at his GitHub page