The OpenMV Cam Donkey Car is designed to be easy to build out of parts that you can buy online and assemble together with basic tools. Below is the list of essential parts you’ll need to build the self-driving car.
Here is the car in action:
kwagyeman has shared the servo controller board on OSH Park:
The PIDDYBOT is currently using a Atmega32u4 microncontroller. It uses 3 potentiometers that allow you to manually tune the PID loop to get the robot balancing. This allows you to see how each term affects the performance of the system. It is a great teaching tool for the classroom and is currently being used by students at McMaster University.
The aim of this project is to lower the barrier of entry into dynamic robotics. After seeing Boston Dynamic’s Wildcat I became interested in working on something similar, but was disappointed with what the hobbiest scene had to offer. They all used static locomotion. I wanted it to feel alive!
I hope that if people can see that this style of robotics is reproducible with basic development skills, it will attract a wider range of people to legged robots than just those who want to see a vaguely spider looking device re-implement the same kinematic equations over and over again.
The approach is based on the work of Fumiya Iida and Rolf Pfiefer at the University of Zurich in the mid 2000’s. Dr. Pfeifer is well known in the field of embodied cognitive science, and these experiments were an attempt to generate movement in quadruped robots based on those principles.
[David Brown]’s entry for The Hackaday Prize is a design for a tool that normally exists only as an expensive piece of industrial equipment; out of the reach of normal experimenters, in other words. That tool is a 6-axis micro manipulator and is essentially a small robotic actuator that is capable of very small, very precise movements.…
This summer, I am once again diving into designing mechanical personality quirks. I’ll be investigating new and exciting ways for my robot, NoodleFeet to interact with the world. This time, my focus is the wet, tingly and preferential aspect of TASTE.
From now until the end of August, my goal is to produce four different tasting modules that each demonstrate some aspect of sampling or preference. You could think of them as the “four tasters of the apocalypse”
Bristlebots are great because no coding is required – they’re completely analog circuits that just go! But if you wanted them to go in a specific direction, how would you do that? Facelesstech has released their design for a light-following bristlebot that uses two LDRs to drive either side of the bristlebot (so you could turn it, somewhat – see video below for demo!). It’s pretty simple and pretty clever.