Arduino Neural Network Robot

Sean Hodgins created an Arduino-based robot that avoids light by navigating using a neural network:
4863551509384902904
This project is meant to teach about utilizing neural networks in robotic platforms. There will be a 3 part video series on the Make YouTube channel on building the robot. It will start with prototyping and design, then move onto assembly and testing, and finally programming and running the neural network. You will be able to follow along and make your own robot in the end.

Part 2: Soldering and Assembly

 

 

Arduino Neural Network Robot

Donkey Self-Driving Car

Kwabena Agyeman shows how to create a DIY Robocar forked off of the “Donkey” Self-Driving car platform using the OpenMV Cam

donkey-car-web

Donkey Self-Driving Car

The OpenMV Cam Donkey Car is designed to be easy to build out of parts that you can buy online and assemble together with basic tools. Below is the list of essential parts you’ll need to build the self-driving car.

step(34)small.jpg

Here is the car in action:

kwagyeman has shared the servo controller board on OSH Park:

OpenMV Cam Servo Controller

6ec7d6712270e213752a2e7851c94c65.png

Order from OSH Park

Donkey Self-Driving Car

PIDDYBOT: DIY Arduino Balancing Robot

Sean Hodgins designed this open source balancing robot to help teach PID control:

tnKlDYQ (1)

The PIDDYBOT

The PIDDYBOT is currently using a Atmega32u4 microncontroller. It uses 3 potentiometers that allow you to manually tune the PID loop to get the robot balancing. This allows you to see how each term affects the performance of the system. It is a great teaching tool for the classroom and is currently being used by students at McMaster University.
The design files and source code is available on GitHub:

IdleHandsProject/thePIDDYBOT

PIDDYBOT: DIY Arduino Balancing Robot

Mr. Runner

Alex Martin is creating a four legged robot with a running bound gait:

9754291472923566463.png

Mr. Runner

The aim of this project is to lower the barrier of entry into dynamic robotics. After seeing Boston Dynamic’s Wildcat I became interested in working on something similar, but was disappointed with what the hobbiest scene had to offer. They all used static locomotion. I wanted it to feel alive!

I hope that if people can see that this style of robotics is reproducible with basic development skills, it will attract a wider range of people to legged robots than just those who want to see a vaguely spider looking device re-implement the same kinematic equations over and over again.

371011500859268418.jpg

The approach is based on the work of Fumiya Iida and Rolf Pfiefer at the University of Zurich in the mid 2000’s. Dr. Pfeifer is well known in the field of embodied cognitive science, and these experiments were an attempt to generate movement in quadruped robots based on those principles.

Mr. Runner

DIY 6-Axis Micro Manipulator

[David Brown]’s entry for The Hackaday Prize is a design for a tool that normally exists only as an expensive piece of industrial equipment; out of the reach of normal experimenters, in other words. That tool is a 6-axis micro manipulator and is essentially a small robotic actuator that is capable of very small, very precise movements.…

via Hackaday Prize Entry: DIY 6-Axis Micro Manipulator — Hackaday

DIY 6-Axis Micro Manipulator

Robotic Arts: Noodle is Gettin’ Bean Feet

Sarah Petkus posts an update on her Robotic Arts blog about her NoodleFeet robot:

Noodle is Gettin’ Bean Feet!

This summer, I am once again diving into designing mechanical personality quirks. I’ll be investigating new and exciting ways for my robot, NoodleFeet to interact with the world. This time, my focus is the wet, tingly and preferential aspect of TASTE.

moduledrawings

From now until the end of August, my goal is to produce four different tasting modules that each demonstrate some aspect of sampling or preference. You could think of them as the “four tasters of the apocalypse”

If you’re unfamiliar with Sarah and NoodleFeet, then check out here great talk from Hackaday Super Con:

Robotic Arts: Noodle is Gettin’ Bean Feet

Bristlebot with LDRs Becomes Light-Following Bristlebot

Bristlebot with LDRs Becomes Light-Following Bristlebot

Bristlebots are great because no coding is required – they’re completely analog circuits that just go! But if you wanted them to go in a specific direction, how would you do that? Facelesstech has released their design for a light-following bristlebot that uses two LDRs to drive either side of the bristlebot (so you could turn it, somewhat – see video below for demo!). It’s pretty simple and pretty clever.

img_20160815_165948 (2).jpg

The KiCad design files are available on GitHub:

 

Bristlebot with LDRs Becomes Light-Following Bristlebot