Our Process and Our Code

The Process…

 

We started off this robot with many hopes that we could draw any amazingly artistic picture from a robot which we affectionately named Stewie after the Family Guy character of the same name, pictured in Figure 1.

Figure 1. This is a picture of our Robot Stewie

 

We soon realized though by building the robot and with minimal tinkering with the equipment  that our ideas were not quite going to be able to come to fruition, which will be described throughout. For this project we were given a car robot (Stewie), four motors, an Arduino microcontroller, two H-bridges,  one ultrasound sensor, two IR sensors,  two encoders, 6 batteries for power,  and a pen holder. We are supposed to have this robot draw a design of our choice on a large piece of paper within a 6×6 walled in area. The robot and other provided parts are supposed to be able to draw any figure on any given surface. This though, we found out is not quite the way it is.

We ran into quite a few issues throughout the entire project, which did lead us to become quite proficient at fixing  issues with our robot.  It seemed like every time we started working on the robot a new issue would arise. Issues we encountered along with their solutions will be discussed next.

Well, after we built the robot and installed the Arduino we could start working on it. At this point only one issue arose… That issue being the batteries, which were not making contact with the battery pack. This was fixed easily by putting washers between the each battery and the pack. Once that was fixed we then decided the first real thing to do was to get the tires moving by way of the two H-bridges. We decided that the best way  would be to have the one H-bridge control the front tires and one H-bridge control the back tires. We then proceeded to code the robot with these specifications and it the motors worked fine. This simple task brought up our first issue, one that would become chronic, which is loose wires. Many times throughout the semester we would try to turn the robot on and the tires wouldn’t turn, and it would always be loose wires. We looked into the problem and we decided that we if we made the wires shorter they would have a less chance of being knocked out, and that is what happened, the loose wire issue almost went to no issue after making the wires shorter.

We then moved onto the sensors. The first sensor that we worked on was the ultra-sound sensor. This sensor is to be used to detect the distance that the robot is from the walls, and is to be positioned in the front of the robot. We had an issue with these in the sonar in that we burnt one out, but beyond that the ultra sound was very reliable. We just had to make sure that it was in the TTL mode for our use. The reason we burnt one out is that we switched the ground and 5V. We remedied this simply by replacing the burnt out sensor.

The next sensors we looked at were the IR sensors, which are placed on both the right and left sides of the robot. The IR sensors are to be used to tell the robot whether there was an object on either side of the robot and how far away they are. After looking into the IR sensor we could see that they are not good for close range, and not all that reliable when in the range they are supposed to be. After seeing this issue we decided to not use them.

Another issue we ran into was that we accidently used pins 1 and 0 on the Arduino, which actually froze up the Arduino. It caused the Arduino to not be recognized through its serial port. We had this problem fixed by another student in the class cause he ran into the same problem, and solved it by using an external boot loader.

 

The rest of the issues we had were all in trying to implement the design…

 

 

Figure 2. Conceptual ideas we had in the beginnig for our robot to draw

 

In the beginning of class we came up with ideas of what our robot could draw. We were first assigned to have our robot draw using software with little or no limitations, and then slowly add our idea to new platforms in a way that new limitations and issues would arrise in a slow and controlled manner. Of course, with limitations comes things that can’t be done. This made us look at what we had to work with one step at a time. Figure 2 shows two ideas we had to do early on: fractles with triangles and the 3-D cube. Both of these ideas were vetoed when we realized that they would be most likely not do-able by our robot. The first issue with the robot was the wheels, which are of quite low grade and of a very low engineering level. The wheels are concave in the middle and only touching the ground on the very edges, which is only added to by the fact that the treads are plastic rather than rubber. These shortcomings of the wheels left us with a robot that did not go straight and on top of that whobbly wheels. We realied early that these designs would have to be stratched and replaced by designs that could be feasibly done by our robot. Even though we switched ideas we still had to deal with the issues caused by the wheels. We straightened the treads every so often to keep the wheels from wobbling. Also the straightness issue we fixed by applying different PWM’s to the motors until the robot was acceptably straight.

 

Figure 3. This is our re-examined design that our robot will draw

We then decided to draw the design in Figure 3. This design we thought we could pull off, but the issue of not having the IR sensors of really any use, and the issue with the wheels we were starting to doubt this design’s possibility. Then more issues came to us. We realized that we did yet know what we were going to be actually writing with since the first idea for drawing was vetoed. We also did not know what we would be actually writing on such as type of paper. there were other little minor issues also came up that lead us to change this idea a little bit.

 

Figure 4. This is our further re-examined design that our robot will draw

There was not too much change as shown in our design concept in figure 4. We really were expecting to use this design. That was until we were shown the device that was to hold the marker and realized that our above design was not possible. This is because as we were expecting the marker would not be able to be lifted off the ground, which means that our turns would cause a large arch along the ground. The marker would have to be put in the center of the robot, but that is impossible with the holder we are to use. This lead us to change what we expected our design to look like.

 

Figure 5. This is our even further re-examined design that our robot will draw

As shown in Figure 5, this is what we then expected our design to look like, which is our Figure 4 design, but with the arches during the turns as mentioned above. This is the design that we expected to come up with. When we had all the parts set up we really started to buckle down and try to get this to work. We finally felt like we had a focus we could really start working towards, and we did intently. So, our first idea to draw the design we had the ultra-sound sensor detecting distance and the encoders detecting the turns. The encoders are used to count wheel turns, which helps a lot. Yet, we did have issues with the encoders where the encoders were counting extremely high really fast, which made them unusable, but we fixed that issue by making sure they were wired properly, and by attaching and detaching the interrupts with a delay. We then started to realize that the encoders were more reliable than even the sonar. Our sonar was starting to be unreliable for reasons still unknown to use, so we switched over to using the encoders. Even further with the encoders we realized that only one encoder was working up to par so we ended up using only that one encoder for all movement. Ok, so now Saul and Matthew are now at FAU on Friday night December 2nd with all of the equipment, which was very, very late coming. We now had the walls all set up and the paper to draw on. We did run into varies issues figuring out the friction of the paper provided in ways such as grip of the wheels, slippage, and other issues. We were given a variety of different papers, which were ranged from abundant to the expensive paper which was limited. We practiced for hours and hours that night until we could get the project to work on the abundant paper, which the robot was able to do our figure from Figure 5, but this was without the marker. We tested the abundant paper with the marker and the marker made no difference with the angle of turn make.  Throughout the night we were running into reliability issues with the equipment from miss-readings, batteries dying , etc… So, after we were finished with the abundant paper, we had to switch to the limited paper because it could handle being drawn on unlike the abundant paper. We then switched to the limited paper, which only gave us one shot to draw on it since other groups would need it. We taped the twelve necessary papers together and then started figuring out the design out on this new paper. We tested it out without the marker and we couldn’t quite get it, so we decided to try taking of the treads and we could actually get better results. Therefore without the treads we were finally able to get the robot to follow Figure 5 as we wanted without the marker. Then we decided to attach the marker, and of course then we start to get unreliable readings for some unknown reason, so we stopped the robot and flipped the paper over to try again…

 

Figure 6. This is the design our robot drew

So after the paper is flipped, we have been working on this robot now for about 14 hours all through the night cause we know that this is the last night we both have to work on this project and we have used our allotted paper we set up the robot one last time, and have our camera ready. The robot starts drawing, and it is starting to look like the design we want from figure 5, but the marker on this paper seems to not let the robot turn the full 90 degrees, which we couldn’t test for cause of the limited paper. This lead to the final drawing our robot made in Figure 6.

With more time, supplies and better equipment we could have created any one of our above designs, but we used what we had to make our robot draw a picture on the ground. Considering that was the goal our robot succeeded in drawing a very interesting design… The video of our robot drawing this figure is listed on YouTube at: http://www.youtube.com/watch?v=pmQTOVnOOtE&feature=mfu_in_order&list=UL

 

All in all, and it does surprise me to say, that it was worth it and we learned a lot. We have heard all through our years here that projects such as these are 10% thinking, conceptualizing, planning and 90% debugging. These numbers may be off, but this is what it feels like, and truth be told if there is no frustration then there is no learning. So, if you want to learn about how to make a robot work, and do what you want it to, or in our case realize the limitations of our given robot and have to deal with such limitations as we did. Then take this class!

 

Code That We Used

 

/* ex: set syntax=cpp :

DistanceBySoftwareSerial.pde – URM 37 Control Library Version 2.0.0

Author: Miles Burton, miles@mnetcs.com

Copyright (c) 2009 Miles Burton All Rights Reserved

 

This library is free software; you can redistribute it and/or

modify it under the terms of the GNU Lesser General Public

License as published by the Free Software Foundation; either

version 2.1 of the License, or (at your option) any later version.

 

This library is distributed in the hope that it will be useful,

but WITHOUT ANY WARRANTY; without even the implied warranty of

MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU

Lesser General Public License for more details.

 

You should have received a copy of the GNU Lesser General Public

License along with this library; if not, write to the Free Software

Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA  02110-1301  USA

 

*/

 

// Test Program written for Arduino Robot v1.0

// Saul     Beniquez

// Marcorel Atilus

// Matthew  Herland

 

 

#include “URMSerial.h”

 

 

// The measurement we’re taking

#define DISTANCE 1

#define TEMPERATURE 2

#define ERROR 3

#define NOTREADY 4

#define TIMEOUT 5

 

URMSerial urm;

 

/* Aliases to make the code easier to read */

const int LEFT  = 0;

const int RIGHT = 1;

const int DIR1  = 0;

const int DIR2  = 1;

 

int controlPinsL[] = {

4 ,6  };

int controlPinsR[] = {

8, 10 };

 

int pwm[] = {

5, 9 };

 

int IR_R = A0;

int IR_L = A1;

 

volatile int ticks_L = 0;

volatile int ticks_R = 0;

 

void encoderInterrupt_L()

{

ticks_L++;

}

 

//void encoderInterrupt_R()

//{

//   ticks_R++;

//}

 

void setup()

{

Serial.begin(9600);

urm.begin(13,12,9600);                 // RX Pin, TX Pin, Baud Rate

 

for (int i=0; i < 2; ++i)

{

pinMode(controlPinsL[i],OUTPUT);

pinMode(controlPinsR[i],OUTPUT);

}

 

pinMode(pwm[LEFT],  OUTPUT);

pinMode(pwm[RIGHT], OUTPUT);

 

digitalWrite(controlPinsL[DIR1], 0);

digitalWrite(controlPinsL[DIR2], 0);

 

digitalWrite(controlPinsR[DIR1], 0);

digitalWrite(controlPinsR[DIR2], 0);

 

analogWrite(pwm[LEFT], 0);

analogWrite(pwm[RIGHT], 0);

 

attachInterrupt(0, encoderInterrupt_L, CHANGE);

 

delay(5000);

}

 

void loop()

{

 

//Serial.print(“R:”);

//Serial.println(ticks_R, DEC);

moveForward(40);

 

detachInterrupt(0);

delay(200);

attachInterrupt(0, encoderInterrupt_L, CHANGE);

 

sleep(1000);

turnLeft(22);

detachInterrupt(0);

delay(200);

attachInterrupt(0, encoderInterrupt_L, CHANGE);

 

sleep(1000);

moveForward(40);

 

detachInterrupt(0);

delay(200);

attachInterrupt(0, encoderInterrupt_L, CHANGE);

sleep(1000);

 

turnLeft(22);

detachInterrupt(0);

delay(200);

attachInterrupt(0, encoderInterrupt_L, CHANGE);

 

sleep(1000);

moveForward(40);

detachInterrupt(0);

delay(200);

attachInterrupt(0, encoderInterrupt_L, CHANGE);

 

sleep(1000);

turnRight(29);

detachInterrupt(0);

delay(200);

attachInterrupt(0, encoderInterrupt_L, CHANGE);

sleep(1000);

 

}

 

void moveForward(int ticks)

{

ticks_L = 0;

 

digitalWrite(controlPinsL[DIR1], LOW);

digitalWrite(controlPinsL[DIR2], HIGH);

 

digitalWrite(controlPinsR[DIR1], LOW);

digitalWrite(controlPinsR[DIR2], HIGH);

 

analogWrite(pwm[LEFT], 240);

analogWrite(pwm[RIGHT], 245);

 

while ( ticks_L < ticks )

{

Serial.print(“F:”);

Serial.println(ticks_L, DEC);

}

 

}

 

void sleep(int time)

{

ticks_L = 0;

 

digitalWrite(controlPinsL[DIR1], HIGH);

digitalWrite(controlPinsL[DIR2], LOW);

 

digitalWrite(controlPinsR[DIR1], LOW);

digitalWrite(controlPinsR[DIR2], HIGH);

 

analogWrite(pwm[LEFT], 0);

analogWrite(pwm[RIGHT], 0);

 

delay(time);

}

 

 

void turnLeft(int ticks)

{

ticks_L = 0;

 

digitalWrite(controlPinsL[DIR1], HIGH);

digitalWrite(controlPinsL[DIR2], LOW);

 

digitalWrite(controlPinsR[DIR1], LOW);

digitalWrite(controlPinsR[DIR2], HIGH);

 

analogWrite(pwm[LEFT], 200);

analogWrite(pwm[RIGHT], 200);

 

while ( ticks_L < ticks ) {

 

Serial.print(“L:”);

Serial.println(ticks_L, DEC);

}

 

}

 

void turnRight(int ticks)

{

ticks_L = 0;

 

digitalWrite(controlPinsL[DIR1], LOW);

digitalWrite(controlPinsL[DIR2], HIGH);

 

digitalWrite(controlPinsR[DIR1], HIGH);

digitalWrite(controlPinsR[DIR2], LOW);

 

analogWrite(pwm[LEFT], 200);

analogWrite(pwm[RIGHT], 200);

 

while ( ticks_L  < ticks )  {

Serial.print(“R:”);

Serial.println(ticks_L, DEC);

}

 

}

 

 

int getMeasurement(int mode )

{

int value;

 

// Request a distance reading from the URM37

switch(urm.requestMeasurementOrTimeout(mode, value)) // Find out the type of request

{

case DISTANCE: // Double check the reading we recieve is of DISTANCE type

//    //Serial.println(value); // Fetch the distance in centimeters from the URM37

return value;

break;

case TEMPERATURE:

return value;

break;

case ERROR:

//Serial.println(“Error”);

break;

case NOTREADY:

//Serial.println(“Not Ready”);

break;

case TIMEOUT:

//Serial.println(“Timeout”);

break;

}

 

return -1;

}

 

Advertisements

Day 82

For our final report, we have chosen to optimize and improve upon the Arduino’s SoftSerial library and make it interrupt-driven.

Our URM37 supports both TTL and RS232 modes, but we are only using the TTL mode. The arduino softserial library also supports an RS232 mode of operation, and has several other options to allow full control of the sonar, including one where commands can trigger reads from the sensor.

 

Since we can only use one of these modes at a time, we can significantly reduce code size by removing all the code to handle the serial modes we don’t need.

The source code is available in Linux under /usr/share/arduino/libraries/SoftwareSerial or in Windows under .\arduino-0022\libraries\SoftwareSerial

 

Day 25 9/21/2011

Extending from Day 19’s post…

As per the assignment, this is the Java simulation for our Robotics class, which is the same design we used in processing in Day 19’s post. In processing we are virtualy limitless in our ability to create a design. In Java we had a few, let’s say roadblock, ie.. the fact that we couldn’t turn, or move 45 degrees, instead we are limited to 90 or 180 degrees. This is the idea of top down design, starting out at the lowest level, with the least amount of restrictions, and keep leveling up and dealing with the restrictions one at a time instead of all at once. This is the way that this class was set up, and we are excited to see what we can accomplish by the end of the semester…

Day 19 – 9.15.2011

After talking to Professor Shankar,

Our group decided to to with a simpler design than the one I suggested, seeing as our mobot (mobile robot) will not be able to move at the angles the bouncing ball could. Matt Herland’s design looked more doable, but Professor Shankar urged us to be more creative.

In order to do so, I coded a class in Processing to simulate the Becker RobotSE class’s interface so that we may translate the simulation to the Becker Library.

Here is a preliminary result:

The code for the class is posted below:
Continue reading

Day 10 – 9.7.2011

Continued working on our robotic platform today. We have decided to attach the wheels to the robot either way. Perhaps if it is used as a mobile platform, we could elevated it with a stack of books or some sort of platform.
Our TA, Jordan helped us in class with a wiring diagram for the motors, which you can see below:

To do so we had to tint our  wires (ie. cover them in a thin layer of solder), so  that when we solder them to the terminals, we do not end up melting the plastic on the motors:

Day 1 – 08.25.2011

Started our Robotics project with some basic assembly. Our project in this class is to create an autonomous chess set controlled via an Android smart phone or tablet.  To do so we will engineer a system of control beacons and mobile platforms using the open-source Arduino chipset.

Today we received our 4-Wheel Drive Arduino robot platform and began the basic assembly:

As our platform will be one of the stationary platforms, we are unsure whether to include wheels or not.  Including them would give us the the benefit of being able to switch the platform’s role from a beacon to a chess piece, by simply changing the software.

At this moment in time, we are not sure if this will be necessary.

Hello world!

This is a project blog detailing the progress of our Embedded Robotics course at FAU.

My name is Saul Beniquez and I am working with Matthew Herland  & Marcorel Atilus to complete a part of a larger, more complex system of robots designed to function as a remote-controlled chess set.

Hopefully this will be a fun experience as well as a learning experience for us.

 

 

Our Mission statement:

Step 1:

For the first step our goal is to decide exactly what we want the robots to do. We will use the art program called processing to come up with a design that the robot will create. Following this we will simulate the decided upon design in java using Dr. Becker’s robot library, offered at learningwithrobots.com…

Step 2:

The second step will focus on simply making our robot actually do what we have simulated. This will be done using a variety of programming languages Java, Arduino, AvR-C, AvR studio, C, C++, Assembley. The goal here would be to put the functionality we want in the lowest level programming language so the robot can perform at peak performance.

Step 3:

The third step is to start putting the robot into an environment similar to the chess game that the robot will eventually be used in. The environment will not be as demanding or strenious as the chess game, but it will be getting its feet wet in autonomous robotics, as will be crucial for the chess game. The idea will be to use mobots and beacons to be able to make the mobots be able to maneuver in a way that would be needed for a chess game. In chess robots need to move without hitting the other peices and get to the spot they want. There are many steps involved in that and our goal will be to give the robot(s), used in this class, at least one if not many of those abilities needed for said steps.