How to Track Your Robot With OpenCV

by Ladvien in Circuits > Robots

89057 Views, 122 Favorites, 0 Comments

How to Track Your Robot With OpenCV

OpenCV Bot.png

UPDATE: Lets Make Robots, my home digital hackerspace, was purchased by RobotShop. I don't want to go into, but the new owners banned myself and most of the veteran members. That stated, most of the links there will be broken. Luckily, I was able to make a copy of my content (the reason they banned me) and moved it to a Github site--it was a hurried move, so the content maybe a little rough right now.


UPDATE: I have tweaked the code in these instructions into a Python module. Muuuch easier to use.

Friendly Overlord


These are redneck instructions on how to control a robot with a static webcam for under 50 USD.

I'm a robot builder and I got tired of seeing universities play robot soccer or something with computer vision guiding their players, and no matter how much I begged, darn Ivy Leagues wouldn't share.

So, I wrote my own. And while I did it, I swore I'd try to help anyone trying something similar.


Proof-of-Concept

LMR_logo_1000.jpg
I don't like reading half an instructable before I know its use, so, I'll put the proof-of-concept video upfront. No offense taken if you don't stick around, time's precious. :)

Overlord and Dot Muncher

Overview: How It Works

LMR_logo_1000.jpg
So, here's an overview of how the code works:

Red Hatted Robot

  1. Webcam sends images of its field-of-view.
  2. OpenCV looks for the largest red blob.
  3. It begins tracking the red blob's X, Y.
  4. The PC averages these X, Y positions for around 150 camera frames.
  5. If the blob hasn't moved much, the PC assumes the red blob is the robot.
  6. The PC gets frisky and gives our robot a random target within the webcam's field-of-view.
  7. The PC calculates the angle between the bot and the target.
  8. Meanwhile, the robot's microcontroller is taking readings from a magnetometer on the robot.
  9. The robot, with a one time human calibration, translates true North to "video-game north," aka, top of PC's screen.
  10. The microcontroller transmits this code to the PC.
  11. The PC compares the angle of the bot from the target with the robots angle.
  12. The PC sends a code to the bot telling it to turn left, right, or move forward (closer to the target).
  13. When the robot has made it within an acceptable distance from the target he "Munches the Dot."
  14. A new random dot appears. Rinse repeat. (For real though, don't rinse the bot. Consider Asimov's Third Law.)

Bio -- (uh, I'd Skip It, Boring!)

LMR_logo_1000.jpg

I'm a homeless outreach worker. The job's amazing. But I'll say, emotionally taxing. Skipping the politics and the sermon on harm-reduction, I decided at the start I needed something far from the job to allow my mind rest and prevent compassion fatigue. Something that consumed my brain-power so I'd not be stressing over the 6 months pregnant 17 year-old, shooting up under a bridge on I-35. Something to protect my down-time so I'd be frosty for the next day.

Well, I saw that TED talk about the Sand Flea and I told Bek, "That's awesome, think I could build one?"
"Oh crap," she said, "new obsession?"

Now, robots are my relief. My way to prevent white-matter from becoming dark-matter as I rake through sludge looking for those who want out.

I started reading a lot. I discovered, Arduino, Sparkfun, eBay, Raspberry Pi, ferric chloride, Python, hackaday, HC-SR04, Eagle, OSHPark, and the list goes on. But every time I Googled something about robots, I'd end up at the same place.

Lets Make Robots (edit: sadly, RobotShop bought them and banned the member who helped me the most).

These guys are brilliant. They are a college education from collaboration, I swear.

Soon, I ended up with my first bot. A piece of sh...short-circuits. Although, I did learn a lot interfacing the bot with the Raspberry Pi. Also, while I was working with a Raspberry Pi, I played with OpenCV, and was considering adding a face tracker to my bot before I got distracted. But before I quit, I created a proof-of-concept.

So, all these experiences began to culminate.

Meanwhile, I was taking a graduate Research Methods class at UTA and my professor disappeared. The university was amazing; good professors filled in and made sure our education didn't suffer. But we wondered for many months. Sadly, it was discovered he had killed himself.

It shook me. I deal with suicidality every other day, but it's usually on the street. Why a successful research professor? My thoughts got dark for a bit, which meant I sunk into robots even more. Yet, now, a question sat at the front of my mind: Will robots one day kill themselves?



This may sound silly. But I believe the formula for self-termination can be expressed in Boolean logic, and therefore coded.

Pseudo-code would be:


if painOfExistence > senseOfPurpose then:

self_terminate()


Derived from work and life experience I genuinely believe the root-motive for suicide is existential-anxiety, which seems to me, entangled within both constructs.

Ok. Skipping the Time bit.

Someday, I'd like to delve into swarm robotics. Or, at least, attempt to replicate organic group behavior within a robot group. And I thought it might be possible to control a group of robots with a setup similar to those universities or research groups keep showing off. (Jockish Ivy Leagues :P)

Well, I found these desires, information, and tools synergized into a passion. After two days, I was able to write a basic OpenCV Python script that could control a robot using a static webcam looking down on it. Let me clarify, I'm of average intelligence, simply obsessive, so when I mentioned "two-days" I'm trying to convey the utter feasibility of this project, for anyone. Python, Arduino, and OpenCV make it so very easy; anyidiot like me can hack it out.

Of course, my purpose for this platform is to control robot groups. The group being the second social collection (one-to-eight) and social interaction seems to be the essential in developing a positronic brain. The white-mattered brained being necessary for me to test the above mentioned self-termination formula. So, maybe, I'll learn if robots will commit suicide, or perhaps, have a better understanding of why humans do.

Dark and depressing! I know, right? Who writes this crap!?

What You'll Need: Robot

LMR_logo_150_Python.jpg



A robot

It doesn't matter what sort of robot you use, it only needs:

  1. A microncontroller (e.g., Arduino, PicAxe, etc.)
  2. Built from material of a bold, solid color.
  3. The ability to change directions and move.
  4. A magnetometer. I used the HMC5883L. They're like 2 USD on eBay.
  5. A wireless serial connection. Bluetooth, Xbee, and nRF24L01 would be my recommendation since all are well documented creating a bridge between PC and microcontroller.

I personally built my own using red cutting-board I stole from Bek (shh). For my serial connection I used two $10 Bluetooth 4.0 modules, which I've written an instructable on setting up a Bluetooth 4.0 module to work with an Arduino and PC: Bluetooth 4.0 and Arduino.



A PC

Probably something less than 10 years old. It could be running Linux or Windows;though, I'll be using Windows Vista (hey, I'm first-world poor and can't afford Windows 7 :P).

  1. The PC will need to be running Python 2.7
  2. It'll need OpenCV 2.4.4
  3. It will need a wireless serial connection that pairs with your bot. Again, I used my BT 4.0 modules.



A Webcam


It's really up to you. I'm not going to lie, I went with the cheapest webcam I saw, which costs 6.87 USD. But I would not recommend this webcam. It didn't like my PC, so every time my Python script stopped I had to unplug the webcam and re-plug it in. A real annoyance for debugging.

  1. I'd suggest a high-resolution webcam. Maybe even a IP cam, if you're rich? If you are, would you buy me one too?
  2. Long male-to-female USB cable. Again, I got two 15' USB cables on eBay for around 4.50 USD. If you get everything setup and you notice problems with the webcam at the end of the cable, you can put a powered hub at the end of the cable with an extension cord and it'll take care of the issue. Though, I didn't have this problem at 15'.
  3. A wife that'll let you screw your webcam into the ceiling. Or...don't ask...

Dot Muncher: Overview

LMR_logo_150_Arduino.png

So, I made my robot, Dot Muncher, using an Arduino Uno, Motor Shield, and a Bluetooth 4.0 module. The chassis was made from HDPE, a cutting board I stole from my wife. The motors and tires were from eBay.

Now, about any robot will work, like I've stated, so Google away and select a robot build you like.

Of course, everything you'd every want to know can be found at

www.letsmakerobots.com

I'm just sayin'.

But the code, that's the part we want to focus on. Really, our robot only has a nerves and muscles, the brain will actually be in the PC, all the robot does is,

  1. Calculates the compass info.
  2. Sends the compass info to the PC.
  3. Reads the movement codes from the PC.
  4. Translates the movement code received into a motor activation.

That's it. Pretty simple.

  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
<span style="color: rgb(136,136,136);">//I've been using Zombie_3_6_RC in Processing to interact.</span>


<span style="color: rgb(136,136,136);">// Reference the I2C Library</span>
<span style="color: rgb(85,119,153);">#include </span><span style="color: rgb(85,119,153);"><Wire.h></span>
<span style="color: rgb(136,136,136);">// Reference the HMC5883L Compass Library</span>
<span style="color: rgb(85,119,153);">#include </span><span style="color: rgb(85,119,153);"><HMC5883L.h></span>

<span style="color: rgb(136,136,136);">// Store our compass as a variable.</span>
HMC5883L compass;

<span style="color: rgb(136,136,136);">// Record any errors that may occur in the compass.</span>
<span style="color: rgb(51,51,153);font-weight: bold;">int</span> error <span style="color: rgb(51,51,51);">=</span> <span style="color: rgb(0,0,221);font-weight: bold;">0</span>;

<span style="color: rgb(136,136,136);">//int pwm_a = 10; //PWM control for motor outputs 1 and 2 is on digital pin 10</span>
<span style="color: rgb(51,51,153);font-weight: bold;">int</span> pwm_a <span style="color: rgb(51,51,51);">=</span> <span style="color: rgb(0,0,221);font-weight: bold;">3</span>;  <span style="color: rgb(136,136,136);">//PWM control for motor outputs 1 and 2 is on digital pin 3</span>
<span style="color: rgb(51,51,153);font-weight: bold;">int</span> pwm_b <span style="color: rgb(51,51,51);">=</span> <span style="color: rgb(0,0,221);font-weight: bold;">11</span>;  <span style="color: rgb(136,136,136);">//PWM control for motor outputs 3 and 4 is on digital pin 11</span>
<span style="color: rgb(51,51,153);font-weight: bold;">int</span> dir_a <span style="color: rgb(51,51,51);">=</span> <span style="color: rgb(0,0,221);font-weight: bold;">12</span>;  <span style="color: rgb(136,136,136);">//dir control for motor outputs 1 and 2 is on digital pin 12</span>
<span style="color: rgb(51,51,153);font-weight: bold;">int</span> dir_b <span style="color: rgb(51,51,51);">=</span> <span style="color: rgb(0,0,221);font-weight: bold;">13</span>;  <span style="color: rgb(136,136,136);">//dir control for motor outputs 3 and 4 is on digital pin 13</span>

<span style="color: rgb(51,51,153);font-weight: bold;">int</span> lowspeed <span style="color: rgb(51,51,51);">=</span> <span style="color: rgb(0,0,221);font-weight: bold;">120</span>;
<span style="color: rgb(51,51,153);font-weight: bold;">int</span> highspeed <span style="color: rgb(51,51,51);">=</span> <span style="color: rgb(0,0,221);font-weight: bold;">140</span>;

<span style="color: rgb(136,136,136);">//Distance away</span>
<span style="color: rgb(51,51,153);font-weight: bold;">int</span> distance;

<span style="color: rgb(136,136,136);">//Sets the duration each keystroke captures the motors.</span>
<span style="color: rgb(51,51,153);font-weight: bold;">int</span> keyDuration <span style="color: rgb(51,51,51);">=</span> <span style="color: rgb(0,0,221);font-weight: bold;">10</span>;

<span style="color: rgb(51,51,153);font-weight: bold;">int</span> iComp;

<span style="color: rgb(51,51,153);font-weight: bold;">void</span> <span style="color: rgb(0,102,187);font-weight: bold;">setup</span>()
{
  Serial.begin(<span style="color: rgb(0,0,221);font-weight: bold;">9600</span>);
  
  Wire.begin(); <span style="color: rgb(136,136,136);">// Start the I2C interface.</span>

  Serial.println(<span style="background-color: rgb(255,240,240);">"Constructing new HMC5883L"</span>);
  compass <span style="color: rgb(51,51,51);">=</span> HMC5883L(); <span style="color: rgb(136,136,136);">// Construct a new HMC5883 compass.</span>
    
  Serial.println(<span style="background-color: rgb(255,240,240);">"Setting scale to +/- 1.3 Ga"</span>);
  error <span style="color: rgb(51,51,51);">=</span> compass.SetScale(<span style="color: rgb(102,0,238);font-weight: bold;">1.3</span>); <span style="color: rgb(136,136,136);">// Set the scale of the compass</span>
  error <span style="color: rgb(51,51,51);">=</span> compass.SetMeasurementMode(Measurement_Continuous); <span style="color: rgb(136,136,136);">// Set the measurement mode to Continuous</span>
  
  pinMode(pwm_a, OUTPUT);  <span style="color: rgb(136,136,136);">//Set control pins to be outputs</span>
  pinMode(pwm_b, OUTPUT);
  pinMode(dir_a, OUTPUT);
  pinMode(dir_b, OUTPUT);
  
  analogWrite(pwm_a, <span style="color: rgb(0,0,221);font-weight: bold;">0</span>);        
  <span style="color: rgb(136,136,136);">//set both motors to run at (100/255 = 39)% duty cycle (slow)  </span>
  analogWrite(pwm_b, <span style="color: rgb(0,0,221);font-weight: bold;">0</span>);

  pinMode (<span style="color: rgb(0,0,221);font-weight: bold;">2</span>,OUTPUT);<span style="color: rgb(136,136,136);">//attach pin 2 to vcc</span>
  pinMode (<span style="color: rgb(0,0,221);font-weight: bold;">5</span>,OUTPUT);<span style="color: rgb(136,136,136);">//attach pin 5 to GND</span>
  <span style="color: rgb(136,136,136);">// initialize serial communication:</span>
  Serial.begin(<span style="color: rgb(0,0,221);font-weight: bold;">9600</span>);
  
}

<span style="color: rgb(51,51,153);font-weight: bold;">void</span> <span style="color: rgb(0,102,187);font-weight: bold;">loop</span>()
{


  
  <span style="color: rgb(136,136,136);">// Retrive the raw values from the compass (not scaled).</span>
  MagnetometerRaw raw <span style="color: rgb(51,51,51);">=</span> compass.ReadRawAxis();
  
  <span style="color: rgb(136,136,136);">// Retrived the scaled values from the compass (scaled to the configured scale).</span>
  MagnetometerScaled scaled <span style="color: rgb(51,51,51);">=</span> compass.ReadScaledAxis();
  
  <span style="color: rgb(136,136,136);">// Values are accessed like so:</span>
  <span style="color: rgb(51,51,153);font-weight: bold;">int</span> MilliGauss_OnThe_XAxis <span style="color: rgb(51,51,51);">=</span> scaled.XAxis;<span style="color: rgb(136,136,136);">// (or YAxis, or ZAxis)</span>

  <span style="color: rgb(136,136,136);">// Calculate heading when the magnetometer is level, then correct for signs of axis.</span>
  <span style="color: rgb(51,51,153);font-weight: bold;">float</span> heading <span style="color: rgb(51,51,51);">=</span> atan2(scaled.YAxis, scaled.XAxis);
  
  <span style="color: rgb(136,136,136);">// Once you have your heading, you must then add your 'Declination Angle', which is the 'Error' of the magnetic field in your location.</span>
  <span style="color: rgb(136,136,136);">// Find yours here: http://www.magnetic-declination.com/</span>
  <span style="color: rgb(136,136,136);">// Mine is: 2� 37' W, which is 2.617 Degrees, or (which we need) 0.0456752665 radians, I will use 0.0457</span>
  <span style="color: rgb(136,136,136);">// If you cannot find your Declination, comment out these two lines, your compass will be slightly off.</span>
  <span style="color: rgb(51,51,153);font-weight: bold;">float</span> declinationAngle <span style="color: rgb(51,51,51);">=</span> <span style="color: rgb(102,0,238);font-weight: bold;">0.0457</span>;
  heading <span style="color: rgb(51,51,51);">+=</span> declinationAngle;
  
  <span style="color: rgb(136,136,136);">// Correct for when signs are reversed.</span>
  <span style="color: rgb(0,136,0);font-weight: bold;">if</span>(heading <span style="color: rgb(51,51,51);"><</span> <span style="color: rgb(0,0,221);font-weight: bold;">0</span>)
    heading <span style="color: rgb(51,51,51);">+=</span> <span style="color: rgb(0,0,221);font-weight: bold;">2</span><span style="color: rgb(51,51,51);">*</span>PI;
    
  <span style="color: rgb(136,136,136);">// Check for wrap due to addition of declination.</span>
  <span style="color: rgb(0,136,0);font-weight: bold;">if</span>(heading <span style="color: rgb(51,51,51);">></span> <span style="color: rgb(0,0,221);font-weight: bold;">2</span><span style="color: rgb(51,51,51);">*</span>PI)
    heading <span style="color: rgb(51,51,51);">-=</span> <span style="color: rgb(0,0,221);font-weight: bold;">2</span><span style="color: rgb(51,51,51);">*</span>PI;
   
  <span style="color: rgb(136,136,136);">// Convert radians to degrees for readability.</span>
  <span style="color: rgb(51,51,153);font-weight: bold;">float</span> headingDegrees <span style="color: rgb(51,51,51);">=</span> heading <span style="color: rgb(51,51,51);">*</span> <span style="color: rgb(0,0,221);font-weight: bold;">180</span><span style="color: rgb(51,51,51);">/</span>M_PI; 

  <span style="color: rgb(136,136,136);">// Normally we would delay the application by 66ms to allow the loop</span>
  <span style="color: rgb(136,136,136);">// to run at 15Hz (default bandwidth for the HMC5883L).</span>
  <span style="color: rgb(136,136,136);">// However since we have a long serial out (104ms at 9600) we will let</span>
  <span style="color: rgb(136,136,136);">// it run at its natural speed.</span>
  <span style="color: rgb(136,136,136);">// delay(66);</span>

  <span style="color: rgb(136,136,136);">//This throttles how much data is sent to Python code.  </span>
  <span style="color: rgb(136,136,136);">//Basically, it updates every second (10 microsecond delay X 100 iComps)</span>
  <span style="color: rgb(0,136,0);font-weight: bold;">if</span> (iComp <span style="color: rgb(51,51,51);">>=</span> <span style="color: rgb(0,0,221);font-weight: bold;">30</span>){
    
    <span style="color: rgb(51,51,153);font-weight: bold;">int</span> adjHeading <span style="color: rgb(51,51,51);">=</span> <span style="color: rgb(0,0,221);font-weight: bold;">0</span>;    
    <span style="color: rgb(136,136,136);">//The "floor" part makes the float into an integer, rounds it up.</span>
    headingDegrees <span style="color: rgb(51,51,51);">=</span> floor(headingDegrees);
    <span style="color: rgb(0,136,0);font-weight: bold;">if</span> (headingDegrees <span style="color: rgb(51,51,51);">>=</span> <span style="color: rgb(0,0,221);font-weight: bold;">280</span>){
        adjHeading <span style="color: rgb(51,51,51);">=</span> map(headingDegrees, <span style="color: rgb(0,0,221);font-weight: bold;">280</span>, <span style="color: rgb(0,0,221);font-weight: bold;">360</span>, <span style="color: rgb(0,0,221);font-weight: bold;">0</span>, <span style="color: rgb(0,0,221);font-weight: bold;">79</span>);
    }
    <span style="color: rgb(0,136,0);font-weight: bold;">else</span> <span style="color: rgb(0,136,0);font-weight: bold;">if</span> (headingDegrees <span style="color: rgb(51,51,51);"><=</span> <span style="color: rgb(0,0,221);font-weight: bold;">279</span>) {
        adjHeading <span style="color: rgb(51,51,51);">=</span> map(headingDegrees, <span style="color: rgb(0,0,221);font-weight: bold;">0</span>, <span style="color: rgb(0,0,221);font-weight: bold;">279</span>, <span style="color: rgb(0,0,221);font-weight: bold;">80</span>, <span style="color: rgb(0,0,221);font-weight: bold;">360</span>);
    }

    Serial.println(adjHeading);
    iComp<span style="color: rgb(51,51,51);">=</span><span style="color: rgb(0,0,221);font-weight: bold;">0</span>;
  }
  iComp<span style="color: rgb(51,51,51);">++</span>;

  delay(<span style="color: rgb(0,0,221);font-weight: bold;">10</span>); <span style="color: rgb(136,136,136);">//For serial stability.</span>
  
  
   
  <span style="color: rgb(51,51,153);font-weight: bold;">int</span> val <span style="color: rgb(51,51,51);">=</span> Serial.read() <span style="color: rgb(51,51,51);">-</span> <span style="color: rgb(0,68,221);">'0'</span>;
    
  <span style="color: rgb(0,136,0);font-weight: bold;">if</span> (val <span style="color: rgb(51,51,51);">==</span> <span style="color: rgb(0,0,221);font-weight: bold;">1</span>)
  {
    Back();
  }
   
  <span style="color: rgb(0,136,0);font-weight: bold;">else</span> <span style="color: rgb(0,136,0);font-weight: bold;">if</span> (val <span style="color: rgb(51,51,51);">==</span> <span style="color: rgb(0,0,221);font-weight: bold;">2</span>)
  {
    Right();
  } 
  
  <span style="color: rgb(0,136,0);font-weight: bold;">else</span> <span style="color: rgb(0,136,0);font-weight: bold;">if</span> (val <span style="color: rgb(51,51,51);">==</span> <span style="color: rgb(0,0,221);font-weight: bold;">3</span>)
  {
    Forward();
  } 
   
  <span style="color: rgb(0,136,0);font-weight: bold;">else</span> <span style="color: rgb(0,136,0);font-weight: bold;">if</span> (val <span style="color: rgb(51,51,51);">==</span> <span style="color: rgb(0,0,221);font-weight: bold;">4</span>)
  {
    Left();
  }   
   
  <span style="color: rgb(0,136,0);font-weight: bold;">else</span> <span style="color: rgb(0,136,0);font-weight: bold;">if</span> (val <span style="color: rgb(51,51,51);">==</span> <span style="color: rgb(0,0,221);font-weight: bold;">5</span>)
  {
    Stop();
  }
}

<span style="color: rgb(51,51,153);font-weight: bold;">void</span> <span style="color: rgb(0,102,187);font-weight: bold;">Back</span>(){
<span style="color: rgb(136,136,136);">//Straight back</span>
      analogWrite(pwm_a, highspeed);      
      analogWrite(pwm_b, highspeed);
  
      digitalWrite(dir_a, HIGH);  <span style="color: rgb(136,136,136);">//Reverse motor direction, 1 high, 2 low</span>
      digitalWrite(dir_b, LOW);  <span style="color: rgb(136,136,136);">//Reverse motor direction, 3 low, 4 high</span>
      
delay(keyDuration);
}

<span style="color: rgb(51,51,153);font-weight: bold;">void</span> <span style="color: rgb(0,102,187);font-weight: bold;">Left</span>(){
      <span style="color: rgb(136,136,136);">//Left</span>
      analogWrite(pwm_a, lowspeed);      
      analogWrite(pwm_b, lowspeed);
      
      digitalWrite(dir_a, HIGH);  <span style="color: rgb(136,136,136);">//Reverse motor direction, 1 high, 2 low</span>
      digitalWrite(dir_b, HIGH);  <span style="color: rgb(136,136,136);">//Reverse motor direction, 3 low, 4 high</span>
    
delay(keyDuration);
}

<span style="color: rgb(51,51,153);font-weight: bold;">void</span> <span style="color: rgb(0,102,187);font-weight: bold;">Right</span>(){
      <span style="color: rgb(136,136,136);">//Right</span>
      analogWrite(pwm_a, lowspeed);      
      analogWrite(pwm_b, lowspeed);
      
      digitalWrite(dir_a, LOW);  <span style="color: rgb(136,136,136);">//Reverse motor direction, 1 high, 2 low</span>
      digitalWrite(dir_b, LOW);  <span style="color: rgb(136,136,136);">//Reverse motor direction, 3 low, 4 high</span>
    
delay(keyDuration);
}
 
<span style="color: rgb(51,51,153);font-weight: bold;">void</span> <span style="color: rgb(0,102,187);font-weight: bold;">Forward</span>(){
  <span style="color: rgb(136,136,136);">//set both motors to run at 100% duty cycle (fast)</span>
  analogWrite(pwm_a, highspeed);      
  analogWrite(pwm_b, highspeed);
  
  <span style="color: rgb(136,136,136);">//Straight forward</span>
  digitalWrite(dir_a, LOW);  <span style="color: rgb(136,136,136);">//Set motor direction, 1 low, 2 high</span>
  digitalWrite(dir_b, HIGH);  <span style="color: rgb(136,136,136);">//Set motor direction, 3 high, 4 low</span>
  
  delay(keyDuration);
}

<span style="color: rgb(51,51,153);font-weight: bold;">void</span> <span style="color: rgb(0,102,187);font-weight: bold;">Stop</span>(){
  <span style="color: rgb(136,136,136);">//set both motors to run at 100% duty cycle (fast)</span>
  analogWrite(pwm_a, <span style="color: rgb(0,0,221);font-weight: bold;">0</span>);      
  analogWrite(pwm_b, <span style="color: rgb(0,0,221);font-weight: bold;">0</span>);
  
  <span style="color: rgb(136,136,136);">//Straight forward</span>
  digitalWrite(dir_a, LOW);  <span style="color: rgb(136,136,136);">//Set motor direction, 1 low, 2 high</span>
  digitalWrite(dir_b, HIGH);  <span style="color: rgb(136,136,136);">//Set motor direction, 3 high, 4 low</span>
  
  delay(keyDuration);
}

 

Downloads

Dot Muncher: Compass Code

LMR_logo_150_Arduino.png
The first bit of robot code I'd like to focus on is the compass.  Now, I've not detailed how to use the HMC5883L, since SparkFun has done this for me.  I also won't go into tilt-compensation, since I was more worried about proving the concept here than dead-on accuracy.  But if you're a smart-cookie and would like to take that chaellenge, feel free.  Just be sure and share the code with us all when you're done :P

No.  Instead, I want to focus on adjusting the compass heading from a value respective to true North, to what we want it to think is north, in our case, whatever is the top of our screen.  This process takes a little involvement, since the numbers must be set manually and with a little guesstimation.
 
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
  if (iComp >= 30){
    
    int adjHeading = 0;    
    //The "floor" part makes the float into an integer, rounds it up.
    headingDegrees = floor(headingDegrees);
    if (headingDegrees >= 280){
        adjHeading = map(headingDegrees, 280, 360, 0, 79);
    }
    else if (headingDegrees <= 279) {
        adjHeading = map(headingDegrees, 0, 279, 80, 360);
    }

    Serial.println(adjHeading);
    iComp=0;
  }
  iComp++;

  delay(10); //For serial stability.

So, I got my compass module lying flat as possible and then bolted it to my robot.  This helps assure your getting a full 360º and will keep you from having to re-calibrate what we'd like to call north every time the compass module gets knocked out of place.

106-114: These modules and the Arduino library are both designed to have 0º be North, but we want to set our own north, video-game north.  Which is exactly what lines 106-114 are about.  I found 80º is what value my robot was reading when he was headed towards the top of the screen.  I had to find a way to adjust this to give me the reading 0º.  I ended with this simple code to spin the compass.

I had to divide the adjustments into two sections for the math to stay simple.  Lines 109-111 handle mapping 0-79º onto 280-0º, making the robot think 0-79º is 280-0º.  Lines 112-114 do the same for 80-360º, converting it to 0-279º.

Honestly, I've got some spatial-retardation, so I have a hard time thinking through this, I just know it works.  So, if you have problems I'll answer emails and Skypes and we can work through it together.  And, if you want to submit a better explanation, I'll post it and be sure to give you credit.



Do know, my redneck solution was to change the orientation of the camera.  Pfft.  Too easy.

Moving on,

116: Sends the robot's heading to the PC.

117: iComp is a variable allowing us to decide when to start sending data to the PC.  We don't want to send data to the PC before it's ready or before the robot is warmed-up, we'd be dealing with poor readings.  

118: This is a delay that makes sure we are not bogging down the serial line, since every time we call Serial.println("whatever") both the PC and the robot have to take some processing power to deal with it.  In short, it's to make sure the robot is not talking the computer's ear off.

Dot Muncher: Motor Codes

LMR_logo_150_Arduino.png
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
  int val = Serial.read() - '0';
    
  if (val == 1)
  {
    Back();
  }
   
  else if (val == 2)
  {
    Right();
  } 
  
  else if (val == 3)
  {
    Forward();
  } 
   
  else if (val == 4)
  {
    Left();
  }   
   
  else if (val == 5)
  {
    Stop();
  }

This bit is pretty easy.  It reads the codes being sent from the PC and translates them into a function call.  I write all my robot-PC interactions this way, since if I want a code to mean something completely different, for instance I want to swap the robot's right and left movements, I'd just swap lines 134 and 144.

Easy.

125: If I remember correctly, this line reads serial data being sent from the PC and assures the val variable isn't getting a bunch of zeros.

Dot Muncher: Motor Code

LMR_logo_150_Arduino.png
Easy one.

This is one of the functions called to make the motor move, or in the case of this function, stop.

188-189: This actually tells which pin on the Arduino, specified by the variables pwm_a and pwm_b to decrease to 0.  This effectively stops our robot.  

192-193: This bit actually tells the motor which direction to turn.  The pins (dir_a and dir_b) are set either HIGH or LOW and this changes the direction of how the motor moves.

Tell you what, my good friend ChickenParmi explains it better here:


 
186
187
188
189
190
191
192
193
194
195
196
void Stop(){
  //set both motors to run at 100% duty cycle (fast)
  analogWrite(pwm_a, 0);      
  analogWrite(pwm_b, 0);
  
  //Straight forward
  digitalWrite(dir_a, LOW);  //Set motor direction, 1 low, 2 high
  digitalWrite(dir_b, HIGH);  //Set motor direction, 3 high, 4 low
  
  delay(keyDuration);
}

Setting Up Python 2.7 and IDE

LMR_logo_150_Python.jpg
Now we have a our little robot setup, let's setup our Python environment.

I'm going to use Python 2.7 (just found later versions piss me off).

Python 2.7 download

For windows, use the MSI Install respective to your architecture, either x86 or x64.  Of course, Linux and Mac are versions are there as well.  Go ahead and install Python 2.7, but I'm not a fan of their IDE.  Instead, I use:

Geany
 


Though, this IDE is a little tricky to get running on Windows, since it's meant for Linux.  These posts over at Stack Overflow go through some popular Windows Python IDEs.  Pick what you feel comfortable in.  I suggest running 'Hello World' in each until you decide you like one.

Setting Up Python Modules

LMR_logo_150_Python.jpg
Here we are, the hardest part of this whole project; if not careful, we fall into dependency hell.

I'm going to try and help you setup all the modules needed to run the Python code.  It's been difficult for me to do this right, so I'll try to be descriptive.

There are seven modules we will use.
  1. OpenCV (which we'll call cv2).
  2. Numpy
  3. Serial
  4. Time
  5. Math
  6. Random
  7. Threading
Of these we will need to install OpenCV, Numpy, and Serial, since the rest come built into Python 2.7.

The main trick with any module you install in Python is to make sure the exact path you install it to gets added to the Environment Variable (this is true for both Windows and Linux). 

To explain this I'm going to hand it over to Lovely Ada as she tells us how to install the Serial module:

pySerial installation

Note the bit about adding the environment variable, since none of the other modules will explain this, but each will need to be there.

Now, let's try OpenCV and Numpy.  My favorite installation guide (meaning it worked for me) was written by Abid Rahman:

OpenCV 2.4.4 installation

 
 
At this point, you might bring up Python and try some simple webcam capture test code (if you have problems copying and pasting, I've added web capture code as an attachment as well):

 
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
import cv2.cv as cv
import time

cv.NamedWindow("camera", 1)

capture = cv.CaptureFromCAM(0)

while True:
    img = cv.QueryFrame(capture)
    cv.ShowImage("camera", img)
    if cv.WaitKey(10) == 27:
        break

If you see a live feed from your webcam, you're almost good to go.



If there any problems, like I said, you and me buddy.  Feel free to ask questions here or Skype me: thomas_ladvien


Downloads

Beware All Ye Who Enter Here

LMR_logo_150_Python.jpg
Okay.  Here's all the Python code in one go. Don't be scared if this looks confusing.  I feel the same way.  In fact, some of it I still don't understand.  (Hey, honesty a is a rare fault I seem to possess.)  Again, don't worry, we're going to walk through it one section at a time, you and me, buddy.  Until the end.

On the flip side, if you are a Python guru, or yanno, just a sassy-pants: Feel free to add corrections and comments on this page.  I'd love to make this code grow through critique.  Do know, I guarantee the following: Typos, grammar problems, illogical coding, artifacts from debugging, and the like.  But don't worry, I'm thick skinned and usually wear my big-boy panties.

I should state, the basic code for color tracking was written by Abid Rahman in a reply on Stack Overflow.

Also, I've included the code as an attachment, it's at the bottom.  Video-game south.
 
  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
#Written by the pathos filled hack: C. Thomas Brittain

import cv2
import numpy as np
import serial
from time import sleep
import threading
import math 
from math import atan2, degrees, pi 
import random

#Open COM port to tether the bot.
ser = serial.Serial('COM34', 9600)

#For getting information from the Arduino (tx was taken by Target X :P)
global rx
rx = " "

#For sending information to the Arduino
global tranx
tranx = 0

#For converting the compass heading into an integer
global intRx
intRx = 0

#I've not used this yet, but I plan on scaling motor duration based
#how far away from the target
global motorDuration
motorDuration = 0

#A flag variable for threading my motor timer.
global motorBusy
motorBusy = "No"

#Holds the frame index
global iFrame
iFrame = 0



def OpenCV():
    #Create video capture
    cap = cv2.VideoCapture(0)
    
    #Globalizing variables
    global cxAvg  #<----I can't remember why...
    global cxFound 
    global iFrame 
    global intRx
    global rx
    global tranx
    
    #Flag for getting a new target.
    newTarget = "Yes"
    
    #Dot counter. He's a hungry hippo...
    dots = 0
    
    #This holds the bot's centroid X & Y average
    cxAvg = 0
    cyAvg = 0

    #Stores old position for movement assessment.
    xOld = 0
    yOld = 0
    
    #Clearing the serial send string.
    printRx = " "
          
    while(1):
        
        #"printRx" is separate in case I want to 
        #parse out other sensor data from the bot
        printRx = str(intRx)
        #Bot heading, unmodified
        headingDeg = printRx
        #Making it a number so we can play with it.
        intHeadingDeg = int(headingDeg)
       
        headingDeg = str(intHeadingDeg)
            
        #Strings to hold the "Target Lock" status.     
        stringXOk = " "
        stringYOk = " "
        
	#Incrementing frame index
	iFrame = iFrame + 1
		
	#Read the frames
	_,frame = cap.read()

	#Smooth it
	frame = cv2.blur(frame,(3,3))

	#Convert to hsv and find range of colors
	hsv = cv2.cvtColor(frame,cv2.COLOR_BGR2HSV)
	thresh = cv2.inRange(hsv,np.array((0, 80, 80)), 
	/ np.array((20, 255, 255)))
	thresh2 = thresh.copy()

	#Find contours in the threshold image
	contours,hierarchy = cv2.findContours
	(thresh,cv2.RETR_LIST,cv2.CHAIN_APPROX_SIMPLE)

	#Finding contour with maximum area and store it as best_cnt
	max_area = 0
	for cnt in contours:
		area = cv2.contourArea(cnt)
		if area > max_area:
			max_area = area
			best_cnt = cnt

	#Finding centroids of best_cnt and draw a circle there
	M = cv2.moments(best_cnt)
	cx,cy = int(M['m10']/M['m00']), int(M['m01']/M['m00'])
	cv2.circle(frame,(cx,cy),10,255,-1)

	#After 150 frames, it compares the bot's X and X average,
	#if they are the same + or - 5, it assumes the bot is being tracked.
	if iFrame >= 150:
		if cxAvg < (cx + 5) and cxAvg > (cx - 5):
			xOld == cxAvg
			stringXOk = "X Lock"
		if cyAvg < (cy + 5) and cyAvg > (cy - 5):
			yOld == cyAvg
			stringYOk = "Y Lock"          
            
#This is finding the average of the X cordinate.  Used for establishing
#a visual link with the robot.
#X
cxAvg = cxAvg + cx
cxAvg = cxAvg / 2
#Y
cyAvg = cyAvg + cy
cyAvg = cyAvg / 2

#//Finding the Target Angle/////////////////////////////////////

#Target cordinates.
#Randomizing target.
if newTarget == "Yes":
	tX = random.randrange(200, 400, 1)
	tY = random.randrange(150, 350, 1)
	newTarget = "No"

if iFrame >= 170:
	if tX > cxAvg -45 and tX < cxAvg + 45:
		print "Made it through the X"
		if tY > cyAvg -45 and tY < cyAvg + 45:
			print "Made it through the Y"
			newTarget = "Yes"
			dots=dots+1
        
        #Slope
        dx = cxAvg - tX
        dy = cyAvg - tY
        
        #Quad I -- Good
        if tX >= cxAvg and tY <= cyAvg:
            rads = atan2(dy,dx)
            degs = degrees(rads)
            degs = degs - 90
        #Quad II -- Good
        elif tX >= cxAvg and tY >= cyAvg:
            rads = atan2(dx,dy)
            degs = degrees(rads)
            degs = (degs * -1)
        #Quad III
        elif tX <= cxAvg and tY >= cyAvg:
            rads = atan2(dx,-dy)
            degs = degrees(rads)
            degs = degs + 180
            #degs = 3
        elif tX <= cxAvg and tY <= cyAvg:
            rads = atan2(dx,-dy)
            degs = degrees(rads) + 180
            #degs = 4
        
        #Convert float to int
        targetDegs = int(math.floor(degs))
        
        #Variable to print the degrees offset from target angle.
        strTargetDegs = " "
        
        #Put the target angle into a string to printed.
        strTargetDegs = str(math.floor(degs))
               
        #///End Finding Target Angle////////////////////////////////////

        
#//// Move Bot //////////////////////////////////////

#Don't start moving until things are ready.
if iFrame >= 160:
	#This compares the bot's heading with the target angle.  It must
	#be +-30 for the bot to move forward, otherwise it will turn.
	if intHeadingDeg <= (targetDegs + 30) and intHeadingDeg >+ (targetDegs - 30):
		tranx = 3
		motorDuration = 10 #I'll use later
		#Forward
	else: 
		if intHeadingDeg < targetDegs:
			if 1 < (targetDegs - intHeadingDeg):
				#abs(intHeadingDeg - targetDegs) >= 180:
				
				tranx = 2
				motorDuration = 10
				print (intHeadingDeg - targetDegs)
				print "Right 1"
			elif 1 > (targetDegs - intHeadingDeg): 
				#abs(intHeadingDeg - targetDegs) < 180:
				
				tranx = 4
				motorDuration = 10
				print (intHeadingDeg - targetDegs)
				print "Left 1"
		elif intHeadingDeg >= targetDegs:
			if 1 < (targetDegs - intHeadingDeg):
				#abs(intHeadingDeg - targetDegs) <= 180:
				
				tranx = 2
				motorDuration = 10
				print (intHeadingDeg - targetDegs)
				print "Right 2"
			elif 1 > (targetDegs - intHeadingDeg):
				#abs(intHeadingDeg - targetDegs) > 180:
				
				tranx = 4
				motorDuration = 10
				print (intHeadingDeg - targetDegs)
				print "Left 2"

#//// End Move Bot //////////////////////////////////
       
   
        
        
        #////////CV Dawing//////////////////////////////
        
        #Target circle
        cv2.circle(frame, (tX, tY), 10, (0, 0, 255), thickness=-1)
        
        #ser.write(botXY)
        
        #Background for text.
        cv2.rectangle(frame, (18,2), (170,160), (255,255,255), -1)

        #Target angle.
        cv2.line(frame, (tX,tY), (cxAvg,cyAvg),(0,255,0), 1)
        
        #Bot's X and Y is written to image
        cv2.putText(frame,str(cx)+" cx, "+str(cy)+" cy",(20,20),cv2.FONT_HERSHEY_COMPLEX_SMALL,.7,(0,0,0))
        
        #Bot's X and Y averages are written to image
        cv2.putText(frame,str(cxAvg)+" cxA, "+str(cyAvg)+" cyA",(20,40),cv2.FONT_HERSHEY_COMPLEX_SMALL,.7,(0,0,0))

        #"Ok" is written to the screen if the X&Y are close to X&Y Avg for several iterations.
        cv2.putText(frame,stringXOk,(20,60),cv2.FONT_HERSHEY_COMPLEX_SMALL,.7,(0,0,0))
        cv2.putText(frame,stringYOk,(20,80),cv2.FONT_HERSHEY_COMPLEX_SMALL,.7,(0,0,0))

        #Print the compass to the frame.
        cv2.putText(frame,"Bot: "+headingDeg+" Deg",(20,100),cv2.FONT_HERSHEY_COMPLEX_SMALL,.7,(0,0,0))
        cv2.putText(frame,"Target: "+strTargetDegs+" Deg",(20,120),cv2.FONT_HERSHEY_COMPLEX_SMALL,.7,(0,0,0))
        
        #Dots eaten.
        cv2.putText(frame,"Dots Ate: "+ str(dots),(20,140),cv2.FONT_HERSHEY_COMPLEX_SMALL,.7,(0,0,0))
                
        #After the frame has been modified to hell, show it.
        cv2.imshow('frame',frame) #Color image
        cv2.imshow('thresh',thresh2) #Black-n-White Threshold image
        
        #/// End CV Draw //////////////////////////////////////

        
        if cv2.waitKey(33)== 27:
            # Clean up everything before leaving
            cv2.destroyAllWindows()
            cap.release()
            #Tell the robot to stop before quit.
            ser.write("5") 
            ser.close() # Closes the serial connection.
            break

def rxtx():

    
    # Below 32 everything in ASCII is gibberish
    counter = 32
    
    #So the data can be passed to the OpenCV thread.
    global rx
    global intRx
    global tranx
    global motorDuration
    global motorBusy
    
    while(True):
        counter +=1
                        
        # Read the newest output from the Arduino
        rx = ser.readline()
        
        #This is for threading out the motor timer.  Allowing for control
        #over the motor burst duration.
        if motorBusy == "No":
            ser.write(tranx)
            ser.flushOutput() #Clear the buffer?
            motorBusy = "Yes"
        
        #Delay one tenth of a second
        sleep(.1)
                
        #This is supposed to take only the first three digits.
        rx = rx[:3]
        
        #This removes any EOL characters
        rx = rx.strip()
        
        #If the number is less than 3 digits, then it will be included
        #we get rid of it so we can have a clean str to int conversion.
        rx = rx.replace(".", "")
        
        #We don't like 0.  So, this does away with it.        
        try:
            intRx = int(rx)
        except ValueError:
            intRx = 0

        #Reset counter if over 255.
        if counter == 255:
            counter = 32    

def motorTimer():
    global motorDuration
    global motorBusy
    
    while(1):
        if motorBusy == "Yes":
            sleep(.2) #Sets the motor burst duration.
            ser.write("5")
            sleep(.3) #Sets time inbetween motor bursts.
            motorBusy = "No"

#Threads OpenCV stuff.        
OpenCV = threading.Thread(target=OpenCV)
OpenCV.start()

#Threads the serial functions.
rxtx = threading.Thread(target=rxtx)
rxtx.start()

#Threads the motor functions.
motorTimer = threading.Thread(target=motorTimer)
motorTimer.start()

Importing Modules

LMR_logo_150_Python.jpg
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
#Written by the pathos filled hack: C. Thomas Brittain

import cv2
import numpy as np
import serial
from time import sleep
import threading
import math 
from math import atan2, degrees, pi 
import random

Ok.  The beginning.  

So lines 3-10 pull in the modules we will need.  My take on a module is the following, "Code some smart guy wrote and doesn't want anymore, so he gave it to me to use."

To be specific
  • cv2 is the OpenCV module.
  • Numpy, which we'll call "np" throughout the code, is used for higher number functions needed for OpenCV to do her magic.
  • Serial is the module which will allow us to establish a serial connection between the PC and the robot, via whichever wireless device you've chosen.
  • Time allows us to basically idle the code.  This is important in controlling many things, for instance, how far the robot moves.  We tell the motors to turn on, wait 10 secs, then turn off.  Because the sleep function actually puts the code into an idle state, we must have the threading module, since our code requires the PC to do several things at once.
  • Math.  From the math module we get the code to help us simplify the trigonometry calculations, like the angle between the robot and target.
  • The random module is only used to gives us a random target. 
  • Threading.  Important module.  Basically, threading allows the computers to do two tasks at the same time.  This becomes important when we are both trying to track the robot and receive his position.  Throughout this code we will have three threads
    1. The thread running the OpenCV stuff.  This tracks the robot and is also the largest.
    2. A thread controlling the serial connection between the robot and PC.
    3. And a thread with the small job of telling the motors how long to be on, thereby controlling how far the robot will move.

Setup

LMR_logo_150_Python.jpg
121314151617181920212223242526272829303132333435363738
#Open COM port to tether the bot.
ser = serial.Serial('COM34', 9600)

#For getting information from the Arduino (tx was taken by Target X :P)
global rx
rx = " "

#For sending information to the Arduino
global tranx
tranx = 0

#For converting the compass heading into an integer
global intRx
intRx = 0

#I've not used this yet, but I plan on scaling motor duration based
#how far away from the target
global motorDuration
motorDuration = 0

#A flag variable for threading my motor timer.
global motorBusy
motorBusy = "No"

#Holds the frame index
global iFrame
iFrame = 0

13: This is where we actually open a serial connection to the wireless device you are using.  Note, we've named the serial connection we opened "ser" so when we go to send information it will be something like, ser.write("What you want to send here")

15-38: Here we declare a bunch of variables. The "global variable" lets the code know that this variable is going to jump between all threads. Next, the variable = 0 actually declares the variable.  Do know, you'll have to remind each thread a variable is global by stating "global variable."

One thing I should state, iFrame = 0 is an actual variable declaration, as well as setting it to 0.  Of course, this is how one would declare an integer variable with an initial value of 0.  On the flip, rx = " " is also a variable declaration but this time a string.  You'll know I switched information from a integer to a string if you see something like this:

headingDeg = str(intHeadingDeg)

That tells the code, "I want to convert the value in intHeadingDeg, which is an integer, into a string and call it 'headingDeg'"

The comments indicate what each variable is meant for.  Not going to lie, not sure I don't have some declared variables I meant to use, didn't, and forgot to remove.

One important variable is the iFrame variable, since it tracks which frame we are on.  This becomes key in all aspects of tracking our robot.

OpenCV Function Begins

LMR_logo_150_Python.jpg
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
def OpenCV():
    #Create video capture
    cap = cv2.VideoCapture(0)
    
    #Globalizing variables
    global cxAvg  #<----I can't remember why...
    global cxFound 
    global iFrame 
    global intRx
    global rx
    global tranx
    
    #Flag for getting a new target.
    newTarget = "Yes"
    
    #Dot counter. He's a hungry hippo...
    dots = 0
    
    #This holds the bot's centroid X & Y average
    cxAvg = 0
    cyAvg = 0

    #Stores old position for movement assessment.
    xOld = 0
    yOld = 0
    
    #Clearing the serial send string.
    printRx = " "

42: Here we start this function that does most of the work, OpenCV():.  It is one of the functions that will be threaded at lines 345-347.

44: We open up the webcam and give it the nickname cap.  If I remember right the "0" in the parenthesis refers to whatever camera comes first on your USB bus, so if you have more than one camera you can specify by changing this number, e.g., cap = cv2.VideoCapture(3).  Notice we called the OpenCV module cv2, so we are using the OpenCV module to access the webcam.

46-52: Just making the variables we declared work within this function.  This might not be needed, but hey, I don't read the whole Python manual.

55: This is just a string flag that is flipped to tell the PC to generate a new target for the robot.  Note, we initially set it to "Yes" meaning the first time we run through this function a target needs to be generated.

58: This is an integer variable to count how many dots the robot has "ate."

Ok, before I get to the next bit I need to take a minute and explain how we approach actually getting the coordinates of our robot.  As you know, OpenCV does the hard work for us, giving us the X and Y coordinate of the largest red blob on the screen.  Though, the coordinates it gives us are the center of the mass.  Now, this is all just a logical guess because I didn't read the whole OpenCV manual, but I believe the X or Y coordinate that refers to the center of this mass is called the centroid.

This might seems simple.  That's because it is, I'm not sure why we don't just call it the damn center or something.  Eh, oh well.  Though, it will become important when we do collision detection between the robot and its target.

61-62: All that to say, the "c" in cyAvg and cxAvg stands for centroid.  So, these are variables that will hold the running average for the X and Y coordinates of the red blob's centroid.

65-66: These are back-up variables of the cxAvg and cyAvg and will be important around line 122-127 when  we are trying to decide if the color we are tracking is actually the robot or some other piece of junk with enough red in it to fool OpenCV.

69: This simply clears the string variable with data that came from the robot, like the robot's heading, before another iFrame starts.

Main Loop

LMR_logo_150_Python.jpg
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
while(1):
        
        #"printRx" is separate in case I want to 
        #parse out other sensor data from the bot
        printRx = str(intRx)
        #Bot heading, unmodified
        headingDeg = printRx
        #Making it a number so we can play with it.
        intHeadingDeg = int(headingDeg)
       
        headingDeg = str(intHeadingDeg)
            
        #Strings to hold the "Target Lock" status.     
        stringXOk = " "
        stringYOk = " "

71: Creates a loop within the OpenCV() function.

73-81: Ok, I need to be humble here and say I'm not sure what the Cthulhu's Kitchen I was doing.  I know printRx = str(intRx) is taking the information received from the robot and converting it into a string.  intRx is as a global variable and it is loaded with robot data at line 326. headingDeg = printRx is moving the heading data from one variable to another; the idea here was if I wanted more information to come from the robot besides the compass heading it would come in through printRx, then I could chop it up and load it into variables respective to their purpose.  

For instance, printRx.split(",") should give a list of strings based on how many commas are currently held within printRx.
 
printRx = "2, 23, 88"  
compass, sonar, battery_life = printRx.split(",") 
Now, 
compass = 2
sonar = 23
battery_life = 88

 
But the part that confuses me is I turn right back around and convert the string back to an integer?  I'm not sure, guys.  I might have Southpark while coding again.  

At the end of that poor coding we end up with two variables to use: intHeadingDeg and headingDeg. We the integer intHeadingDeg to do any calculations that involve the robot's heading.  The other, headingDeg, is to print the robot's heading to the screen, which is done at line 263.

84-85: These are string variables that will will hold the "Target Locked X" or "Target Locked Y" if we are tracking the robot.  These strings are needed so we can print this to the screen on line 259-260.

OpenCV Begins

LMR_logo_150_Python.jpg
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
        #Incrementing frame index
        iFrame = iFrame + 1
            
        #Read the frames
        _,frame = cap.read()
    
        #Smooth it
        frame = cv2.blur(frame,(3,3))
    
        #Convert to hsv and find range of colors
        hsv = cv2.cvtColor(frame,cv2.COLOR_BGR2HSV)
        thresh = cv2.inRange(hsv,np.array((0, 80, 80)), np.array((20, 255, 255)))
        
	thresh2 = thresh.copy()
    
        #Find contours in the threshold image
        contours,hierarchy = cv2.findContours(thresh,cv2.RETR_LIST,cv2.CHAIN_APPROX_SIMPLE)

		
        #Finding contour with maximum area and store it as best_cnt
        max_area = 0
        for cnt in contours:
            area = cv2.contourArea(cnt)
            if area > max_area:
                max_area = area
                best_cnt = cnt

        #Finding centroids of best_cnt and draw a circle there
        M = cv2.moments(best_cnt)
        cx,cy = int(M['m10']/M['m00']), int(M['m01']/M['m00'])
        cv2.circle(frame,(cx,cy),10,255,-1)
    
        #After 150 frames, it compares the bot's X and X average,
        #if they are the same + or - 5, it assumes the bot is being tracked.
        if iFrame >= 150:
            if cxAvg < (cx + 5) and cxAvg > (cx - 5):
                xOld == cxAvg
                stringXOk = "X Lock"
            if cyAvg < (cy + 5) and cyAvg > (cy - 5):
                yOld == cyAvg
                stringYOk = "Y Lock"       

We're in the meat now. 

88: This increments our frame counter.

91: We read a single frame from the webcam we declared, cap, at line 44.

OPENCV! Sorry, I just love it so much.  

So, by now you know I've not read the OpenCV manual.  And please don't tell me, "What! Go RTFM!"  You go RTFM! I've got a wife, kid, and a job I love.  I'm just going to tinker with crap and get it to work. But this attitude will begin to show as we go through the OpenCV calls, since I don't know their inner working.  Instead, I'm going to offer my best guess, and as always, if someone wants to correct me or offer better explanation, I'll post and give credit. 

94: This blurs the image we got.  You may say, "But I thought higher resolution was better?" It is.  But jagged edges and color noise are not.  A simple shape is much easier for math of OpenCV to wrap around then a complex one.  Therefore, we blur the image a little, giving us softer edges to deal with.

Also, blur melds colors, so if there are 2 blue pixels and 1 red pixel in a group, then it become 3 blue-purplish pixels.  This has the nifty benefit of speeding up the image processing a lot.  How much? I don't know I didn't RTFM.


97-100: Our image is converted to a histogram here.  Having the image in a histogram format allows us to use comparative statements with it.  What we use it for is to get rid of all the colors except the one we are trying to find.  This will give us a black and white image, the white being only the color we are looking to find. Line 98 is where your color is defined (it's the two "np.array"s).

In the next step I'll go through how to select your robot's exact color.








103: Finds the contours of the white area in the resulting image.







107-112: OpenCV then counts how many pixels are in each contour it finds in the webcam image.  It assumes whichever has the most white area (aka, "mass") is our object.






114-117: After we decided which object we want to track, now we need to come up with the centroid coordinates.  That is what lines 115-116 do.  I've not done the research on the math there, but I believe it averages the moments of the polygon and calls the average either centroid X or Y, depending on the calculation.  But, feel free to correct or explain better.




121-127: Here we lock onto the mass we believe is the robot.  It begins by collecting a 150 samples before it will state is tracking the largest mass.  But after it begins to track the largest mass, then we try to stay locked on to it.  This is line 122-127.  In essence, we allow the mass to move enough to be considered a movement by the robot, but not so much that noise (like a stray hand in the webcam image) will cause the tracking to switch off the robot.

OpenCV: Selecting Your Color

LMR_logo_150_Python.jpg
98
        thresh = cv2.inRange(hsv,np.array((130, 70, 110)), np.array((190, 190, 200)))

This particular line defines what color you are looking for, specifically, the two sets of values: 130, 170, 110 and 190, 190, 200.  These two values set the lower limit and the upper limit of the color you are looking to find.  The reason we use upper and lower limits, which we'll call color thresholds, is because our robot will move through different lights.  Different light sources have a tendency to change how the webcam reads the color.  

The color format we are using is HSV, which stands for hue, saturation, value.  Later, I'll probably write code to select the robot within our actual program, but for now I use Gimp and the following method:

  1. Setup your webcam the in the area you'll be using, just like you're ready to control him.
  2. Run the webcam program attached in step 10.

     
  3. While the webcam program is watching your robot, hit Ctrl + Print Screen
  4. Open Gimp.
  5. Hit Ctrl + V to paste the screen capture into gimp.

     
  6. Now, find the Color Selector tool.

     
  7. Select the main color of your robot.
  8. Now double click on the color square on the toolbar.

     
  9. A window should pop open with color information regarding the color you selected, your robot.
  10. Now, the three numbers listed should be close to what we need.  Sadly, we have to convert from Gimp's HSV number range to OpenCV's HSV number range.  You see, HSV value range in Gimp is H = 0- 360, S = 0-100, and V = 0-100.  In OpenCV, H = 0-180, S = 0-255, V = 0-255.  So, some conversion needs to take place.
  11. From my selection I ended with Gimp numbers of, H: 355, S:50, and V:61.  I could get all fancy and calculate the right numbers, but I figure 180 (OpenCV) is half of 360, so for my H I just divided by two: 177. The other two I kinda guess at a little.  I doubled and added 25, S: 125 and V: 147.
  12. In the end, this gave me middle numbers.  But I wanted an upper and lower threshold, so I took each number and subtracted 20 to give me a lower, and added 20 to give me an upper. 
  13. The result for my robot was:
1
        thresh = cv2.inRange(hsv,np.array((152, 105, 127)), np.array((180, 145, 167)))

I'll try to code a color selector into the program to make this whole damn thing a cinch.

If you'd like to read more, two good posts on Stack Overflow.
  1. Choosing HSV
  2. Finding HSV in image.

OpenCV: Centroid Average and Target Generation

LMR_logo_150_Python.jpg
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
        #This is finding the average of the X cordinate.  Used for establishing
        #a visual link with the robot.
        #X
        cxAvg = cxAvg + cx
        cxAvg = cxAvg / 2
        #Y
        cyAvg = cyAvg + cy
        cyAvg = cyAvg / 2
        
        #//Finding the Target Angle/////////////////////////////////////
        
        #Target cordinates.
        #Randomizing target.
        if newTarget == "Yes":
            tX = random.randrange(200, 400, 1)
            tY = random.randrange(150, 350, 1)
            newTarget = "No"
        
        if iFrame >= 170:
            if tX > cxAvg -45 and tX < cxAvg + 45:
                print "Made it through the X"
                if tY > cyAvg -45 and tY < cyAvg + 45:
                    print "Made it through the Y"
                    newTarget = "Yes"
                    dots=dots+1

132-136: Here we actually take the running average of the centroids' X and Y.  We load this into the  variables cxAvg and cyAvg, again, this is to assure we are tracking the robot.

142-145: Here the target, or "dot," for the robot to run after is randomly generated.  As you may notice I restricted the generation area of the dots towards the center of my webcam's field-of-view.  That's because I'm messy and dots were going where the little robot couldn't get.

147-153: This is a rough collision detection function.  Basically, if the robot gets so close to the target (45px) then it has considered to have "eaten" the dot.  If it did, then the dot variable is incremented showing the total amount he's done ate and the newTarget string variable is flipped so it can generate a new target the next run through.

OpenCV: Find the Angle

LMR_logo_150_Python.jpg
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
        #Slope
        dx = cxAvg - tX
        dy = cyAvg - tY
        
        #Quad I -- Good
        if tX >= cxAvg and tY <= cyAvg:
            rads = atan2(dy,dx)
            degs = degrees(rads)
            degs = degs - 90
        #Quad II -- Good
        elif tX >= cxAvg and tY >= cyAvg:
            rads = atan2(dx,dy)
            degs = degrees(rads)
            degs = (degs * -1)
        #Quad III
        elif tX <= cxAvg and tY >= cyAvg:
            rads = atan2(dx,-dy)
            degs = degrees(rads)
            degs = degs + 180
            #degs = 3
        elif tX <= cxAvg and tY <= cyAvg:
            rads = atan2(dx,-dy)
            degs = degrees(rads) + 180
            #degs = 4

156-177: Here we are trying to find the angle between the robot and his target.  We basically divide the entire screen up into four quadrants but always using the robot's centroid as the point of origin.  We then calculate the slope between the target's X and Y (tY, tX) and the robot's X and Y (cxAvg and cyAvg).

Something like this:



If the target were to be located in the quadrant III, it would go something like this.


If you'd like to dig further into Trigonometric Functions in Python, have fun.  Share if you find better math :)

OpenCV: Stupid Floats

LMR_logo_150_Python.jpg
180
181
182
183
184
185
186
187
188
189
        #Convert float to int
        targetDegs = int(math.floor(degs))
        
        #Variable to print the degrees offset from target angle.
        strTargetDegs = " "
        
        #Put the target angle into a string to printed.
        strTargetDegs = str(math.floor(degs))
               
        #///End Finding Target Angle////////////////////////////////////

181: When we find the angle between the robot and the target, then convert it into degrees, it ends up giving us a number which is a float. That's more than we need, so here we convert the float (degs) to and integer (targetDegs) so we can compare to the robot's compass heading.

184: We declare an empty string called strTargetDegs.
187: Then we convert the float degs into a string so we can print the target angle onto the screen at line 264.

OpenCV: Time to Move!

LMR_logo_150_Python.jpg
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
#//// Move Bot //////////////////////////////////////

#Don't start moving until things are ready.
if iFrame >= 160:
	#This compares the bot's heading with the target angle.  It must
	#be +-30 for the bot to move forward, otherwise it will turn.
	if intHeadingDeg <= (targetDegs + 30) and intHeadingDeg >+ (targetDegs - 30):
		tranx = 3
		motorDuration = 10 #I'll use later
		#Forward
	else: 
		if intHeadingDeg < targetDegs:
			if 1 < (targetDegs - intHeadingDeg):
				#abs(intHeadingDeg - targetDegs) >= 180:
				
				tranx = 2
				motorDuration = 10
				print (intHeadingDeg - targetDegs)
				print "Right 1"
			elif 1 > (targetDegs - intHeadingDeg): 
				#abs(intHeadingDeg - targetDegs) < 180:
				
				tranx = 4
				motorDuration = 10
				print (intHeadingDeg - targetDegs)
				print "Left 1"
		elif intHeadingDeg >= targetDegs:
			if 1 < (targetDegs - intHeadingDeg):
				#abs(intHeadingDeg - targetDegs) <= 180:
				
				tranx = 2
				motorDuration = 10
				print (intHeadingDeg - targetDegs)
				print "Right 2"
			elif 1 > (targetDegs - intHeadingDeg):
				#abs(intHeadingDeg - targetDegs) > 180:
				
				tranx = 4
				motorDuration = 10
				print (intHeadingDeg - targetDegs)
				print "Left 2"

#//// End Move Bot //////////////////////////////////

This is where I need help guys.  My turning code has a bug, so if you find it and come up with a correction I'll send you a prize.  I dunno? A lint ball? It'd probably be one of my left over circuit boards, or some piece of hardware I hacked together.  

But for now, let's take a look.

The idea is like:



The code is supposed to go as follows:

if target1 = True then:
MoveForward()
elif target2 = True then:
TurnRight()
elif target3 = True then:
TurnLeft() 

And for the most part that happens, but occasionally it is dumb and turns left when it should right.  Not sure what I'm doing wrong.  Hey, that "You and me buddy, until the end" is a two-way street. :P

Let's step through it

195: We want to make sure we are deep into tracking the robot before we start moving it towards the target.

198: We compare intHeadingDeg, which is the robot's heading angle, with targetDegs, which is the angle between the robot and the target.  But we do this + or - 30º.  This means the robot does not have to have its heading angle exactly the same as the angle to the target it.  It only need to be approximately pointing in the right direction.

199: The movement code for the robot to go forward is 3, so here, given the robot is approximately headed in the right direction, we tell the robot to move forward.  This happens by loading into the variable tranx, which is transmitted to the robot at line 307.  When this code gets transmitted to my robot, the Arduino code at line 137 tells the Forward(); function to fire.

202: If our robot isn't headed in the right direction, then which way should he turn?

203-232:  Still debugging here.  I'm sorry guys.  I can tell you this code works "Ok."  But once I'm done with this tutorial, I'll go back and focus on making it turn perfect.  Sorry, this code took me two days to right, but this tutorial has taken too many days.  

Though, within each of the if statements we have two variable assignments: tranx = X and motorDuration = 10.  The tranx tells the robot which direction to move and the motorDuration tells it how long to move that way (this is not yet being utilized in my code).

UPDATE: Fixed the turning code with the following. Works great now :)
 
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
        #Don't start moving until things are ready.
        if iFrame >= 160:
            #This compares the bot's heading with the target angle.  It must
            #be +-30 for the bot to move forward, otherwise it will turn.
            shortestAngle = targetDegs - intHeadingDeg
            
            if shortestAngle > 180:
                shortestAngle -= 360
            
            if shortestAngle < -180:
                shortestAngle += 360
            
            if shortestAngle <= 30 and shortestAngle >= -31:
                tranx = 3
                print "Forward"
            elif shortestAngle >= 1:
                tranx = 2
                print "Turn Right"
            elif shortestAngle < 1:
                tranx = 4
                print "Turn Left"

OpenCV: Print to Screen

LMR_logo_150_Python.jpg
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
        #////////CV Dawing//////////////////////////////
        
        #Target circle
        cv2.circle(frame, (tX, tY), 10, (0, 0, 255), thickness=-1)
        
        #ser.write(botXY)
        
        #Background for text.
        cv2.rectangle(frame, (18,2), (170,160), (255,255,255), -1)

        #Target angle.
        cv2.line(frame, (tX,tY), (cxAvg,cyAvg),(0,255,0), 1)
        
        #Bot's X and Y is written to image
        cv2.putText(frame,str(cx)+" cx, "+str(cy)+" cy",(20,20),cv2.FONT_HERSHEY_COMPLEX_SMALL,.7,(0,0,0))
        
        #Bot's X and Y averages are written to image
        cv2.putText(frame,str(cxAvg)+" cxA, "+str(cyAvg)+" cyA",(20,40),cv2.FONT_HERSHEY_COMPLEX_SMALL,.7,(0,0,0))

        #"Ok" is written to the screen if the X&Y are close to X&Y Avg for several iterations.
        cv2.putText(frame,stringXOk,(20,60),cv2.FONT_HERSHEY_COMPLEX_SMALL,.7,(0,0,0))
        cv2.putText(frame,stringYOk,(20,80),cv2.FONT_HERSHEY_COMPLEX_SMALL,.7,(0,0,0))

        #Print the compass to the frame.
        cv2.putText(frame,"Bot: "+headingDeg+" Deg",(20,100),cv2.FONT_HERSHEY_COMPLEX_SMALL,.7,(0,0,0))
        cv2.putText(frame,"Target: "+strTargetDegs+" Deg",(20,120),cv2.FONT_HERSHEY_COMPLEX_SMALL,.7,(0,0,0))
        
        #Dots eaten.
        cv2.putText(frame,"Dots Ate: "+ str(dots),(20,140),cv2.FONT_HERSHEY_COMPLEX_SMALL,.7,(0,0,0))
                
        #After the frame has been modified to hell, show it.
        cv2.imshow('frame',frame) #Color image
        cv2.imshow('thresh',thresh2) #Black-n-White Threshold image
        
        #/// End CV Draw //////////////////////////////////////

Here, we are drawing every thing to the screen before we show the frame.

242: Red circle for target.

247: White box to display black text on.  Note, we are drawing things bottom up.  So, if you want something to have a particular Z level you'll need to put it towards the top of this section.

250: This is the green line between the target and our robot.

253-267: We display all our info here.  Compass heading, target-lock, etc.

270: This actually shows the color window (the window we wrote everything on).

271: This shows the HSV copy of the captured frame.  Notice the white area to be assessed as our target. 

OpenCV: Quit It

LMR_logo_150_Python.jpg
276
277
278
279
280
281
282
283
        if cv2.waitKey(33)== 27:
            # Clean up everything before leaving
            cap.release()
            cv2.destroyAllWindows()
            #Tell the robot to stop before quit.
            ser.write("5") 
            ser.close() # Closes the serial connection.
            break

276: An if-statement that waits for the ESC to be pressed.  If it gets pressed, we close stuff.

278: This releases our webcam.

279: This closes the windows we were displaying the color and HSV frames.

281: We send the code to stop our robot.  If we don't do this and we hit the ESC in the middle of a robot movement, that move will continue forever.

282: Here we closed the serial connection.

283: We quit.

Towards the beginning of this article I stated my webcam had crappy drivers; well, while writing this I noticed I had placed the cv2.destroyAllWindows before cap.release(). This is what was causing the problem.  My interpretation of this was our camera being sucked into the void where the destroyed windows go.  Anyway, I switched the order and it seems to have solved the problem.

RXTX: Beginning

LMR_logo_150_Python.jpg
285
286
287
288
289
290
291
292
293
294
295
296
297
def rxtx():

    
    # Below 32 everything in ASCII is gibberish
    counter = 32
    
    #So the data can be passed to the OpenCV thread.
    global rx
    global intRx
    global tranx
    global motorDuration
    global motorBusy
    

Finally, we are opening our second threaded function.  This function is much smaller than the OpenCV function.  Here all serial communication takes place.

289: This helps in translating ASCII.

292-296: Global variables for passing robot information to other threads.


RXTX: the Wireless Magic

LMR_logo_150_Python.jpg
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
    while(True):
        
                        
        # Read the newest output from the Arduino
        rx = ser.readline()
        
        #This is for threading out the motor timer.  Allowing for control
        #over the motor burst duration.
        if motorBusy == "No":
            ser.write(tranx)
            ser.flushOutput() #Clear the buffer?
            motorBusy = "Yes"
        
        #Delay one tenth of a second
        sleep(.1)
                
        #This is supposed to take only the first three digits.
        rx = rx[:3]
        
        #This removes any EOL characters
        rx = rx.strip()
        
        #If the number is less than 3 digits, then it will be included
        #we get rid of it so we can have a clean str to int conversion.
        rx = rx.replace(".", "")
        
        #We don't like 0.  So, this does away with it.        
        try:
            intRx = int(rx)
        except ValueError:
            intRx = 0

        

303: We read information into the variable rx. The information is coming from the serial line we opened at the code's beginning.

307: This is a flag gate that makes it where our Python code can only send a motor command to the robot if the robot isn't already in the middle of a movement.

308: We write whatever value is in tranx, which should be loaded with some sort of movement from lines 192-232.

313: I think I threw this in there so the serial-line would bog down the my code.

316: We strip the number down to three digits only;remember, this is the compass heading in degrees, e.g, 000-360º.

319: When something is sent over serial it gets an end-of-line character. We don't want that.

323: The robot collected this number from a compass, which gave a number with a decimal involved.  This removes the decimal so we are only dealing with whole numbers.

326-329: I'm not sure what I was doing here, I think it had to do with the oddities of zero.  Eh.  I'll try to remember.


Motor Sleeper

LMR_logo_150_Python.jpg
334
335
336
337
338
339
340
341
342
343
def motorTimer():
    global motorDuration
    global motorBusy
    
    while(1):
        if motorBusy == "Yes":
            sleep(.2) #Sets the motor burst duration.
            ser.write("5")
            sleep(.3) #Sets time inbetween motor bursts.
            motorBusy = "No"

This is a short threaded function.  It only really has one job, to control how long the motors on the robot stay on.  It works like this, if we send the robot a message to move forward, it continues to do so until line 341.  There, the command to stop is sent to the robot and the motorBusy flag is set back to "No" meaning the motor is ready to be used again.

340: This sets how long the motor will stay on.  For instance, if it were changed to sleep(1) the robot's motor would continue in the direction they were told for 1 second.

342: This makes the robot wait in between movements.  In theory, this was meant to ensure OpenCV could keep up with the little guy.  So, if you have a fast robot, you might set this higher.

Thread's End

LMR_logo_150_Python.jpg
345
346
347
348
349
350
351
352
353
354
355
#Threads OpenCV stuff.        
OpenCV = threading.Thread(target=OpenCV)
OpenCV.start()

#Threads the serial functions.
rxtx = threading.Thread(target=rxtx)
rxtx.start()

#Threads the motor functions.
motorTimer = threading.Thread(target=motorTimer)
motorTimer.start()

Ok. Code's End.

This bit starts all three threads: OpenCV, rxtx, and motorTimer.

And here is my poor attempt to explain Python threading.  Most Python code is run sequentially; the order it comes is the order it is executed.  One problem is timing.  If we have to cause a delay in code, then the whole program has to pause.  Threading allows us to get around this.  I see it like a juggler performing that trick where he keeps all the balls going in one hand, while he holds one ball still in his other.  I dunno, just how I see it.


Done?

Well, like I said, "You and me, buddy, until the end."  And here we are.  The end.  

I hope this code has been helpful.  But do know, you're not alone.  

cthomasbrittain@hotmail.com

Skype: thomas_ladvien


Skype or email me if you have any questions. Likewise, all that crap I did a poor job explaining, coding, writing, just shoot me an email and I'll fix it.

I still want to develop this into a Swarmie platform; so you might keep an eye out on www.letsmakerobots.com since I'll post my unfinished work there.  Alright, I'm off to work on the 8th iteration of my Swarmie...ugh.