loading
 
Roborealm and hexapod
valentina from Italy  [43 posts]
6 years
Hi,
I am trying to run this tutorial:
http://www.hexapodrobot.com/guides/HexEngine_2_RoboRealm_Tutorial_01.html
that I need to implement a IA progrm to make the robot order some cubes of different colors.

My problem is that I can't make the robot receive the commands: the serial port is fine set (I can send commands through a terminal) but it saeems I'm not able to make it move through rorborealm. In particular, it seems that every time I open a project, Roborealm maintains in memory the settings of the last one (I set a sleep time for camera input) even when restarted by double-clicking on the original project file....

Can you help me? Thanks!
Anonymous 6 years
Most likely RR is still running in the background (since it is a separate process) which is why it appears that it remains in memory ...

Other than that its hard to tell what is going wrong with your project from that brief a description. If you know that the serial messages are working to your robot then most likely the serial script that you are using in RR may not be sending the right commands ... can you include that here?

What MCU are you using?

What commands is the MCU expecting?

Are you using the API since you mention a project?

STeven.
valentina from Italy  [43 posts] 6 years
Well, first of all thanks for the replay ^^

I am using a p.Brain SMB (MSR-H01 HexEngine, Micromagic hexapod) and I'm working with PIP commands. We are 2 computer scientists at first experience with robots, so all the hw/serial part for us is very difficult.

The serial messages are working to my robot (I tried a .exe that tests the serial connection for movements and it works well: the robot correctly receives the PIP commands from the PC).

The RR script that I used is the one linked (I include here the original zip file than includes both the RR file and the script). I am trying to make it run before making any change to it.

At the end the goal is to have the hexapod connected to PC and receiving commands on the basis of a radio-camera input. The environement where the robot have to work will be a white floor with a 2colored line (2 lines of different colours one close to another, to have the bot knowing on which side it is located) and some colored cubes (some red and some blue for example). The main algorithm have to take the video stream, check for a line in it, then check for a red cube on the side where the robot is. If a red cube is found, the robot has to push it on the other side, than (on the other side) make the same with a blue cube (if founded, it has to be pushed on te potyer side). All this has to be repeated to have a goal where all red cubes are on one side and all blue ones on the other side.

I am also wondering how I can "reset" the image filtering to make it recognize the line at the beginning with some black and white filters and then have again the image with colours to recognize the cubes....

thanks for all!

Hexapod_BLOB_Tracking_V1.zip
Anonymous 6 years
to "reset" the image use the math plugin.  have image 1 as the original, image 2 as current and choose replace #2
valentina from Italy  [43 posts] 6 years
thanks anonymous!

Using the math one with "replace #2" doesn't give me the original image but with "difference" it does :) thinking about it it seems it should be as you said, not as it is, but well, the important for me for now is the result, then if I can also understand better is.... better ;D
valentina from Italy  [43 posts] 6 years
well, nope I was wrong, the "difference" works only in some positions, but anyway it'll work ;)

hope to receive suggestions about the communications, thanks! ^^
James Dickinson from Australia  [49 posts] 6 years
Hi there!

http://www.youtube.com/watch?v=rhxH0W3dkZo

Here is my MSR-H01 + OQO2 running Roborealm.

I have included my modification of the tutorial at www.hexapodrobot.com

I love this program, and I love this robot!  I've got an SRF08 to hook up yet to give my robot depth perception!
MSRH-01 tracking.zip
James Dickinson from Australia  [49 posts] 6 years
oops I forgot to add :

I have the p.Brain u24 with bluetooth module connecting the computer with the pod.

to change the port it connects to just double click the serial plugin and change it at the top.

I find I have problems sometimes with the hexapod and roborealm just not wanting to talk. restarting roboreal always fixes it though
valentina from Italy  [43 posts] 6 years
thanks very much James! :)

I also have a BT module on (ESD200), on a different p.brain than yours (do you have the small hexapod, right? the one that has the BT module "natively"?) but I couldn't make it working for now (I made bad welds -it was my first time- then I fixed but didn't spend more time for now to make it run): it can be viewed by the BT manager of the PC and it correctly pairs, but it doesn't work with the Teraterm as expected. From here came a lot of problems for me... I wanna apply this project for a university exam (Artificial Intelligence), and I am working to make the robot work in time. After that I will work with less hurry on the BT module :)

I just downloaded your changed version, to see if it works on my hexapod (I don't know if the different p.brain would affect it).

I'd like you to keep following this thread, so that I can avail of your help with hexapod-related matters, if you can. Our time for the exam is expiring: we spent a lot of time only to make it simply move and I'm a bit anxious about it :P Thanks!

PS: if you prefer, or if we go OT on not-roborealm matters, I can give you my personal email or FB username, just tell me what you prefer (if you like it and not bother, of course! ;)
valentina from Italy  [43 posts] 6 years
PPS: oh, it's not the smaller one, it's just like mine :) I subscribed to your youtube channel, James :)
James Dickinson from Australia  [49 posts] 6 years
I am really impressed with Roborealm, I can use it as my "engine" for my own personal AI project in VB Script.  

In other words I will be around for a while yet!

Make sure you hang around and show me how yours goes!
valentina from Italy  [43 posts] 6 years
yes, for sure James, thanks! :D
James Dickinson from Australia  [49 posts] 6 years
I have modified the tutorial VB script and changed it to work with the Associated Video Memory (Navigator) plugin for roborealm.

unfortunately it seems you cannot use the keyboard input plugin along with AVM as manual control stops working.  As a work around I have added a bunch of "set once" variables at the top. just open this plugin and untick then retick "set once" on the variable you want to set (to power up or power down for example)
AVM for MSR-H01.zip
valentina from Italy  [43 posts] 6 years
James, that's awsome! the function is working (the robot moves :)

Let me see if I understood all clearly (I'm testing it with a webcam on my PC so I can't really see for now where the robot is going or when it will stop, but I can have only an idea):

SET FUNC: clicking and rr-clicking on a variable in the Set function I can make the robot wake up, sleep and enable walk (setting the walk_enable=1) or not (setting the walk_enable=0).

AVM navigator: I can:
- train an object to be recognized with the "object recognition" radio button
- make the robot move in the direction of the tracked object with the "navigate mode" radio button
- in the "nova gate" mode I can left-click on an object and then click on "walking by way" to make the robot move in the direction of the clicked position
- in the "marker" mode I can make the robot look around and draw a virtual map of what he's seing
- in the "navigation by map" mode I can make the robot move inside the map he drawed in the previous point, avoiding obstacles (how?)

It seems that there are a lot of functions in Roborealm that I don't still know and that I can use for our project: what do you think it's (or are) the best to use for my goals? track an object or the colors or/and the line, etc?

thanks James, you are giving me awsome tips and ideas: the hexapod world is different now for me :D
valentina from Italy  [43 posts] 6 years
PS: we would like to publish the final work (that we expect finished for the end of July at least) in a blog of our project. Do we have your permission to write your name and give credits to you for the great help you are giving us, James? of course we will say here when we will publish any content with your name or your code, so that you can quickly correct us for anything you prefer :)
valentina from Italy  [43 posts] 6 years
Another question: what's the meaning of having one interneal and one external variables for each command and state? thanks!
EDV  [328 posts] 6 years
Valentina,

Thank you for your enthusiasm!
And you can find out more details about "AVM Navigator" here:
http://letsmakerobots.com/node/22733
http://forums.trossenrobotics.com/showthread.php?t=4764

Let me know if you have any additional questions.

Dmitriy.
James Dickinson from Australia  [49 posts] 6 years
hi Valentina,

your particular project is an interesting one indeed.

you really need 3 things :

line finder - to find the line to follow
red ball finder - to find red balls only
blue ball finder - to find blue balls only

depending on your robots surroundings these things could easily be done with simple filters and line/point finders

what type of progress have you made so far?
James Dickinson from Australia  [49 posts] 6 years
valentina

I missed your question about internal and external

I believe you are talking about the VBS script in roborealm?

if so the two commands

vbs_variable = getvariable("variable_name_in_roborealm")
and
setvariable("variable_name_in_roborealm", vbs_variable)

are used to pass values between the VBS plugin in roborealm and other plugins (such as centre of gravity or AVM).
James Dickinson from Australia  [49 posts] 6 years
Sorry for hijacking your thread!

EDV I have a question about AVM.  how do the move speed and turn speed values correlate to real world movement?  for example my hexapod turns and moves slower than your tank in the videos, how important is it to have these values (move speed and turn speed) accurately represent the robots movement values?
EDV  [328 posts] 6 years
Move/Turn speed has an influence on values of motor control variables (NV_L_MOTOR, NV_R_MOTOR, NV_L_MOTOR_128 and NV_R_MOTOR_128). These parameters can limit the values of control variables.

But it has not an influence on variables NV_LEFT, NV_RIGHT, NV_FORWARD, NV_BACKWARDS and NV_TURRET_BALANCE that you used in your VBScript.

So, you can leave it by default 99/99.
valentina from Italy  [43 posts] 6 years
@EDV: thanks very much Dmitriy! now we are at university and we are studying the links you provided ^___^ it's so helpful to have your replay

@James: yes you got perfectly my question about internal and external variables, thanks :)
Please, continue to hijacking the thread ;) we are getting many interesting infos :)

We were stuck several days because we could not make the robot walk through RR. The problem was the initial setting of the variables for walk and power (the ones that you added at the top with the set variables once feature. Now we can make the 6pod walk without any problem with the blob tracking script and we are making good steps.
At this point we filtered the blobs and we are able to make RR show (and put on variables) the red and blue blobs (that will be balls or perhaps cubes), pushing out the image filtering problems (so we have only the right blobs on our image).
We still don't have a camera on the robot (the radio camera we have doesn't work so good), so we are waiting for a new one i  days and making proofs with the webcam on the PC for now.
This is our TODO list fr now:
- mix the line recognition and blob recognition functions, learning how to use the math plugin to use the original (not filtered) image at the middle of the pipeline.
- set the functions and learn how to use RR variables in the script to make the robot move how we want it to do (go in front of the object and push it at the other side of the line)... that's why I made the question about variables :)
- mix all that and make the robot follow our initial algorithm
-  register all on our blog, to make it understandable easily for italian people that wanna use RR and/or the MSR-H01
- present the project to our professor :)
James Dickinson from Australia  [49 posts] 6 years
I had a little play around with how to have 2 filters running in a single robo file and this is what I came up with

<Roborealm Plugins>
Filter plugin set 1
write image c:\temp\filter1.jpg
math (source, source, replace#2)
Filter plugin set 2
write image c:\temp\filter2.jpg
<Roborealm Plugins>

when I say "filter plugin set" I mean a combination of plugins from Roborealm that give me a final output.

you can then use a vb script to load the correct image. this would work especially well in your scenario as you could have 3 images, blue, red, line

read image c:\temp\filter1.jpg

will allow you to change the current image so that the next plugin uses that output

I hope that made sense LOL


If you are wondering what I am doing with my MSR-H01, well I am trying to make a "pet" robot AI that is very interactable. This is the reason I chose the MSR-H01 as it has an amazing amount of movement capabality while remaining very stable.
I have an SRF08 ultrasonic range finder and a sharp IR distance sensor yet to install and intergrate. The IR sensor is going on the head to give accurate aiming distances on the camera and the SRF08 will be mounted (eventually) on a 360 degree turret for spacial mapping.
valentina from Italy  [43 posts] 6 years
Wow James, you are in a very advanced level :) congrats!
the filter/math settings make me return to the source image but it filters and shows only one RGB channel (even if "All" is selected), but I found another way: there is another function that "resets" all, that is the "marker". When I put a marker I can simply change the current image with the source one.

Now we are facing with a little developing problem: it seems that is not possible to use arrays in the VBscript (?).... it gives errors when we try to make a for cycle on an array, in particular if we construct it with a "Dim" (we have an error that says he expect an end of instruction.... I am translating from italian so sorry if it's not precise) or if we use a termination condition with a "Ubound" (upper bound: we aheva n error of wrong type).
We have to use an array because we set in an array (BLOBS) the coordinates of all the blue blobs we see...

valentina from Italy  [43 posts] 6 years
OK we found specifical RR array functions ;)
valentina from Italy  [43 posts] 6 years
update for today:

we set up a image with the line of 2 colours (blue and yellow: the cubes will be red and green), we recognized the blobs of these 2 colours and put the Y cord of the biggest one in a variable. We made a comparison of the 2 Y coords aof the blue and yellow blobs to see what is nearest to the viewer (the minimum Y) and wrote the result on the screen.

So now we recognized the line and we know where we are respect of the line.

We also filtered the red cubes and put in an array the coords. Tomorrow we will see how to know where is the cube... we still don't know how we will do it, because if we simply get the Y variable it is related to the COG? If yes, it could be that a cube has cog upper the line even if it's on our side. Tips? :)
valentina from Italy  [43 posts] 6 years
And, in the meanwhile, James your project is really interesting: at what point are you, what did you arleady implemented while you are waiting to install the new sensors?
James Dickinson from Australia  [49 posts] 6 years
Hi Valentina,

I have only had my MSRH01 for 2 months, and I only got Roborealm last week, so I have not gotten very far LOL

what I have so far is :

face tracking and detection outputting location on screen of any faces
basic visual object avoidance

So not much, however I have been building the design of the AI for quite a while. I have decided on a modular system that uses weights to decide on the final action.

for example I would have a space module, this module would check how close objects are to the sides of the robot, wanting to move the robot to the centre of the area equally from the objects on each side.
another module would be the light module. using a combination of the image and the light sensor on the SRF08 I would calculate the amount of light in the area, and have the robot want to move into lighter places.

These two module would each produce a full set of movement commands, each module is given a weight on how important it is, I would then scale the output movement to the weight of the module for each module, then average their outputs.

The idea is to have spefic controlled behaviours that can blend and mix together on the fly.

While I am a complete noob when it comes to robots, I have been programming AI systems for almost 20 years. This is actually my first step on moving my AI creations out of the virtual realm and into the real world!

------ As for your issue with localisation and positioning I can think of two things off the top of my head :

You could use the marker/navigate by map mode of the AVM plugin to track both movement and ball positions in 3d space.  this would be difficult on getting AVM working efficiently with the MSR-H01, but once you do, you robot will be unstoppable at its task.

The second more DIY approach is the use of trigonometry to calculate the distance to an object (or guage distance by how big the cube is on screen)
James Dickinson from Australia  [49 posts] 6 years
here is a small VBS script to control the MSRH01 using basic Binary Object Avoidance.

In order to use this, follow the "object avoidance" tutorial here at Roborealm.  What you want is a binary (black and white) image of the area the robot is allowed to move.

Then we use the "Point Location" plugin to get "Furthest" "Nearest", "Highest (Left Most)", "Highest (Right Most)".
X should be set to (IMAGE_WIDTH/2)
Y should be set to (IMAGE_HEIGHT)

The code does the following :

first it checks to see if the nearest point is too close to the robot (bottom of image) if it is, the robot backs off until it is no longer too close.
if there is nothing too close, the robot will attempt to move to the furthest point providing that the furthest point is further than half way up the image.  If it is not, it looks for the furthest point to the left or right, and turns in that direction.

Its extremely simple and far from robust, but it gets your MSR-H01 roaming around its environment autonomously with just a camera!
MSRH01BOA.zip
valentina from Italy  [43 posts] 6 years
Grats James, I tghink that it's being very full of satisfaction to make your AI programming become visible on the robot and I think that you did a lot in this short time since you had the robot!

I was a complete noob about robots, too, so I can understand you, but you have 20 years more than me of IA developing LOL I am starting only now ^^

We tried to use te object recognition module of the AVM, but it's really difficult to track a cube (too many cubes in the view), so we decided to move back to the color recognition that seems easier.
Our problem about distance is not to recognize a cube that is closest to the robot than another cube (we also will do that, but later.... we think we'll use the method that looks at the bigger one): for now we have to see if the cube is on the right side of the line.
Our bi-color-line is on the floor and the robot has to put red cubes on one side and green cubes on the oter side. The robot already knows in what side itself is, so we are at this step:
- look around and see if a cube of wrong color is in your same side of the floor respect to the line.
Of course we cannot understand this looking at the dimension of the cube respect o the line, because we don't know (and don't wanna knoe, to keep it simple) how far the cube is from the line.
Our project also has in its specification that the robot has to move i an environement where onlu line and cubes are visible, so we can't put other objects (like little flags etc) to make a triangulation.
The easiest thing that came in our minds is to look at the cube's coords but what we don't know is what those coords really are... if they are the coords of the COG they aren't significative. Do you know how to get the Y coord of the lower point of an object? this would solve our problem :)
valentina from Italy  [43 posts] 6 years
(sorry for the bad typing, I hope it will be clear anyway :P)
valentina from Italy  [43 posts] 6 years
I'm also looking at the point location plugin and to your last example you posted.... we just started studying, for today we will be here only for few hours (if you don't see me later you'll seee me tomorrow for sure!)

thanks again ^^
valentina from Italy  [43 posts] 6 years
oh yes! :) the point locator module works great! :D
we have to adjust the image to be seen clearlier.... I'm studying the modules about normalizing, contrast etc now :)
James Dickinson from Australia  [49 posts] 6 years
Hi there,

Please ignore my attachment above, it has y axis reversed!!

attached to this entry however IS a really good working Binary Object Avoidance robo file and VBS script

this uses canny edge detection to find edges like walls and things. then side fill plus harris corners to get points
the VBS script turns the harris corner array into an array of x,y positions for all the points then finds the hieghest point overall, the lowest point overall, the right heighest point and the left heighest point.
then the robot checks to see if the lowest point is too low. if it is, it backs away, turning to face that point at the same time.
Otherwise the robot checks to see if the heighest point is 75% of the way up the image. If it is the robot moves towards that point while turning to face it. If the heighest point is too close, it then checks to see if the left heighest point is hiegher than the right one, and turns left if it is, right if it isnt.

it is simple but works excellently providing that the canny edge detector doesnt screw up! I have found that using the cameras built in contrast and brightness settings to tweak it for each use the best method of compensating for light changes

I have noticed that sometimes the Harris Corners point finder returns more than 100 points (which is the limit ive put on the point arrays) and can crash the vb script. when this happens the robot stops moving (its shouldnt ever be stationary) just open the vbs plugin to get it going again

Here is a video of it in action :

http://www.youtube.com/watch?v=pIJU8_3d0iQ
Binary Object Avoidance.zip
James Dickinson from Australia  [49 posts] 6 years
I wish you could edit posts!

then the robot checks to see if the lowest point is too low. if it is, it backs away, turning to face that point at the same time.

should be

then the robot checks to see if the lowest point is too low. if it is, it backs away, turning away from the point at the same time.
Anonymous 6 years
thanks james! :)

Can I ask you more about the contrast and brightness settings to compensate for light changes? Today we had to change all the values of hue and intensity on the RGB filter when the sun went down and we switched on the lights in the room :( it obviously can't be done all the times and we fear that simply moving the robot in another room will mess it al up!

Today we did this:
caught the variables for the lowest point of each colour
made the robot move forward when he's seing a red blob, and push it.
We also made a short function to make it crab until he's in front of the blob, using the image_width and COG variables.
We now have to see how and when to call these function :) tomorrow's next step :)
James Dickinson from Australia  [49 posts] 6 years
I use brightness on a low setting, then contrast on a high setting to make it wash out the colors.  this is how I managed to get the robot to work last night in that video!
James Dickinson from Australia  [49 posts] 6 years
Heres an update to my own project :

Firstly I fixed an error in my above application that made the robot always turn left!

Secondly, I have added crab movement into the system to "push" the robot away from things close to the left or right while moving and turning. This keeps the path clear for longer, and makes turning more effective at finding a good direction to go in

Next up is to get the robot to pan its head left and right if it is having problems finding a direction. currently there is the possibility of the robot getting into a left/right loop in rare scenarios

I have attached the update.  I have also added some markers and a math plugin to give you a better view of what the robot is seeing on screen as well as a description of what it is currently doing
MSRH01 BOA 2.zip
valentina from Italy  [43 posts] 6 years
I have some problems to calibrate te head.... it turns left when he wakes and I couldn't find the settings on the calibration menu (in Teraterm), I think it's because I have an old version of the hexengine.
Anyway I won't use the head for now, but I would like to fix at least for esthetics matters. ;) For this purpose it should be sufficient to make it pan a little in the wake function...
valentina from Italy  [43 posts] 6 years
today we made the robot rotate if it's not in fron of the line. To implement it (2 coloured line: two lines of different color one close to the other) we made the robot check the highest and lowest point of the colour blobs and compare them: it seems to work :) Then we worked on a way to let the robot stop after a while when it's pushing cubes.... this made us work a lot because we can't use "while" or cycles for that (because of the pipeline continuosly restarting) or a sleep (that blocks the execution).
James Dickinson from Australia  [49 posts] 6 years
you can use

SetTimedVariable "ROBOREALM_VARIABLE", new_value, milliseconds

to get delays for example :

if GetVariable("Timer") == 1 then
  ' do something
  SetTimedVariable "Timer", 1, 1000
  SetVariable "Timer", 1
end if

the code inside this if statement will run once every second
Anonymous 6 years
James,

I think there is a small typo ... the SetVariable should be set to 0 instead of 1 otherwise the statement will execute every time the pipeline iterates.

if GetVariable("Timer") == 1 then
  ' do something
  SetTimedVariable "Timer", 1, 1000
  SetVariable "Timer", 0
end if

STeven.
James Dickinson from Australia  [49 posts] 6 years
thanks steven! I completely missed that!

http://www.youtube.com/watch?v=1qMjACWKEDE
Here is a video of further object avoidance checking.

I would ignore the file I posted above, its not errors in it :p

I wont post another one until I am sure its working!

Does anyone have any ideas how I could relyably remove the grout lines on the floor from the image?  my current method is a combination of gamme/intensity and brightness to white-out the ground but it causes the robot to not be able to see my walls as they are white too!
valentina from Italy  [43 posts] 6 years
We are trying since 2 days with the SetTimedVariable, but the problem is that we have to make it do something until a time and not
don't do something until a time.
In that way he just considerates the conditions if moving or not and what movement to do for a limited time, while we have to make it do a particular action for a certain time.... example

if the line is not straight rotate
if a red blob is on the view
walk 'here it has to walk for example for 10 seconds
after that it has to stop walking.

If we use a timed variable where we say that after 10 seconds he has to stop, for those 10 seconds he won't make the forward movement, but recalculate all the pipeline and eventually stop or rotate etc.

We are used to develop with while cycles and similar.... probably the solution is easy but we still couldn't find it :P
Any suggestion?
valentina from Italy  [43 posts] 6 years
OK we caught it :) we were only checking the variable in a wrong place ;)

I think that in few days our project could be ready.... as soon as it it cleared (now the code is a bit too confused ;) I will post here the robo file :)
valentina from Italy  [43 posts] 6 years
oh, and we set the intensity and contrast fuctions as you said, James: tomorrow with the daylight we will see if I set them properly :) thanks

now our next step is to clear the code (make functions, as now is a whole blob) and call the right functions in the right positions :)
Anonymous 6 years
valentina,

Can you write the about using a while loop as you would expect it to work and we can translate that into a state system that will work in the pipeline.

The trick is to save your current state in a variable that is recalled each time the VB (or C/Python) executes. When something is triggered to move to the next state you update that variable. Using a select statement it would look something like what can be seen in

http://www.roborealm.com/tutorial/ball_picker/slide080.php

STeven.
Anonymous 6 years
Thanks for the video. Cool bot! Can you include an image of the grout lines? There are many ways to do this ... having an image from what your robot sees may help design a new one that works better for you.

STeven.
James Dickinson from Australia  [49 posts] 6 years
Sure thing, I will put one up tonight

I actually came up with a pretty good idea :

us gamma/intensity/contrast to make the grout not stand out much
run a mean filter at 25+ to make the grout go away
threshold to white=floor
marker this image (output 1)
draw gradient (white at top black at bottom)
math subtract gradient from output 1
marker this image (output 2) return to source
run prewitt edges on the source.
threshold to get edges only
math to subtract output2 from current
threshold to remove <60%
run side fill
valentina from Italy  [43 posts] 6 years
Thanks Steven, we solved using a variable as written above :) but I will for sure write if I have more problems about it.
I also read the page you linked: how is the "Timer()" used there, just to learn? thanks


I am having some troubles to set the image light and contrast... I put here my RR pipeline to make you check it.... it has to run in different lighting environements, but actually it doesn't.
I also send hte photos of the cubes.

thanks :)

  

program.robo
valentina from Italy  [43 posts] 6 years
PS: I posted photos, but in reality we take the input video stream for a webcam, so the light will change ;) just to be clear.

Thanks ^^
valentina from Italy  [43 posts] 6 years
And this tapes will be the line (also to recognize with the RR file linked above)...

 
Anonymous 6 years
Timer is a function in VBScript that returns a number that represents the time of the machine expressed in seconds with milliseconds to the right of the decimal point.

Can you try to disable to auto-white balance of your camera? The images you include are approximately the same lighting but have a cyan/red shift in them which will make color check more difficult. There is a workaround in software but the best results will be to disable that feature on your webcam.

Once you disable this can you retake the photos and also include photos at a different lighting level (i.e. during day/night).

Can you also take a picture of the tape on the floor instead of in the package?

Note that taking pictures as close as possible to what your robot will see will be idea. In your above photos the overhead light is present which will cause the camera to do strange things due to the bright light. Try taking the above images at ground level as would your robot see them. One of the reasons for this is a lot of floors are shiny which can cause other issues. So its good to know that ASAP.

STeven.
valentina from Italy  [43 posts] 6 years
thanks Steven.

I actually didn't do the shots that I posted yesterday with the same camera that I use for the robot (I made them with the biuld in camera on my laptop, while I use an external one for the robot).
These that I put here now are from the robot camera. I set up to disable the white balance on the software that was installed with the camera driver, but I'm not so sure that the settings remains in memory when I shut down that sw (if I don't shut down it, RR doesn't recognize the cam, that is busy).

I will upload on monday more photos made in the place where we are working on the robot (university, now closed for the weekend), but the fact is that I would like to have it running in different environments.... if I set RR file to take good images in that room, it won't work in other places with my actual file, and that's not good because I had to present it for an exam and we don't know in what room and at what tima it will be...

Anyway these are new photos of a cube (red and green sides) made with the same paper of the others, taken with the robot camera and shot in my home.
I think I'll stick the tape on a plastic towel (the one on the photo, with nothing on it; and in another photo, with the tapes on it) to make it a portable line. I have only 2.75 meter of them and they aren't cheap, so I would like to see if the tape is recognizable on that plastic towel before to stick it, if it's possibile. The other possibility is to stick it on white paper (I will buy a big paper if the plastic towel is not good, but if you have better ideas, pleas suggest them).

Hope it can help... I would like to work a bit on that tomorrw at home, so that on monday we can try the line-recognition code that we wrote.



thanks again!


    
valentina from Italy  [43 posts] 6 years
PS also the cubes are shot on the plastic towel.
James Dickinson from Australia  [49 posts] 6 years
Hi Valentina!

I have attached a robo file which isoletes the red, green, blue, and yellow to markers for you

I have tested it on all your images and the output is should be more than enough to get you going.  from the outputs for the cubes use blob filters etc to remove artifacts or non cube shapes.

I trialled it out in my house and providing you have a controlled environment that is purposefully lacking anything of these key colours it works brilliantly regardless of lighting (within reason!)
program.robo
James Dickinson from Australia  [49 posts] 6 years
here is an example of the output of each marker combined

 
valentina from Italy  [43 posts] 6 years
James.... can I say I LOVE YOU? :D

You got perfectly the point: light balancing at the beginning and then filtering OUT the wrong colours, while I was only filtering IN the right ones.... that's the way, thanks thanks and thanks again! :)

I can't see the hour to try it tomorrow at university :)
James Dickinson from Australia  [49 posts] 6 years
Are you allowed to take some videos of your testing at uni?  I'd love to see your hexapod in action!

and its my pleasure! I learned a long time ago in 3D rendering that removing unwanted colours is easier than finding wanted ones.
valentina from Italy  [43 posts] 6 years
of course, I will send videos ^___^
valentina from Italy  [43 posts] 6 years
big problem: with the color filters we used, I have a red shadow on the green cube and viceversa and I'm stuck trying to filter it.... this is te photo. If I put out the shadow of the green cube, I don't see the red one anymore, too....

may you help me?

 

program.robo
James Dickinson from Australia  [49 posts] 6 years
Hi Valentina!

I have looked at your file and included a modified version.  I will be honest there was alot of modules in there I couldnt figure out what they were for so I disabled them (the blob filter ones with colour names) I instead replaced them with a threshold module to isolate the blobs. I tested this on all the images you have posted so far and it worked reliably so hopefully this is all you need to move forward!

oh, remove the image load at the top :p
program.robo
valentina from Italy  [43 posts] 6 years
the blob filter modules are to get an array with the X and Y coords of every blob to compare them for our conditions....

I am experimenting another stressful problem, I would like also Steven to look at it....
it seems that the Lowest_Middle and Highest_Middle variables have some sort of bug so that they don't appear on the middle....
please look at the attached demo file: the mouse is placed on the point that the coords indicate, but the X mark is showed on another (the right) point.

We are getting the lowest and highest points of the blue and yellow blobs to compare the X variable to see if the line is horizontal (if not, the robot will rotate).
valentina from Italy  [43 posts] 6 years
the blob filter modules are to get an array with the X and Y coords of every blob to compare them for our conditions....

I am experimenting another stressful problem, I would like also Steven to look at it....
it seems that the Lowest_Middle and Highest_Middle variables have some sort of bug so that they don't appear on the middle....
please look at the attached demo file: the mouse is placed on the point that the coords indicate, but the X mark is showed on another (the right) point.

We are getting the lowest and highest points of the blue and yellow blobs to compare the X variable to see if the line is horizontal (if not, the robot will rotate).

 
James Dickinson from Australia  [49 posts] 6 years
have you looked into using the staight_line filter on your lines?  Also  unless I am mistaken, the lines should always go from the edge of the image to another. I think you should approach the lines differently than you do the cubes
valentina from Italy  [43 posts] 6 years
Our lines don't actually go from an edge to the other: we have a quite short line and te robot could see it when it ends, so that the end of the line is in a indefinite point in the image.
I don't know if there are functions or variables to get the infos of the single lines in the image, but we aretrying to solve the upper problem using the angle variables of the Geometric Statistics function: our blobs are not so definite, but it seems we can have sufficient results with the angle_ALT_2 variable.

Even if it's clear that, if the LOWEST_MIDDLE and HIGHEST_MIDDLE of the Point Location modules ar really bugged as it seems to us, they can't be fixed in time for ur project, anyway I hope that it could be useful if Steven has a look on them and eventually fix :)

In the meanwhile we changed our code to work with the Geometric variables, but still have olors issues with the plastic towel, that is a little darker on one side (it's an old one and the edge id getting yellow/brown). Our plan is to cut it to have only the line and use it directly on the floor or on a white paper that we already prepared. Let's see toorrow if we can finally get ionto the last part of the project, developing an end point for our algorithm! :)

James, thanks very much for your tips on color and threshold: they were very useful! :)
valentina from Italy  [43 posts] 6 years
we solved the issue of the lower points, I don't know if it's a trick or if it's meant to be exactly so: we lowered the thershold count in the blob filter and we could filter out another blob, so that the only one that remains is ours.
We don't understand how it can see a black blob inside a black image, but at the end we solved for our particular purposes.
Anyway we are still using the angle variable to check if the line is horizontal: we are now checking if all goes right before doing the last function. We'll let you know how it goes and, at the end, post a video :)
valentina from Italy  [43 posts] 6 years
Ok, our project is going to an end :)
Today we presented it for the exam and it was uccessful ^^

videos and (italian) blog:
http://hexapodproject.wordpress.com/2011/07/06/233/

Let me have a little rest and I will comment in English the RR and VB files, so that you all can understand them :) An Italian version will be soon posted in the page "Progetto" of the blog linked above!
James Dickinson from Australia  [49 posts] 6 years
Very nice indeed. Loved the videos. and it was quite a good read (although I think google translate didnt really get some of it!

congratulations on a successful project!
valentina from Italy  [43 posts] 6 years
Thanks James :)
AT the end we only made him order the red blocks, instead that also go to the other side to order the green ones, but we were anyway satisfacted for our first robot and first IA challenge :)
The blog will be completed in next days with other explanations and a tutorial for the robo file. We will also add a downloadable file both for the robo and for the vbs (I have only to find how to upload a file that's not a media one in wordpress.com :D).
Our hexapod gAItano looks a bit more funny and less professional that yours and the other that can be found on Youtube, but we liked it to be so :) - and smiling! -, but if we can, we will provide him with a better way to push the glasses than a wire ;)

As soon as I have updated the blog I will of course send you a message here ^^
valentina from Italy  [43 posts] 6 years
Here is the link to the page where you can download both the robo and the vbs files.

Both are in 2 versions: one with italian comments and one with english ones.
In the page (in the case you don't translate it) I wrote in english where I put the english ones ;)

Wordpress.com don't allow to upload (and then download) particular file extensions, so remember to rename them: they are now all with a .doc extension (the right one is in the file name).

Hope you enjoy them! ^__^
http://hexapodproject.wordpress.com/2011/07/07/file-del-progetto-di-ordinamento-blocchi/
valentina from Italy  [43 posts] 6 years
I finally added the step-by-step guide:
http://hexapodproject.wordpress.com/2011/08/07/progetto-roborealm-passo-passo/

At the end of the page there is a special thanksgiving to Steven and James for the help :)

This forum thread has been closed due to inactivity (more than 4 months) or number of replies (more than 50 messages). Please start a New Post and enter a new forum thread with the appropriate title.

 New Post   Forum Index