|
Basic Quesiton on Cameras from United States [60 posts] |
15 year
|
I have been playing with obstacle avoidance and have looked at the tutorial examples. They all work fine on the still image but when processed on my web camera image they don't work as well. There are two issues.
The first is that real world is messier than the examples. Shadows cause problems because they are seen as obstacles.
The second is that the image changes from frame to frame even though there is not motion. Using the Canny module the edges jump and move quite a lot regardless of the settings.
I assume this is mainly the camera. When the value of a pixel is borderline it jumps between two (or more) RGB values. I've tried a number of the filters and while they help there is still alot of jitter. Even Flickr didn't seem to help. (I expect it just delayed the changes a frame or so.) Any suggestions on filters to try?
Right now I'm use a Live! Cam Notebook Pro which was not expensive. Are other cameras better? I'd appreciate comments on other cameras. I don't want to spend $500 but if $100 will make a big difference I can do that.
|
|
|
Anonymous |
15 year
|
Rud,
You have identified a common issue with vision. Whenever a threshold is introduced into the processing pipeline some pixels will change their association to above and below the threshold quite quickly. These threshold border pixels will cause the image to fluctuate quite a bit. The trick is to try to ignore these pixels and focus on those that are clearly within the appropriate range. But that is not easy to do.
I would not worry about getting a better camera at this point. A better camera will help to stabilize this issue but not eliminate it. You will always have this issue regardless of the hardware. For example, the shadow issue will not be eliminated by a better camera and does plague any vision system to date. What most do is try to work around the issue. For example, color can be used to reduce the shadow effects. I.e. convert the RGB image into a HLS and just use the H&S channels as this will help to reduce the dependence on intensity.
While the tutorials do use color for obstacle avoidance they are limited to a constructed environment (with perhaps the floor_finder module being better than the techniques) and thus will ultimately have issues on carpets that are not a single color or if a large amount of shadows/sunlight makes it into the room.
The other techniques we are looking at involve optical flow or movement based techniques. I.e. close objects move quicker than further objects. This has better lighting properties but does not give you a fully detailed map and can also suffer from noise. Plus it requires movement (i.e. robot has to move) in order to work correctly.
The other technique is stereo which can work quite well in most environments but suffers from expensive CPU processing and does not operate as well indoors as outside. This is due to the indoor environment consisting of large amounts of single color planes (walls, carpet, etc). But we are looking at techniques to overcome these limitations (mainly the CPU and camera alignment) which should be quite promising. The downside is that two cameras would be required ... but if they are two cheap webcams (we use two $7 webcams for testing) that should be ok.
Unfortunately this is not a solved problem and still in active development/research. Perhaps you can post some of the hard cases which you have encountered and the collective folks here on the forum maybe able to find something that would work better for your environment.
STeven.
|
|
|
from United States [60 posts] |
15 year
|
I've attached a robo file which is a crude approach to obstacle avoidance for my Create. My real goal for this approach was to do a quick check to see if obstacles might be in front of the moving robot. If this were true then more detailed processing would be used.
It actually worked better than I expected even in the crude form. To test I did setup a rough steering capability in the VB script.
The basic process is to use Movement to generate a white image of changes from the previous image. Then Collapse the image to the bottom. Finally, use ClusterPoints set at 5 to determine the location of the most movement.
In this file I just averaged the X positions and used that to control the steering. I'm not familiar with VB so didn't want to fuss with adding more complexity.
Something similar would work with Optical Flow but I thought Movement would take less time for a quick test prior to more complex analysis. program.robo
|
|
|
from United States [214 posts] |
15 year
|
Hi Rud,
I haven't tried obstacle avoidance based on movement yet, but this looks like a promising approach. What I *have* tried is obstacle avoidance based on edge detection which I summarized in a tutorial at:
http://www.pirobot.org/blog/0004/
Like you, I found the Canny Edge filter to be very unstable. I'm guessing this is the fault of the algorithm, not RoboRealm or your camera. So instead I use the Prewitt Edge filter followed by an Auto Threshold and Clean. This tends to produce nice edges even in a dim room with uneven lighting. I then do a Side Fill down to the bottom edge of the frame, and Erode and a Smooth Hull to get a profile of the edge landscape ahead of the robot. The low points on this profile correspond to obstacles close by and the high points to clear paths. To check that a gap is wide enough to pass through, I use the Harris Corners module or the Sample_Line module (with boarder padding set to around 7) to get the array of x,y coordinates of the profile contour. You can then use the API or VB Script to compute the best gap to head for.
I've attached the .robo file that uses the Sample_Line module in the final step.
--patrick
program.robo
|
|
|
from United States [60 posts] |
15 year
|
Hi Pi,
I've gone back to your web site a couple times on the obstacle avoidance problem. I think I got "Collapse" from your approach. It is almost as bad to remember what all the RR modules do as to figure out how to combine them.
Thanks for the file. I need to think about Sample_Line a little bit.
|
|