Matching omnidirectional images
from United States  [214 posts]
9 years

I am trying to determine the relative position of a robot that uses an omnidirectional imaging system.  The idea is that a snapshot is taken at a starting location, then I move the robot away from that position and take a second snapshot.  I now want to compare the starting and ending snapshots and compute where the robot is relative to the starting position.

I  get a really good estimate of the relative orientation of the robot by doing a cross-correlation on the center horizontal strip of the two images.  Now I want to compute the displacement of the robot between the two images.  So to make things simpler, assume that the orientation of the robot does not change between the two images.

I have attached five sample images.  The first is the initial snapshot after feeding it through the Polar and Crop modules.  The other four images are taken with the robot displaced roughly 1 foot to the right, behind, left, and in front of the initial location, in that order.  The task is to compare each of these images in turn with the initial image and compute the corresponding displacement--even just the direction of the displacement would be good enough.  For example, a successful result for the second image would be "90 degrees clockwise at 12 inches" or even just "90 degrees clockwise".  But the degrees of rotation needs to be fairly accurate since I want to use it for homing the robot back to the initial location.  Just to be clear, I'm not talking about rotating the robot about its center: I'm talking about picking the robot up and placing it 1 foot to the right, or 1 foot behind, etc. without changing its orientation.

Any ideas?


This forum thread has been closed due to inactivity (more than 4 months) or number of replies (more than 50 messages). Please start a New Post and enter a new forum thread with the appropriate title.

 New Post   Forum Index