loading
 
FIRST FRC 2017
sandy currie from United States  [9 posts]
7 year
In the FIRST FRC game for 2017 there are two targets. In your tutorials, you document two examples to process the target for the boiler stack.  I wonder if you have any ideas for the second target, the peg for the gear?  I was able to modify one of the examples provided to determine the distance and robot left and right angular offset from the target. Your example was very helpful to start the process.  I am now trying to figure a way to determine the angle that the target is from the robot required to get straight on to the target.  One thought was to use a cropped image from the Line Pattern module and to try to find the two blobs and their relative heights.  I could then use those heights to calculate the angle.  One problem I can see is that the peg will bisect the closest blob and create two blobs instead of one.  I am wondering of you have thought about examples to use the peg target that might get me there quicker.  I appreciate all your help and your tutorials are great.  Thanks for the FRC support.
Steven Gentner from United States  [273 posts] 7 year
Sandy,

We had heard of the peg target before kickoff but since have not seen that even mentioned. Do you have actual images from this? We never received any from FIRST and therefore assumed this was a change and had been dropped.

Assuming they exist, the Line Pattern should be able to also find those. In terms of driving the robot towards the peg, I would avoid any angle calculation since its very unlikely that your robot will be able to drive that accurately. Instead, take the center of the Line Pattern result and use that to determine if the robot needs to drive hard right, right, center, left, hard left and just test to see what motor values actually work best. I.e. if you find that the robot starts waggling towards the target, reduce the motor values.

For example, if the target is 200 pixels off from center of the image that would imply a stronger left or right move. If the target is within 100 pixels then that's only a slight left or right. If it is within 50 pixels, then move forward. Note these are just guesses as you will have to tune your own robot and camera solution (above assumes 640x480 image AND that the camera is centered on your peg grabbing device).

Repeating these crude directions very quickly will give you a much more reliable targeting than using any angles. BUT this only works when you are running the analysis over and over and processing the image as fast as possible (at least 10hz or fps).

This also deals with the fact that most robots CANNOT even more straight (unless you are using encoders) and in essence uses the target as an absolute measurement of where the robot needs to go (left, right,etc) and thus will home the robot correctly even if you are not using encoders to drive the robot.

STeven.

This forum thread has been closed due to inactivity (more than 4 months) or number of replies (more than 50 messages). Please start a New Post and enter a new forum thread with the appropriate title.

 New Post   Forum Index