Hi, i'm developing a project about ball tracking with my micro2440, use USB webcam and OpenCV-2.0. The main ideas in my code: int time_tick; void *time_cnt(void*){ while(1){ time_tick++; } } int main(int argc, char* argv[]){ pthread_t time_thread; pthread_create(&time_thread, NULL, &time_cnt, 0); CvSize size = cvSize(640,480); CvCapture* capture = cvCaptureFromCAM( 0 ); while(1){ time_tick = 0; IplImage* frame = cvQueryFrame( capture ); ... (do something to process the image, detect the ball) ... printf("Ball! x=%d y=%d r=%d t=%d\n\r", ball_x, ball_y, ball_r, time_tick); ... some code to access the GPIOs to control motors... } } My full code can be downloaded here: http://www.mediafire.com/?2k6l6zhj0vc6irh ********************* The result when run it: time_tick is about 200ms to process a frame. But when i move the ball, the application still prints out the ball_x, ball_y, ball_r of the previous position of the ball for about 5~10 continuous frame later (about 1~2s), and then it prints out the new position continuously. It 's too slow and i cannot control the motors to track the ball properly. I don't understand what in my code can make that delay. I'm thinking about 2 cases: 1/The cvQueryFrame, it gets some previous frame in buffer of capture??? 2/printf() and the code to access the GPIOs, they don't reach immediate to hardware, but put something to buffer and output to hardware slowly, don't they??? Anybody know how to solve this problem. P/S: my USB webcam shows very well with "USB camera" example application from manufacture.
image processing perfomance
I have just tested: while(1){ get a char from COM1 printf(something); control motor through GPIO } and it work fast enough to see the motor change its speed immediately. So i think the reason may be from how we capture the image from USB webcam. But still have no way to analyse and solve the problem.
One more test: with my PC, Ubuntu 10.10, I make a small code: cvNamedWindow( "My_Window", CV_WINDOW_AUTOSIZE ); CvCapture* capture = cvCaptureFromCAM( 0 ); IplImage* frame = 0; int cnt = 0; while(1){ frame = cvQueryFrame( capture ); cvShowImage( "My_Window", frame); cnt ++; printf("frame %d \n\r", cnt); sleep(1); } And I can see that My_Window shows the image which appeared 6s ago. So the problem is from cvCaptureFromCAM and cvQueryFrame. How to capture the image in present?
Try using QT OpenCV together it will be easier for you. There is an example project named QTCam, I think.....
Thanks Usama Masood for your attention. But I don't think using QT will solve my problem, I can use OpenCV to process images quite well, problem only from how i catch an image, bad delay. In my test code above on PC, if i replace "sleep(1)" by "usleep(200000)" (delay 200ms), so every image will be showed with 2s delay. It's the same result with my application code on micro2440. If I put cvCaptureFromCAM and cvRelease into while loop, like this: while(1){ capture = cvCaptureFromCAM( 0 ); frame = cvQueryFrame( capture ); cvShowImage( "My_Window", frame); cnt ++; printf("frame %d \n\r", cnt); sleep(1); cvRelease(capture); } Then there is no delay and My_Window will show image in present, not in the past. But cvCaptureFromCAM and cvRelease, they seem take about 1s to process, it's too slow for continuously processing image and tracking. So I still have no way to solve :(
I found some discussions here: http://tech.groups.yahoo.com/group/OpenCV/messages/71060?threaded=1&... http://tech.dir.groups.yahoo.com/group/OpenCV/messages/62201?threaded=1&... It relates to the buffer of cvGrabFrame. Now I'm trying some to solve, I don't want to re-compile OpenCV.
I solved this problem by adding 4 instruction calls "cvGrabFrame" before "cvQueryFrame", and the processing time is not change almost :D