TY - GEN
T1 - Tracker fusion for robustness in visual feature tracking
AU - Toyama, Kentaro
AU - Hager, Gregory D.
PY - 1995/12/1
Y1 - 1995/12/1
N2 - Task-directed vision obviates the need for general image comprehension by focusing attention only on features which contribute useful information to the task at hand. Window-based visual tracking fits into this paradigm as motion tracking becomes a problem of local search in a small image region. While the gains in speed from such methods allow for real-time feature tracking on off-the-shelf hardware, they lose robustness by giving up a more global perspective: Window-based feature trackers are prone to such problems as distraction, illumination changes, fast features, and so forth. To add robustness to feature tracking, we present `tracker fusion,' where multiple trackers simultaneously track the same feature while watching for various problematic circumstances and combine their estimates in a meaningful way. By categorizing different situations in which mistracking occurs, finding appropriate trackers to deal with each such situation, and fusing the resulting trackers together, we construct robust feature trackers which maintain the speed of simple window-based trackers, yet afford greater resistance to mistracking.
AB - Task-directed vision obviates the need for general image comprehension by focusing attention only on features which contribute useful information to the task at hand. Window-based visual tracking fits into this paradigm as motion tracking becomes a problem of local search in a small image region. While the gains in speed from such methods allow for real-time feature tracking on off-the-shelf hardware, they lose robustness by giving up a more global perspective: Window-based feature trackers are prone to such problems as distraction, illumination changes, fast features, and so forth. To add robustness to feature tracking, we present `tracker fusion,' where multiple trackers simultaneously track the same feature while watching for various problematic circumstances and combine their estimates in a meaningful way. By categorizing different situations in which mistracking occurs, finding appropriate trackers to deal with each such situation, and fusing the resulting trackers together, we construct robust feature trackers which maintain the speed of simple window-based trackers, yet afford greater resistance to mistracking.
UR - http://www.scopus.com/inward/record.url?scp=0029519286&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=0029519286&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:0029519286
SN - 0819419532
SN - 9780819419538
T3 - Proceedings of SPIE - The International Society for Optical Engineering
SP - 38
EP - 49
BT - Proceedings of SPIE - The International Society for Optical Engineering
A2 - Schenker, Paul S.
A2 - McKee, Gerard T.
T2 - Sensor Fusion and Networked Robotics VIII
Y2 - 23 October 1995 through 24 October 1995
ER -