kwiver
kwiver copied to clipboard
Add a vital algo for object tracking
With #691 adding more object tracking code, it might be nice to create an algo that the existing trackers can be turned into implementations of. The main challenge I see is that different trackers have different input requirements. While all will likely take a detected_object_set
as input and produce an object_track_set
as output, other things vary. For instance:
-
simple_homog_tracker
(link) additionally takes a timestamp and a homography as input. -
srnn_tracker
(link) additionally takes a timestamp and an image as input. -
pysot_tracker
(from VIAME'sviame/next
, link) additionally takes a timestamp and an image as input, as well as an optional object track set for initialization requests.
I think this would be a really useful since it would allow swapping the trackers. One potential solution might be to create more than one type of abstract tracker. Eg. if the tracker is purely based on appearance that it would likely follow the function signature of srnn_tracker
or pysot_tracker
so we can an algo for that. Another can be motion based although not sure if simple_homog_tracker
would fall under that category.
I agree that this will take some thought on how to deal with the variations of required inputs. Having an interface that is a superset of all possible inputs will be confusing and hard to validate. Some tracking approaches use a collection of processes, so maybe a cluster could be used to normalize the set of input ports.
The other tracker is simple feature distance, which is a composite of 4 tracking processes in a .pipe. We can probably have both cluster and vital algo type APIs, might be hard to find a one size fits all