LabelMovie: a Semi-supervised Machine Annotation Tool with Quality Assurance and Crowd-sourcing Options for Videos
For multiple reasons, the automatic annotation of video recordings is challenging: ﬁrst, the amount of database video instances to be annotated is huge, second, tedious manual labelling sessions are required, third, the multimodal annotation needs exact information of space, time, and context, fourth, the different labelling opportunities (e.g., for the case of affects) require special agreements between annotators, and so forth. Crowdsourcing with quality assurance by experts may come to the rescue here. We have developed a special tool where individual experts can annotate videos over the Internet, their work can be joined and ﬁltered, the annotated material can be evaluated by machine learning methods, and automated annotation starts according to a predeﬁned conﬁdence level. Qualitative manual labelling instances by humans, the seeds, assure that relatively small samples of manual annotations can effectively bootstrap the machine annotation procedure. The annotation tool features special visualization methods for crowd- sourced users not familiar with machine learning methods and, in turn, ignites the bootstrapping process.