Skip to main content

Research Repository

Advanced Search

Modelling perceptions on the evaluation of video summarization

Abdalla, Kalyf; Menezes, Igor; Oliveira, Luciano

Authors

Kalyf Abdalla

Luciano Oliveira



Abstract

Hours of video are uploaded to streaming platforms every minute, with recommender systems suggesting popular and relevant videos that can help users save time in the searching process. Recommender systems regularly require video summarization as an expert system to automatically identify suitable video entities and events. Since there is no well-established methodology to evaluate the relevance of summarized videos, some studies have made use of user annotations to gather evidence about the effectiveness of summarization methods. Aimed at modelling the user’s perceptions, which ultimately form the basis for testing video summarization systems, this paper seeks to propose: (i) A guideline to collect unrestricted user annotations, (ii) a novel metric called compression level of user annotation (CLUSA) to gauge the performance of video summarization methods, and (iii) a study on the quality of annotated video summaries collected from different assessment scales. These contributions lead to benchmarking video summarization methods with no constraints, even if user annotations are collected from different assessment scales for each method. Our experiments showed that CLUSA is less susceptible to unbalanced compression data sets in comparison to other metrics, hence achieving higher reliability estimates. CLUSA also allows to compare results from different video summarizing approaches.

Citation

Abdalla, K., Menezes, I., & Oliveira, L. (2019). Modelling perceptions on the evaluation of video summarization. Expert Systems with Applications, 131, 254-265. https://doi.org/10.1016/j.eswa.2019.04.065

Journal Article Type Article
Acceptance Date Apr 29, 2019
Online Publication Date Apr 30, 2019
Publication Date 2019-10
Deposit Date Jun 24, 2019
Publicly Available Date May 1, 2020
Journal Expert Systems with Applications
Print ISSN 0957-4174
Publisher Elsevier
Peer Reviewed Peer Reviewed
Volume 131
Pages 254-265
DOI https://doi.org/10.1016/j.eswa.2019.04.065
Keywords Video summarization; Subjective evaluation; Evaluation metric
Public URL https://hull-repository.worktribe.com/output/2004511
Publisher URL http://doi.org/10.1016/j.eswa.2019.04.065
Additional Information ©2019, Elsevier. This manuscript version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0/

Files





You might also like



Downloadable Citations