A statistical-driven approach for automatic classification of events in AFL video highlights
Tjondronegoro, Dian, Chen, Yi-Ping Phoebe and Pham, Binh 2005, A statistical-driven approach for automatic classification of events in AFL video highlights, in Proceedings of the twenty eighth Australasian Computer Science Conference (ACSC 2005) Newcastle, Australia, January, 2005, Australian Computer Society, Sydney, N.S.W., pp. 209-218.
(Some files may be inaccessible until you login with your Deakin Research Online credentials)
Due to the repetitive and lengthy nature, automatic content-based summarization is essential to extract a more compact and interesting representation of sport video. State-of-the art approaches have confirmed that high-level semantic in sport video can be detected based on the occurrences of specific audio and visual features (also known as cinematic). However, most of them still rely heavily on manual investigation to construct the algorithms for highlight detection. Thus, the primary aim of this paper is to demonstrate how the statistics of cinematic features within play-break sequences can be used to less-subjectively construct highlight classification rules. To verify the effectiveness of our algorithms, we will present some experimental results using six AFL (Australian Football League) matches from different broadcasters. At this stage, we have successfully classified each play-break sequence into: goal, behind, mark, tackle, and non-highlight. These events are chosen since they are commonly used for broadcasted AFL highlights. The proposed algorithms have also been tested successfully with soccer video.
Field of Research
080199 Artificial Intelligence and Image Processing not elsewhere classified
Unless expressly stated otherwise, the copyright for items in Deakin Research Online is owned by the author, with all rights reserved.
Every reasonable effort has been made to ensure that permission has been obtained for items included in DRO.
If you believe that your rights have been infringed by this repository, please contact firstname.lastname@example.org.