A Viewport-Driven Multi-Metric Fusion Approach for 360-Degree Video Quality Assessment. de Albuquerque Azevedo, R. G., Birkbeck, N., Janatra, I., Adsumilli, B., & Frossard, P. In 2020 IEEE International Conference on Multimedia and Expo (ICME), pages 1-6, July, 2020. doi abstract bibtex 14 downloads We propose a new viewport-based multi-metric fusion (MMF) approach for visual quality assessment of 360-degree (omnidirectional) videos. Our method is based on computing multiple spatio-temporal objective quality metrics (features) on viewports extracted from 360-degree videos, and learning a model that combines these features into a metric, which closely matches subjective quality scores. The main motivations for the proposed method are that: 1) quality metrics computed on viewports better captures the user experience than metrics computed on the projection domain; 2) no individual objective image quality metric always performs best for all types of visual distortions, while a learned combination of them is able to adapt to different conditions and produce better results overall. Experimental results, based on the largest available 360-degree videos quality dataset, demonstrate that the proposed metric outperforms state-of-the-art 360-degree and 2D video quality metrics.
@inproceedings{2020_07_azevedo,
author={de Albuquerque Azevedo, Roberto Gerson and Birkbeck, Neil and
Janatra, Ivan and Adsumilli, Balu and Frossard, Pascal},
booktitle={2020 IEEE International Conference on Multimedia and Expo (ICME)},
title={A Viewport-Driven Multi-Metric Fusion Approach for 360-Degree Video
Quality Assessment},
year={2020},
volume={},
number={},
pages={1-6},
abstract={We propose a new viewport-based multi-metric fusion (MMF) approach
for visual quality assessment of 360-degree (omnidirectional) videos. Our
method is based on computing multiple spatio-temporal objective quality
metrics (features) on viewports extracted from 360-degree videos, and
learning a model that combines these features into a metric, which closely
matches subjective quality scores. The main motivations for the proposed
method are that: 1) quality metrics computed on viewports better captures the
user experience than metrics computed on the projection domain; 2) no
individual objective image quality metric always performs best for all types
of visual distortions, while a learned combination of them is able to adapt
to different conditions and produce better results overall. Experimental
results, based on the largest available 360-degree videos quality dataset,
demonstrate that the proposed metric outperforms state-of-the-art 360-degree
and 2D video quality metrics.},
keywords={Measurement;Quality assessment;Visualization;Video
recording;Feature extraction;Two dimensional displays;Distortion;visual
quality assessment;omnidirectional video;360-degree video;multi-metric
fusion},
doi={10.1109/ICME46284.2020.9102936},
issn={1945-788X},
month={July},
}
Downloads: 14
{"_id":"u5jG7rvo3Wmwq7ert","bibbaseid":"dealbuquerqueazevedo-birkbeck-janatra-adsumilli-frossard-aviewportdrivenmultimetricfusionapproachfor360degreevideoqualityassessment-2020","authorIDs":["22qdanuzGpjfebYp2","244dmL4NwKXTfTqFB","24yoNLHLsDa4x9D2M","2wkrqiNDpnJQeYFvw","3fArXRQLQoc8ZwrMq","5YSSXn6odxExxuoFA","5e63aeed5e3c57de01000146","5yMsKZKLBnZKYvSHM","6AYHMrhkKKxQmxubb","6jHF2A9kwyWmNFCCd","6wryNCAx7Zo8fbwAB","7hPsR4oepA7Xc4HaE","884pdHmj7ppG8iMk9","8yyq55iMiNu7KsgmJ","9JGhhPKtYaFCnMcTC","AED8rhcj6RNJLxXaL","AinPa5MKBTARKabGR","BAt34XTbdkvxhXgcC","CKx8XtftEtEkTS9XX","CWCixdwNu6XxFiHB5","EZ5k5yT5wLDS5oY2k","EiPFBrZGk8szdGvdM","EnvpRHSpqMg8PPvFL","EyTjgNaspXXTjLnRA","FeT67z5eszt53SB9m","G4TmkruogoSArYmz8","G6dsN8GBTgKb4zgHK","GdSNk5rbGcsuazQAA","Gg6FyBxS4c9crJQWY","HK6F7nRGfMHsZwRQn","JBPNdykuQFNRwkJKm","Jq4oRAA6kc2cEWEMx","Js8D8d5fYQiaZimyM","K3h2ZWfWMtS5NWpEB","KzsLTkoPMJ6gNJo64","L8qBKy36xDRcmE8oH","MfBYf6SeReriSdD5q","NNShBA8E6jTJ9pc49","NwmD6quP7sfJgs4cE","P4HhxCoz5c2Bn55xN","RBufWNXJd7ArN6gPp","RKoz2F7cr9jhRzBSA","SKn7QDikeo6LsniwG","Tu33b2iijsccpNwNy","WSi47kKxoBL9zxMPk","XcDM74dsEkKXvGtSE","ZLyJ3SvsZAgkSmMhA","aHoWYt7KttYy4xsyb","dT4Q9vZmW8XW3rssQ","dYyNfMq6QDvMwqEmg","hN8avpNdk8F9Bdmih","hy6qqBWeQBxrfZQkQ","iZrZ562QLES379AeJ","kbbhu3riRm4GbuxXr","mYnbCr4EQk3aayLEc","oZTNoGwQwN6Lwjkyt","pRdcpdf3i5Fm32gWn","pfHvmrarHFaCnXruD","qfTGajRsvSg96C5Lj","s6NMQeuXCvahtn5xD","sTvD7E6KuhsRZRD3E","srPxnveRYPn3cx52j","thzkvvp2oSxPQeiE7","tshc376gy7go5omwM","vnp2utbXmofWRR7MA","wiggSGkKjJvNyXH6j","x2ZvzHk267HfqDhwM","xi7msFqQ4BePcjteq","xjLLYibFy9e2YPqtu","z2zDhcNq9rbP2GbuG","zwSxNcwtdzLJCX6eB"],"author_short":["de Albuquerque Azevedo, R. G.","Birkbeck, N.","Janatra, I.","Adsumilli, B.","Frossard, P."],"bibdata":{"bibtype":"inproceedings","type":"inproceedings","author":[{"propositions":["de"],"lastnames":["Albuquerque","Azevedo"],"firstnames":["Roberto","Gerson"],"suffixes":[]},{"propositions":[],"lastnames":["Birkbeck"],"firstnames":["Neil"],"suffixes":[]},{"propositions":[],"lastnames":["Janatra"],"firstnames":["Ivan"],"suffixes":[]},{"propositions":[],"lastnames":["Adsumilli"],"firstnames":["Balu"],"suffixes":[]},{"propositions":[],"lastnames":["Frossard"],"firstnames":["Pascal"],"suffixes":[]}],"booktitle":"2020 IEEE International Conference on Multimedia and Expo (ICME)","title":"A Viewport-Driven Multi-Metric Fusion Approach for 360-Degree Video Quality Assessment","year":"2020","volume":"","number":"","pages":"1-6","abstract":"We propose a new viewport-based multi-metric fusion (MMF) approach for visual quality assessment of 360-degree (omnidirectional) videos. Our method is based on computing multiple spatio-temporal objective quality metrics (features) on viewports extracted from 360-degree videos, and learning a model that combines these features into a metric, which closely matches subjective quality scores. The main motivations for the proposed method are that: 1) quality metrics computed on viewports better captures the user experience than metrics computed on the projection domain; 2) no individual objective image quality metric always performs best for all types of visual distortions, while a learned combination of them is able to adapt to different conditions and produce better results overall. Experimental results, based on the largest available 360-degree videos quality dataset, demonstrate that the proposed metric outperforms state-of-the-art 360-degree and 2D video quality metrics.","keywords":"Measurement;Quality assessment;Visualization;Video recording;Feature extraction;Two dimensional displays;Distortion;visual quality assessment;omnidirectional video;360-degree video;multi-metric fusion","doi":"10.1109/ICME46284.2020.9102936","issn":"1945-788X","month":"July","bibtex":"@inproceedings{2020_07_azevedo,\nauthor={de Albuquerque Azevedo, Roberto Gerson and Birkbeck, Neil and\nJanatra, Ivan and Adsumilli, Balu and Frossard, Pascal},\nbooktitle={2020 IEEE International Conference on Multimedia and Expo (ICME)},\ntitle={A Viewport-Driven Multi-Metric Fusion Approach for 360-Degree Video\nQuality Assessment},\nyear={2020},\nvolume={},\nnumber={},\npages={1-6},\nabstract={We propose a new viewport-based multi-metric fusion (MMF) approach\nfor visual quality assessment of 360-degree (omnidirectional) videos. Our\nmethod is based on computing multiple spatio-temporal objective quality\nmetrics (features) on viewports extracted from 360-degree videos, and\nlearning a model that combines these features into a metric, which closely\nmatches subjective quality scores. The main motivations for the proposed\nmethod are that: 1) quality metrics computed on viewports better captures the\nuser experience than metrics computed on the projection domain; 2) no\nindividual objective image quality metric always performs best for all types\nof visual distortions, while a learned combination of them is able to adapt\nto different conditions and produce better results overall. Experimental\nresults, based on the largest available 360-degree videos quality dataset,\ndemonstrate that the proposed metric outperforms state-of-the-art 360-degree\nand 2D video quality metrics.},\nkeywords={Measurement;Quality assessment;Visualization;Video\nrecording;Feature extraction;Two dimensional displays;Distortion;visual\nquality assessment;omnidirectional video;360-degree video;multi-metric\nfusion},\ndoi={10.1109/ICME46284.2020.9102936},\nissn={1945-788X},\nmonth={July},\n}\n\n","author_short":["de Albuquerque Azevedo, R. G.","Birkbeck, N.","Janatra, I.","Adsumilli, B.","Frossard, P."],"key":"2020_07_azevedo","id":"2020_07_azevedo","bibbaseid":"dealbuquerqueazevedo-birkbeck-janatra-adsumilli-frossard-aviewportdrivenmultimetricfusionapproachfor360degreevideoqualityassessment-2020","role":"author","urls":{},"keyword":["Measurement;Quality assessment;Visualization;Video recording;Feature extraction;Two dimensional displays;Distortion;visual quality assessment;omnidirectional video;360-degree video;multi-metric fusion"],"metadata":{"authorlinks":{"azevedo, r":"http://139.82.95.2/~roberto/publications"}},"downloads":14,"html":""},"bibtype":"inproceedings","biburl":"http://www.telemidia.puc-rio.br/~roberto/biblio/bib.bib","creationDate":"2020-03-07T14:26:13.003Z","downloads":14,"keywords":["measurement;quality assessment;visualization;video recording;feature extraction;two dimensional displays;distortion;visual quality assessment;omnidirectional video;360-degree video;multi-metric fusion"],"search_terms":["viewport","driven","multi","metric","fusion","approach","360","degree","video","quality","assessment","de albuquerque azevedo","birkbeck","janatra","adsumilli","frossard"],"title":"A Viewport-Driven Multi-Metric Fusion Approach for 360-Degree Video Quality Assessment","year":2020,"dataSources":["g2kK7LGtY6BSAGWXZ","fzQqRdpBjaqFvMqtM","2GMp8PAJ4r2b8svJX"]}