Distinction maximization loss: Fast, scalable, turnkey, and native neural networks out-of-distribution detection simply by replacing the softmax loss. MacÊdo, D., Ren, T., Zanchettin, C., Oliveira, A., Tapp, A., & Ludermir, T. 2019. abstract bibtex Copyright © 2019, arXiv, All rights reserved. Recently, many methods to reduce neural networks uncertainty have been proposed. However, most of the techniques used in these solutions usually present severe drawbacks. In this paper, we argue that neural networks low outof- distribution detection performance is mainly due to the SoftMax loss anisotropy. Therefore, we built an isotropic loss to reduce neural networks uncertainty in a fast, scalable, turnkey, and native approach. Our experiments show that replacing SoftMax with the proposed loss does not affect classification accuracy. Moreover, our proposal overcomes ODIN typically by a large margin while producing usually competitive results against a state-of-the-art Mahalanobis method despite avoiding their limitations. Hence, neural networks uncertainty may be significantly reduced by a simple loss change without relying on special procedures such as data augmentation, adversarial training/validation, ensembles, or additional classification/regression models.
@misc{
title = {Distinction maximization loss: Fast, scalable, turnkey, and native neural networks out-of-distribution detection simply by replacing the softmax loss},
type = {misc},
year = {2019},
source = {arXiv},
id = {48d442f2-6f41-314b-b755-a88d57c0e391},
created = {2020-11-03T23:59:00.000Z},
file_attached = {false},
profile_id = {74e7d4ea-3dac-3118-aab9-511a5b337e8f},
last_modified = {2020-11-04T11:54:36.797Z},
read = {false},
starred = {false},
authored = {true},
confirmed = {false},
hidden = {false},
private_publication = {false},
abstract = {Copyright © 2019, arXiv, All rights reserved. Recently, many methods to reduce neural networks uncertainty have been proposed. However, most of the techniques used in these solutions usually present severe drawbacks. In this paper, we argue that neural networks low outof- distribution detection performance is mainly due to the SoftMax loss anisotropy. Therefore, we built an isotropic loss to reduce neural networks uncertainty in a fast, scalable, turnkey, and native approach. Our experiments show that replacing SoftMax with the proposed loss does not affect classification accuracy. Moreover, our proposal overcomes ODIN typically by a large margin while producing usually competitive results against a state-of-the-art Mahalanobis method despite avoiding their limitations. Hence, neural networks uncertainty may be significantly reduced by a simple loss change without relying on special procedures such as data augmentation, adversarial training/validation, ensembles, or additional classification/regression models.},
bibtype = {misc},
author = {MacÊdo, D. and Ren, T.I. and Zanchettin, C. and Oliveira, A.L.I. and Tapp, A. and Ludermir, T.}
}
Downloads: 0
{"_id":"b89dLvkKA3J7bkDkT","bibbaseid":"macdo-ren-zanchettin-oliveira-tapp-ludermir-distinctionmaximizationlossfastscalableturnkeyandnativeneuralnetworksoutofdistributiondetectionsimplybyreplacingthesoftmaxloss-2019","author_short":["MacÊdo, D.","Ren, T.","Zanchettin, C.","Oliveira, A.","Tapp, A.","Ludermir, T."],"bibdata":{"title":"Distinction maximization loss: Fast, scalable, turnkey, and native neural networks out-of-distribution detection simply by replacing the softmax loss","type":"misc","year":"2019","source":"arXiv","id":"48d442f2-6f41-314b-b755-a88d57c0e391","created":"2020-11-03T23:59:00.000Z","file_attached":false,"profile_id":"74e7d4ea-3dac-3118-aab9-511a5b337e8f","last_modified":"2020-11-04T11:54:36.797Z","read":false,"starred":false,"authored":"true","confirmed":false,"hidden":false,"private_publication":false,"abstract":"Copyright © 2019, arXiv, All rights reserved. Recently, many methods to reduce neural networks uncertainty have been proposed. However, most of the techniques used in these solutions usually present severe drawbacks. In this paper, we argue that neural networks low outof- distribution detection performance is mainly due to the SoftMax loss anisotropy. Therefore, we built an isotropic loss to reduce neural networks uncertainty in a fast, scalable, turnkey, and native approach. Our experiments show that replacing SoftMax with the proposed loss does not affect classification accuracy. Moreover, our proposal overcomes ODIN typically by a large margin while producing usually competitive results against a state-of-the-art Mahalanobis method despite avoiding their limitations. Hence, neural networks uncertainty may be significantly reduced by a simple loss change without relying on special procedures such as data augmentation, adversarial training/validation, ensembles, or additional classification/regression models.","bibtype":"misc","author":"MacÊdo, D. and Ren, T.I. and Zanchettin, C. and Oliveira, A.L.I. and Tapp, A. and Ludermir, T.","bibtex":"@misc{\n title = {Distinction maximization loss: Fast, scalable, turnkey, and native neural networks out-of-distribution detection simply by replacing the softmax loss},\n type = {misc},\n year = {2019},\n source = {arXiv},\n id = {48d442f2-6f41-314b-b755-a88d57c0e391},\n created = {2020-11-03T23:59:00.000Z},\n file_attached = {false},\n profile_id = {74e7d4ea-3dac-3118-aab9-511a5b337e8f},\n last_modified = {2020-11-04T11:54:36.797Z},\n read = {false},\n starred = {false},\n authored = {true},\n confirmed = {false},\n hidden = {false},\n private_publication = {false},\n abstract = {Copyright © 2019, arXiv, All rights reserved. Recently, many methods to reduce neural networks uncertainty have been proposed. However, most of the techniques used in these solutions usually present severe drawbacks. In this paper, we argue that neural networks low outof- distribution detection performance is mainly due to the SoftMax loss anisotropy. Therefore, we built an isotropic loss to reduce neural networks uncertainty in a fast, scalable, turnkey, and native approach. Our experiments show that replacing SoftMax with the proposed loss does not affect classification accuracy. Moreover, our proposal overcomes ODIN typically by a large margin while producing usually competitive results against a state-of-the-art Mahalanobis method despite avoiding their limitations. Hence, neural networks uncertainty may be significantly reduced by a simple loss change without relying on special procedures such as data augmentation, adversarial training/validation, ensembles, or additional classification/regression models.},\n bibtype = {misc},\n author = {MacÊdo, D. and Ren, T.I. and Zanchettin, C. and Oliveira, A.L.I. and Tapp, A. and Ludermir, T.}\n}","author_short":["MacÊdo, D.","Ren, T.","Zanchettin, C.","Oliveira, A.","Tapp, A.","Ludermir, T."],"biburl":"https://bibbase.org/service/mendeley/74e7d4ea-3dac-3118-aab9-511a5b337e8f","bibbaseid":"macdo-ren-zanchettin-oliveira-tapp-ludermir-distinctionmaximizationlossfastscalableturnkeyandnativeneuralnetworksoutofdistributiondetectionsimplybyreplacingthesoftmaxloss-2019","role":"author","urls":{},"metadata":{"authorlinks":{}}},"bibtype":"misc","biburl":"https://bibbase.org/service/mendeley/74e7d4ea-3dac-3118-aab9-511a5b337e8f","dataSources":["XkGKCoQgZDKqXZqdh","2252seNhipfTmjEBQ"],"keywords":[],"search_terms":["distinction","maximization","loss","fast","scalable","turnkey","native","neural","networks","out","distribution","detection","simply","replacing","softmax","loss","macêdo","ren","zanchettin","oliveira","tapp","ludermir"],"title":"Distinction maximization loss: Fast, scalable, turnkey, and native neural networks out-of-distribution detection simply by replacing the softmax loss","year":2019}