Distinction maximization loss: Fast, scalable, turnkey, and native neural networks out-of-distribution detection simply by replacing the softmax loss. MacÊdo, D., Ren, T., Zanchettin, C., Oliveira, A., Tapp, A., & Ludermir, T. 2019.
abstract   bibtex   
Copyright © 2019, arXiv, All rights reserved. Recently, many methods to reduce neural networks uncertainty have been proposed. However, most of the techniques used in these solutions usually present severe drawbacks. In this paper, we argue that neural networks low outof- distribution detection performance is mainly due to the SoftMax loss anisotropy. Therefore, we built an isotropic loss to reduce neural networks uncertainty in a fast, scalable, turnkey, and native approach. Our experiments show that replacing SoftMax with the proposed loss does not affect classification accuracy. Moreover, our proposal overcomes ODIN typically by a large margin while producing usually competitive results against a state-of-the-art Mahalanobis method despite avoiding their limitations. Hence, neural networks uncertainty may be significantly reduced by a simple loss change without relying on special procedures such as data augmentation, adversarial training/validation, ensembles, or additional classification/regression models.
@misc{
 title = {Distinction maximization loss: Fast, scalable, turnkey, and native neural networks out-of-distribution detection simply by replacing the softmax loss},
 type = {misc},
 year = {2019},
 source = {arXiv},
 id = {48d442f2-6f41-314b-b755-a88d57c0e391},
 created = {2020-11-03T23:59:00.000Z},
 file_attached = {false},
 profile_id = {74e7d4ea-3dac-3118-aab9-511a5b337e8f},
 last_modified = {2020-11-04T11:54:36.797Z},
 read = {false},
 starred = {false},
 authored = {true},
 confirmed = {false},
 hidden = {false},
 private_publication = {false},
 abstract = {Copyright © 2019, arXiv, All rights reserved. Recently, many methods to reduce neural networks uncertainty have been proposed. However, most of the techniques used in these solutions usually present severe drawbacks. In this paper, we argue that neural networks low outof- distribution detection performance is mainly due to the SoftMax loss anisotropy. Therefore, we built an isotropic loss to reduce neural networks uncertainty in a fast, scalable, turnkey, and native approach. Our experiments show that replacing SoftMax with the proposed loss does not affect classification accuracy. Moreover, our proposal overcomes ODIN typically by a large margin while producing usually competitive results against a state-of-the-art Mahalanobis method despite avoiding their limitations. Hence, neural networks uncertainty may be significantly reduced by a simple loss change without relying on special procedures such as data augmentation, adversarial training/validation, ensembles, or additional classification/regression models.},
 bibtype = {misc},
 author = {MacÊdo, D. and Ren, T.I. and Zanchettin, C. and Oliveira, A.L.I. and Tapp, A. and Ludermir, T.}
}

Downloads: 0