Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
adversarial_training [2017/11/27 15:05]
admin
adversarial_training [2018/12/02 13:59] (current)
admin
Line 293: Line 293:
  
 At training time, we learn a generative model for each class, while at test time, given an example to classify, we query each generator for its most similar generation, and select the class corresponding to the most similar one. Our approach is general and can be used with expressive models such as GANs and VAEs. At test time, our method accurately "knows when it does not know," and provides resilience to out of distribution examples while maintaining competitive performance for standard examples. At training time, we learn a generative model for each class, while at test time, given an example to classify, we query each generator for its most similar generation, and select the class corresponding to the most similar one. Our approach is general and can be used with expressive models such as GANs and VAEs. At test time, our method accurately "knows when it does not know," and provides resilience to out of distribution examples while maintaining competitive performance for standard examples.
 +
 +https://​arxiv.org/​abs/​1805.10204 Adversarial examples from computational constraints
 +
 +This example gives an exponential separation between classical learning and robust learning in the statistical query model. It suggests that adversarial examples may be an unavoidable byproduct of computational limitations of learning algorithms.
 +
 +https://​arxiv.org/​abs/​1811.11553v1 ​
 +Strike (with) a Pose: Neural Networks Are Easily Fooled by Strange Poses of Familiar Objects
 +
 +https://​arxiv.org/​abs/​1809.07802v2 Playing the Game of Universal Adversarial Perturbations
 +
 +https://​arxiv.org/​pdf/​1811.04422.pdf An Optimal Control View of Adversarial Machine Learning
 +