Passer au contenu principal

Vocabulary of natural language processing (POC)

Choisissez le vocabulaire dans lequel chercher

Concept information

Terme préférentiel

adversarial attack  

Définition

  • A deliberate intent to mislead a machine learning or deep neural network model by introducing subtle, imperceptible interference to an input sample. This might result in the model drawing an incorrect conclusion confidently. (Based on Wang et al., Adversarial Attacks and Defenses in Machine Learning-Powered Networks: A Contemporary Survey, 2023)

Concept générique

Synonyme(s)

  • adversarial training

Traductions

  • français

  • apprentissage adverse
  • apprentissage antagoniste
  • attaque adverse

URI

http://data.loterre.fr/ark:/67375/8LP-B29QHNSZ-4

Télécharger ce concept :

RDF/XML TURTLE JSON-LD Dernière modification le 13/06/2024