Skip to main content

Vocabulary of natural language processing (POC)

Search from vocabulary

Concept information

Preferred term

knowledge distillation  

Definition

  • The process of transferring knowledge from a large model to a smaller one. (Wikipedia)

Broader concept

Entry terms

  • model distillation

In other languages

URI

http://data.loterre.fr/ark:/67375/8LP-BBBC55RH-N

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 4/26/24