Skip to main content

ThesoTM thesaurus

Search from vocabulary

Concept information

Preferred term

tokenization  

Definition

  • The task/process of recognizing and tagging tokens (words, punctuation marks, digits etc.) in a text.

Broader concept

Scope note

  • The task/process of recognizing and tagging tokens (words, punctuation marks, digits etc.) in a text

In other languages

URI

http://data.loterre.fr/ark:/67375/LTK-T5RXB3DL-2

Download this concept: