Concept information
Preferred term
tokenizers
Definition
- tokenizers est un package R qui offre des fonctions, avec une interface cohérente, pour convertir du texte en langage naturel en tokens.
Broader concept
has application field
has for input language
has repository
has download location
uses software
has interface
is executed in
has for license
In other languages
-
English
URI
http://data.loterre.fr/ark:/67375/LTK-XJ7ZWWDF-Q
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}