Concept information
Preferred term
tokenizers
Definition
- tokenizers is a R package which offers functions with a consistent interface to convert natural language text into tokens.
Broader concept
has application field
has for input language
has repository
has download location
uses software
is encoded in
has interface
is executed in
has for license
In other languages
-
French
URI
http://data.loterre.fr/ark:/67375/LTK-XJ7ZWWDF-Q
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}