Questions tagged [tokenization]

Tokenization is the process of breaking a stream of text up into words, phrases, symbols, or other meaningful elements called tokens.

Tokenization is the process of breaking a stream of text up into words, phrases, symbols, or other meaningful elements called tokens.

The list of tokens becomes input for further processing such as parsing or text mining. Tokenization is useful both in linguistics (where it is a form of text segmentation), and in computer science, where it forms part of lexical analysis.

11 questions
1
vote
1 answer

Question on Tokenization and the need to maintain a value to token lookup

I'm a newbie and an idiot to boot. I've a question in relation to tokenizing: is there a methodology out there that would allow you reidentify without using a token to value lookup? I'd have thought no but have a developer claiming something…
0
votes
2 answers

Understanding token security

I am having some problems, cryptographically speaking, digesting this information. There is a table that seems to say that tokens provide "end to end security" (I suppose we understand different things by that). But my main issue is with…
user1156544
  • 129
  • 6