seems that almost complex code must be used with Tokens like:
- convert a string to Math Expression;
- convert a code to another language.
but i need understand more about them and where use them.
can anyone just explain\correct me more about Tokens?
There is lots of research and tools for parsing text. You might try searching for "lexical analysis" online.
Usually a token has both a type and text. For example "123.4" might have a type of "double."
A tokenizer usually reads from a stream and returns the type and text of the next token. Often a language is designed so that the tokenizer can determine the end of a token by "looking ahead" no more than one character. This keeps the code simple.
The next layer up is syntax, which defines when one token can legally follow another. Again, there is a lot of research and tools here. And again, languages are usually designed so that you don't need to know more than one token in in advance to know exactly where you are in the syntax.