| Store | Cart

Which one to use: generate_tokens or tokenize?

From: Tim Peters <tim....@gmail.com>
Thu, 9 Sep 2004 19:39:39 -0400
[Andr? Roberge]
> According to the Python documentation:> > 18.5 tokenize -- Tokenizer for Python source> ...> The primary entry point is a generator:> generate_tokens(readline)> ...> An older entry point is retained for backward compatibility:> tokenize(readline[, tokeneater])> ====> Does this mean that one should preferably use generate_tokens?

Yes.

> If so, what are the advantages?

Be adventurous:  try them both.  You'll figure it out quickly.  If you
have to endure "an explanation" first, read PEP 255, where
tokenize.tokenize was used as an example motivating the desirability
of introducing generators.

Recent Messages in this Thread
Andr? Roberge Sep 09, 2004 11:32 pm
Tim Peters Sep 09, 2004 11:39 pm
Messages in this thread