What is tokenization in NLP?

What is tokenization in NLP?

a) Dividing text into paragraphs
b) Dividing text into words or sentences
c) Translating text into another language
d) Generating text summaries

Answer:

b) Dividing text into words or sentences

Explanation:

Tokenization is the process of splitting text into individual tokens, which can be words or sentences, to facilitate further processing.

Reference:

Natural Language Processing (NLP) Quiz – MCQ Questions and Answers

Scroll to Top