I am trying to input an entire paragraph into my word processor to be split into sentences first and then into words.
I tried the following code but it does not work,
#text is the paragraph input
sent_text = sent_tokenize(text)
tokenized_text = word_tokenize(sent_text.split)
tagged = nltk.pos_tag(tokenized_text)
print(tagged)
however this is not working and gives me errors. So how do I tokenize paragraphs into sentences and then words?
An example paragraph:
This thing seemed to overpower and astonish the little dark-brown dog, and wounded him to the heart. He sank down in despair at the child's feet. When the blow was repeated, together with an admonition in childish sentences, he turned over upon his back, and held his paws in a peculiar manner. At the same time with his ears and his eyes he offered a small prayer to the child.
**WARNING:**This is just a random text from the internet, I do not own the above content.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…