2009-12-27 6 views
3

texte Source: United States Declaration of Independencede Split une grande chaîne en plusieurs sous-chaînes contenant « n » nombre de mots par python

Comment peut-on diviser le texte source ci-dessus dans un certain nombre de sous-chaînes, contenant un nombre « n » de mots?

J'utilise split ('') pour extraire chaque mot, mais je ne sais pas comment le faire avec plusieurs mots en une seule opération.

Je pourrais parcourir la liste des mots que j'ai, et en créer un autre en collant les mots de la première liste (en ajoutant des espaces). Cependant ma méthode n'est pas très pythonique.

Répondre

5
text = """ 
When in the course of human Events, it becomes necessary for one People to dissolve the Political Bands which have connected them with another, and to assume among the Powers of the Earth, the separate and equal Station to which the Laws of Nature and of Nature?s God entitle them, a decent Respect to the Opinions of Mankind requires that they should declare the causes which impel them to the Separation. 

We hold these Truths to be self-evident, that all Men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty, and the pursuit of Happiness?-That to secure these Rights, Governments are instituted among Men, deriving their just Powers from the Consent of the Governed, that whenever any Form of Government becomes destructive of these Ends, it is the Right of the People to alter or abolish it, and to institute a new Government, laying its Foundation on such Principles, and organizing its Powers in such Form, as to them shall seem most likely to effect their Safety and Happiness. Prudence, indeed, will dictate that Governments long established should not be changed for light and transient Causes; and accordingly all Experience hath shewn, that Mankind are more disposed to suffer, while Evils are sufferable, than to right themselves by abolishing the Forms to which they are accustomed. But when a long Train of Abuses and Usurpations, pursuing invariably the same Object, evinces a Design to reduce them under absolute Despotism, it is their Right, it is their Duty, to throw off such Government, and to provide new Guards for their future Security. Such has been the patient Sufferance of these Colonies; and such is now the Necessity which constrains them to alter their former Systems of Government. The History of the Present King of Great-Britain is a History of repeated Injuries and Usurpations, all having in direct Object the Establishment of an absolute Tyranny over these States. To prove this, let Facts be submitted to a candid World. 
""" 

words = text.split() 
subs = [] 
n = 4 
for i in range(0, len(words), n): 
    subs.append(" ".join(words[i:i+n])) 
print subs[:10] 

impressions:

['When in the course', 'of human Events, it', 'becomes necessary for one', 'People to dissolve the', 'Political Bands which have', 'connected them with another,', 'and to assume among', 'the Powers of the', 'Earth, the separate and', 'equal Station to which'] 

ou, comme la compréhension de la liste:

subs = [" ".join(words[i:i+n]) for i in range(0, len(words), n)] 
+0

Cela semblerait plutôt pythonique. – physicsmichael

+2

Oh. La plupart des applications de ngrams voudraient '' ['Quand dans le cours', 'au cours de', 'le cours de l'homme'] ', etc. –

3

Vous essayez de créer des n-grammes? Voici comment je le fais, en utilisant le NLTK.

punct = re.compile(r'^[^A-Za-z0-9]+|[^a-zA-Z0-9]+$') 
is_word=re.compile(r'[a-z]', re.IGNORECASE) 
sentence_tokenizer = nltk.data.load('tokenizers/punkt/english.pickle') 
word_tokenizer=nltk.tokenize.punkt.PunktWordTokenizer() 

def get_words(sentence): 
    return [punct.sub('',word) for word in word_tokenizer.tokenize(sentence) if is_word.search(word)] 

def ngrams(text, n): 
    for sentence in sentence_tokenizer.tokenize(text.lower()): 
     words = get_words(sentence) 
     for i in range(len(words)-(n-1)): 
      yield(' '.join(words[i:i+n])) 

Puis

for ngram in ngrams(sometext, 3): 
    print ngram 
+0

lien Intéressant! Vraiment à la recherche de l'utilisation de la boîte à outils dans le futur. – torger

2

Pour les grandes STRI ng, l'itérateur est recommandé pour la vitesse et l'empreinte mémoire faible.

import re, itertools 

# Original text 
text = "When in the course of human Events, it becomes necessary for one People to dissolve the Political Bands which have connected them with another, and to assume among the Powers of the Earth, the separate and equal Station to which the Laws of Nature and of Nature?s God entitle them, a decent Respect to the Opinions of Mankind requires that they should declare the causes which impel them to the Separation." 
n = 10 

# An iterator which will extract words one by one from text when needed 
words = itertools.imap(lambda m:m.group(), re.finditer(r'\w+', text)) 
# The final iterator that combines words into n-length groups 
word_groups = itertools.izip_longest(*(words,)*n) 

for g in word_groups: print g 

obtiendrez le résultat suivant:

('When', 'in', 'the', 'course', 'of', 'human', 'Events', 'it', 'becomes', 'necessary') 
('for', 'one', 'People', 'to', 'dissolve', 'the', 'Political', 'Bands', 'which', 'have') 
('connected', 'them', 'with', 'another', 'and', 'to', 'assume', 'among', 'the', 'Powers') 
('of', 'the', 'Earth', 'the', 'separate', 'and', 'equal', 'Station', 'to', 'which') 
('the', 'Laws', 'of', 'Nature', 'and', 'of', 'Nature', 's', 'God', 'entitle') 
('them', 'a', 'decent', 'Respect', 'to', 'the', 'Opinions', 'of', 'Mankind', 'requires') 
('that', 'they', 'should', 'declare', 'the', 'causes', 'which', 'impel', 'them', 'to') 
('the', 'Separation', None, None, None, None, None, None, None, None) 
+0

Puis-je coller les mots dans chaque groupe de tuple avec des espaces? – torger

+0

Oui, il suffit d'utiliser print '' .join (g) au lieu d'imprimer g – iamamac

Questions connexes