NLTK字Lemmatizer奇怪的行为 - 帮助POS标记 - Python 3

时间:2018-06-14 15:07:56

标签: python python-3.x nltk sentiment-analysis lemmatization

试图在python3中编写一个关于情感分析的小程序。但是在使用词语lemmatizer时,看到奇怪的“是”转换为“wa”,这完全出乎意料。以下是我的预处理代码 - 请帮助我理解如何避免这些问题。此外,如果必须使用POS标记来获得正确的词形变换器,如何在我的预处理函数中正确有效地包含它

wordnet_lemmatizer = WordNetLemmatizer()
stoplist = stopwords.words('english') + list(string.punctuation)

def preprocess(text):
    return [wordnet_lemmatizer.lemmatize(word) for word in word_tokenize(text) if wordnet_lemmatizer.lemmatize(word.lower()) not in stoplist and not word.isdigit() and len(word)>2]

输入文本基本上是来自Multi-Domain Sentiment Dataset(版本2.0)的语料库,引起我注意的行在下面

<review_text> I just recieved my HDMI cable and am very impressed. The price is just what it should be about $5 and makes me wonder how somebody would spend over $100 for this cable at a store. The service was excellent and the cable arrived in 4 days! I highly recommend this cable. I just plugged it into my cable box and the other end into the TV and WOW what a great picture all around. The color is just so much more vivid using HDMI compared to component 3 wire connectors. Get this cable for your system and stay away from those high priced others </review_text>

并且在lemmatizer之后的函数的输出是

['recieved', 'HDMI', 'cable', 'impressed', 'price', 'make', 'wonder', 'somebody', 'would', 'spend', 'cable', 'store', 'service', 'wa', 'excellent', 'cable', 'arrived', 'day', 'highly', 'recommend', 'cable', 'plugged', 'cable', 'box', 'end', 'WOW', 'great', 'picture', 'around', 'color', 'much', 'vivid', 'using', 'HDMI', 'compared', 'component', 'wire', 'connector', 'Get', 'cable', 'system', 'stay', 'away', 'high', 'priced', 'others']

让我彻底陷入困境的是“如何”转变为“哇”?还有其他有趣的事情,如印象深刻留下深刻印象,并没有改变印象可能是因为POS?

如果有人可以帮助我在同一个功能中使用POS标签将会很有帮助

-------------------编辑解决方案和验证------------------ < /强>

感谢kaggle链接我使用了完全相同的功能并获得了更好的结果。只是想验证两件事可能是因为我对POS没有正确的理解

import nltk
from nltk.corpus import stopwords
from nltk.corpus import wordnet
from nltk import word_tokenize
from nltk import pos_tag
from nltk.stem import WordNetLemmatizer
import numpy as np
from sklearn.linear_model import LogisticRegression
from bs4 import BeautifulSoup
import string

nltk.data.path.append('C:\\Users\\myuser\\AppData\\Roaming\\nltk_data\\')

actualSent = "<review_text> I just recieved my HDMI cable and am very impressed. The price is just what it should be about $5 and makes me wonder how somebody would spend over $100 for this cable at a store. The service was excellent and the cable arrived in 4 days! I highly recommend this cable. I just plugged it into my cable box and the other end into the TV and WOW what a great picture all around. The color is just so much more vivid using HDMI compared to component 3 wire connectors. Get this cable for your system and stay away from those high priced others </review_text>"


wordnet_lemmatizer = WordNetLemmatizer()
stoplist = stopwords.words('english') + list(string.punctuation)

def penn2morphy(penntag):
    """ Converts Penn Treebank tags to WordNet. Copied from kaggle post https://www.kaggle.com/alvations/basic-nlp-with-nltk"""

    morphy_tag = {'NN':'n', 'JJ':'a',
                  'VB':'v', 'RB':'r'}
    try:
        return morphy_tag[penntag[:2]]
    except:
        return 'n'

def lemmatize_sent(text): 
    # Text input is string, returns lowercased strings.
    return [wordnet_lemmatizer.lemmatize(word.lower(), pos=penn2morphy(tag)) 
            for word, tag in pos_tag(word_tokenize(text))]

def preprocess(text):
    return [word for word in lemmatize_sent(text) if word not in stoplist and not word.isdigit()]

token = preprocess(actualSent)

上面的代码给了我以下输出

['review_text', 'recieved', 'hdmi', 'cable', 'impressed', 'price', 'make', 'wonder', 'somebody', 'would', 'spend', 'cable', 'store', 'service', 'excellent', 'cable', 'arrive', 'day', 'highly', 'recommend', 'cable', 'plug', 'cable', 'box', 'end', 'tv', 'wow', 'great', 'picture', 'around',  'color', 'much', 'vivid', 'use', 'hdmi', 'compare', 'component', 'wire', 'connector', 'get', 'cable', 'system', 'stay', 'away', 'high', 'price',  'others', '/review_text']

现在为什么“收到”这个词没有改为“收到”和“留下深刻印象”而不是改为“留下深刻印象”

如果有人可以帮助我理解上述两个

,我的想法会更清楚

0 个答案:

没有答案