1
我試圖用StanfordSegmenter來分割一段中文,但遇到了標題問題。我首先從http://nlp.stanford.edu/software/segmenter.shtml錯誤:無法找到或加載主類edu.stanford.nlp.ie.crf.CRFClassifier
下載斯坦福字分割器3.5.2版。然後我寫了一個python:
import os
os.environ['JAVAHOME'] = "C:/Program Files/Java/jdk1.8.0_102/bin/java.exe"
from nltk.tokenize.stanford_segmenter import StanfordSegmenter
segmenter = StanfordSegmenter(path_to_jar = "./stanford-segmenter-2015-12-09/stanford-segmenter-3.6.0.jar",
path_to_slf4j = "./stanford-segmenter-2015-12-09/slf4j-api.jar",
path_to_sihan_corpora_dict = "./stanford-segmenter-2015-12-09/data",
path_to_model = "./stanford-segmenter-2015-12-09/data/pku.gz",
path_to_dict = "./stanford-segmenter-2015-12-09/data/dict-chris6.ser.gz")
sentence = u"這是斯坦福中文分詞器測試"
segmenter.segment(sentence)
但我得到了以下錯誤:
Error: Could not find or load main class edu.stanford.nlp.ie.crf.CRFClassifier
哪裏我犯錯誤?謝謝。