1 year ago
#75403
mjoy
Spacy NER custom patterns ValueError: Can't read file error .json?
I am using Spacy to get NER tags and have created custom patterns that I want to add to my NER ruler. It works when running this on my local machine. However, I am now trying to run the code in an ubuntu server and I keep running into the following error:
Traceback (most recent call last):
File "nlp/model.py", line 57, in <module>
data = get_ner_tags(data)
File "/home/ubuntu/scripts/dir1/nlp/py_scripts/features.py", line 61, in get_ner_tags
.from_disk("scripts/dir1/nlp/py_scripts/patterns.jsonl")
File "/home/ubuntu/scripts/dir1/venv-dir1/lib/python3.6/site-packages/spacy/pipeline/entityruler.py", line 477, in from_disk
self.add_patterns(patterns)
File "/home/ubuntu/scripts/dir1/venv-dir1/lib/python3.6/site-packages/spacy/pipeline/entityruler.py", line 307, in add_patterns
for entry in patterns:
File "/home/ubuntu/scripts/dir1/venv-dir1/lib/python3.6/site-packages/srsly/_json_api.py", line 109, in read_jsonl
file_path = force_path(path)
File "/home/ubuntu/scripts/dir1/venv-dir1/lib/python3.6/site-packages/srsly/util.py", line 24, in force_path
raise ValueError(f"Can't read file: {location}")
ValueError: Can't read file: scripts/dir1/nlp/py_scripts/patterns.jsonl
This is the code I use:
nlp = en_core_web_sm.load()
#adding custom ruler with patterns
ruler = nlp.add_pipe('entity_ruler', before ='ner')\
.from_disk("scripts/dir1/nlp/py_scripts/patterns.jsonl")
data['spacy_ner_label'] = data['clean_words'].apply(lambda x: [y.label_ for y in list(nlp(x).ents)])
It works when running on my local machine, and if I create new directories and run the scripts in there. But, I can't get it to work using it in the ubuntu box.
python
ubuntu
spacy
named-entity-recognition
0 Answers
Your Answer