Skip to content

Commit

Permalink
Merge pull request #130 from Franck-Dernoncourt/requirements
Browse files Browse the repository at this point in the history
Add requirements, fix tests
  • Loading branch information
Franck-Dernoncourt authored Feb 8, 2019
2 parents 96b525e + ee0eae1 commit 4cbfc3a
Show file tree
Hide file tree
Showing 4 changed files with 15 additions and 5 deletions.
5 changes: 4 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,7 @@
*.project
*.pydevproject
output/*
data/*
data/*

# OS X file
.DS_Store
5 changes: 2 additions & 3 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,12 +4,11 @@ os:
- linux
language: python
python:
- "3.5"
- "3.6"
# command to install dependencies
install:
- bash .travis_install_ubuntu.sh
- pip install tensorflow
- pip install -U networkx matplotlib scikit-learn scipy spacy pycorenlp
- pip install -r requirements.txt
- python -m spacy download en
# ensure that NeuroNER doesn't perform too many epochs (Travis jobs are limited to 50 minutes)
- sed -i 's/maximum_number_of_epochs = 100/maximum_number_of_epochs = 1/g' src/parameters.ini
Expand Down
8 changes: 8 additions & 0 deletions requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
matplotlib==3.0.2
networkx==2.2
pycorenlp==0.3.0
scikit-learn==0.20.2
scipy==1.2.0
spacy==2.0.18
tensorflow==1.12.0
# tensorflow-gpu==1.1.0
2 changes: 1 addition & 1 deletion src/utils_plots.py
Original file line number Diff line number Diff line change
Expand Up @@ -143,7 +143,7 @@ def plot_classification_report(classification_report, title='Classification repo
else:
lines = classification_report.split('\n')
for line in lines[2 : (len(lines) - 1)]:
t = line.strip().replace('avg / total', 'micro-avg').split()
t = line.strip().replace(' avg', '-avg').split()
if len(t) < 2: continue
classes.append(t[0])
v = [float(x)*100 for x in t[1: len(t) - 1]]
Expand Down

0 comments on commit 4cbfc3a

Please sign in to comment.