Neural Fine-grained Entity Type Classification with Hierarchy-Aware Loss
Paper Published in NAACL 2018: NFETC
- tensorflow >= r1.2
- hyperopt
- gensim
- sklearn
- pandas
Run ./download.sh
to download the corpus and the pre-trained word embeddings
Run python preprocess.py -d <data_name> [ -c ]
to preprocess the data.
Available Dataset Name:
- wiki: Wiki/FIGER(GOLD) with original freebase-based hierarchy
- ontonotes: ONTONOTES
- wikim: Wiki/FIGER(GOLD) with improved hierarchy
Use -c
to control if filter the data or not
Before preprocessing, you need to:
- Create a folder
data/wikim
to store data for Wiki with the improved hierarchy - Run
python transform.py
Run python task.py -m <model_name> -d <data_name> -e <max_evals> -c <cv_runs>
See model_param_space.py
for available model name
The searching procedurce is recorded in one log file stored in folder log
Run python eval.py -m <model_name> -d <data_name> -r <runs>
The scores for each run and the average scores are also recorded in one log file stored in folder log
If you found this codebase or our work useful, please cite:
@InProceddings{xu2018neural,
author = {Xu, Peng and Barbosa, Denilson},
title = {Neural Fine-Grained Entity Type Classification with Hierarchy-Aware Loss},
booktitle = {The 16th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL 2018)},
month = {June},
year = {2018},
publisher = {ACL}
}