BERT-CRel: Improved Biomedical Word Embeddings in the Transformer Era

Jiho Noh & Ramakanth Kavuluru
BERT-CRel is a transformer model for fine-tuning biomedical word embeddings that are jointly learned along with concept embeddings using a pre-training phase with fastText and a fine-tuning phase with a transformer setup. The goal is to provide high quality pre-trained biomedical embeddings that can be used in any downstream task by the research community. The corpus used for BERT-CRel contains biomedical citations from PubMed and the concepts are from the Medical Subject Headings (MeSH codes)...
This data repository is not currently reporting usage information. For information on how your repository can submit usage information, please see our documentation.