You are here

Ask2Transformers Domains

A2T Domains (A2TD) (Sainz and Rigau 2021) is a lexical resource generated as part of the Ask2Transformers work. It  consists of WordNet synsets automatically annotated with domain information, such us BabelDomains labels. 

The Ask2Transformers work aims to automatically annotate textual data without any supervision. Given a particular set of labels (BabelDomains, WNDomains, ...), the system have to classify the data without previous examples. This work is based on Transformers library and it's pretrained LMs. For this particular resource we evaluated the systems on BabelDomains dataset (Camacho-Collados and Navigli, 2017) achieving 92.14 of accuracy on domain labelling.

You can find the code of the Ask2Transformers work on Github:

For further information about the A2T Domains please check this README.


A2T Domains: [tar.gz]


This package is distributed under Attribution 3.0 Unported (CC BY 3.0) license. You can find it at


Sainz O. and Rigau G. Ask2Transformers: Zero-Shot Domain labelling with Pre-trained Language Models. Proceedings of the 11th Global WordNet Conference (GWC 2021). Pretoria, South Africa. 2021.


Camacho-Collados, Jose, and Roberto Navigli. "BabelDomains: Large-scale domain labeling of lexical resources." In Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 2, Short Papers, pp. 223-228. 2017.