You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Dec 16, 2022. It is now read-only.
Spacy 3 introduced new transformer-based models that can be run on GPU. I think it's a good idea to add support for these types of language models in AllenNLP, since now it doesn't work out-of-the-box.
My suggestion is to add an option to specify whether to load the model on CPU or GPU, and also use it accordingly.
This would require changes to both allennlp and allennlp-models, as in some models (e.g. SRL), there should be made some changes to the torch tensor management.
The text was updated successfully, but these errors were encountered:
Spacy 3 introduced new transformer-based models that can be run on GPU. I think it's a good idea to add support for these types of language models in AllenNLP, since now it doesn't work out-of-the-box.
My suggestion is to add an option to specify whether to load the model on CPU or GPU, and also use it accordingly.
This would require changes to both allennlp and allennlp-models, as in some models (e.g. SRL), there should be made some changes to the torch tensor management.
The text was updated successfully, but these errors were encountered: