You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have made conversions of the model into two supported formats for Rust Bert: one being the OP extension, and the other, the newly supported ONNX format. Despite my endeavors, success has eluded me in the implementation phase. I endeavored to replicate a process which I had formerly accomplished with BART, but it proved unsuccessful. Could anyone kindly provide a preliminary guide or some assistance on how I might execute zero-shot learning with DeBERTa-v3?
fromtransformersimportpipelineclassifier=pipeline("zero-shot-classification", model="MoritzLaurer/DeBERTa-v3-large-mnli-fever-anli-ling-wanli")
sequence_to_classify="Angela Merkel is a politician in Germany and leader of the CDU"candidate_labels= ["politics", "economy", "entertainment", "environment"]
output=classifier(sequence_to_classify, candidate_labels, multi_label=False)
print(output)
Here follows my implementation using Facebook's BART Large. In this implementation, I have one endpoint in REST and another in ZeroMQ for stream, utilizing batch streaming.
I have made conversions of the model into two supported formats for Rust Bert: one being the OP extension, and the other, the newly supported ONNX format. Despite my endeavors, success has eluded me in the implementation phase. I endeavored to replicate a process which I had formerly accomplished with BART, but it proved unsuccessful. Could anyone kindly provide a preliminary guide or some assistance on how I might execute zero-shot learning with DeBERTa-v3?
Here follows my implementation using Facebook's BART Large. In this implementation, I have one endpoint in REST and another in ZeroMQ for stream, utilizing batch streaming.
The text was updated successfully, but these errors were encountered: