How to use MingZhong/DialogLED-base-16384 with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("MingZhong/DialogLED-base-16384") model = AutoModelForSeq2SeqLM.from_pretrained("MingZhong/DialogLED-base-16384")
Can I use this model for a simple inference pipeline? I.e. feed in a conversation and get the summarisation output from it?
· Sign up or log in to comment