Can be used with peft:
from peft import PeftModel
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("google-bert/bert-base-uncased")
base_model = AutoModelForSequenceClassification.from_pretrained("google-bert/bert-base-uncased", num_labels=2)
model = PeftModel.from_pretrained(base_model, "./bert-lora")
model = model.merge_and_unload() # optional: merge adapter into base weights permanently
Wandb Workspace: https://wandb.ai/kunjcr2-dreamable/huggingface/runs/zkeka1hf?nw=nwuserkunjcr2
NOTE: THIS WAS BUILT FOR CLASSIFYING PROMPT INJECTION AND BENIGN PROMPTS, FOR A BIGGER PROBLEM.
- Downloads last month
- 231
Model tree for kunjcr2/bert-lora
Base model
google-bert/bert-base-uncased