# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("Himitsui/KuroMitsu-11B")
model = AutoModelForCausalLM.from_pretrained("Himitsui/KuroMitsu-11B")Quick Links
Included in this repo is the full model for KuroMitsu-11B
(β―βΏββ¬β΄β¬β΄β¬β΄β¬β΄β€(ο½₯_ββ¬β΄β¬β΄β¬β΄β¬β΄β€ο½₯Οο½₯)οΎ
Hiya! This is my 11B Solar Finetune.
Included in the dataset I used to train are hateful and toxic entries. Along with rows of chat, roleplay and instruct entries. The goal was to make a roleplaying model that is uncensored. (γ»_γ»γΎ)
The prompt format Vicuna or Alpaca works well with this model.
I hope this model may be useful to you π
- Downloads last month
- 790
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="Himitsui/KuroMitsu-11B")