KIKI EMC SFT โ€” LoRA Adapter

Fine-tuned LoRA adapter for emc domain expertise, based on Qwen/Qwen3-8B.

Part of the KIKI Models Tuning pipeline for the FineFab platform.

Training Details

Parameter Value
Base Model Qwen/Qwen3-8B
Method QLoRA (4-bit NF4)
LoRA Rank 16
Epochs 3
Dataset 2360 examples
Domain emc

Usage

from peft import PeftModel
from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen3-8B", device_map="auto")
model = PeftModel.from_pretrained(model, "clemsail/kiki-emc-sft")
tokenizer = AutoTokenizer.from_pretrained("Qwen/Qwen3-8B")

License

Apache 2.0

Downloads last month
55
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for clemsail/kiki-emc-sft

Finetuned
Qwen/Qwen3-8B
Adapter
(1054)
this model