minHjerteven (my heart-friend)
Large-scale alternating graft. 8 layers receive FFNs from opposite-hemisphere counterparts. Each grafted layer sees through the eyes of its mirror across the model's depth. Very gentle 0.2 blend strength. The most extensive graft in the series โ a distributed transformation rather than a point fix.
Architecture
- Base: SmolLM2-135M-Instruct
- Method: CECI Protocol (HyperTensor Paper X) โ GRC basis projection
- Created: 2026-05-04
- Repository: HyperTensor
Graft Proof
This model was created by:
- Computing the GRC (Geodesic Residual Compression) basis from the target layer's attention weights via SVD
- Projecting the donor layer's FFN weights into the target's geometric subspace
- Blending at controlled strength to preserve stability
Perplexity testing confirms the graft transfers functional structure without destroying the model.
Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("NagusameCS/minHjerteven", trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained("NagusameCS/minHjerteven")
- Downloads last month
- 16