minHjerteven (my heart-friend)

Large-scale alternating graft. 8 layers receive FFNs from opposite-hemisphere counterparts. Each grafted layer sees through the eyes of its mirror across the model's depth. Very gentle 0.2 blend strength. The most extensive graft in the series โ€” a distributed transformation rather than a point fix.

Architecture

  • Base: SmolLM2-135M-Instruct
  • Method: CECI Protocol (HyperTensor Paper X) โ€” GRC basis projection
  • Created: 2026-05-04
  • Repository: HyperTensor

Graft Proof

This model was created by:

  1. Computing the GRC (Geodesic Residual Compression) basis from the target layer's attention weights via SVD
  2. Projecting the donor layer's FFN weights into the target's geometric subspace
  3. Blending at controlled strength to preserve stability

Perplexity testing confirms the graft transfers functional structure without destroying the model.

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("NagusameCS/minHjerteven", trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained("NagusameCS/minHjerteven")
Downloads last month
16
Safetensors
Model size
0.1B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support