Building on HF
not-lain
Β·
AI & ML interests
custom AI models with HF integration, HuggingFace fellow π€
Recent Activity
reacted to SeanLee97's post with π 13 days ago Our lab recently released a paper where we introduce ShadowPEFT, a new Parameter-Efficient Fine-Tuning (PEFT) paradigm tailored for edge computing scenarios.
Unlike traditional approaches such as LoRA and its variants, which inject trainable parameters directly into the weights of Transformer, requiring tight coupling with the backbone.
ShadowPEFT instead enhances the frozen large base model by adding a lightweight, centralized, pretrainable, and detachable Shadow network.
This shadow network operates in parallel with the base model, delivering learned corrections to each decoder layer. Because the shadow module is architecturally decoupled from the backbone, it can be independently trained, stored, and deployed, benefiting edge computing scenarios and edge-cloud collaboration computing.
- HF Paper: https://huggingface.co/papers/2604.19254
- GitHub: https://github.com/ShadowLLM/shadow-peft
- HF Collection: https://huggingface.co/collections/shadow-llm/shadow-peft-models
reacted to SeanLee97's post with π₯ 13 days ago Our lab recently released a paper where we introduce ShadowPEFT, a new Parameter-Efficient Fine-Tuning (PEFT) paradigm tailored for edge computing scenarios.
Unlike traditional approaches such as LoRA and its variants, which inject trainable parameters directly into the weights of Transformer, requiring tight coupling with the backbone.
ShadowPEFT instead enhances the frozen large base model by adding a lightweight, centralized, pretrainable, and detachable Shadow network.
This shadow network operates in parallel with the base model, delivering learned corrections to each decoder layer. Because the shadow module is architecturally decoupled from the backbone, it can be independently trained, stored, and deployed, benefiting edge computing scenarios and edge-cloud collaboration computing.
- HF Paper: https://huggingface.co/papers/2604.19254
- GitHub: https://github.com/ShadowLLM/shadow-peft
- HF Collection: https://huggingface.co/collections/shadow-llm/shadow-peft-models
View all activity Organizations
not-lain/Depth-Anything-V2-Small
Depth Estimation
β’ Updated β’ 5
not-lain/cloth_classification
Image Classification
β’ 85.8M β’ Updated β’ 7
7B β’ Updated β’ 10
Updated β’ 4
not-lain/training_the_moon
Text Generation
β’ 2B β’ Updated β’ 5
not-lain/finetuned_mistral_on_ads
Updated β’ 2
not-lain/finetuned_tinyllama_on_ads
Text Generation
β’ 1B β’ Updated β’ 4
not-lain/finetuned_gemma2b_on_ads_dataset
Updated β’ 3
not-lain/Finetuned_TinyLlama
Text Generation
β’ 1B β’ Updated β’ 8
not-lain/my_awesome_mnist_model
22.3k β’ Updated Updated β’ 4
22.3k β’ Updated Image Classification
β’ 21.8k β’ Updated β’ 18
β’ 1
Updated β’ 4
Image Classification
β’ Updated Image-to-3D
β’ Updated β’ 6
β’ 1
not-lain/UniFormer_Light_xs_image
Updated β’ 2
Updated β’ 6
Updated β’ 12
not-lain/Gemma-2b-Peft-finetuning
Updated β’ 2
not-lain/newkwargsserialization
Updated β’ 2
Updated β’ 3
not-lain/DUSt3R_ViTLarge_BaseDecoder_512_dpt
0.6B β’ Updated β’ 2
Updated β’ 5
not-lain/MyModelWithParams
Updated β’ 4
not-lain/BaseModelWithMultipleParameters
Updated β’ 3
not-lain/BaseModelWithConfigAndMultipleParameters
Updated β’ 3
not-lain/BaseModelWithConfigAndNamedParameter
Updated β’ 4
not-lain/BaseModelWithJustConfig
Updated β’ 4