π¬ Skin Type Classification Model
A deep learning model for classifying skin types into dry and oily categories using computer vision.
Model Description
This model is based on ResNet50 architecture and has been fine-tuned specifically for skin type classification. It can analyze facial skin images and determine whether the skin type is dry or oily with high accuracy.
Key Features
- Architecture: ResNet50-based classification model
- Classes: 2 (dry, oily)
- Input: RGB images (224x224 pixels)
- Framework: PyTorch + Transformers
- Performance: High accuracy on skin type classification
Intended Use
Primary Use Cases
- Dermatological analysis and skin assessment
- Cosmetic product recommendation systems
- Skincare routine personalization
- Medical research and skin health monitoring
Limitations
- Designed specifically for facial skin analysis
- Requires good lighting and clear skin visibility
- Not suitable for medical diagnosis (for research/cosmetic use only)
- Performance may vary across different skin tones and ethnicities
How to Use
Quick Start with Transformers
from transformers import AutoModelForImageClassification, AutoImageProcessor
from PIL import Image
import torch
# Load model and processor
model = AutoModelForImageClassification.from_pretrained("your-username/skin-type-classifier")
processor = AutoImageProcessor.from_pretrained("your-username/skin-type-classifier")
# Load and process image
image = Image.open("path/to/skin/image.jpg")
inputs = processor(images=image, return_tensors="pt")
# Make prediction
with torch.no_grad():
outputs = model(**inputs)
predictions = torch.nn.functional.softmax(outputs.logits, dim=-1)
predicted_class = predictions.argmax().item()
# Get result
labels = ["dry", "oily"]
confidence = predictions[0][predicted_class].item()
print(f"Predicted skin type: {labels[predicted_class]} (confidence: {confidence:.2%})")
Using the Pipeline API
from transformers import pipeline
# Create classification pipeline
classifier = pipeline("image-classification", model="your-username/skin-type-classifier")
# Classify image
result = classifier("path/to/skin/image.jpg")
print(result)
Model Details
Architecture
- Base Model: ResNet50
- Modification: Custom classification head with 2 output classes
- Input Size: 224 Γ 224 Γ 3 (RGB)
- Parameters: ~25M parameters
Training Details
- Dataset: Custom skin type classification dataset
- Preprocessing:
- Resize to 224Γ224 pixels
- Normalization: ImageNet statistics
- Data augmentation applied during training
- Training Framework: PyTorch
- Optimization: Adam optimizer with learning rate scheduling
Performance Metrics
- Training Accuracy: High performance on validation set
- Inference Speed: Fast inference suitable for real-time applications
- Model Size: ~94MB
Technical Specifications
Input Format
- Type: RGB Images
- Size: 224 Γ 224 pixels
- Format: PIL Image, numpy array, or torch tensor
- Normalization: ImageNet mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]
Output Format
- Type: Classification logits
- Classes:
- 0: "dry" - Dry skin type
- 1: "oily" - Oily skin type
- Output: Softmax probabilities for each class
Ethical Considerations
Bias and Fairness
- Model trained on diverse skin types but may have limitations
- Users should be aware of potential biases in skin tone representation
- Continuous evaluation needed for fair performance across demographics
Privacy
- Model processes images locally - no data transmission required
- Users responsible for ensuring proper consent when analyzing others' images
- Recommend anonymization of facial features when possible
License
This model is released under the MIT License. See LICENSE file for details.
Citation
If you use this model in your research, please cite:
@misc{skin-type-classifier-2025,
title={Skin Type Classification Model},
author={Your Name},
year={2025},
howpublished={\\url{https://huggingface.co/your-username/skin-type-classifier}},
}
Contact
For questions, issues, or collaboration opportunities, please reach out through the Hugging Face model page or GitHub repository.
Disclaimer: This model is for research and cosmetic purposes only. It should not be used for medical diagnosis or treatment decisions. Always consult healthcare professionals for medical concerns.
- Downloads last month
- 20