Create README.md
Browse files
README.md
ADDED
|
@@ -0,0 +1,46 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
tags:
|
| 3 |
+
- text-generation
|
| 4 |
+
- gpt2
|
| 5 |
+
- recipes
|
| 6 |
+
- natural-language-generation
|
| 7 |
+
license: apache-2.0
|
| 8 |
+
---
|
| 9 |
+
|
| 10 |
+
# MinimalistRecipeTextGenerator
|
| 11 |
+
|
| 12 |
+
## Overview
|
| 13 |
+
|
| 14 |
+
This model is a fine-tuned version of the **GPT-2 (small)** language model, specifically trained to generate coherent and realistic short recipe texts. Given a prompt (e.g., "A quick chicken curry"), the model completes the text, often generating ingredient lists and basic instructions.
|
| 15 |
+
|
| 16 |
+
## Model Architecture
|
| 17 |
+
|
| 18 |
+
The model uses the standard **GPT-2 language modeling architecture**.
|
| 19 |
+
|
| 20 |
+
1. **Core:** A 12-layer, 768-dimensional transformer decoder stack.
|
| 21 |
+
2. **Mechanism:** It operates based on attention mechanisms, predicting the next token in a sequence given all previous tokens.
|
| 22 |
+
3. **Training:** Fine-tuned on a dataset of simple, short recipes, enabling it to learn the structural patterns of recipes (Title -> Ingredients -> Instructions).
|
| 23 |
+
4. **Generation Parameters:** The `config.json` sets default generation parameters for high-quality output:
|
| 24 |
+
* `do_sample`: True (for creative text generation)
|
| 25 |
+
* `temperature`: 0.7 (controls randomness)
|
| 26 |
+
* `max_length`: 256 (for short, complete recipes)
|
| 27 |
+
|
| 28 |
+
## Intended Use
|
| 29 |
+
|
| 30 |
+
This model is intended for creative and content generation purposes:
|
| 31 |
+
|
| 32 |
+
* **Creative Writing/Blogging:** Generating unique recipe ideas for food blogs or social media.
|
| 33 |
+
* **Data Augmentation:** Creating synthetic, but structurally correct, recipe texts for training other culinary-focused models.
|
| 34 |
+
* **Demonstration:** Serving as a basic example of fine-tuning GPT-2 on a domain-specific corpus.
|
| 35 |
+
|
| 36 |
+
### How to use
|
| 37 |
+
|
| 38 |
+
```python
|
| 39 |
+
from transformers import pipeline
|
| 40 |
+
|
| 41 |
+
generator = pipeline("text-generation", model="your_username/MinimalistRecipeTextGenerator") # Replace with actual hub path
|
| 42 |
+
prompt = "Recipe for a refreshing summer salad:"
|
| 43 |
+
|
| 44 |
+
output = generator(prompt, max_length=150, num_return_sequences=1, temperature=0.8)
|
| 45 |
+
|
| 46 |
+
print(output[0]['generated_text'])
|