Commit
·
05e5e5f
1
Parent(s):
a93a713
Update README.md
Browse files
README.md
CHANGED
|
@@ -1,3 +1,43 @@
|
|
| 1 |
---
|
| 2 |
license: mit
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
---
|
| 2 |
license: mit
|
| 3 |
+
datasets:
|
| 4 |
+
- chloeliu/reddit_nosleep_posts
|
| 5 |
+
language:
|
| 6 |
+
- en
|
| 7 |
+
tags:
|
| 8 |
+
- fun
|
| 9 |
+
- horror
|
| 10 |
+
- writing
|
| 11 |
+
widget:
|
| 12 |
+
- text: "[WP] We don't go to ravenholm anymore [RESPONSE] "
|
| 13 |
+
example_title: "Ravenholm"
|
| 14 |
+
- text: "[WP] The man in the corner of my window [RESPONSE] "
|
| 15 |
+
example_title: "The man in the corner"
|
| 16 |
+
co2_eq_emissions:
|
| 17 |
+
emissions: 70
|
| 18 |
+
source: "https://mlco2.github.io/impact/#compute"
|
| 19 |
+
training_type: "fine-tuning"
|
| 20 |
+
geographical_location: "Oregon, USA"
|
| 21 |
+
hardware_used: "1x T4, Google Colab"
|
| 22 |
---
|
| 23 |
+
|
| 24 |
+
# GPT-NoSleep-1.5b
|
| 25 |
+
This is the largest release of GPT-NoSleep; a finetuned version of [GPT2-XL](https://huggingface.co/gpt2-xl) on the 'reddit-nosleep-posts' dataset.
|
| 26 |
+
Smaller releases include:
|
| 27 |
+
* [GPT-NoSleep-355m](https://huggingface.co/DarwinAnim8or/GPT-NoSleep-355m)
|
| 28 |
+
|
| 29 |
+
And the accompanying prompt generator can be found here:
|
| 30 |
+
* [Space for prompt generation](https://huggingface.co/spaces/DarwinAnim8or/NoSleepWritingPromptGenerator)
|
| 31 |
+
* [The model](https://huggingface.co/DarwinAnim8or/NoSleepPromptGen)
|
| 32 |
+
|
| 33 |
+
# Training Procedure
|
| 34 |
+
This was trained on the 'reddt-nosleep-posts' dataset, on Google Colab.
|
| 35 |
+
This model was trained for 2 epochs with learning rate 1e-2.
|
| 36 |
+
Special thanks for Skyler for helping to train this large of a model!
|
| 37 |
+
|
| 38 |
+
# Biases & Limitations
|
| 39 |
+
This likely contains the same biases and limitations as the original GPT2 that it is based on, and additionally heavy biases from the dataset.
|
| 40 |
+
It can generate output that is not meant for all audiences, seeing as it's purpose is to generate horror stories.
|
| 41 |
+
|
| 42 |
+
# Intended Use
|
| 43 |
+
This model is meant for fun, nothing else.
|