File size: 1,390 Bytes
8cbd529
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f5ede23
8cbd529
f5ede23
8cbd529
f5ede23
8cbd529
 
 
f5ede23
ea91484
 
8cbd529
 
 
 
 
 
 
 
 
 
ea91484
8cbd529
 
 
 
ea91484
f5ede23
 
8cbd529
 
ea91484
 
 
8cbd529
f5ede23
8cbd529
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
---
language:
- en
tags:
- sentiment-analysis
- text-classification
- bert
- manav
- ManavDhayeCoder/sentiment-bert
- ManavDhaye
pipeline_tag: text-classification
base_model:
- google-bert/bert-base-uncased
datasets:
- imdb
library_name: transformers
widget:
- text: This movie was amazing!
- text: Worst movie I have ever seen.
model-index:
- name: sentiment-bert
  results: []
metrics:
- accuracy
---

# πŸ“˜ BERT Sentiment Analysis Model (Fine-Tuned on IMDB)

This model is a fine-tuned version of **google-bert/bert-base-uncased**, trained on the **IMDB movie reviews dataset** for binary sentiment classification.

It predicts whether text expresses **negative** or **positive** sentiment.

This model is hosted by **[@ManavDhayeCoder](https://huggingface.co/ManavDhayeCoder)**.

---

# πŸš€ Model Overview

| Property | Value |
|----------|--------|
| **Base model** | google-bert/bert-base-uncased |
| **Task** | Sentiment Analysis (Sequence Classification) |
| **Labels** | negative / positive |
| **Dataset** | IMDB |
| **Library** | Hugging Face Transformers |
| **Format** | model.safetensors |

The model has two classes:

- `LABEL_0` β†’ **negative**  
- `LABEL_1` β†’ **positive**

---

# πŸ”₯ Quick Usage Example

```python
from transformers import pipeline

clf = pipeline("text-classification", model="ManavDhayeCoder/sentiment-bert")

print(clf("This movie was amazing!"))