Byte-Level BPE Tokenizer: tur_Latn (32K)

A Byte-Level BPE tokenizer trained on tur_Latn data from Fineweb-2-HQ.

Training Details

Parameter Value
Algorithm Byte-Level BPE
Language tur_Latn
Target Vocab Size 32,000
Final Vocab Size 0
Pre-tokenizer ByteLevel
Normalizer NFC
Special Tokens <s>, </s>, <pad>, <unk>
Training Shards 2

Usage

from transformers import AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("flexitok/-bpe_tur_Latn_32000")
tokens = tokenizer.encode("Hello, world!")

Files

  • tokenizer.json โ€” Full HuggingFace tokenizer
  • vocab.json โ€” Vocabulary mapping
  • merges.txt โ€” BPE merge rules
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Collection including flexitok/bpe_tur_Latn_32000