CAC-CoT: Connector-Aware Compact Chain-of-Thought for Efficient Reasoning Data Synthesis Across Dual-System Cognitive Tasks
Paper
•
2508.18743
•
Published
•
3
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("datumo/CAC-CoT") # 🔧 Replace with your model path
tokenizer = AutoTokenizer.from_pretrained("datumo/CAC-CoT")
prompt = "Problem: If you have 3 apples and get 2 more, how many do you have?"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=100)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
BibTeX:
@misc{choi2025caccotconnectorawarecompactchainofthought,
title={CAC-CoT: Connector-Aware Compact Chain-of-Thought for Efficient Reasoning Data Synthesis Across Dual-System Cognitive Tasks},
author={Sunguk Choi and Yonghoon Kwon and Heondeuk Lee},
year={2025},
eprint={2508.18743},
archivePrefix={arXiv},
primaryClass={cs.AI},
url={https://arxiv.org/abs/2508.18743},
}
Sunguk Choi, Yonghoon Kwon, Heondeuk Lee