New technique helps LLMs rein in CoT lengths, optimizing reasoning without exploding compute costs March 13, 2025