GPT-5 mini is cheap because it is narrower, not because it is a universal substitute.
The current OpenAI limits summary lists GPT-5 mini at 400,000 tokens. That is still large, but it is materially smaller than the flagship path and should be treated as a real fit constraint.
Sources