Models and Credits
Understand model access, default credit pricing, and what happens when a generation fails.
Models and Credits
MakeClipAI uses a credit system so model usage, plan access, and billing are visible in one place.
Supported model families
The current workspace includes models from these groups:
- Kling
- Pika
- Seedance
- Sora
Different durations and model families have different credit costs and minimum plan requirements.
Default model pricing snapshot
The application currently ships with the following defaults:
| Model | Credits | Minimum plan |
|---|---|---|
| Pika Basic | 40 | Free |
| Pika Effects | 80 | Free |
| Kling AI 5s | 70 | Free |
| Kling AI 10s | 140 | Free |
| Seedance 2.0 - 5s | 100 | Free |
| Kling AI 30s | 420 | Pro |
| Kling AI 60s | 840 | Max |
| Sora 2 - 10s | 200 | Pro |
| Sora 2 - 15s | 300 | Pro |
| Seedance 2.0 - 15s | 250 | Pro |
These values can be overridden by database-backed pricing, so treat the table as the product default rather than a permanent contract.
How plan gating works
Each model has a minimum required plan.
That means:
- lower-cost models are available for quick testing on lower tiers
- more expensive or advanced models are reserved for higher plans
- the UI can show both credit cost and access boundary before submission
What credits are for
Credits are used to make generation cost predictable.
They help teams answer practical questions such as:
- how much a workflow costs per video
- which models are worth using for a given outcome
- when to move from prompt experiments to repeatable production
Failure handling and refunds
MakeClipAI is built to avoid the worst billing experience: paying for broken output without visibility.
If a generation fails, the product tracks the task state and includes refund handling in the workflow so billing stays aligned with actual outcomes.
Choosing the right model
Use this rough rule of thumb:
- start with lower-cost models for prompt exploration
- move to higher-tier models when quality or duration justifies it
- reserve the most expensive options for validated workflows
Operational advice
- Review history together with credits consumed.
- Use plan upgrades for sustained throughput, not just one-off testing.
- Revisit prompt quality before blaming model quality.
- Treat model selection as a cost-quality decision, not just a feature checklist.
Next: read Templates and Workflows.
MakeClipAI Docs