optimize-prompt

Optimize Prompt for Token Efficiency

Content Preview
# Optimize Prompt for Token Efficiency

Takes an input prompt and returns ONLY a token-optimized version that preserves meaning while minimizing token count. Based on LLM tokenization principles: common words tokenize more efficiently, unusual words break into more tokens, and conciseness reduces cost.

**Output Format**: Return only the optimized prompt text with no additional commentary, analysis, or explanation.

## Instructions

### 1. **Analyze Input Prompt**

- Extract the prompt from `$AR
How to Use

Recommended: Install to project (local)

mkdir -p .claude/skills
curl -o .claude/skills/optimize-prompt.md \
  https://raw.githubusercontent.com/qdhenry/Claude-Command-Suite/main/.claude/commands/context/optimize-prompt.md

Skill is scoped to this project only. Add .claude/skills/ to your .gitignoreif you don't want to commit it.

Alternative: Clone full repo

git clone https://github.com/qdhenry/Claude-Command-Suite

Then reference at .claude/commands/context/optimize-prompt.md

Related Skills