wfgy-bbam

WFGY BBAM - Attention Modulation

Content Preview
---
tools:
  - read
  - write
  - edit
  - grep
arguments: $FOCUS
---

# WFGY BBAM - Attention Modulation

Apply BBAM (BigBig Attention Modulation) to optimize attention distribution and focus reasoning on critical elements.

Based on the WFGY project: https://github.com/onestardao/WFGY

## Formula

```
â_i = a_i * exp(-γ * std(a))

Where:
- a_i: Raw attention score for element i
- â_i: Modulated attention score
- std(a): Standard deviation of attention scores
- γ: Modulation factor (0.618, gold
How to Use

Recommended: Install to project (local)

mkdir -p .claude/skills
curl -o .claude/skills/wfgy-bbam.md \
  https://raw.githubusercontent.com/qdhenry/Claude-Command-Suite/main/.claude/commands/wfgy/wfgy-bbam.md

Skill is scoped to this project only. Add .claude/skills/ to your .gitignoreif you don't want to commit it.

Alternative: Clone full repo

git clone https://github.com/qdhenry/Claude-Command-Suite

Then reference at .claude/commands/wfgy/wfgy-bbam.md

Related Skills