Point. Speak. Magic.
Capture anything on your screen and let AI understand it instantly
scroll
"explain this"
function calculate() { return data.map(x => x * 2 ); }
This code maps over data to transform each element. Perfect for batch processing!
Combining visual selection with natural language. No copying, no pasting, no context switching.
Draw around anything -
code, images, text, UI
Ask questions in
your own words
Smart, contextual
responses instantly
Smart architecture that processes locally first, then enhances with cloud AI
Point at anything on your screen and ask questions naturally
Direct interaction with what you see - no intermediary steps
Local processing for privacy and speed, cloud AI for deep insights
Why it feels different
context switches
fluid motion
possibilities
Traditional workflow:
6 steps, multiple tools
Plank workflow:
⌘⇧X
activate
↘
drag
🎤
speak
One gesture, instant results
"make this async"
→ Converted function to async/await pattern with error handling
"what's wrong here?"
→ Missing closing bracket on line 3, undefined variable 'user'
"grab the API endpoints"
→ Found 3 endpoints: /users, /posts, /comments
"clean this up"
→ Reformatted with consistent spacing, extracted magic numbers
"typescript this"
→ Added TypeScript interfaces and proper type annotations
"make this CSV"
→ Converted JSON to CSV with proper headers and escaping
Plank adapts to how you think. Every interaction teaches it your preferences.
* no more endless clicking around
Yes! Plank works seamlessly across all your displays. Just drag and select anywhere.
Screenshots are processed locally first, only text is sent for AI analysis. Nothing is stored.
macOS 12 Monterey and newer. Optimized for Apple Silicon but works great on Intel too.
Absolutely! Set any keyboard combo you like in Settings. We just think ⌘⇧X is pretty cool.
Ready to work smarter?
Get Plank Now