Open Source MCP Server

Give your AI assistant eyes into your Android app

ComposeProof is an MCP server that lets AI coding assistants render Compose UI headlessly, inspect live devices, and verify screenshots — no emulator needed.

Works with any MCP client

Claude Code Cursor Gemini CLI Android Studio
composeproof
$ claude
You: verify my login screen matches the spec
AI: I'll render and verify the login screen.
calling render LoginScreenPreview
calling verify mode='verify'
calling diff mode='verify'
PASS (3/3) — rendered 1.8s, 99.8% golden match, 0 a11y warnings
_

The problem with AI + Android today

AI coding assistants are powerful at writing code, but blind when it comes to UI. The feedback loop is broken.

AI can read code but can't see the app

Your AI assistant writes Compose UI code but has zero visual feedback. It can't tell if a button is misaligned, a color is wrong, or a layout is broken.

Debugging is manual and slow

Run the app, navigate to the screen, check the output, copy the error, paste it back. Every iteration requires you as the middleman.

Visual testing needs human eyes

Screenshot tests exist, but setting them up is painful and reviewing diffs requires you to look. The AI should handle this end-to-end.

Two modes. Complete coverage.

ComposeProof works at build-time and run-time, giving AI assistants full visibility into your Compose UI.

Headless Rendering

Build-time / CI

Renders @Preview composables headlessly via Roborazzi. No device, no emulator — just PNGs the AI can see and verify.

# AI renders a preview headlessly
render LoginScreenPreview
Rendering via Roborazzi... 1.8s
✓ PNG returned (1080×1920)
diff mode='verify'
✓ 99.8% match with golden
render verify diff render_batch list_previews

Device Inspection

Run-time / Live

Connects to a live device via ADB. The AI can screenshot, tap, swipe, inspect UI trees, and navigate your app — closed-loop testing.

# AI interacts with a live device
device_interact tap_element "Login"
✓ Tapped, screenshot saved
device_interact text "user@test.com"
✓ Text entered
device_interact tap_element "Submit"
✓ Tapped, screenshot saved
device_interact inspect_ui_tree take_screenshot build_and_deploy preflight

See it in action

One prompt, zero human intervention. AI builds the app, discovers UI patterns, tests interactions, and reports results.

Recorded on a Pixel 9 Pro Fold with StickerExplode demo app.

Build, deploy & launch

AI runs preflight to detect the device, picks the right Gradle variant, builds, installs, and launches — all from a single natural language prompt.

AI discovers the interaction model

The FAB tap fails, so the AI reads source code, discovers the FAB opens a bottom sheet sticker tray, and switches strategy. Every action auto-screenshots.

Drag testing with coordinate recalculation

ADB swipe doesn't trigger Compose gestures. The AI switches to raw touch events, detects resolution mismatch, recalculates coordinates, and successfully drags stickers.

Results & persistent screenshots

AI produces a structured test report (3/3 pass), notes z-ordering and animations. Every screenshot from the session is saved to disk for human review.

Full uncut session

6 minutes — from prompt to test report, no edits.

23 tools built. More coming.

Every tool is an MCP endpoint your AI can call. Headless rendering, device interaction, performance profiling, and more.

insights

Project overview — preview count, golden coverage, device status

Observability
render

Render any @Preview headlessly via Roborazzi → PNG

Compose
list_previews

Discover all @Preview functions — file, line, params

Compose
verify

Single-call PASS/FAIL: render + golden + accessibility

Compose
render_batch

Render/verify multiple previews with compact summary

Compose
diff

Golden management: verify, record, or update baselines

Compose
preflight

Check device + app state — connected, installed, screen

Observability
inspect_ui_tree

Dump live Compose/View hierarchy with a11y warnings

UI Inspection
device_interact

Tap, swipe, type, scroll — AI navigates the app

UI Control
get_recomposition_stats

Find recomposition hotspots via compiler metrics

Performance
take_device_screenshot

Capture device screen, auto-saved to disk

UI Inspection
build_and_deploy

Gradle build + install APK on device

Observability
get_build_status

Check build success and APK version match

Observability
get_network_logs

Capture OkHttp HTTP traffic from logcat

Observability
manage_proxy

Set/clear device HTTP proxy

Data
get_feature_flags

Read/write SharedPreferences

Data
inspect_permissions

Runtime permissions — granted, denied, rationale needed

Embedded Agent
inspect_process_lifecycle

Activity/Fragment lifecycle states for all components

Embedded Agent
inspect_navigation_graph

Navigation graph, back stack, deep link patterns

Embedded Agent
inspect_datastore

Jetpack DataStore preferences — all keys and values

Embedded Agent
inspect_coroutine_state

Active coroutines — state, dispatchers, job hierarchy

Embedded Agent
execute_deeplink

Fire a deep link URI and report which handler resolved

Embedded Agent
simulate_process_death

Recreate Activity to test save/restore state handling

Embedded Agent
audit_accessibility planned

Full WCAG 2.1 audit: touch targets, contrast, focus

Accessibility
track_recompositions planned

Count recompositions per composable per frame

Compose
analyze_stability planned

Report stability classification of composable params

Compose
detect_memory_leaks planned

LeakCanary heap analysis with reference chains

Performance
profile_startup planned

Cold/warm/hot start breakdown with bottlenecks

Performance
test_edge_cases planned

Simulate no network, low memory, date boundaries

Accessibility

Up and running in 3 steps

Zero-install architecture. No changes to your project's build files.

Install ComposeProof

One command via Homebrew. No build file changes, no Gradle plugin, no SDK integration needed.

brew tap aldefy/tap
brew install composeproof

Add to your MCP config

Point your AI client at ComposeProof. Works with Claude Code, Gemini CLI, Cursor, Android Studio.

{
  "mcpServers": {
    "composeproof": {
      "command": "composeproof"
    }
  }
}

AI has eyes

Your AI assistant can now render previews, inspect live devices, verify screenshots, and iterate on UI autonomously.

You: "verify the login screen"
AI:  renders → verifies → PASS ✓

Roadmap

Five waves of capabilities, from core debugging to AI-powered autonomous testing.

1

Foundation

In progress
inspect_navigation_graphinspect_coroutine_stateinspect_datastoreinspect_permissionsexecute_deeplinkinspect_process_lifecyclesimulate_process_death
2

Compose Intelligence

Up next
track_recompositionsanalyze_stabilityinspect_compose_stateprofile_lazy_listsemantic_ui_query
3

Advanced Debugging

Planned
detect_memory_leaksinspect_threadscapture_anr_traceinspect_work_managerinspect_crash_history
4

AI-Powered Workflows

Planned
autonomous_explorationcorrelate_state_and_uidiff_app_stateverify_fixaudit_accessibility
5

Testing & CI

Planned
test_edge_casessnapshot_golden_imagerun_smoke_testverify_release_buildprofile_startup

Get early access

ComposeProof is in active development. Join the waitlist and we'll notify you when it's ready.

No spam. Unsubscribe anytime.