Skip to content

feat: AI-first submission, shader detail pages, and review system#15

Merged
devallibus merged 3 commits into
masterfrom
feat/ai-first-detail-reviews
Mar 6, 2026
Merged

feat: AI-first submission, shader detail pages, and review system#15
devallibus merged 3 commits into
masterfrom
feat/ai-first-detail-reviews

Conversation

@devallibus
Copy link
Copy Markdown
Owner

Why this matters

ShaderBase was built as a git-first, agent-first shader registry — and it nails the data layer. The schema is rigorous, provenance is enforced, recipes are copy-paste ready. But the web frontend has been holding it back in three fundamental ways:

1. The submission flow is hostile to contributors

The current /submit page is a 30+ field manual form. A contributor needs to understand the full ShaderBase manifest schema — pipeline vs. stage, capability requirements, provenance source kinds, recipe targets — before they can add a single shader. This is backwards. The data should serve the contributor, not the other way around.

Most shader authors already have their code. They have a Shadertoy link, a GitHub gist, or raw GLSL sitting in a file. They shouldn't need to reverse-engineer our schema to share it.

The fix: An AI-first submission wizard. Paste code or a URL. AI analyzes the GLSL, extracts uniforms, infers the pipeline/stage, suggests tags and category, adapts Shadertoy conventions (iTime to uTime), and populates every field. The contributor reviews, tweaks, and submits. The old 545-line manual form is gone entirely.

2. Search results are a dead end

The /shaders browse page shows cards with metadata — but they're divs, not links. You can't click through to see what a shader actually looks like. There's no detail page. No rendering. No way to inspect uniforms, read the GLSL source, or copy a recipe. You search, you see a card, and... that's it.

For a shader registry, not being able to see the shaders is a fundamental gap.

The fix: A full detail page at /shaders/$name with:

  • Live Three.js preview — the actual shader running in a canvas, not a static image
  • Interactive uniform controls — sliders, color pickers, toggles for every uniform
  • Recipe code blocks — Three.js and R3F integration code with copy buttons
  • GLSL source — vertex and fragment code displayed with copy support
  • Full metadata — compatibility, inputs/outputs, provenance with upstream links
  • Shader cards are now clickable Link elements that navigate to detail pages

3. No feedback loop from usage

When an AI agent uses a shader during vibecoding, there's no way to capture whether it worked well. Did the user like the result? Was the recipe easy to integrate? Did the shader compile correctly in their setup? This signal is invaluable for curation — knowing which shaders are battle-tested vs. which need improvement.

The fix: A review system with:

  • SQLite database (using node:sqlite, same pattern as Better Auth) for ratings and comments
  • Server function API — submitReview and getReviews for external consumption by SDK/MCP/agents
  • Source tracking — each review records whether it came from the web UI, an SDK, an MCP tool, or a skill, plus optional agent context (which model, what task)
  • Visible ratings — average stars and review counts on shader cards and detail pages
  • Web review form — star rating + optional comment directly on the detail page

What changed

New files (10)

File Purpose
components/ShaderPreviewCanvas.tsx Three.js live shader renderer with per-pipeline geometry (plane/sphere), uniform binding, SVG fallback
components/UniformControls.tsx Interactive controls: sliders for float/int, color pickers for vec3/color, toggle for bool, grouped sliders for vec2/3/4
components/CodeBlock.tsx Monospace code display with language label and clipboard copy
components/ReviewsSection.tsx Star ratings, source breakdown, review list, web submission form
components/AiSubmitWizard.tsx 3-step wizard: paste input, AI processing, review & edit with live preview
lib/server/shader-detail.ts Server function loading full manifest + GLSL source + recipe code for a shader
lib/server/ai-parse.ts URL resolution (Shadertoy/GitHub/gist) + Vercel AI SDK generateObject() for metadata extraction
lib/server/reviews-db.ts SQLite database, schema creation, CRUD helpers for reviews
routes/shaders.$name.tsx Detail page assembling preview, controls, metadata, recipes, source, provenance, reviews
routes/api/-reviews.ts Server functions for submitting and fetching reviews

Modified files (7)

File Change
components/ShaderCard.tsx Wrapped in Link to detail page, added star rating display
lib/server/shaders.ts Joins review stats (average rating, count) into ShaderEntry
routes/submit.tsx Replaced 545-line manual form with AiSubmitWizard; kept auth flow and saveDraftSubmission
package.json Added three, @types/three, ai, @ai-sdk/anthropic
.env.example Added ANTHROPIC_API_KEY
bun.lock Updated lockfile
routeTree.gen.ts Auto-regenerated with new /shaders/$name route

Dependencies added

  • three — Live shader preview (shaders are Three.js ShaderMaterial code, this guarantees compatibility)
  • ai + @ai-sdk/anthropic — Vercel AI SDK for provider-agnostic structured output; currently wired to Anthropic but trivially switchable

How it works

Detail page flow

  1. User clicks a shader card on /shaders
  2. getShaderDetail server function reads the full manifest, GLSL files, recipe source code, and preview SVG from the filesystem
  3. getReviews fetches review data from SQLite
  4. ShaderPreviewCanvas creates a Three.js scene with the appropriate geometry (plane for surface/postprocessing, sphere for geometry shaders) and applies a ShaderMaterial with the actual vertex/fragment code
  5. UniformControls renders interactive controls per uniform type, feeding overrides back into the material
  6. Recipes, source code, compatibility, and provenance are displayed with CodeBlock and existing UI primitives

AI submission flow

  1. User pastes raw GLSL, a Shadertoy URL, a GitHub gist URL, or a GitHub file URL
  2. resolveShaderSource detects the input type and fetches the source code (Shadertoy API, GitHub API, or direct fetch)
  3. aiParseShader sends the code to Claude via generateObject() with a Zod schema matching the form data type, plus a system prompt that explains all valid enum values and ShaderBase conventions
  4. The wizard populates the form store with AI-extracted data and shows the review step
  5. User sees a live shader preview + all editable fields + manifest preview
  6. Submission goes through the existing saveDraftSubmission pipeline (Zod validation, filesystem write, schema validation)

Review API contract

POST submitReview({ shaderName, rating: 1-5, comment?, source, agentContext? })
GET  getReviews({ shaderName }) -> { reviews[], stats: { average, count } }

SDK/MCP tools call the server functions with source: 'mcp' or source: 'sdk' and optional agentContext for tracing which model and task triggered the review.


Test plan

  • Navigate to /shaders, verify cards are clickable links
  • Click a shader card -> detail page loads with live Three.js preview rendering
  • Adjust uniform sliders/color pickers -> preview updates in real-time
  • Copy recipe code and GLSL source via copy buttons
  • Submit a review on the detail page -> rating appears in review list
  • Navigate back to /shaders -> star rating shows on the card
  • Go to /submit -> paste raw GLSL -> "Parse with AI" -> AI populates all fields
  • Review AI-extracted fields, see live preview, submit draft
  • Verify draft appears in submissions/ directory
  • Run bun run check — tests pass, types pass, validation passes, build succeeds

Closes #14


Generated with Claude Code

devallibus and others added 3 commits March 7, 2026 00:33
- Add /shaders/$name detail page with live Three.js preview, interactive
  uniform controls, recipe code blocks, GLSL source display, and provenance
- Make shader cards clickable links to detail pages with star ratings
- Replace manual 30-field submission form with AI-first wizard: paste GLSL
  or a URL (Shadertoy, GitHub gist/file), AI extracts all metadata via
  Vercel AI SDK, user reviews and submits
- Add SQLite-backed review system with POST API for SDK/MCP/agent feedback
- Display reviews and ratings on detail pages and shader cards

Closes #14

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
- Dynamic import('three') so the 725KB module only loads when a shader
  preview is actually viewed, not on every page
- IntersectionObserver pauses requestAnimationFrame when canvas is
  off-screen (saves GPU cycles when scrolled past)
- Full disposal of geometry, materials, textures, and renderer on unmount
- Reduced sphere geometry from 64x64 to 32x32 segments
- Loading spinner while Three.js loads
- powerPreference: 'high-performance' GPU hint
- Raise Vite chunkSizeWarningLimit to 800KB (Three.js is inherently large)

ShaderPreviewCanvas chunk: 562KB -> 75KB (Three.js loads separately)

Addresses #16

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
TanStack Start v1.166 uses .inputValidator(), not .validator().
The wrong API caused "createServerFn(...).validator is not a function"
at runtime, breaking the detail page and submit page.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@devallibus devallibus merged commit 9f6ec5b into master Mar 6, 2026
1 check failed
@devallibus devallibus deleted the feat/ai-first-detail-reviews branch March 8, 2026 17:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant