Skip to content

feat: spam protection and input validation for reviews#42

Merged
devallibus merged 3 commits into
masterfrom
development
Mar 8, 2026
Merged

feat: spam protection and input validation for reviews#42
devallibus merged 3 commits into
masterfrom
development

Conversation

@devallibus
Copy link
Copy Markdown
Owner

Summary

  • Validate rating (integer 1-5), truncate comments (2000 char max), allowlist sources
  • Rate limit: max 5 reviews per IP per 10 minutes
  • Duplicate prevention: one review per shader per IP, permanently
  • Track client_ip via getRequestIP() with DB migration for existing tables
  • Client-side: 5s cooldown after submit, double-click guard, red error styling

Files changed

File Change
reviews-db.ts Validation, rate limiting, duplicate check, client_ip column + migration
-reviews.ts Pass client IP from request to addReview
ReviewsSection.tsx Cooldown, double-click guard, error styling
reviews-db.test.node.ts 8 new tests for validation and rate limiting

Test plan

  • bun run check — all tests pass
  • 15 reviews-db tests pass including new validation/rate-limit tests
  • Manual: submit review, try submitting again for same shader — should be rejected
  • Manual: verify error messages show in red

🤖 Generated with Claude Code

Server-side:
- Validate rating is integer 1-5, reject floats and out-of-range
- Truncate comments to 2000 chars, allowlist sources (web/mcp/cli)
- Rate limit: max 5 reviews per IP per 10 minutes
- Duplicate prevention: one review per shader per IP (permanent)
- Track client_ip with migration for existing DBs

Client-side:
- 5-second cooldown after successful submit
- Guard against double-click submission
- Error messages styled in red, success in muted

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@devallibus
Copy link
Copy Markdown
Owner Author

Findings

  1. Medium: the duplicate-review guard uses client_ip as the reviewer identity, so legitimate users who share an egress IP get blocked from reviewing the same shader. In reviews-db.ts:77 through reviews-db.ts:95, any prior row with the same client_ip + shader_name rejects the new submission forever. The IP comes straight from -reviews.ts:17 through -reviews.ts:18. On shared NATs, mobile carriers, offices, or schools, the second real user will get ?You already reviewed this shader? even though they never posted.

  2. Medium: the new tests are not isolated from prior runs, so the targeted web test is non-idempotent. The database path is persistent in reviews-db.ts:5 through reviews-db.ts:9, and the PR adds fixed-IP cases at reviews-db.test.node.ts:135 and reviews-db.test.node.ts:145. On a normal rerun, bun run test:web failed for me because an earlier local run had already consumed the hard-coded IP?s quota, so the ?duplicate review? assertion received Too many reviews submitted... instead. The same test only passed after I forced a fresh DATA_DIR.

Open Questions

  • There is still no direct test for the advertised rate-limit path: six distinct reviews from one IP inside the 10-minute window. The new cases in reviews-db.test.node.ts:145 through reviews-db.test.node.ts:156 never hit that threshold.

Verification
bun run build:web passed. bun run test:web failed against the default workspace DB, and passed when run with an isolated temporary DATA_DIR.

Signed,
GPT-5.4 HIGH

- Tests now use a temp DATA_DIR per run (idempotent across reruns)
- Duplicate prevention relaxed to 24-hour window to avoid blocking
  users on shared IPs (NAT, offices, mobile carriers)
- Added rate-limit threshold test (6th review from same IP is rejected)
- Fixed hard-coded IPs in tests that leaked across runs

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@devallibus
Copy link
Copy Markdown
Owner Author

I would stop using IP as reviewer identity and keep it only as a coarse abuse signal.

The cleanest anonymous approach is an anonymous reviewer token:

  • On first visit or first submit, mint a random opaque token.
  • Store it in an HttpOnly, SameSite=Lax cookie.
  • In the DB, store only a hash of that token, not the raw value.
  • Enforce duplicate-prevention on (shader_name, reviewer_token_hash), not (shader_name, client_ip).
  • Keep IP-based limits only for burst control, like max 20 submits per hour per IP.

That fixes the shared-NAT problem without forcing accounts. Two users in the same office stop colliding, while one browser still cannot spam the same shader repeatedly unless they clear cookies.

I would also split controls by purpose:

  • Identity control: reviewer token hash
  • Abuse control: IP/window rate limit
  • Escalation control: CAPTCHA only after suspicious behavior, not by default

If you want this to be harder to evade without full auth, add a second layer:

  • Token cookie + normalized user-agent hash
  • Optional Turnstile after N failed or fast attempts
  • Server-side uniqueness or app-level duplicate check so races cannot double-insert

Concretely, I would model it like this:

  • Add reviewer_token_hash TEXT
  • Index (reviewer_token_hash, shader_name, created_at)
  • Keep client_ip indexed separately for rate limiting only

If cookies are missing or blocked, degrade gracefully:

  • Allow submit
  • Use only IP rate limit
  • Do not do duplicate-prevention in that fallback path, because that is where false positives get ugly

Signed,
GPT-5.4 HIGH

Replace IP-based duplicate detection with an HttpOnly cookie token.
The server mints a random token on first review, hashes it with SHA-256,
and stores the hash. Duplicate check is now reviewer_token_hash + shader
within 24 hours — different users on the same IP are no longer blocked.

IP is retained only for burst rate limiting (5 reviews / 10 min).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@devallibus devallibus merged commit 39158c9 into master Mar 8, 2026
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant