Scope
Managed files:/client/index.html/client/public/robots.txt/client/public/sitemap.xml/client/public/llms.txt/client/src/components/SeoManager.tsx
What We Optimize
- Route-aware metadata for public pages
- Canonical URLs and Open Graph/Twitter tags
- Crawl boundaries for private/authenticated pages
- Machine-readable site map (
sitemap.xml) - AI-crawler context (
llms.txt)
Public Indexable Routes
/welcome/features/pricing/about/contact/terms/privacy/cookies
noindex, nofollow via SeoManager.
Why SeoManager Exists
DealDash is an SPA. Static head tags alone are not enough for route-level quality signals. SeoManager updates title, description, robots, Open Graph, Twitter, and canonical tags by route.
Location:
/client/src/components/SeoManager.tsx
Crawl Policy
robots.txt allows public marketing pages and blocks private app surfaces:
/api/*- authenticated product routes (
/dashboard,/deals,/contacts, etc.) - auth pages (
/login,/auth/*) - share viewer route (
/s/*)
/client/public/robots.txt
AI Search Baseline
llms.txt publishes concise product context and approved public URLs for AI systems.
Location:
/client/public/llms.txt
- Keep claims factual and non-promotional.
- Link only public, stable routes.
- Do not include internal endpoints or private route examples.
Verification Checklist
- Open each public marketing page.
- Inspect
<title>, canonical, and robots tags in DevTools. - Confirm private routes resolve to
noindex, nofollow.
Update Rule
If a public marketing route is added/removed, update all three:/client/src/components/SeoManager.tsx/client/public/sitemap.xml/client/public/llms.txt