Infrastructure
· 17 checks — DNS, redirects, IPv6, crawlability, URL variants, and domain intelligence rolled into one auditable list.FIPv6 ReadinessActionIPv6 records exist but unreachableFIX
Having AAAA records but an unreachable server is worse than no AAAA — clients may experience delays before falling back to IPv4.
Advertising IPv6 (AAAA records) without a reachable server means IPv6-preferring clients silently fail every connection.
Learn more ▾ ▴
Modern browsers prefer IPv6 if AAAA exists (Happy Eyeballs algorithm). If the IPv6 server isn't reachable, browsers fall back to IPv4 — but with seconds of added latency per request. Either fix IPv6 reachability or remove the AAAA records.
Source: RFC 8305 (Happy Eyeballs)
BDNSSECUnsigned (DNSSEC not deployed)REVIEW
BCAA RecordsNo CAA records (any CA may issue certificates)REVIEW
BReverse DNS0/4 IPs match cert SANREVIEW
BCrawlabilityno robots.txt, no sitemapREVIEW
robots.txt is optional but recommended. It tells search engine crawlers which pages to index.
No robots.txt — crawlers fetch /robots.txt and get 404; not breaking but means default crawl behavior with no directives or sitemap reference.
Learn more ▾ ▴
A minimal robots.txt with `User-agent: * / Allow: / / Sitemap: https://example.com/sitemap.xml` covers the basics. Without it, crawlers behave fine but lose the sitemap signal and can't be selectively blocked from crawl-traps.
Source: robotstxt.org
A sitemap helps search engines discover and index your pages more efficiently.
No sitemap.xml — Google relies on crawl-graph discovery alone, slowing indexing of deep or fresh URLs.
Learn more ▾ ▴
A sitemap accelerates Google's discovery of new and updated content. Most CMSes auto-generate one; static-site frameworks need a build-step plugin. Reference it from robots.txt and submit in Search Console to confirm Google can fetch it.
Source: sitemaps.org / Google Search Central
No robots.txt found
This is fine for most sites — a missing robots.txt allows all crawling by default.
No sitemap found
Adding a sitemap helps search engines discover your pages.
BTLS Certificate Expiry & Recommendations312 days until leaf cert expires — 2 issues to addressREVIEW
Certificate validity
Recommended actions
- Enable DNSSEC on your domain for DNS spoofing protection
- Enable OCSP stapling on your TLS server to remove a CA roundtrip and protect user privacy
BCDN & DeliveryNetlifyREVIEW
BOperational Status PageNo status page link detectedREVIEW
BHealth Check EndpointNo conventional health endpoint foundREVIEW
ADNS Records2 A records, 8 ms lookupPASS
| A | 52.52.192.191, 13.52.188.95 |
| AAAA | 2600:1f1c:446:4900::259, 2600:1f1c:446:4900::258 |
| CNAME | — |
| NS | — |
| MX | — |
| TXT | — |
| CAA | Lookup not available with standard resolver |
SPF helps prevent email spoofing. Add a TXT record starting with 'v=spf1'.
Without SPF, receiving servers can't validate sending IPs — your domain is easier to spoof in phishing.
Learn more ▾ ▴
SPF complements DMARC. Both should be published. SPF records list authorized sending IPs (e.g., `v=spf1 include:_spf.google.com ~all` for Google Workspace). After publishing, verify in Google Postmaster Tools or mxtoolbox.
Source: RFC 7208 (SPF)
A+Subdomain TakeoverNo subdomain takeover risk detectedPASS
A+Multi-Resolver DNS SpeedMean 7ms across 3 resolvers (spread 7ms)PASS
A+Redirect ChainNo redirects — direct accessPASS
https://benefit-estimator.netlify.app
17 ms · HTTP/1.1 FINAL
| # | URL | Status | Time | Protocol | Server |
|---|---|---|---|---|---|
| 1 | https://benefit-estimator.netlify.app | 200 | 17 ms | HTTP/1.1 | Netlify |
A+URL Variantswww/non-www, trailing slash, HTTP→HTTPSPASS
www / non-www
HTTP → HTTPS
Consistent
A+HTTP Probe TimingTotal 94 ms — DNS, TCP, TLS, TTFB, content transfer breakdownPASS
Connection waterfall
A+CDN Cache ObservabilityCache state: Age=60489sPASS
Domain IntelligenceDomain intelligence data not availableINFO
RDAP and WHOIS lookup both failed