Infrastructure
· 17 checks — DNS, redirects, IPv6, crawlability, URL variants, and domain intelligence rolled into one auditable list.BDNSSECUnsigned (DNSSEC not deployed)REVIEW
BCAA RecordsNo CAA records (any CA may issue certificates)REVIEW
BReverse DNS0/4 IPs match cert SANREVIEW
BCrawlabilityno robots.txt, no sitemapREVIEW
robots.txt is optional but recommended. It tells search engine crawlers which pages to index.
No robots.txt — crawlers fetch /robots.txt and get 404; not breaking but means default crawl behavior with no directives or sitemap reference.
Learn more ▾ ▴
A minimal robots.txt with `User-agent: * / Allow: / / Sitemap: https://example.com/sitemap.xml` covers the basics. Without it, crawlers behave fine but lose the sitemap signal and can't be selectively blocked from crawl-traps.
Source: robotstxt.org
A sitemap helps search engines discover and index your pages more efficiently.
No sitemap.xml — Google relies on crawl-graph discovery alone, slowing indexing of deep or fresh URLs.
Learn more ▾ ▴
A sitemap accelerates Google's discovery of new and updated content. Most CMSes auto-generate one; static-site frameworks need a build-step plugin. Reference it from robots.txt and submit in Search Console to confirm Google can fetch it.
Source: sitemaps.org / Google Search Central
No robots.txt found
This is fine for most sites — a missing robots.txt allows all crawling by default.
No sitemap found
Adding a sitemap helps search engines discover your pages.
BHTTP Probe TimingTotal 1068 ms — DNS, TCP, TLS, TTFB, content transfer breakdownREVIEW
Connection waterfall
BTLS Certificate Expiry & Recommendations313 days until leaf cert expires — 2 issues to addressREVIEW
Certificate validity
Recommended actions
- Enable DNSSEC on your domain for DNS spoofing protection
- Enable OCSP stapling on your TLS server to remove a CA roundtrip and protect user privacy
BCDN & DeliveryNetlifyREVIEW
BOperational Status PageNo status page link detectedREVIEW
BHealth Check EndpointNo conventional health endpoint foundREVIEW
ADNS Records2 A records, 29 ms lookupPASS
| A | 35.157.26.135, 63.176.8.218 |
| AAAA | 2a05:d014:58f:6200::259, 2a05:d014:58f:6200::258 |
| CNAME | — |
| NS | — |
| MX | — |
| TXT | — |
| CAA | Lookup not available with standard resolver |
SPF helps prevent email spoofing. Add a TXT record starting with 'v=spf1'.
Without SPF, receiving servers can't validate sending IPs — your domain is easier to spoof in phishing.
Learn more ▾ ▴
SPF complements DMARC. Both should be published. SPF records list authorized sending IPs (e.g., `v=spf1 include:_spf.google.com ~all` for Google Workspace). After publishing, verify in Google Postmaster Tools or mxtoolbox.
Source: RFC 7208 (SPF)
A+Subdomain TakeoverNo subdomain takeover risk detectedPASS
A+Multi-Resolver DNS SpeedMean 20ms across 3 resolvers (spread 10ms)PASS
A+Redirect ChainNo redirects — direct accessPASS
https://benefit-estimator.netlify.app
101 ms · HTTP/1.1 FINAL
| # | URL | Status | Time | Protocol | Server |
|---|---|---|---|---|---|
| 1 | https://benefit-estimator.netlify.app | 200 | 101 ms | HTTP/1.1 | Netlify |
A+IPv6 ReadinessIPv6 reachable (34 ms)PASS
A+URL Variantswww/non-www, trailing slash, HTTP→HTTPSPASS
www / non-www
HTTP → HTTPS
Consistent
A+CDN Cache ObservabilityCache state: Age=1sPASS
Domain IntelligenceDomain intelligence data not availableINFO
RDAP and WHOIS lookup both failed