Infrastructure
· 17 checks — DNS, redirects, IPv6, crawlability, URL variants, and domain intelligence rolled into one auditable list.FCrawlabilityActionrobots.txt present, sitemap with 0 URLsFIX
Disallow: / for all user-agents prevents search engines from indexing any page. This will remove the site from search results.
Disallow: / in robots.txt blocks every search crawler — the site becomes invisible in organic search.
Learn more ▾ ▴
Common deployment mistake: a staging robots.txt with `User-agent: * / Disallow: /` ships to prod. The site falls out of search results within days. Verify your robots.txt is the production-intended version. If this is intentional (private site), no action needed.
Source: Google Search Central
Search engines may not be able to parse the sitemap. Fix XML validation errors.
An unparseable sitemap is silently ignored by Google — the URLs it advertises are never queued for crawl.
Learn more ▾ ▴
Google's sitemap parser is strict about XML validity. A single unescaped `&` or unclosed tag invalidates the whole file. Run your sitemap through a validator (Search Console's Sitemaps report flags it) and fix the offending entry. Most generators escape correctly; mistakes usually come from manually-written entries.
Source: sitemaps.org / Google Search Central
An empty sitemap provides no value. Add <url> entries for your pages.
An empty sitemap signals 'no content to index' to Google — actively harmful versus having no sitemap at all.
Learn more ▾ ▴
Google compares URLs in the sitemap against URLs it has crawled. An empty sitemap on a site with thousands of pages signals abandonment. Either populate it correctly (most CMSes auto-generate) or delete the file and let Google crawl normally.
Source: Google Search Central / sitemaps.org
Add a 'Sitemap:' directive to robots.txt so search engines can discover your sitemap.
robots.txt omits Sitemap: directive — crawlers must fetch /sitemap.xml by convention; reliable but missing the explicit hint.
Source: sitemaps.org
# https://www.robotstxt.org/robotstxt.html
User-agent: *
Disallow: /
BDNSSECUnsigned (DNSSEC not deployed)REVIEW
BCAA RecordsNo CAA records (any CA may issue certificates)REVIEW
BReverse DNS0/4 IPs match cert SANREVIEW
CIPv6 ReadinessActionNo IPv6 supportREVIEW
IPv6 support is increasingly important for global accessibility. About 40% of internet users have IPv6 connectivity.
No AAAA records — same impact as 'no IPv6 (AAAA) records'; IPv6-preferring clients pay extra latency falling back to IPv4.
Source: Google IPv6 stats
BTLS Certificate Expiry & Recommendations74 days until leaf cert expires — 2 issues to addressREVIEW
Certificate validity
Recommended actions
- Enable HSTS: Strict-Transport-Security: max-age=31536000; includeSubDomains
- Enable DNSSEC on your domain for DNS spoofing protection
BOperational Status PageNo status page link detectedREVIEW
ADNS Records4 A records, 35 ms lookupPASS
| A | 3.165.113.74, 3.165.113.40, 3.165.113.100, 3.165.113.67 |
| AAAA | — |
| CNAME | — |
| NS | — |
| MX | — |
| TXT | — |
| CAA | Lookup not available with standard resolver |
SPF helps prevent email spoofing. Add a TXT record starting with 'v=spf1'.
Without SPF, receiving servers can't validate sending IPs — your domain is easier to spoof in phishing.
Learn more ▾ ▴
SPF complements DMARC. Both should be published. SPF records list authorized sending IPs (e.g., `v=spf1 include:_spf.google.com ~all` for Google Workspace). After publishing, verify in Google Postmaster Tools or mxtoolbox.
Source: RFC 7208 (SPF)
A+Subdomain TakeoverNo subdomain takeover risk detectedPASS
A+Multi-Resolver DNS SpeedMean 28ms across 3 resolvers (spread 5ms)PASS
A+Redirect ChainNo redirects — direct accessPASS
https://Www.mybni.com
57 ms · HTTP/1.1 FINAL
| # | URL | Status | Time | Protocol | Server |
|---|---|---|---|---|---|
| 1 | https://Www.mybni.com | 200 | 57 ms | HTTP/1.1 | AmazonS3 |
A+URL Variantswww/non-www, trailing slash, HTTP→HTTPSPASS
www / non-www
HTTP → HTTPS
Consistent
A+Domain Intelligencemybni.com — via GoDaddy.com, LLC, 24 years, 5 months old, hosted on AWSPASS
228 days
December 29, 2026
74 days
Issued by Amazon
24 years, 5 months
Registered April 10, 2002
Not enabled
Protects against DNS spoofing
AWS
ASN AS16509
3.165.113.74
GoDaddy.com, LLC
Expiry timeline
Recommended actions
- Enable DNSSEC to protect visitors from DNS spoofing
- Enable registrar lock (clientTransferProhibited) to block unauthorized domain transfers
DNSSEC protects against DNS spoofing attacks. While not required, enabling DNSSEC adds an additional layer of security. Contact your DNS provider to enable it.
Without DNSSEC, an attacker who can poison your DNS can hijack your domain — and SSL certs alone don't stop them.
Learn more ▾ ▴
DNSSEC adds cryptographic signatures to DNS records, preventing forged responses from poisoning resolver caches. Without it, an attacker who controls the network path can redirect your domain to a malicious server before any HTTPS handshake happens. Most modern registrars (Cloudflare, Google Domains, Route 53) enable it with one toggle.
Source: ICANN / RFC 4033
The domain can be transferred without an unlock step. Enable registrar lock (clientTransferProhibited) in your registrar's control panel to protect against unauthorized or accidental transfers.
Without registrar lock, an attacker who phishes your registrar credentials can transfer the domain in minutes — total brand hijack.
Learn more ▾ ▴
Registrar lock (clientTransferProhibited, clientUpdateProhibited, clientDeleteProhibited) requires extra verification before any transfer/update/delete. Every major registrar offers it free. Combined with 2FA on your registrar account, it's the strongest defense against domain hijacking.
Source: ICANN / domain-security best practice