Infrastructure
· 17 checks — DNS, redirects, IPv6, crawlability, URL variants, and domain intelligence rolled into one auditable list.BDNSSECUnsigned (DNSSEC not deployed)REVIEW
BCAA RecordsNo CAA records (any CA may issue certificates)REVIEW
BReverse DNS0/4 IPs match cert SANREVIEW
BMulti-Resolver DNS SpeedMean 41ms across 3 resolvers (spread 109ms)REVIEW
CIPv6 ReadinessActionNo IPv6 supportREVIEW
IPv6 support is increasingly important for global accessibility. About 40% of internet users have IPv6 connectivity.
No AAAA records — same impact as 'no IPv6 (AAAA) records'; IPv6-preferring clients pay extra latency falling back to IPv4.
Source: Google IPv6 stats
CCrawlabilityActionno robots.txt, sitemap with 0 URLsREVIEW
robots.txt is optional but recommended. It tells search engine crawlers which pages to index.
No robots.txt — crawlers fetch /robots.txt and get 404; not breaking but means default crawl behavior with no directives or sitemap reference.
Learn more ▾ ▴
A minimal robots.txt with `User-agent: * / Allow: / / Sitemap: https://example.com/sitemap.xml` covers the basics. Without it, crawlers behave fine but lose the sitemap signal and can't be selectively blocked from crawl-traps.
Source: robotstxt.org
Search engines may not be able to parse the sitemap. Fix XML validation errors.
An unparseable sitemap is silently ignored by Google — the URLs it advertises are never queued for crawl.
Learn more ▾ ▴
Google's sitemap parser is strict about XML validity. A single unescaped `&` or unclosed tag invalidates the whole file. Run your sitemap through a validator (Search Console's Sitemaps report flags it) and fix the offending entry. Most generators escape correctly; mistakes usually come from manually-written entries.
Source: sitemaps.org / Google Search Central
An empty sitemap provides no value. Add <url> entries for your pages.
An empty sitemap signals 'no content to index' to Google — actively harmful versus having no sitemap at all.
Learn more ▾ ▴
Google compares URLs in the sitemap against URLs it has crawled. An empty sitemap on a site with thousands of pages signals abandonment. Either populate it correctly (most CMSes auto-generate) or delete the file and let Google crawl normally.
Source: Google Search Central / sitemaps.org
No robots.txt found
This is fine for most sites — a missing robots.txt allows all crawling by default.
BURL Variantswww/non-www, trailing slash, HTTP→HTTPSREVIEW
www / non-www
Inconsistent — duplicate content risk
HTTP → HTTPS
Consistent
BTLS Certificate Expiry & Recommendations179 days until leaf cert expires — 3 issues to addressREVIEW
Certificate validity
Recommended actions
- Enable HSTS: Strict-Transport-Security: max-age=31536000; includeSubDomains
- Enable DNSSEC on your domain for DNS spoofing protection
- Enable OCSP stapling on your TLS server to remove a CA roundtrip and protect user privacy
BOperational Status PageNo status page link detectedREVIEW
A+DNS Records4 A records, 132 ms lookupPASS
| A | 13.224.252.41, 13.224.252.36, 13.224.252.66, 13.224.252.127 |
| AAAA | — |
| CNAME | — |
| NS | ns-1085.awsdns-07.org, ns-1545.awsdns-01.co.uk, ns-235.awsdns-29.com, ns-780.awsdns-33.net |
| MX | 10 inbound-smtp.us-east-1.amazonaws.com |
| TXT | SPF v=spf1 include:amazonses.com ~all |
| CAA | Lookup not available with standard resolver |
A+Subdomain TakeoverNo subdomain takeover risk detectedPASS
A+Redirect ChainNo redirects — direct accessPASS
https://pullitpacks.com
147 ms · HTTP/1.1 FINAL
| # | URL | Status | Time | Protocol | Server |
|---|---|---|---|---|---|
| 1 | https://pullitpacks.com | 200 | 147 ms | HTTP/1.1 | AmazonS3 |
A+Domain Intelligencepullitpacks.com — via Amazon Registrar, Inc., 18 days old, hosted on AWSPASS
342 days
April 22, 2027
179 days
Issued by Amazon
18 days
Registered April 22, 2026
Not enabled
Protects against DNS spoofing
AWS
ASN AS16509
13.224.252.127
Amazon Registrar, Inc.
Expiry timeline
Recommended actions
- Enable DNSSEC to protect visitors from DNS spoofing
- Newly registered domain — build backlinks and content to establish SEO trust
- Enable registrar lock (clientTransferProhibited) to block unauthorized domain transfers
Newly registered domains may face SEO trust challenges. Search engines generally give more authority to older domains. This is informational — not a problem to fix.
Informational: domain age. Newer domains may have lower trust signals in spam/security filters.
DNSSEC protects against DNS spoofing attacks. While not required, enabling DNSSEC adds an additional layer of security. Contact your DNS provider to enable it.
Without DNSSEC, an attacker who can poison your DNS can hijack your domain — and SSL certs alone don't stop them.
Learn more ▾ ▴
DNSSEC adds cryptographic signatures to DNS records, preventing forged responses from poisoning resolver caches. Without it, an attacker who controls the network path can redirect your domain to a malicious server before any HTTPS handshake happens. Most modern registrars (Cloudflare, Google Domains, Route 53) enable it with one toggle.
Source: ICANN / RFC 4033
The domain can be transferred without an unlock step. Enable registrar lock (clientTransferProhibited) in your registrar's control panel to protect against unauthorized or accidental transfers.
Without registrar lock, an attacker who phishes your registrar credentials can transfer the domain in minutes — total brand hijack.
Learn more ▾ ▴
Registrar lock (clientTransferProhibited, clientUpdateProhibited, clientDeleteProhibited) requires extra verification before any transfer/update/delete. Every major registrar offers it free. Combined with 2FA on your registrar account, it's the strongest defense against domain hijacking.
Source: ICANN / domain-security best practice