Infrastructure
· 17 checks — DNS, redirects, IPv6, crawlability, URL variants, and domain intelligence rolled into one auditable list.BDNSSECUnsigned (DNSSEC not deployed)REVIEW
BCAA RecordsNo CAA records (any CA may issue certificates)REVIEW
CReverse DNSAction0/6 IPs match cert SANREVIEW
BCrawlabilityrobots.txt present, no sitemapREVIEW
A sitemap helps search engines discover and index your pages more efficiently.
No sitemap.xml — Google relies on crawl-graph discovery alone, slowing indexing of deep or fresh URLs.
Learn more ▾ ▴
A sitemap accelerates Google's discovery of new and updated content. Most CMSes auto-generate one; static-site frameworks need a build-step plugin. Reference it from robots.txt and submit in Search Console to confirm Google can fetch it.
Source: sitemaps.org / Google Search Central
Add a 'Sitemap:' directive to robots.txt so search engines can discover your sitemap.
robots.txt omits Sitemap: directive — crawlers must fetch /sitemap.xml by convention; reliable but missing the explicit hint.
Source: sitemaps.org
# As a condition of accessing this website, you agree to abide by the following
# content signals:
# (a) If a Content-Signal = yes, you may collect content for the corresponding
# use.
# (b) If a Content-Signal = no, you may not collect content for the
# corresponding use.
# (c) If the website operator does not include a Content-Signal for a
# corresponding use, the website operator neither grants nor restricts
# permission via Content-Signal with respect to the corresponding use.
# The content signals and their meanings are:
# search: building a search index and providing search results (e.g., returning
# hyperlinks and short excerpts from your website's contents). Search does not
# include providing AI-generated search summaries.
# ai-input: inputting content into one or more AI models (e.g., retrieval
# augmented generation, grounding, or other real-time taking of content for
# generative AI search answers).
# ai-train: training or fine-tuning AI models.
# ANY RESTRICTIONS EXPRESSED VIA CONTENT SIGNALS ARE EXPRESS RESERVATIONS OF
# RIGHTS UNDER ARTICLE 4 OF THE EUROPEAN UNION DIRECTIVE 2019/790 ON COPYRIGHT
# AND RELATED RIGHTS IN THE DIGITAL SINGLE MARKET.
# BEGIN Cloudflare Managed content
User-agent: *
Content-Signal: search=yes,ai-train=no
Allow: /
User-agent: Amazonbot
Disallow: /
User-agent: Applebot-Extended
Disallow: /
User-agent: Bytespider
Disallow: /
User-agent: CCBot
Disallow: /
User-agent: ClaudeBot
Disallow: /
User-agent: CloudflareBrowserRenderingCrawler
Disallow: /
User-agent: Google-Extended
Disallow: /
User-agent: GPTBot
Disallow: /
User-agent: meta-externalagent
Disallow: /
# END Cloudflare Managed Content
No sitemap found
Adding a sitemap helps search engines discover your pages.
BTLS Certificate Expiry & Recommendations40 days until leaf cert expires — 2 issues to addressREVIEW
Certificate validity
Recommended actions
- Extend HSTS max-age to at least 31536000 (1 year) to meet the preload list criteria
- Enable DNSSEC on your domain for DNS spoofing protection
BOperational Status PageNo status page link detectedREVIEW
BHealth Check EndpointNo conventional health endpoint foundREVIEW
ADNS Records3 A records, 9 ms lookupPASS
| A | 104.26.10.171, 172.67.71.188, 104.26.11.171 |
| AAAA | 2606:4700:20::681a:bab, 2606:4700:20::ac43:47bc, 2606:4700:20::681a:aab |
| CNAME | — |
| NS | — |
| MX | — |
| TXT | — |
| CAA | Lookup not available with standard resolver |
SPF helps prevent email spoofing. Add a TXT record starting with 'v=spf1'.
Without SPF, receiving servers can't validate sending IPs — your domain is easier to spoof in phishing.
Learn more ▾ ▴
SPF complements DMARC. Both should be published. SPF records list authorized sending IPs (e.g., `v=spf1 include:_spf.google.com ~all` for Google Workspace). After publishing, verify in Google Postmaster Tools or mxtoolbox.
Source: RFC 7208 (SPF)
A+Subdomain TakeoverNo subdomain takeover risk detectedPASS
A+Multi-Resolver DNS SpeedMean 11ms across 3 resolvers (spread 16ms)PASS
A+Redirect ChainNo redirects — direct accessPASS
https://www.der-betze-brennt.de
90 ms · HTTP/1.1 FINAL
| # | URL | Status | Time | Protocol | Server |
|---|---|---|---|---|---|
| 1 | https://www.der-betze-brennt.de | 200 | 90 ms | HTTP/1.1 | cloudflare |
A+IPv6 ReadinessIPv6 reachable (2 ms)PASS
A+URL Variantswww/non-www, trailing slash, HTTP→HTTPSPASS
www / non-www
Preferred variant: www
HTTP → HTTPS
Consistent
A+Domain Intelligenceder-betze-brennt.dePASS
Unknown
40 days
Issued by Google Trust Services
Unknown
Status unknown
Protects against DNS spoofing
Unknown
2606:4700:20::ac43:47bc
Registrar unknown
Expiry timeline
Recommended actions
- Enable registrar lock (clientTransferProhibited) to block unauthorized domain transfers
The domain can be transferred without an unlock step. Enable registrar lock (clientTransferProhibited) in your registrar's control panel to protect against unauthorized or accidental transfers.
Without registrar lock, an attacker who phishes your registrar credentials can transfer the domain in minutes — total brand hijack.
Learn more ▾ ▴
Registrar lock (clientTransferProhibited, clientUpdateProhibited, clientDeleteProhibited) requires extra verification before any transfer/update/delete. Every major registrar offers it free. Combined with 2FA on your registrar account, it's the strongest defense against domain hijacking.
Source: ICANN / domain-security best practice