Infrastructure
· 17 checks — DNS, redirects, IPv6, crawlability, URL variants, and domain intelligence rolled into one auditable list.DCDN & DeliveryActionNo CDN detectedFIX
Consider using a CDN to improve global delivery speed and reduce origin load.
BDNSSECUnsigned (DNSSEC not deployed)REVIEW
CReverse DNSAction0/1 IPs match cert SANREVIEW
CIPv6 ReadinessActionNo IPv6 supportREVIEW
IPv6 support is increasingly important for global accessibility. About 40% of internet users have IPv6 connectivity.
No AAAA records — same impact as 'no IPv6 (AAAA) records'; IPv6-preferring clients pay extra latency falling back to IPv4.
Source: Google IPv6 stats
BCrawlabilityrobots.txt present, no sitemapREVIEW
A sitemap helps search engines discover and index your pages more efficiently.
No sitemap.xml — Google relies on crawl-graph discovery alone, slowing indexing of deep or fresh URLs.
Learn more ▾ ▴
A sitemap accelerates Google's discovery of new and updated content. Most CMSes auto-generate one; static-site frameworks need a build-step plugin. Reference it from robots.txt and submit in Search Console to confirm Google can fetch it.
Source: sitemaps.org / Google Search Central
Add a 'Sitemap:' directive to robots.txt so search engines can discover your sitemap.
robots.txt omits Sitemap: directive — crawlers must fetch /sitemap.xml by convention; reliable but missing the explicit hint.
Source: sitemaps.org
User-agent: AddSearchBot
Disallow: /
User-agent: AhrefsBot
Disallow: /
User-agent: AI2Bot
Disallow: /
User-agent: AihitBot
Disallow: /
User-agent: AllenAI
Disallow: /
User-agent: Amazon-Kendra
Disallow: /
User-agent: Andibot
Disallow: /
User-agent: Anomura
Disallow: /
User-agent: anthropic-ai
Disallow: /
User-agent: Applebot-Extended
Disallow: /
User-agent: archive.org_bot
Disallow: /
User-agent: arquivo-web-crawler
Disallow: /
User-agent: arquivo.pt
Disallow: /
User-agent: Awario
Disallow: /
User-agent: BacklinkCrawler
Disallow: /
User-agent: Baiduspider
Disallow: /
User-agent: Barkrowler
Disallow: /
User-agent: BedrockBot
Disallow: /
User-agent: bigsur.ai
Disallow: /
User-agent: BLEXBot
Disallow: /
User-agent: Bloodhound
Disallow: /
User-agent: Bravebot
Disallow: /
User-agent: Brightbot 1.0
Disallow: /
User-agent: browsertrix
Disallow: /
User-agent: brozzler
Disallow: /
User-agent: Buddybot
Disallow: /
User-agent: BuiltWith
Disallow: /
User-agent: Bytespider
Disallow: /
User-agent: CCBot
Disallow: /
User-agent: ChatGPT Agent
Disallow: /
User-agent: Cincraw
Disallow: /
User-agent: Claude-SearchBot
Disallow: /
User-agent: Claude-User
Disallow: /
User-agent: Claude-Web
Disallow: /
User-agent: ClaudeBot
Disallow: /
User-agent: CloudVertexBot
Disallow: /
User-agent: cohere-ai
Disallow: /
User-agent: Cohere-Training-Data-Crawler
Disallow: /
User-agent: Cotoyogi
Disallow: /
User-agent: CrawlSpace
Disallow: /
User-agent: cydralspider
Disallow: /
User-agent: DataForSeoBot
Disallow: /
User-agent: Datenbank Crawler
Disallow: /
User-agent: DeepSeek
Disallow: /
User-agent: DeepSeekBot
Disallow: /
User-agent: Devin
Disallow: /
User-agent: Diffbot
Disallow: /
User-agent: dotbot
Disallow: /
User-agent: downloadexpress
Disallow: /
User-agent: DuckAssistBot
Disallow: /
User-agent: Echobot Bot
Disallow: /
User-agent: EchoboxBot
Disallow: /
User-agent: FactSet_SpyderBot
Disallow: /
User-agent: Fasterfox
Disallow: /
User-agent: FireCrawlAgent
Disallow: /
User-agent: Flamingo_SearchEngine
Disallow: /
User-agent: FriendlyCrawler
Disallow: /
User-agent: gammaSpider
Disallow: /
User-agent: Google-Extended
Disallow: /
User-agent: GoogleAgent-Mariner
Disallow: /
User-agent: GPTBot
Disallow: /
User-agent: ia_archiver
Disallow: /
User-agent: iaskspider/2.0
Disallow: /
User-agent: ICC-Crawler
Disallow: /
User-agent: ImagesiftBot
Disallow: /
User-agent: img2dataset
Disallow: /
User-agent: ISSCyberRiskCrawler
Disallow: /
User-agent: Jetslide
Disallow: /
User-agent: Kangaroo Bot
Disallow: /
User-agent: Kraken
Disallow: /
User-agent: LCC
Disallow: /
User-agent: LinerBot
Disallow: /
User-agent: Linkfluence Yak Bot
Disallow: /
User-agent: LinkWalker
Disallow: /
User-agent: magpie-crawler
Disallow: /
User-agent: MistralAI-User
Disallow: /
User-agent: MistralAI-User/1.0
Disallow: /
User-agent: MJ12bot
Disallow: /
User-agent: msnbot-media/1.1
Disallow: /
User-agent: msnbot-media/2.0b
Disallow: /
User-agent: MyCentralAIScraperBot
Disallow: /
User-agent: netEstate Imprint Crawler
Disallow: /
User-agent: news-please
Disallow: /
User-agent: NewsNow
Disallow: /
User-agent: NovaAct
Disallow: /
User-agent: ObjectsSearch
Disallow: /
User-agent: omgili
Disallow: /
User-agent: omgilibot
Disallow: /
User-agent: Operator
Disallow: /
User-agent: PanguBot
Disallow: /
User-agent: Panscient
Disallow: /
User-agent: panscient.com
Disallow: /
User-agent: PerplexityBot
Disallow: /
User-agent: PetalBot
Disallow: /
User-agent: PhindBot
Disallow: /
User-agent: Poseidon Research Crawler
Disallow: /
User-agent: proximic
Disallow: /*&share=*
User-agent: QualifiedBot
Disallow: /
User-agent: QuillBot
Disallow: /
User-agent: quillbot.com
Disallow: /
User-agent: Quora-Bot
Disallow: /
User-agent: Raven
Disallow: /
User-agent: SBIntuitionsBot
Disallow: /
User-agent: scrapy
Disallow: /
User-agent: SeekrBot
Disallow: /
User-agent: SemrushBot
Disallow: /
User-agent: SentiBot
Disallow: /
User-agent: Sidetrade indexer bot
Disallow: /
User-agent: SirdataBot
Disallow: /
User-agent: SummalyBot
Disallow: /
User-agent: TaraGroup Intelligent Bot
Disallow: /
User-agent: Terracotta
Disallow: /
User-agent: ThinkBot
Disallow: /
User-agent: thinkers-bot
Disallow: /
User-agent: TikTokSpider
Disallow: /
User-agent: Timpibot
Disallow: /
User-agent: TurnitinBot
Disallow: /
User-agent: VelenPublicWebCrawler
Disallow: /
User-agent: WebZinger
Disallow: /
User-agent: Webzio-Extended
Disallow: /
User-agent: wpbot
Disallow: /
User-agent: YandexAdditional
Disallow: /
User-agent: YandexAdditionalBot
Disallow: /
User-agent: YouBot
Disallow: /
User-agent: *
Disallow: /p/
Disallow: /p-api/
Disallow: /action/
Disallow: /sn/
Disallow: /syndication/
Disallow: /elastic/
Disallow: /search?
No sitemap found
Adding a sitemap helps search engines discover your pages.
BURL Variantswww/non-www, trailing slash, HTTP→HTTPSREVIEW
www / non-www
Inconsistent — duplicate content risk
HTTP → HTTPS
Consistent
BTLS Certificate Expiry & Recommendations53 days until leaf cert expires — 3 issues to addressREVIEW
Certificate validity
Recommended actions
- Enable HSTS: Strict-Transport-Security: max-age=31536000; includeSubDomains
- Enable DNSSEC on your domain for DNS spoofing protection
- Enable OCSP stapling on your TLS server to remove a CA roundtrip and protect user privacy
BCDN Cache ObservabilityNo CDN cache-status headers in the responseREVIEW
BOperational Status PageNo status page link detectedREVIEW
BHealth Check EndpointNo conventional health endpoint foundREVIEW
A+DNS Records1 A records, 41 ms lookupPASS
| A | 207.174.61.1 |
| AAAA | — |
| CNAME | — |
| NS | ns-607.awsdns-11.net, ns-1609.awsdns-09.co.uk, ns-1516.awsdns-61.org, ns-41.awsdns-05.com |
| MX | 0 gala-de.mail.protection.outlook.com |
| TXT | knowbe4-site-verification=9f84b0069b8711b7d78efc51717ae64a Sendinblue-code:6ef1bdca0140128a0b0cdffd769c529f facebook-domain-verification=knv9p1x80uzi1mrdxtmduw4x8p37ty adobe-idp-site-verification=f6d72814729fce0d9ad0163609551d3d1a79c155760dde7bf5c6... google-site-verification=Xq0MsjCylnJTv3hUSwvopQGOd8FGkZwqm4U3o16Zbq0 MS=ms75715108 atlassian-domain-verification=Kl3ByU1tsfV9fKiC7TDDZYz1uCeTeUo0SSEh5SitZ9q99Ua74O... SPF v=spf1 ip4:194.12.192.0/19 ip4:193.16.163.16 ip4:193.16.163.17 include:spf.prote... facebook-domain-verification=fdy1h1nv2cft49up6o7r4zi2hhjrou google-site-verification=BSy9BPF8cc1ogO9nBdLGOa30-K4MVA1-Xriw8ChwtSA adobe-idp-site-verification=8d7568f4-8a0c-4ca6-ad27-b53d25e63b60 mwzsJ6RGsERiOHQA9yZPlidFQAvyji3/R8+PrERPxsJXs5fFZAP7dRVx12Zph0BkaR2x3XQzZ/htFBIB... |
| CAA | Lookup not available with standard resolver |
Multiple A records provide failover if one server goes down.
Single A record means a single point of failure — if that IP goes down, your site is unreachable until DNS TTL expires.
Learn more ▾ ▴
Add multiple A records for round-robin failover, or use a managed DNS provider with health-checked failover (Route 53, Cloudflare, NS1). Short TTL (60-300s) lets clients recover faster on outages.
Source: SRE practice / DNS architecture
A+Subdomain TakeoverNo subdomain takeover risk detectedPASS
ACAA Recordsissue: amazon.com, digicert.com, letsencrypt.org, pki.goog, sectigo.com | issuewild: amazon.com, digicert.com, letsencrypt.org, sectigo.comPASS
A+Multi-Resolver DNS SpeedMean 16ms across 3 resolvers (spread 24ms)PASS
ARedirect Chain1 redirect(s), 323 ms totalPASS
https://gala.de
106 ms · HTTP/1.1
https://www.gala.de/
217 ms · HTTP/1.1 FINAL
| # | URL | Status | Time | Protocol | Server |
|---|---|---|---|---|---|
| 1 | https://gala.de | 301 | 106 ms | HTTP/1.1 | |
| 2 | https://www.gala.de/ | 200 | 217 ms | HTTP/1.1 |
See the visual redirect chain in the HTTP Probe tab →
A+Domain Intelligencegala.de — hosted on AWSPASS
Unknown
53 days
Issued by Let's Encrypt
Unknown
Status unknown
Protects against DNS spoofing
AWS
ASN AS16509
207.174.61.1
Registrar unknown
Expiry timeline
Recommended actions
- Enable registrar lock (clientTransferProhibited) to block unauthorized domain transfers
The domain can be transferred without an unlock step. Enable registrar lock (clientTransferProhibited) in your registrar's control panel to protect against unauthorized or accidental transfers.
Without registrar lock, an attacker who phishes your registrar credentials can transfer the domain in minutes — total brand hijack.
Learn more ▾ ▴
Registrar lock (clientTransferProhibited, clientUpdateProhibited, clientDeleteProhibited) requires extra verification before any transfer/update/delete. Every major registrar offers it free. Combined with 2FA on your registrar account, it's the strongest defense against domain hijacking.
Source: ICANN / domain-security best practice