5 Ways to Prepare for Site Closure

Are you planning to shut down your website for a day or longer? According to advice from Google’s Search Advocate John Mueller, here are five ways to prepare.
Mueller shared this advice in tweets while linking to relevant Google help pages.
Warning broken trip — there is no good way to temporarily shut down a website. You should avoid doing this if possible.
However, there are things you can do to keep the negative impact to a minimum.
Mueller’s recommendations include:
- Use an HTTP 503 status code
- Keep HTTP 503 up for no more than a day
- Modify the robots.txt file to return a 200 status code
- Prepare for the consequences if the site is down longer than a day
- Expect reduced crawling from Googlebot
More details about these recommendations and how to deal with the negative impact of taking a site offline are explained in the following sections.
1. HTTP 503 Status Code
When taking a website offline, make sure it delivers an HTTP 503 status code to web crawlers.
When web crawlers like Googlebot encounter a 503 status code, they understand that the site is unavailable and may become available later.
With a 503 code, crawlers know to check the site again instead of dropping it from Google’s search index.
Mueller explains how to view a 503 status code using Chrome:
1. They should use HTTP 503 for “closed” pages. You can check that in Chrome, right-click: Inspect, select “Network” at the top, then refresh the page. Check the top entry, it should be red and show 503 Status. pic.twitter.com/dkH7VE7OTb
— 🌽〈link href=//johnmu.com rel=canonical 〉🌽 (@JohnMu) September 19, 2022
2. Keep 503 Status Code No More than One Day
Googlebot will return to a site after first encountering a 503, but it won’t return forever.
If Googlebot sees a 503 code every day, it will start dropping pages from the index.
Mueller says, ideally, you should keep the 503 status code for one day.
“Keep the 503 status – ideally – for at most one day. I know, not everything is limited to 1 day. A “permanent” 503 can result in pages being removed from search. Be frugal in 503 times. Don’t worry about the “retry after” setting.”
3. Robots.txt – 200 Status Code
While the pages of a closed website should return a 503 code, the robots.txt file should return either a 200 or 404 status code.
Robots.txt shouldn’t deliver a 503, Mueller said. Googlebot will assume that the site is completely blocked from crawling.
Additionally, Mueller recommends using Chrome DevTools to review your website’s robots.txt file:
2. The robots.txt file should return either 200 + a valid robots.txt file, or 404. It should *not* return 503. Never believe if the page shows “404”, it could still be 503 – check it . pic.twitter.com/nxN2kCeyWm
— 🌽〈link href=//johnmu.com rel=canonical 〉🌽 (@JohnMu) September 19, 2022
4. Prepare For Negative Effects
As we mentioned at the beginning of this article, there is no way to take a website offline and avoid all the negative consequences.
If your website will be offline for longer than a day, prepare accordingly.
Mueller says pages are likely to drop out of search results regardless of the 503 status code:
“Hmm.. What if want to close a site for >1 day? There will be negative effects regardless of the option you choose (503, blocked, noindex, 404, 403) – pages will likely drop out of search results.”
When you “open” your website again, check if the critical pages are still indexed. Otherwise, submit them for indexing.
5. Expect Reduced Crawling
An inevitable side effect of delivering a 503 code is reducing the crawl, however long it takes.
Mueller said on Twitter:
“A side-effect of even 1 day of 503s is that Googlebot (note: it all has a Google lens, I don’t know about other search engines) will slow down the crawl. Is it a small site? That doesn’t matter. giant? The keyword is “crawling budget”.
Reduced crawling can affect a site in many ways. The main things to be aware of are that new pages may take longer to be indexed, and updates to existing pages may take longer to appear in search results.
Once Googlebot sees that your site is back online and you’re actively updating it, your crawl rate will likely return to normal.
Origin: @JohnMu on Twitter
Featured Image: BUNDITINAY/Shutterstock