To protect confidential or proprietary content from appearing in search engine results, it’s important to understand the mechanics of how crawlers gather data—including analyzing links, evaluating pages, and adhering to the directives provided by webmasters. This guide details how to hide pages on Squarespace from search engines or site visitors.

Squarespace Content Visibility Options

Squarespace offers several methods for controlling page or site visibility, each with specific scenarios and limitations. For complete privacy, password protection is ideal, whereas noindex tags can selectively block access. Choose the method that aligns with your needs for visibility and access.

* Effectiveness may vary (learn more below)
Accessible to visitors Accessible to search engines Noindex tag URL in sitemap URL can redirect
Enabled page, default state
Disabled page
Password protected ✓ with password
Member Site ✓ with account
Course, Paywalled ✓ with account
Course, Public, default state
Noindex via page settings ✕*
Noindex via code injection ✕*
Robots.txt disallow ✕*

⚠️ IMPORTANT: If you hide a site or make content inaccessible, it may only take a matter of days before Google starts to de-index pages. A page must be indexed to bring in organic search traffic. From a search engine perspective, some of the options below have the same SEO impact as permanently removing content.

“Not Linked” Pages

Pages found in the Not Linked section of your pages panel are hidden from your site’s menu navigation, but it is important to understand they are public pages and can be discoverable by search engines and visitors unless further action is taken.

To hide pages from search engines and/or visitors, see the options below.

Disable Pages or an Entire Site

To completely remove content from both search engines and visitor view, disable it. This might apply to pages under construction or temporarily irrelevant content. From a search engine perspective, disabled content does not exist.

  • Disable a PAGE: Hover over the page you want to disable, and click the page’s Settings icon > toggle off Enable Page, click Save.

  • Disable a BLOG POST: Visit a post > access the post’s Settings > change it from Published to Draft mode to remove the post from your public blog.

  • Disable (or unpublish) a WEBSITE: Visit Settings > Site Availability > select Private, and click Save.

If appropriate, disabled URLs can be permanently or temporarily redirected. Learn more about page URLs, 301 redirects, and how to do redirects on Squarespace.

Tip: In version 7.0, the disable/enable feature on individual sections within index pages is very useful for content that is routinely removed and then reinstated.

Password Protection

You can password-protect a website or page to hide it from search engines while allowing access to visitors who have the password. If the content stays hidden, it will eventually drop from search results.

  • PAGE password: Hover over the page you want to protect, and click the page’s Settings icon > scroll down to Password, add a password and click Save.

  • SITE password: Visit Settings > Site Availability > select Password Protected, add a password and click Save.

In these instances, all visitors will use the same password to view the website or page. You can access and style your site’s “Lock Screen” in System Pages.

Member Site or Course

Member sites create gated content that is only available to members with an account. Search engines won’t index these restricted areas. This is similar to password protected pages (see above).

Courses create a subfolder of pages, similar to a blog collection. A course can be paywalled or open to the public. A paywalled course will not be crawled. By default, a public course is open for crawling and indexing.

If you did not intend for the individual lesson pages of a public course to be indexed, you can use Google’s URL removal tool (learn more below), then give the course a password (see above) or toggle on the noindex directive (see below).

Learn more about setting up Squarespace digital products.

Noindex Directive

The noindex directive in a meta robots tag within the HTML head of a page tells search engines to skip indexing that page. It is useful for pages that you don’t want to appear in search results, but still want accessible to a select audience, such as thank you pages or ad landing pages.

On Squarespace, a noindex tag can be added through SEO settings or code injection:

  • Via a page’s SEO settings: This is the preferred method because Squarespace removes the URL from your sitemap. Ideally, a sitemap only contains URLs that you want search engines to index.

    When implementing SEO on Squarespace many, or even most, sites will want to noindex category and tag pages to optimize how search engines crawl the site, this option is available on the blog page’s SEO settings tab.

  • Via code injection: This option used to be the only method available. With this method, URLs are still included in the sitemap. Place this line of code in code injection:

    <meta name="robots" content="noindex">
    • Singular Page: Hover over the page you want to protect, and click the page’s Settings icon > click Advanced, use Page Header Code Injection

    • All Pages: Access Website Tools > Code Injection > use Header

Note: Major search engines like Google and Bing typically honor noindex tags, but they are not a foolproof method because not all web crawlers follow these directives. If a noindex page is linked from other high-quality pages, the URL of the noindex page can still appear in search results until it is dropped.

Robots.txt file

Robots.txt files guide search engines on what they can crawl or not. On Squarespace, you cannot specify directories or pages to exclude in robots.txt, but if desired, you can block your entire site from search engine crawlers (and AI crawlers). In this case, even though your sitemap is populated, search engines encounter the robots.txt first, so they will stop and not crawl any pages on your website, meaning the sitemap will be ignored.

Remember: Disallowed pages can still be indexed if linked from elsewhere. If a site has already been indexed, first remove the pages from the index and then block crawling.

Content Removal Tools

If content is already in Google’s index and you want it removed, you can use Google’s URL removal tool. The URL will usually be dropped within 24 hours. This provides a temporary fix—for permanent removal, combine it with deleting or disabling content, blocking access, or adding a noindex tag.

See Google’s removal tool and Bing’s removal tool.

Hiding Files and PDFs From Search Engines

In general, uploading a file to a password-protected page will keep the file hidden from crawlers. But if another page links to your uploaded content, then it could be indexed.

If you need to hide a PDF or file that is already showing up in Google search, delete the file from Squarespace, set a page password, then re-upload the document. If a file needs more immediate removal, see Squarespace’s recommended process here.

Ideally, Squarespace would add a feature to the file upload process allowing users to block indexing of PDFs, videos, or images with a simple toggle (adding an X-Robots-Tag).

Final Thoughts

Conduct regular audits of your site to ensure that the methods you’ve put in place to hide content are still effective. Remember that preventing search engines from indexing content does not mean it’s hidden from users who know the direct URL or password. Always combine the methods above with proper security measures to ensure that private content remains private.


Need help with Squarespace?

Explore our client reviews and schedule a Zoom.