Prevent PAge Indexing on Google Sites

There is no way currently to prevent Web Crawlers from indexing selected pages on a Google Site?

This is a problem because we often want to create page "templates" on Google SItes for re-use but these get indexed.

There is no guarantee that removing these pages from robots.txt will prevent site indexing. The only way (based on Googles Search Central is to use the noindex meta tag which is not currently available to Google Sites) https://developers.google.com/search/docs/crawling-indexing/robots/intro?hl=en

Solved Solved
0 4 1,209
1 ACCEPTED SOLUTION

@icrew that unpublished the entire site, not a single page: there's no granularity to publishing in Google Sites.

@mark337 the way we work around this is to delete the pages we don't want published and then publish, then revert to the version with the pages in version history.

View solution in original post

4 REPLIES 4

Perhaps it'd be best to not click "Publish" on the site templates, and only share them with the people that need access to them via Drive?

Thanks. Having already Published, is there a way to unpublish a specific Page, I can't see any documentation for that capability?

Click the down-arrow at the right side of the publish button and choose "unpublish":

icrew_0-1695403298480.png

 

@icrew that unpublished the entire site, not a single page: there's no granularity to publishing in Google Sites.

@mark337 the way we work around this is to delete the pages we don't want published and then publish, then revert to the version with the pages in version history.

Top Solution Authors