The SEO Guy

What site owners should know about Google's core updates?

A core update refers to extensive modifications to Google’s algorithm and infrastructure that result in significant fluctuations in search results. Google typically delivers one or more modifications to our search results each day. The majority are unnoticeable, but they help us gradually continue to improve.

Occasionally, a change may be more visible. We intend to confirm such changes when we believe there is a defining point that site holders, content creators, or others can use. For instance, when “Speed Update” occurred, we provided months of notice and guidance

We make extensive modifications to our search algorithms and infrastructure every year. These are referred to as “core updates.” They are intended to ensure that we fulfill our aim of providing searchers with relevant and authoritative content. These core improvements may impact Google Discover as well.

We confirm extensive core upgrades because they often have far-reaching consequences. Some sites may record declines or gains throughout them. We anticipate that individuals whose sites encounter declines in traffic will seek a solution, and we want to ensure they don’t try to correct the wrong things. Moreover, there may be nothing to fix.

Let's have an In-depth analysis of the Guidelines

The guidelines tell you how to code your site to be listed on Google. These rules are put into the following categories:

Webmaster Guidelines

The Webmaster guidelines consist of general best practices that will help your site surface in Google Search and quality requirements that, if not followed, will result in your page or site being excluded from Search.

However, following General guidelines will assist Google in discovering, indexing, and ranking your website. We highly advise you to pay special attention to the Quality rules, which detail unethical actions that may result in a website’s removal from the Google index or other manual or algorithmic spam actions. If a site has been penalized for spam, it may no longer appear in search results on Google.com or any of Google’s partner sites.

General Guidelines

Assist Google in finding your pages

  1. Make sure that there is a link to every page on the site from every other page that can be found. Make sure that the link that leads to the target page has relevant text or, for images, an alt attribute. Links that can be crawled are > tags with a href attribute.
  2. Give a sitemap file with links to your site’s most important pages. Also, make a page with a list of links to these pages that people can read, also called a site map page or a site index.
  3. Put a reasonable number of links on a page (a few thousand at most).
  4. Make sure that the If-Modified-Since HTTP header works correctly on your web server. This feature tells your web server to let Google know if your site’s content has been modified since the last time we crawled it. When you support this feature, you save on bandwidth and other costs.
  5. You should use the robots.txt file on the server to control your crawling budget by avoiding the crawling of infinite spaces like search result pages. Your robots.txt file should always be up-to-date. 
How to make it easier for Google to find your site:
  • Ask Google to search your pages.
  • Make sure that sites that know about your pages should know that your site is online.

Assist Google in comprehending your pages

  1. Make a useful site full of information, and compose pages that describe your content clearly and accurately.
  2. Think about the words people would type into a search engine to find your pages, and ensure that those words are on your site.
  3. Ensure that your alt attributes and title> elements are specific, descriptive, and correct.
  4. Make sure your site has a clear page hierarchy regarding ideas.
  5. Use the best practices suggested for videos, images, and structured data.
  6. If you use a management system for content like Wix or WordPress, ensure it generates links and pages that search engines can crawl.
  7. Let Google crawl all of your site’s assets that greatly impact how pages are displayed, like CSS and JavaScript files. It will help Google understand fully what your site is about. The Google indexing system shows a web page as a user would see it, which includes images, JavaScript, and CSS files. Use the URL Inspection tool to find out which parts of a page Googlebot can’t crawl. Use the robots.txt Tester tool to figure out what’s wrong with the directives in your robots.txt file.
  8. Let search bots crawl your webpage without session IDs or URL parameters that detect their direction through the site. These methods are useful for monitoring how people use the site, but bots use the site in a completely different way. If you use these methods, your site might not be fully indexed because bots might not be able to get rid of URLs that look completely different but lead to the exact page.
  9. Google can find HTML content hidden in navigational elements like tabs or sections that expand. Moreover, make your information easy to see in the default page view.
  10. Do what you can to ensure that the links to ads on your pages don’t damage your search engine rankings. Use the robots.txt disallow rule, rel=”nofollow,” or rel=”sponsored,” to stop a crawler from following ad links.

Assist visitors in using your pages

  1. Try to use text rather than pictures to show important names, content, or links. If you have to use images for text, use the alt attribute to add a few words that explain what the image is.
  2. Make sure that all links lead to real web pages. Use valid HTML.
  3. Make sure your page loads quickly. Sites that load quickly make users happy and improve the web. Google suggests that you test how well your page works using tools like PageSpeed Insights and Webpagetest.org.
  4. Make sure your site looks good on all shapes and sizes of desktops, tablets, and smartphones. Use the Mobile-Friendly Test to see how well your pages work on mobile devices and determine what needs fixing.
  5. Make sure that your site looks right in all browsers.
  6. Suppose you can use HTTPS to protect the connections to your site. Encrypting the user’s interactions with your website is a good way to talk to people on the web.
  7. Make sure your pages are useful for people who can’t see them, for example, by using a screen reader to test how easy they are to use.

Keep URLs simple

  • URLs should be simple. Organize your material, so URLs are logical and human-readable. Instead of ID numbers, use meaningful phrases in URLs.
  • If your site is multi-regional, use a URL structure that’s geo-targetable. Using locale-specific URLs provides further instances of URL structure.
  • Overcomplicated URLs, especially ones with many parameters, can pose problems for crawlers by establishing too many URLs pointing to the same or similar site content. Googlebot may use too much bandwidth or not index all your site’s content.

For URL structure,

  • Use robots.txt to restrict Googlebot from problematic URLs.
  • Consider banning dynamic URLs, such as search results or calendars with unlimited spaces

Avoid interstitials and dialogues

  • Interstitials are full-page overlays, and dialogues are part-page overlays that can obscure underlying material.
  • Interrupting users with obtrusive interstitials may frustrate them and damage their faith in your website.

Unblock Googlebot

Blocking Googlebot from a site can damage Google’s ability to crawl and index its material, leading to a drop in search rankings.

  1. Site’s SafeSearc

Many searchers don’t want detailed results. Google’s SafeSearch filters let users filter explicit content from search results. Following this guide can help Google comprehend your site and content, and this helps us screen your site with SafeSearch.

SafeSearch optimization

These tips assist you in finding explicit web pages. It ensures users see the results they want or expect and aren’t shocked by search results. If your site also has non-explicit pages, the following approaches can help our systems recognize that.

Metadata explicit pages

Publisher marking explicit pages is a significant signal for our computers to detect them (or indicate in headers)

When publishing documents and photographs online, you may unwittingly expose confidential information. Some document formats may include hidden or censored information that search engines can access.

Putting text in a tiny font, choosing a font color that matches the backdrop, or covering text with an image may render things invisible to the human eye, but search engines can still index and find it.

Some document types feature hidden information. They may include the document’s change history, showing redacted or amended text. They may keep cropped or censored photographs in full. Unseen information in a file may list who accessed or altered it.

All this information remains when a document is exported or transformed. Deleting all information from a file before making it public is important.

How to handle unredacted or incorrectly redacted Search documents?

  • Remove papers from Search using the verified site’s Removals tool.
  • Remove several documents using a URL prefix. URL removal takes less than a day for verified sites, which stops the document from appearing in redaction searches.
  • Host the censored document elsewhere.
  • Use the Removals or Outdated Content tools to change Google’s search results.

Google's AMP guidelines

AMP follows all of our Google-friendly site criteria. This paper includes AMP-specific Google Search standards.

  • AMP pages must use AMP HTML.
  • AMP pages must have the same content and actions as their canonical counterparts.

Quality guidelines

The following recommendations outline certain forbidden tactics that may result in your page or website is excluded from search engine results. These practices may result in manual action against your website.

  • Link schemes

  • Sneaky redirects

  • Cloaking

  • Thin content

  • Paid links

  • Doorway pages

  • Irrelevant keywords

  • Hidden text and links

  • Creating pages with malicious behavior

  • Scraped content

  • Automated queries

  • Affiliate programs

  • User-generated spam

  • Report spam paid links or malware

  • Prevent abuse on your site and platform