Site icon New Design Group Inc.

Google Search Console Complete Guide For SEO

Google Search Console stands as an essential asset for SEO endeavors, furnishing valuable insights into a website or page’s organic performance.

Comprehending user search patterns, gauging site performance across search engines, and receiving actionable recommendations for enhancements are all pivotal aspects of SEO strategy.

Formerly recognized as Google Webmaster Tools, Google Search Console remains the go-to resource for SEO professionals, offering indispensable insights and ensuring technical robustness.

What Is Google Search Console (GSC)?

Google Search Console, or GSC for short, serves as a complimentary offering from Google, enabling website owners to track the overall health and performance of their sites through data sourced directly from Google.

Within its array of features, GSC offers a range of invaluable reports, such as:

  1. Impressions and Clicks.
  2. Indexation status.
  3. Link analysis.
  4. Manual Actions overview.
  5. Core Web Vitals (CWV) assessment.

Furthermore, GSC empowers site owners with the ability to undertake various site-related actions, such as:

  1. Submitting a sitemap.
  2. Requesting the removal of specific URLs from the index.
  3. Inspecting URLs for potential indexing issues.

Moreover, GSC proactively keeps verified owners and users informed via email updates, highlighting any crawl errors, accessibility challenges, or performance discrepancies.

It’s worth noting that while GSC now offers an extended data retention period of up to 16 months, data collection begins only after verification of ownership for the relevant property has been completed.

How To Get Started With GSC

Beginning your journey with Google Search Console requires a functional Google account, be it a Gmail account or an email linked to Google Workspace (previously G Suite) for business. Additionally, you must be capable of either inserting code into your website or adjusting domain name servers via your hosting provider.

In this segment, we’ll address the following topics:

  1. Verifying site ownership within GSC.
  2. Adding a sitemap to GSC.
  3. Establishing ownership, user roles, and permissions.
  4. Understanding dimensions and metrics.

How To Verify Ownership

When venturing into Google Search Console, it’s crucial to acknowledge that due to the sensitive nature of the data and processes involved, Google mandates that site owners undertake one of several verification steps to confirm ownership.

Here’s a brief guide to get started:

  1. Navigate to the Google Search Console page.
  2. Click on “Start Now” to begin the verification process.
  3. Choose the type of property you wish to verify.

Now, it’s essential to pause and delve into the two distinct property types in GSC: Domain and URL Prefix.

Domain

Opting to verify your domain for the first time in GSC is advisable, as it ensures verification for all subdomains, SSL patterns (http:// or https://), and subfolders on your site.

For this property type, there are two verification methods available: TXT and CNAME. Both necessitate access to your site’s Domain Name System (DNS) records, either by you or your site engineer, to effect changes.

For TXT verifications (recommended):

  1. Copy the text provided in the TXT record field within Google Search Console (GSC).
  2. Proceed to your domain’s DNS management platform, typically hosted by your hosting provider.
  3. Create a new DNS record for your domain, setting the Type to TXT.
  4. Paste the verification TXT obtained from GSC into the designated Record field.
  5. Save the Record.
  6. Allow some time for the changes to propagate across DNS servers.
  7. Return to GSC and click on “Verify” to confirm the addition of the TXT record to your DNS.

Please note that the replication of this DNS change can vary, ranging from a few minutes to several days. If the change is not immediately verifiable, you can opt to click “Verify Later” and check back later.

For CNAME verifications:

  1. Copy the CNAME label from Google Search Console (GSC).
  2. Paste the copied CNAME label into the Name field of a new CNAME record within your site’s DNS configuration.
  3. Copy the CNAME Destination/Target content from GSC.
  4. Paste the CNAME Destination/Target content into the Record field within your DNS configuration.
  5. Save the Record.
  6. Allow some time for the changes to propagate across DNS servers.
  7. Return to GSC and click on “Verify” to confirm the addition of the CNAME record to your DNS.

As with TXT verifications, please note that replication of this DNS change can vary, ranging from a few minutes to several days. If the change is not immediately verifiable, you can choose to click “Verify Later” and check back later.

After successfully verifying your domain, you can proceed to verify additional properties associated with this domain using the URL Prefix property type.

URL Prefix

This verification method comes into play when you encounter difficulties accessing your domain’s DNS records or when you aim to authenticate particular URL paths within an existing Domain verification.

The URL Prefix verification offers you the capability to authenticate:

It’s important to note that this verification method yields data that pertains solely to the designated prefix.

While smaller websites may suffice with a single verification, larger sites may opt to monitor site performance and metrics separately for subdomains and subdirectories to ensure a comprehensive dataset.

Google Search Console (GSC) offers five options for verifying your site or specific sections using the URL Prefix method:

  1. HTML Page: This method enables you to upload the .html file directly to your site’s root directory using a free FTP client or your hosting platform’s cPanel file manager.
  2. HTML Tag: By inserting the provided HTML tag into the section of your homepage, you can verify your site. Many Content Management System (CMS) platforms such as WordPress and Wix facilitate adding this tag through their interfaces.
  3. Google Analytics: If you’ve already verified your site on Google Analytics, you can leverage that verification to add your site to GSC.
  4. Google Tag Manager: Similarly, if you’re already utilizing Google’s Tag Manager system, you can verify your site using the tags already embedded on your site.
  5. DNS Configuration: If you’ve previously verified your site using the TXT or CNAME methods as described earlier, you can verify subsections of your site using that verification method. This method is suitable for verifying subdomains or subdirectories.

The verification process for these methods may vary in duration, ranging from a few minutes to several days for replication. If you find yourself in need of more time, you can click on the “Verify Later” button and revisit the process later. In your account’s properties, you’ll locate the site or segment in the “Not Verified” section. By clicking on the unverified site and selecting “Verify Later,” you can temporarily postpone the verification process until a more convenient time.

How To Add A Sitemap In GSC

Certainly! Here’s a rewritten version:

Although Googlebot will eventually discover your website’s XML sitemap, you can speed up the process by directly submitting your sitemaps through Google Search Console (GSC).

To submit a sitemap to GSC, follow these steps:

  1. Copy the URL of the sitemap you wish to add. The format of most XML sitemaps resembles “https://www.domain.com/sitemap.xml.” However, sitemaps generated automatically by content management systems like WordPress might have a different syntax, such as “https://www.domain.com/sitemap_index.xml.”
  2. In Google Search Console, navigate to the “Sitemaps” section located in the left column.
  3. Paste your sitemap URL into the “Add a new sitemap” field at the top of the page, then click “Submit.”

You can include as many sitemaps as needed for your site. Many websites utilize separate sitemaps for videos, articles, product pages, and images.

An added advantage of integrating your sitemaps into this interface is the ability to compare the number of pages submitted to Google with the number of indexed pages.

To view this comparison, click on the three vertical dots next to your sitemap and choose “See page indexing.”

On the resulting page, you’ll find the number of indexed pages (highlighted in green) and the pages labeled as “Not Indexed” (shown in gray), along with a list detailing the reasons why those pages are not indexed.

Setting Users, Owners, And Permissions

Controlling access to data and functionality within Google Search Console (GSC) is crucial. Certain features, such as the Removals tools, can pose significant risks if mishandled.

User permissions in GSC are designed to limit access to specific parts of the platform:

  1. Owner: There are two types of owners. One is a user who has verified their ownership through one of the listed verification methods, while the other is someone to whom ownership has been delegated by an owner. Owners have complete control over the property, including the ability to remove it entirely from GSC.
  2. Full: Users with full access have almost all the functionalities of an owner. However, if a full user removes a property, it only removes it from their list of sites within GSC, not from the platform entirely.
  3. Restricted: Restricted users can only view the data within GSC. They do not have the ability to make any changes to the account or property settings.

It’s worth mentioning that Google has recently bolstered Search Console security by introducing a new feature for managing ownership tokens. This feature can be found under Settings > Users and permissions > Unused ownership tokens.

Ownership tokens are essentially unique codes found in HTML tags set up in your website’s head tag, HTML files you upload, or DNS TXT record values used during website verification.

Consider a scenario where a website has multiple verified owners via HTML tag uploads, and one of them departs the company. If you remove that user from the Search Console, the concern arises that they could still regain access if their token remains in the unused ownership tokens page. This new feature is vital, as website owners now have the ability to remove outdated verification tokens, thereby preventing unauthorized access by former owners.

Dimensions And Metrics

Within Google Search Console, information is organized into Dimensions and Metrics. These reports offer insights into your site’s performance, covering aspects like page indexation, ranking, and traffic.

The Performance report categorizes data into Dimensions like Pages, Queries, Countries, and Devices, providing meaningful segments. Metrics within this report encompass data such as Impressions and Clicks.

In the Pages report, Dimensions might detail reasons for pages not being indexed, while Metrics could quantify the number of affected pages per reason.

Regarding Core Web Vitals, Dimensions would represent performance levels like Poor, Needs Improvement, and Good, while Metrics would indicate the quantity of pages falling under each category.

Troubleshooting With GSC

Google Search Console (GSC) proves invaluable for SEO professionals as it allows us to diagnose and assess pages through Google’s lens. With features ranging from crawlability checks to page experience evaluations, GSC provides a diverse toolkit for troubleshooting site issues.

Crawling Issues

Prior to achieving search engine rankings, a webpage undergoes crawling and indexing. For assessment in search results, a page must be accessible for crawling.

Regardless of experiencing crawling problems, it’s advisable to routinely check the Crawl Stats report in Google Search Console (GSC). This report highlights any encountered issues such as:

Here’s a guide to utilizing this report effectively:

For pages displaying 404 response codes, click on “Not found” to review these specific pages.

Indexation Issues

If your web pages aren’t indexed, they won’t appear in search results for your most crucial keywords. There are a couple of methods to identify which pages on your site aren’t indexed using Google Search Console (GSC).

Initially, you might instinctively check the Pages report, accessible in the GSC’s left sidebar. While this report provides a wealth of information about both indexed and non-indexed pages, it can be somewhat misleading.

The “Not Indexed” section of the report includes pages intentionally left out of indexing, such as tag pages on your blog or pages designated for logged-in users only.

For a more accurate assessment of your site’s indexation issues, the Sitemaps report is the most reliable resource.

To access this information:

  1. Click on “Sitemaps” on the left-hand side of GSC.
  2. Locate your site’s primary sitemap and click on the three vertical dots next to it.

The resulting report resembles the Pages report but focuses on pages your site deems important enough to include in the sitemaps submitted to Google.

Here, you can scrutinize the “Reasons” column in the “Why pages aren’t indexed” table.

For instance, you might have multiple pages that have been crawled but are presently not indexed. To assess one of these pages:

  1. Click on the line item labeled “Crawled – currently not indexed” to view the list of pages.
  2. Hover over one of the listed pages until three icons appear after the URL.
  3. Click on the “Inspect URL” icon.

On this page, you have the option to manually Request Indexing. Additionally, you can find the “TEST LIVE URL” button situated on the right side of the page. Clicking this button leads to a page indicating whether your page is accessible to Google. To check the test results for the page, click on “View Tested Page.”

The resulting pop-in window displays the HTML captured from the page, a smartphone rendering of the page, and “More Information” regarding the page, including any page resource issues or JavaScript console errors.

The HTML code shown in the inspection tool reflects what Googlebot could crawl and render. This is particularly significant for JavaScript-based websites, where the content may not initially exist within the static HTML but is loaded via JavaScript (using REST API or AJAX).

Analyzing the HTML code allows you to assess whether Googlebot has properly accessed your content. If your content is absent, it suggests that Google may have encountered difficulties crawling your webpage, potentially impacting your search rankings negatively.

By exploring the “More Information” section, you can pinpoint any resources that Googlebot couldn’t load. For instance, you might be inadvertently blocking specific JavaScript files crucial for content loading via robots.txt.

Once you’ve verified everything appears in order, proceed to the main URL inspection page and click on “Request Indexing”. The search bar at the top of every page in Google Search Console (GSC) enables you to inspect any URL within your verified domain.

Additionally, it’s advisable to ensure Google can access your robots.txt file smoothly. Navigate to Settings > robots.txt to check for any issues. If Google encounters difficulties fetching your robots.txt, such as due to a firewall blocking access, it will temporarily halt crawling your website for 12 hours. Failure to resolve the issue may lead Google to treat your site as if there’s no robots.txt file at all.

Performance Issues

If you’re encountering challenges with getting your pages indexed by Google or ranking well in search engine results, it’s worthwhile to assess the Core Web Vitals (CWV) across your website.

These metrics gauge the actual user experience by leveraging data from the Chrome User Experience Report (CrUX).

CWV focuses on three key usability metrics:

  1. Cumulative Layout Shift (CLS): This metric evaluates how much the layout of your page shifts as elements load. If images cause text to jumble or shift on your page during loading, it can lead to a subpar user experience.
  2. Interaction to Next Paint (INP): Formerly known as First Input Delay (FID), INP measures the time it takes for your page to respond once a user interacts with it, such as scrolling, clicking, or performing any action that loads additional content.
  3. Largest Contentful Paint (LCP): This metric assesses how long it takes for the primary content on your page to fully render for the user.

To identify issues with your pages’ Core Web Vitals (CWV) scores in Google Search Console (GSC), follow these steps:

  1. Navigate to GSC and click on “Core Web Vitals” in the left column.
  2. Choose either the Mobile or Desktop graph and click on “Open Report.”
  3. Review the “Why URLs aren’t considered good” table and click on a specific line item.
  4. Select an Example URL to inspect.
  5. Click on the three vertical dots next to an Example URL in the pop-in window.
  6. Choose “Developer Resources – PageSpeed Insights.”

This action directs you to PageSpeed Insights, where you can access a Diagnostics report highlighting issues potentially affecting your CWV. This report presents a prioritized list of issues encountered, offering valuable insights into optimizing your website’s performance.

Utilize these diagnostic findings to guide your developers, designers, and engineers in addressing these issues effectively.

Hint: Ensure that your website employs HTTPS, as it enhances website security and contributes to better ranking. Implementing this is relatively straightforward, as most hosting providers offer free SSL certificates that can be installed with a single click.

Security And Manual Actions

If you’re facing challenges with indexation and ranking, it’s plausible that Google may have encountered a security concern or initiated manual action against your site.

In Google Search Console (GSC), you can find reports on both Security and Manual Actions located in the left column of the page.

If you’re flagged for either of these issues, your problems with indexation and ranking won’t be resolved until they’re addressed.

5 Ways You Can Use GSC For SEO

You can harness Google Search Console (GSC) for SEO in your daily routines through five key methods.

1. Measuring Site Performance

The upper section of the Search Console Performance Report offers various insights into a website’s performance in search, encompassing features such as featured snippets.

The Performance Report allows exploration of four search types:

  1. Web
  2. Image
  3. Video
  4. News

The Web search type is displayed by default in the Search Console.

To view different search types, simply click the Search Type button and make your selection.

A valuable function is the option to compare the performance of two search types directly within the graph.

At the top of the Performance Report, four key metrics stand out:

  1. Total Clicks
  2. Total Impressions
  3. Average CTR (click-through rate)
  4. Average position

By default, the Total Clicks and Total Impressions metrics are pre-selected.

To view specific metrics on the bar chart, simply click within the dedicated tabs for each metric and choose the ones you wish to display.

Impressions

Impressions signify the frequency with which a website appears in search results. If a user views a URL without clicking on it, it registers as an impression.

Moreover, if a URL is positioned at the bottom of the search page and remains unseen because the user doesn’t scroll that far, it still counts as an impression.

High impressions are advantageous as they indicate Google’s visibility of the site in search results.

The significance of the impressions metric is enhanced by considering the Clicks and Average Position metrics.

Clicks

The clicks metric indicates the frequency with which users clicked from search results to the website. A high number of clicks, coupled with a high number of impressions, is favorable.

However, a scenario of low clicks despite a high number of impressions suggests room for improvement. While not necessarily bad, it implies that the site may benefit from enhancements to attract more traffic.

The clicks metric gains greater significance when analyzed alongside the Average CTR and Average Position metrics.

Average CTR

The Average CTR (click-through rate) is a percentage that indicates the frequency with which users click from search results to the website.

A low CTR suggests areas for improvement to boost visits from search results, such as revising the page title or updating the meta description.

Conversely, a higher CTR signifies effective performance with users.

This metric becomes more meaningful when analyzed alongside the Average Position metric.

Average Position

The Average Position metric indicates where a website typically ranks in search results. A position between 1 and 10 is excellent, signaling prominent visibility.

When the average position falls in the twenties (20 – 29), it suggests that the result appears further down the search page, requiring users to scroll. While not detrimental, it’s suboptimal and may indicate the need for enhancements to break into the top 10.

Average positions below 30 generally imply that the pages could greatly benefit from improvements.

Additionally, it could suggest that the site ranks for numerous keyword phrases, with some ranking low and a few exceptional ones ranking very high.

Examining all four metrics (Impressions, Clicks, Average CTR, and Average Position) together provides a comprehensive insight into the website’s performance.

The Performance Report serves as a crucial starting point for swiftly grasping the website’s search performance. It acts as a mirror, reflecting the site’s effectiveness or areas needing improvement.

2. Finding “Striking Distance” Keywords

Within the Performance section of Google Search Console, the Search Results report provides insights into Queries and their average Position.

Typically, queries where many companies rank within the top three positions are branded terms. However, those falling within ranks five to 15 are labeled as “striking distance” terms.

To leverage this data effectively, prioritize these striking distance terms based on impressions. Consider refreshing your content to incorporate these queries into the language of relevant pages.

To uncover these terms:

3. Request Faster Indexation Of New Pages

If you’re in urgent need of indexing for a priority page, the URL Inspection tool within Google Search Console offers a swift solution.

Here’s how to request indexing:

While this process prioritizes your URL for crawling and indexing by Googlebot, it doesn’t guarantee immediate indexing.

If your page remains unindexed despite this request, additional investigation may be required.

Please note: Each verified property in Google Search Console is restricted to 50 indexing requests per day.

4. Bulk Data Export

Google Search Console (GSC) historical data is constrained to a maximum of 16 months. However, you can address this limitation by exporting GSC data to BigQuery, where it can be stored indefinitely. This enables access to extensive historical data whenever needed. It’s essential to note that this action isn’t retroactive, so initiating it promptly is advisable.

5. Bonus: Integration With Other SEO Tools

While Google Search Console (GSC) offers a plethora of insights, the real magic unfolds when you integrate its data with various SEO tools and platforms available in the market.

By integrating GSC data into these tools, you gain a sharper understanding of your site’s performance and potential.

From desktop crawlers such as Screaming Frog and Sitebulb to enterprise server-driven crawlers like Lumar and Botify, incorporating your GSC information enhances the depth of your page audits, covering aspects like crawlability, accessibility, and page experience factors.

Integration with prominent SEO tools like Semrush and Ahrefs yields comprehensive ranking information, facilitates content ideation, and enriches link data.

Moreover, the convenience of consolidating all your data into a single view cannot be overstated. With time being a precious commodity, these integration options significantly streamline your workload.

GSC Is An Essential Tool For Site Optimization

Over nearly two decades, Google Search Console has undergone significant evolution. Originally launched as Google Webmaster Tools, it has continuously delivered valuable insights for SEO professionals and site owners alike.

If you’ve yet to integrate it into your daily activities, you’ll swiftly discover its indispensability for making informed decisions regarding your site optimizations moving forward.

Original news from SearchEngineJournal

Exit mobile version