Last updated on
Google Search Console stands as an essential asset for SEO endeavors, furnishing valuable insights into a website or page’s organic performance.
Comprehending user search patterns, gauging site performance across search engines, and receiving actionable recommendations for enhancements are all pivotal aspects of SEO strategy.
Formerly recognized as Google Webmaster Tools, Google Search Console remains the go-to resource for SEO professionals, offering indispensable insights and ensuring technical robustness.
Google Search Console, or GSC for short, serves as a complimentary offering from Google, enabling website owners to track the overall health and performance of their sites through data sourced directly from Google.
Within its array of features, GSC offers a range of invaluable reports, such as:
Furthermore, GSC empowers site owners with the ability to undertake various site-related actions, such as:
Moreover, GSC proactively keeps verified owners and users informed via email updates, highlighting any crawl errors, accessibility challenges, or performance discrepancies.
It’s worth noting that while GSC now offers an extended data retention period of up to 16 months, data collection begins only after verification of ownership for the relevant property has been completed.
Beginning your journey with Google Search Console requires a functional Google account, be it a Gmail account or an email linked to Google Workspace (previously G Suite) for business. Additionally, you must be capable of either inserting code into your website or adjusting domain name servers via your hosting provider.
In this segment, we’ll address the following topics:
When venturing into Google Search Console, it’s crucial to acknowledge that due to the sensitive nature of the data and processes involved, Google mandates that site owners undertake one of several verification steps to confirm ownership.
Here’s a brief guide to get started:
Now, it’s essential to pause and delve into the two distinct property types in GSC: Domain and URL Prefix.
Opting to verify your domain for the first time in GSC is advisable, as it ensures verification for all subdomains, SSL patterns (http:// or https://), and subfolders on your site.
For this property type, there are two verification methods available: TXT and CNAME. Both necessitate access to your site’s Domain Name System (DNS) records, either by you or your site engineer, to effect changes.
For TXT verifications (recommended):
Please note that the replication of this DNS change can vary, ranging from a few minutes to several days. If the change is not immediately verifiable, you can opt to click “Verify Later” and check back later.
For CNAME verifications:
As with TXT verifications, please note that replication of this DNS change can vary, ranging from a few minutes to several days. If the change is not immediately verifiable, you can choose to click “Verify Later” and check back later.
After successfully verifying your domain, you can proceed to verify additional properties associated with this domain using the URL Prefix property type.
This verification method comes into play when you encounter difficulties accessing your domain’s DNS records or when you aim to authenticate particular URL paths within an existing Domain verification.
The URL Prefix verification offers you the capability to authenticate:
It’s important to note that this verification method yields data that pertains solely to the designated prefix.
While smaller websites may suffice with a single verification, larger sites may opt to monitor site performance and metrics separately for subdomains and subdirectories to ensure a comprehensive dataset.
Google Search Console (GSC) offers five options for verifying your site or specific sections using the URL Prefix method:
The verification process for these methods may vary in duration, ranging from a few minutes to several days for replication. If you find yourself in need of more time, you can click on the “Verify Later” button and revisit the process later. In your account’s properties, you’ll locate the site or segment in the “Not Verified” section. By clicking on the unverified site and selecting “Verify Later,” you can temporarily postpone the verification process until a more convenient time.
Certainly! Here’s a rewritten version:
Although Googlebot will eventually discover your website’s XML sitemap, you can speed up the process by directly submitting your sitemaps through Google Search Console (GSC).
To submit a sitemap to GSC, follow these steps:
You can include as many sitemaps as needed for your site. Many websites utilize separate sitemaps for videos, articles, product pages, and images.
An added advantage of integrating your sitemaps into this interface is the ability to compare the number of pages submitted to Google with the number of indexed pages.
To view this comparison, click on the three vertical dots next to your sitemap and choose “See page indexing.”
On the resulting page, you’ll find the number of indexed pages (highlighted in green) and the pages labeled as “Not Indexed” (shown in gray), along with a list detailing the reasons why those pages are not indexed.
Controlling access to data and functionality within Google Search Console (GSC) is crucial. Certain features, such as the Removals tools, can pose significant risks if mishandled.
User permissions in GSC are designed to limit access to specific parts of the platform:
It’s worth mentioning that Google has recently bolstered Search Console security by introducing a new feature for managing ownership tokens. This feature can be found under Settings > Users and permissions > Unused ownership tokens.
Ownership tokens are essentially unique codes found in HTML tags set up in your website’s head tag, HTML files you upload, or DNS TXT record values used during website verification.
Consider a scenario where a website has multiple verified owners via HTML tag uploads, and one of them departs the company. If you remove that user from the Search Console, the concern arises that they could still regain access if their token remains in the unused ownership tokens page. This new feature is vital, as website owners now have the ability to remove outdated verification tokens, thereby preventing unauthorized access by former owners.
Within Google Search Console, information is organized into Dimensions and Metrics. These reports offer insights into your site’s performance, covering aspects like page indexation, ranking, and traffic.
The Performance report categorizes data into Dimensions like Pages, Queries, Countries, and Devices, providing meaningful segments. Metrics within this report encompass data such as Impressions and Clicks.
In the Pages report, Dimensions might detail reasons for pages not being indexed, while Metrics could quantify the number of affected pages per reason.
Regarding Core Web Vitals, Dimensions would represent performance levels like Poor, Needs Improvement, and Good, while Metrics would indicate the quantity of pages falling under each category.
Google Search Console (GSC) proves invaluable for SEO professionals as it allows us to diagnose and assess pages through Google’s lens. With features ranging from crawlability checks to page experience evaluations, GSC provides a diverse toolkit for troubleshooting site issues.
Prior to achieving search engine rankings, a webpage undergoes crawling and indexing. For assessment in search results, a page must be accessible for crawling.
Regardless of experiencing crawling problems, it’s advisable to routinely check the Crawl Stats report in Google Search Console (GSC). This report highlights any encountered issues such as:
Here’s a guide to utilizing this report effectively:
For pages displaying 404 response codes, click on “Not found” to review these specific pages.
If your web pages aren’t indexed, they won’t appear in search results for your most crucial keywords. There are a couple of methods to identify which pages on your site aren’t indexed using Google Search Console (GSC).
Initially, you might instinctively check the Pages report, accessible in the GSC’s left sidebar. While this report provides a wealth of information about both indexed and non-indexed pages, it can be somewhat misleading.
The “Not Indexed” section of the report includes pages intentionally left out of indexing, such as tag pages on your blog or pages designated for logged-in users only.
For a more accurate assessment of your site’s indexation issues, the Sitemaps report is the most reliable resource.
To access this information:
The resulting report resembles the Pages report but focuses on pages your site deems important enough to include in the sitemaps submitted to Google.
Here, you can scrutinize the “Reasons” column in the “Why pages aren’t indexed” table.
For instance, you might have multiple pages that have been crawled but are presently not indexed. To assess one of these pages:
On this page, you have the option to manually Request Indexing. Additionally, you can find the “TEST LIVE URL” button situated on the right side of the page. Clicking this button leads to a page indicating whether your page is accessible to Google. To check the test results for the page, click on “View Tested Page.”
The resulting pop-in window displays the HTML captured from the page, a smartphone rendering of the page, and “More Information” regarding the page, including any page resource issues or JavaScript console errors.
The HTML code shown in the inspection tool reflects what Googlebot could crawl and render. This is particularly significant for JavaScript-based websites, where the content may not initially exist within the static HTML but is loaded via JavaScript (using REST API or AJAX).
Analyzing the HTML code allows you to assess whether Googlebot has properly accessed your content. If your content is absent, it suggests that Google may have encountered difficulties crawling your webpage, potentially impacting your search rankings negatively.
By exploring the “More Information” section, you can pinpoint any resources that Googlebot couldn’t load. For instance, you might be inadvertently blocking specific JavaScript files crucial for content loading via robots.txt.
Once you’ve verified everything appears in order, proceed to the main URL inspection page and click on “Request Indexing”. The search bar at the top of every page in Google Search Console (GSC) enables you to inspect any URL within your verified domain.
Additionally, it’s advisable to ensure Google can access your robots.txt file smoothly. Navigate to Settings > robots.txt to check for any issues. If Google encounters difficulties fetching your robots.txt, such as due to a firewall blocking access, it will temporarily halt crawling your website for 12 hours. Failure to resolve the issue may lead Google to treat your site as if there’s no robots.txt file at all.
If you’re encountering challenges with getting your pages indexed by Google or ranking well in search engine results, it’s worthwhile to assess the Core Web Vitals (CWV) across your website.
These metrics gauge the actual user experience by leveraging data from the Chrome User Experience Report (CrUX).
CWV focuses on three key usability metrics:
To identify issues with your pages’ Core Web Vitals (CWV) scores in Google Search Console (GSC), follow these steps:
This action directs you to PageSpeed Insights, where you can access a Diagnostics report highlighting issues potentially affecting your CWV. This report presents a prioritized list of issues encountered, offering valuable insights into optimizing your website’s performance.
Utilize these diagnostic findings to guide your developers, designers, and engineers in addressing these issues effectively.
Hint: Ensure that your website employs HTTPS, as it enhances website security and contributes to better ranking. Implementing this is relatively straightforward, as most hosting providers offer free SSL certificates that can be installed with a single click.
If you’re facing challenges with indexation and ranking, it’s plausible that Google may have encountered a security concern or initiated manual action against your site.
In Google Search Console (GSC), you can find reports on both Security and Manual Actions located in the left column of the page.
If you’re flagged for either of these issues, your problems with indexation and ranking won’t be resolved until they’re addressed.
You can harness Google Search Console (GSC) for SEO in your daily routines through five key methods.
The upper section of the Search Console Performance Report offers various insights into a website’s performance in search, encompassing features such as featured snippets.
The Performance Report allows exploration of four search types:
The Web search type is displayed by default in the Search Console.
To view different search types, simply click the Search Type button and make your selection.
A valuable function is the option to compare the performance of two search types directly within the graph.
At the top of the Performance Report, four key metrics stand out:
By default, the Total Clicks and Total Impressions metrics are pre-selected.
To view specific metrics on the bar chart, simply click within the dedicated tabs for each metric and choose the ones you wish to display.
Impressions signify the frequency with which a website appears in search results. If a user views a URL without clicking on it, it registers as an impression.
Moreover, if a URL is positioned at the bottom of the search page and remains unseen because the user doesn’t scroll that far, it still counts as an impression.
High impressions are advantageous as they indicate Google’s visibility of the site in search results.
The significance of the impressions metric is enhanced by considering the Clicks and Average Position metrics.
The clicks metric indicates the frequency with which users clicked from search results to the website. A high number of clicks, coupled with a high number of impressions, is favorable.
However, a scenario of low clicks despite a high number of impressions suggests room for improvement. While not necessarily bad, it implies that the site may benefit from enhancements to attract more traffic.
The clicks metric gains greater significance when analyzed alongside the Average CTR and Average Position metrics.
The Average CTR (click-through rate) is a percentage that indicates the frequency with which users click from search results to the website.
A low CTR suggests areas for improvement to boost visits from search results, such as revising the page title or updating the meta description.
Conversely, a higher CTR signifies effective performance with users.
This metric becomes more meaningful when analyzed alongside the Average Position metric.
The Average Position metric indicates where a website typically ranks in search results. A position between 1 and 10 is excellent, signaling prominent visibility.
When the average position falls in the twenties (20 – 29), it suggests that the result appears further down the search page, requiring users to scroll. While not detrimental, it’s suboptimal and may indicate the need for enhancements to break into the top 10.
Average positions below 30 generally imply that the pages could greatly benefit from improvements.
Additionally, it could suggest that the site ranks for numerous keyword phrases, with some ranking low and a few exceptional ones ranking very high.
Examining all four metrics (Impressions, Clicks, Average CTR, and Average Position) together provides a comprehensive insight into the website’s performance.
The Performance Report serves as a crucial starting point for swiftly grasping the website’s search performance. It acts as a mirror, reflecting the site’s effectiveness or areas needing improvement.
Within the Performance section of Google Search Console, the Search Results report provides insights into Queries and their average Position.
Typically, queries where many companies rank within the top three positions are branded terms. However, those falling within ranks five to 15 are labeled as “striking distance” terms.
To leverage this data effectively, prioritize these striking distance terms based on impressions. Consider refreshing your content to incorporate these queries into the language of relevant pages.
To uncover these terms:
If you’re in urgent need of indexing for a priority page, the URL Inspection tool within Google Search Console offers a swift solution.
Here’s how to request indexing:
While this process prioritizes your URL for crawling and indexing by Googlebot, it doesn’t guarantee immediate indexing.
If your page remains unindexed despite this request, additional investigation may be required.
Please note: Each verified property in Google Search Console is restricted to 50 indexing requests per day.
Google Search Console (GSC) historical data is constrained to a maximum of 16 months. However, you can address this limitation by exporting GSC data to BigQuery, where it can be stored indefinitely. This enables access to extensive historical data whenever needed. It’s essential to note that this action isn’t retroactive, so initiating it promptly is advisable.
While Google Search Console (GSC) offers a plethora of insights, the real magic unfolds when you integrate its data with various SEO tools and platforms available in the market.
By integrating GSC data into these tools, you gain a sharper understanding of your site’s performance and potential.
From desktop crawlers such as Screaming Frog and Sitebulb to enterprise server-driven crawlers like Lumar and Botify, incorporating your GSC information enhances the depth of your page audits, covering aspects like crawlability, accessibility, and page experience factors.
Integration with prominent SEO tools like Semrush and Ahrefs yields comprehensive ranking information, facilitates content ideation, and enriches link data.
Moreover, the convenience of consolidating all your data into a single view cannot be overstated. With time being a precious commodity, these integration options significantly streamline your workload.
Over nearly two decades, Google Search Console has undergone significant evolution. Originally launched as Google Webmaster Tools, it has continuously delivered valuable insights for SEO professionals and site owners alike.
If you’ve yet to integrate it into your daily activities, you’ll swiftly discover its indispensability for making informed decisions regarding your site optimizations moving forward.
Original news from SearchEngineJournal