If you want more information about your site’s natural search performance, Google gives it to you. Take some of it with a grain of salt, but there is no reason not to listen.
The Search Console is the medium used by Google to communicate with webmasters and search engine optimization professionals about the status of their site in its search results.
Setting up Google Search Console can be a bit overwhelming. This usually requires help from a developer, unless you have access to a tag manager. It’s well worth it, however, for all the tools you have access to.
After you “verify” your site, you’ll see messages from Google about how your site looks in search results, as well as its performance, indexing, and crawl data.
This section is the way Google uses to communicate on how your site can perform better in natural search. Messages rarely tell you everything you want to know, but they do provide a starting point. Each post includes links to additional reports or help files.
For example, in the image above there is a message about 404 errors. Google cannot say why your site is giving errors or how to fix them. But just knowing that there is an increase and which URLs are impacted can make it easier to diagnose the problem.
You will also see messages about other crawling issues; notifications about the impact of new algorithm updates or search features on your site; and new features available in Search Console.
What do your search results look like? This section is not about ranking order, but about visual optimization of your results. Use this data to take advantage of Google’s opportunities to deliver results that are more likely to grab searchers’ attention and lead to higher click-through rates.
- Structured data, rich maps. Google likes structured data because it makes it easy to display precise and relevant information in search results. These tools help identify the structured data you have on your pages and the “cards” that show up in search results.
- Data highlighter. Use this tool to show Google where the different types of data are on your pages. For example, if the price on your product pages is always in the same place, use the highlighter to select that text and mark it as a price. Google does the rest for you and replicates the markup on other similar pages.
- HTML enhancements. The title tag and meta descriptions of your pages can be displayed as text on the search results page. Learn more about the quality of your metadata in this tool.
- Accelerated mobile pages. If you use AMP to deliver faster results to mobile users, Google will let you know what issues your pages may be experiencing.
This section is one of the main reasons to check your sites in Search Console. It offers insight into natural search performance that you can’t get anywhere else.
- Research analysis. Average impression and rank data, by keyword and landing page, make this tool worth it sometimes to set up Search Console. This is the only place where you can find (theoretically) reliable data on keyword performance for organic search.
The numbers you see in this section will never exactly match the data in your web analytics. Google may only provide analytics data in Search Console based on its own performance, and not other search engines. And even then, the data referenced by Google reported in Search Console will not match the data referenced by Google in the analyzes. But the trends should be the same. And don’t miss the new beta version of this tool, with access to 16 months of data!
- Links to your site and internal links. Similar to the analytics tool, these linking tools help you understand which links point to your site and which pages on your site they lead to. They are invaluable when trying to diagnose a link quality issue that can algorithmically weaken your site’s performance in rankings.
- Manual actions. If you’ve been banned for web spam, you’ll find out more here, as well as in the Messages section.
- International targeting. Analyze the signals your site sends to Google regarding languages and countries supported in HREFLANG (language) tags. You can also alert Google to the country targeted by your site. This is especially useful if you are managing a site that does not have a specific ccTLD such as .of Where .kr.
- Mobile friendliness. Since more than half of its searches are done on smartphones, Google has a soft spot for mobile ergonomics. This report identifies issues that mobile researchers may encounter.
Your pages cannot rank in organic search or generate traffic without first being indexed. Google uses this section to communicate the indexing status of your site.
- Index state. Think of this tool as showing the number of pages eligible to rank in Google.
- Remove URLs. Treat this tool with care because it will do exactly what it says: remove pages from the index. This is useful when private or secure information has been accidentally posted, or when a section of content does not add any value to natural search. Unfortunately, it’s also easy to drag and drop content that should be indexed.
- URL inspection. The URL Inspection Tool is new in the beta version of Search Console. It aggregates crawling and indexing information for a URL into individual pages.
It all starts with the crawl. If Google is having trouble crawling your site, pages that aren’t crawlable will have a hard time ranking.
- Exploration errors. Google also displays this information on the dashboard. Here you will find server header issues, from 404 file not found issues to 500 server unavailable issues. When they go up, you probably have a bigger crawling issue that can cause rankings and performance to drop.
- Exploration statistics. In the past 90 days, how much time has Google spent crawling your site? You will find the answer here. Spikes may indicate server issues or the launch of new content that requires crawling for the first time.
- Recover as Google. The best way to see your content like Google does is to grab the page and render it in this tool. You’ll see how Google interprets the page, as well as what users see in their browsers. You’ll also have the option to submit the page to Google’s index, which is a handy feature when you launch new content.
- txt Tester. the robots.txt The file can restrict and allow exploration access to different areas of the site. Like the “Remove URLs” tool, changes to the robots.txt case must be handled with care. Test all the changes offered in this tool to avoid costly crashes, such as preventing search engines from crawling your entire site.
- Site maps. Roll out the red carpet for Google by submitting your XML sitemaps for crawling. This tool allows you to submit sitemaps. The tool then indicates how many URLs in the sitemaps are indexed.
- URL parameters. If you have URLs based on parameters that you want to avoid to Google, submit those parameters to this tool. For example, search engines don’t need to crawl the versions of pages sorted in different orders. If a parameter controls this sorting, submit it here to make Google’s crawl more efficient and increase crawl fairness.