Chapter 1:Getting Started With the Google Search Console
In this chapter Iâll show you how to use the Search Console. First, youâll learn how to add your site to the GSC. Then, Iâll help you make sure your site settings are good to go.
This was great for security. But it was a bummer for website owners.
Suddenly, priceless keyword data vanished from Google Analytics.
Instead, all we got was this:
The good news? Thereâs a simple way to get some of that keyword data back: Link Google Analytics with your Google Search Console account.
Hereâs how:
Open up your Google Analytics. Then, click the âAdminâ button at the bottom of the left menu.
Click on the âProperty Settingsâ link.
Scroll down until you see the âAdjust Search Consoleâ button. Click it!
Click on âAddâ.
Scroll down until you find your website, check the box, and hit âSaveâ.
Youâre done! Analytics and Search Console are now linked.
Letâs see what you get…
Landing pages with impression and click data:
Impression, click, CTR, and position data by country:
But most importantly… keyword data:
Boom!
Step #4: Check For Security Issues
Finally, check to see if you have any security issues that might be hurting your siteâs SEO.
To do that, click âSecurity Issuesâ.
And see what Google says:
In most cases, as you see here, there arenât any security problems with my site. But itâs worth checking.
Step #5: Add a Sitemap
Iâll be honest:
If you have a small site, you probably donât NEED to submit a sitemap to Google.
But for bigger sites (like ecommerce sites with thousands of pages) a sitemap is KEY.
That said: I recommend that you go ahead and submit a sitemap either way.
Hereâs how to do it:
First up, you need to create a sitemap. If youâre running WordPress with the Yoast plugin, you should already have one.
If you donât have a sitemap yet, head over to Yoast. Then, set the XML sitemaps setting to âOnâ (under âGeneral/Featuresâ):
Click the âSee the XML Sitemapâ link, which will take you to your sitemap:
Donât use Yoast? Go to yoursite.com/sitemap.xml. If you have a sitemap, itâs usually here. If not, you want to create one.
So letâs submit a sitemap to Google.
Itâs SUPER easy to do in the new GSC.
Grab your sitemap URL. Then, hit the âSitemapsâ button.
Paste in your URL and click âSubmitâ.
And thatâs it:
Told you it was easy đ
Chapter 2:How to Optimize Your Technical SEO With the GSC
In this chapter I’ll share the tactics I use to SLAM DUNK my technical SEO.
As you know, when you fix these technical SEO problems, youâll usually find yourself with higher rankings and more traffic.
And the Google Search Console has a TON of features to help you easily spot and fix technical SEO issues.
Hereâs how to use them:
Use The âIndex Coverageâ Report To Find (And Fix) Problems With Indexing
If everything on your website is setup right, Google will:
a) Find your page and
b) Quickly add it to their index
But sometimes, things go wrong.
Things you NEED to fix if you want Google to index all of your pages.
And thatâs where the Index Coverage report comes in.
Letâs dive in.
What is the Index Coverage Report?
The Index Coverage report lets you know which pages from your site are in Googleâs index. It also lets you know about technical issues that prevent pages from getting indexed.
Itâs part of the new GSC and replaces the âIndex Statusâ report in the old Search Console.
Note: The Coverage report is pretty complicated.
And I could just hand you a list of features and wish you luck.
(In fact, thatâs what most other âultimate guidesâ do).
Instead, Iâm going to walk you through an analysis of a REAL site (this one), step-by-step.
That way you can watch me use the Index Coverage Report to uncover problems… and fix them.
How to Find Errors With The Index Coverage Report
At the top of the Index Coverage report weâve got 4 tabs:
Error
Valid with warnings
Valid
Excluded
Letâs focus on the âerrorâ tab for now.
As you can see, this site has 54 errors. The chart shows how that number has changed over time.
If you scroll down, you get deets on each of these errors:
Thereâs a lot to take in here.
So to help you make sense of each âreasonâ, here are some quick definitions:
âSubmitted URL seems to be a Soft 404â
This means that the page was ânot foundâ, but delivered an incorrect status code in the header.
(Iâve found this one to be a little buggy)
âRedirect errorâ
Thereâs a redirect for this page (301/302).
But it ainât working.
âSubmitted URL not found (404)â
The page wasnât found and the server returned the correct HTTP status code (404).
All good. (Well, if you ignore the fact that the page is broken…)
âSubmitted URL has crawl issueâ
This could be a 100 different things.
Youâll have to visit the page to see whatâs up.
âServer errors (5xx)â
Googlebot couldnât access the server. It might have crashed, timed out, or been down when Googlebot stopped by.
And when you click on an error status, you get a list of pages with that particular problem.
404 errors should be easy to fix. So letâs start with those.
Click a URL on the list. This opens up a side panel with 4 options:
But first, letâs visit the URL with a browser. That way, we can double check that the page is really down.
Yup. Itâs down.
Next, pop your URL into the URL inspection field at the top of the page.
And Googlebot will rush over to your page.
Sure enough, this page is still giving me a 404 âNot foundâ status.
How do we fix it?
Well, we have two options:
Leave it as is. Google will eventually deindex the page. This makes sense if the page is down for a reason (like if you donât sell that product anymore).
You can redirect the 404 page to a similar product page, category page, or blog post.
How to Fix âSoft 404â Errors
Now itâs time to fix these pesky âSoft 404â errors.
Again, check out the URLs with that error.
Then, visit each URL in your browser.
Looks like the first page on the list is loading fine.
Letâs see if Google can access the page OK. Again, weâll use the URL Inspection tool.
This time weâll hit the âTest Live URL buttonâ. This sends Googlebot to the page. It also renders the page so you can see your page like Googlebot sees it.
Looks like Google found the page this time.
Now letâs see how Google rendered the page. Click “View Tested Page”, then the “Screenshot” tab:
Looks pretty much the same as how visitors see it. Thatâs good.
Next, click the More Info tab, and check for any page resources that Google wasn’t able to load correctly.
Sometimes thereâs a good reason to block certain resources from Googlebot. But sometimes these blocked resources can lead to soft 404 errors.
In this case though, these 5 things are all meant to be blocked.
Once you’ve made sure any indexing errors are resolved, click the “Request Indexing” button:
This tells Google to index the page.
The next time Googlebot stops by, the page should get indexed.
How to Fix Other Errors
You can use the same exact process I just used for âSoft 404sâ to fix any error you run into:
Load up the page in your browser
Plug the URL into âURL Inspectionâ
Read over the specific issues that the GSC tells you about
Fix any issues that crop up
Here are a few examples:
Redirect errors
Crawl errors
Server errors
Bottom line? With a bit of work, you can fix pretty much any error that you run into in the Coverage report.
How to Fix âWarningsâ In The Index Coverage Report
I donât know about you…
…but I donât like to leave anything to chance when it comes to SEO.
Which means I donât mess around when I see a bright orange âWarningâ.
So letâs hit the âValid with warningsâ tab in the Index Coverage Report.
This time thereâs just one warning: âIndexed, though blocked by robots.txtâ.
So whatâs going on here?
Letâs find out.
The GSC is telling us the page is getting blocked by robots.txt. So instead of hitting âFetch As Googleâ, click on âTest Robots.txt Blockingâ:
This takes us to the robots.txt tester in the old Search Console.
As it turns out, this URL IS getting blocked by robots.txt.
So whatâs the fix?
Well, if you want the page indexed, you should unblock it from Robots.txt (duh).
But if you donât want it indexed, you have two options:
Add the ânoindex,followâ tag to the page. And unblock it from robots.txt
Get rid of the page using the URL Removal Tool
Letâs see how to use the URL Removal Tool:
How To Use The URL Removal Tool In Search Console
The URL Removal Tool is a quick and easy way to remove pages from Googleâs index.
Unfortunately, this tool hasnât moved over to the new Google Search Console yet. So youâll need to use the old GSC to use it.
Expand the “Legacy tools and reports” tab in the new GSC sidebar, then click “Removals”, where you’ll be taken to the old GSC.
Finally, paste in the URL you want to remove:
Double triple check that you entered the right URL, then click âSubmit Requestâ.
Note: A removal is only active for 90 days. After that Googlebot will attempt to recache the page.
But considering the page is blocked through robots.txt…
…this page will be gone for good!
Check Indexed Pages For Possible Issues
Now letâs move on to the âValidâ tab.
This tells us how many pages are indexed in Google.
What should you look for here? Two things:
1
Unexpected drop (or increase) of indexed pages
Notice a sudden drop in the number of indexed pages?
That could be a sign that somethingâs wrong:
Maybe a bunch of pages are blocking Googlebot.
Or maybe you added a noindex tag by mistake.
Either way:
Unless you purposely deindexed a bunch of pages, you definitely want to check this out.
On the flip side:
What if you notice a sudden increase in indexed pages?
Again, that might be a sign that something is wrong.
(For example, maybe you unblocked a bunch of pages that are supposed to be blocked).
2
An unexpectedly high number of indexed pages
There are currently 41 posts at Backlinko.
So when I take a look at the âValidâ report in Index Coverage, Iâd expect to see about that many pages indexed.
But if itâs WAY higher than 41? Thatâs a problem. And Iâm going to have to fix it.
Oh, in case youâre wondering… hereâs what I do see:
So no need to worry about me đ
Make Sure Excluded Stuff Should Be Excluded
Now:
There are plenty of good reasons to block search engines from indexing a page.
Note: When I say âlow qualityâ, I donât mean the page is garbage. It could be that the page is useful for users… but not for search engines.
That said:
You definitely want to make sure Google doesnât exclude pages that you WANT indexed.
In this case, we have a lot of excluded pages…
And if you scroll down, you get a list of reasons that each page is excluded from Googleâs index.
So letâs break this down…
âPage with redirectâ
The page is redirecting to another URL.
This is totally fine. Unless there are backlinks (or internal links) pointing to that URL, theyâll eventually stop trying to index it.
âAlternate page with proper canonical tagâ
Google found an alternative version of this page somewhere else.
Thatâs what a canonical URL is supposed to do. So thatâs A-OK.
âCrawl Anomalyâ
Yikes! Could be a number of things. So weâll need to investigate.
In this case, it looks like the pages listed are returning a 404.
âCrawled – currently not indexedâ
Hmmm…
These are pages that Google has crawled, but (for some reason) are not indexed.
Google doesnât give you the exact reason they wonât index the page.
But from my experience, this error means: the page isnât good enough to warrant a spot in the search results.
So, what should you do to fix this?
My advice: work on improving the quality of any pages listed.
For example, if itâs a category page, add some content that describes that category. If the page has lots of duplicate content, make it unique. If the page doesnât have much content on it, beef it up.
Basically, make the page worthy of Googleâs index.
âSubmitted URL not selected as Canonicalâ
This is Google telling you:
âThis page has the same content as a bunch of other pages. But we think another URL is betterâ
So theyâve excluded this page from the index.
My advice: if you have duplicate content on a number of pages, add the noindex meta robots tag to all duplicate pages except the one you want indexed.
âBlocked by robots.txtâ
These are pages that robots.txt is blocking Google from crawling.
Itâs worth double checking these errors to make sure what you’re blocking is meant to be blocked.
If itâs all good? Then robots.txt is doing its job and thereâs nothing to worry about.
âDuplicate page without canonical tagâ
The page is part of set of duplicate pages, and doesnât include a canonical URL.
In this case itâs pretty easy to see whatâs up.
Weâve got a number of PDF documents. And these PDFs contain content from other pages on the site.
Honestly, this isnât a big deal. But to be on the safe side, you should ask your web developer to block these PDFs using robots.txt. That way, Google ONLY indexes the original content.
âDiscovered – currently not indexedâ
Google has crawled these pages, but hasnât included them in the index yet.
âExcluded by ânoindexâ tagâ
All good. The noindex tag is doing its job.
So thatâs the Index Coverage report. Iâm sure youâll agree: itâs a VERY impressive tool.
Chapter 3:Get More Organic Traffic with the Performance Report
In this chapter weâre going to deep dive into my favorite part of the GSC: âThe Performance Reportâ.
Why is it my favorite?
Because Iâve used this report to increase organic traffic to Backlinko again and again.
Iâve also seen lots of other people use the Performance Report to get similar results.
So without further ado, letâs get started…
What Is The Performance Report?
The âPerformanceâ report in Google Search Console shows you your siteâs overall search performance in Google. This report not only shows you how many clicks you get, but also lets you know your CTR and average ranking position.
And this new Performance Report replaces the âSearch Analyticsâ report in the old Search Console (and the old Google Webmaster Tools).
Yes, a lot of the data is the same as the old âSearch Analyticsâ report. But you can now do cool stuff with the data you get (like filter to only show AMP results).
But my favorite addition to the new version is this:
In the old Search Analytics report you could only see search data from the last 90 days.
(Which sucked)
Now?
We get 16 MONTHS of data:
For an SEO junkie like me, 16 months of data is like opening presents on Christmas morning.
(In fact, I used to pay for a tool to automatically pull and save my old Google Webmaster Tools data. Now, thanks to the beta version of the new GSC, itâs a free service)
How To Supercharge Your CTR With The Performance Report
Note: Like I did in the last chapter, Iâm going to walk you through a real-life case study.
Last time, we looked at an ecommerce site. Now weâre going to see how to use the GSC to get more traffic to a blog (this one).
Specifically, youâre going to see how I used The Performance Report to increase this siteâs CTR by 63.2%.
So letâs fire up the Performance report in the new Search Console and get started…
1
Find Pages With a Low CTR
First, highlight the âAverage CTRâ and âAverage Positionâ tabs:
You want to focus on pages that are ranking #5 or lower⌠and have a bad CTR.
So letâs filter out positions 1-4.
To do that, click on the filter button, and check the âPositionâ box.
Youâll now see a filter box above the data. So we can go ahead and set this to âGreater thanâ 4.9:
Now you have a list of pages that are ranking #5 or below.
According to Advanced Web Ranking, position #5 in Google should get a CTR of around 4.35%:
You want to filter out everything thatâs beating that expected CTR of 4.35%. That way you can focus on pages that are underperforming.
So click the filter button again and check the âCTRâ box.
(Make sure you leave the âPositionâ box ticked)
Then, set the CTR filter to âSmaller thanâ 4.35.
So what have we got?
A list of keywords that are ranking 5 or lower AND have a CTR less than 4.35%.
In other words:
Keywords you could get more traffic from.
We just need to bump up their CTR.
So:
Letâs see if we can find a keyword with a lower-than-expected CTR.
When I scroll down the list… this keyword sticks out like a sore thumb.
1,504 impressions and only 43 clicks⌠ouch! I know that I can do better than 2.9%.
Now that weâve found a keyword with a bad CTR, itâs time to turn things around.
2
Find the page
Next, you want to see which page from your site ranks for the keyword you just found.
To do that, just click on the query with the bad CTR. Then, click âPagesâ:
Easy.
3
Take a look at ALL the keywords this page ranks for
Thereâs no point improving our CTR for one keyword⌠only to mess it up for 10 other keywords.
So hereâs something really cool:
The Performance report can show you ALL keywords that your page ranks for.
And itâs SUPER easy to do.
Just click on â+ Newâ in the top bar and hit âpage…â.
Then enter the URL you want to view queries for.
Bingo! You get a list of keywords that page ranks for:
You can see that the page has shown up over 42,000 times in Google…but only got around 1,500 clicks.
So this pageâs CTR is pretty bad across the board.
(Not just for this particular keyword)
4
Optimize your title and description to get more clicks
I have a few go-to tactics that I use to bump up my CTR.
But my all time favorite is: Power Words.
What are power words?
Power words show that someone can get quick and easy results from your content.
And theyâve been proven again and again to attract clicks in the SERPs.
Here are a few of my favorite Power Words that you can include in your title and description:
Today
Right now
Fast
Works quickly
Step-by-step
Easy
Best
Quick
Definitive
Simple
So I added a few of these Power Words to the pageâs title and description tag:
5
Monitor the results
Finally, wait at least 10 days. Then log back in.
Why 10 days?
It can take a few days for Google to reindex your page.
Then, the new page has to be live for about a week for you to get meaningful data.
With that, I have great news:
With the new Search Console, comparing CTR over two date ranges is a piece of cake.
Just click on the date filter:
Select the date range. Iâm going to compare the 2 week period before the title change, to the 2 weeks after:
Finally, filter the data to show search queries that include the keyword you found in step #1 (in this case: âbest helmet brandsâ).
Boom!
Weâve increased our CTR by 63.2%. And just as important: weâre now beating the average CTR for position 5.
Pro tip: Youâll find that different title formats work better in different niches. So you might have to experiment to find the perfect format for YOUR industry. The good news: Search Console gives you the data you need to do just that.
How To Find âOpportunity Keywordsâ With GSCâs Performance Report
If the last example didnât convince you of just how awesome the new Performance Report is, then I guarantee this one will.
What Is An Opportunity Keyword?
An opportunity keyword is a phrase that ranks between positions 8-20 AND gets a decent number of impressions.
Why is this such a big opportunity?
1
Google already considers your page to be a decent fit for the keyword (otherwise you wouldnât be anywhere close to page 1). When you give your page some TLC, you can usually bump it up to the first page.
2
Youâre not relying on iffy keyword volume data from third party SEO tools. The impression data you get from the GSC tells you EXACTLY how much traffic to expect.
Mining For Gold With Google Search Consoleâs Performance Report
Finding these gold nugget keywords in the Performance report is a simple, 3-step process.
1. Set the date range to the last 28 days:
2. Filter the report to show keywords ranking âGreater thanâ 7.9
3. Finally, sort by âImpressionsâ. And you get a huge list of âOpportunity Keywordsâ:
The best part? These answers give you a shot to rank as a Featured Snippet.
After all: why rank #1 when you can rank #0?
Find High-Impression Keywords
I already showed you how to optimize keywords that rank 8-20.
But…
I also like to look for keywords that arenât ranking, yet still get some impressions. Hereâs an example:
That keyword is sitting at position 50-ish… yet the page was still seen nearly 200 times.
Which tells me: if that many people are visiting the 5th page, wait until I hit the first page.
Itâs gonna be nuts!
Chapter 4:Cool GSC Features
In this chapter Iâm going to show you some of the coolest features in the Google Search Console.
First, Iâll teach you how you can use the Search Console to fix your schema.
Then, Iâll show you one of the quickest (and EASIEST) wins in SEO.
Power Up Important Pages With Internal Links
Make no mistake:
Internal links are SUPER powerful.
Unfortunately, most people use internal linking all wrong.
Thatâs the bad news.
The good news?
The Search Console has an awesome feature designed to help you overcome this problem.
This report shows you the EXACT pages that need some internal link love.
To access this report, hit âLinksâ in the GSC sidebar.
And youâll get a report that shows you the number of internal links pointing to every page on your site.
This report is already a goldmine.
But it gets better…
You can find the EXACT pages that internally link to a specific page. Just click on one of the URLs under the âInternal Linksâ section:
And youâll get a list of all the internal links pointing to that page:
In this case, we only have 6 internal links pointing to our Local SEO Guide. Thatâs not good.
So:
Once you find a page that doesnât have enough internal links juice, add some internal links that point to that page.
Time spent: under a minute.
Assessment: Win!
Pro Tip: Supercharge Key Posts With Internal Links From Powerhouse Pages
Whatâs a Powerhouse Page?
Itâs a page on your site with lots of quality backlinks.
More backlinks = more link juice to pass on through internal links.
You can easily find Powerhouse Pages in the Google Search Console.
Just hit the âLinksâ button again. And youâll see a section titled âTop linked pagesâ.
Click âMoreâ for a full list.
By default, the report is ordered by the total number of backlinks. But I prefer to sort by number of linking sites:
These are your Powerhouse Pages.
And all you need to do is add some internal links FROM those pages TO the ones you want to boost.
Easy, right?
Chapter 5:Advanced Tips and Strategies
Now itâs time for some advanced tips and strategies.
In this chapter youâll learn how to use Google Search Console to optimize crawl budget, fix issues with mobile usability, and improve your mobile CTR.
Mastering Crawl Stats
If you have a small site (
But if you have a huge site⌠thatâs a different story.
In that case, itâs worth looking into your crawl budget.
What Is Crawl Budget?
Your Crawl Budget is the number of pages on your site that Google crawls every day.
You can still see this number in the old âCrawl Statsâ report.
In this case, Google crawls an average of 22,257 pages per day. So thatâs this siteâs Crawl Budget.
Why Is Crawl Budget Important For SEO?
Say you have:
200,000 pages on your website
and
A crawl budget of 2,000 pages per day
It could take Google 100 days to crawl your site.
So if you change something on one of your pages, it might take MONTHS before Google processes the change.
Or, if you add a new page to your site, Googleâs going to take forever to index it.
So what can you do to get the most out of your Crawl Budget?
Three things…
1
First, stop wasting Crawl Budget on unnecessary pages
This is a biggie for Ecommerce sites.
Most ecommerce sites let their users filter through products⌠and search for things.
This is great for sales.
But if youâre not careful, you can find yourself with THOUSANDS of extra pages that look like this:
Unless you take action, Google will happily waste your crawl budget on these junk pages.
Whatâs the solution?
URL Parameters.
To set these up, click the âURL Parametersâ link in the old GSC. Then hit âAdd Parameterâ.
Letâs say that you let users filter products by color. And each color has its own URL.
For example, the color URLs look like this:
yourstore.com/product-category/?color=red
You can easily tell Google not to crawl any URLs with that color parameter:
Repeat this for ALL parameters you donât want Google to crawl.
And if youâre somewhat new to SEO, check in with an SEO specialist to make sure this is implement correctly. When it comes to parameters, itâs easy to do more harm than good!
2
See how long it takes Google to download your page
The crawl report in Search Console shows you the average time it takes Google to download your pages:
See that spike? It means that it suddenly took Google A LOT longer to download everything.
And this can KILL your Crawl Budget.
In fact, we have this quote straight from the horseâs mouth…
âMaking a site faster improves the users’ experience while also increasing crawl rate. For Googlebot a speedy site is a sign of healthy servers, so it can get more content over the same number of connections. On the flip side, a significant number of 5xx errors or connection timeouts signal the opposite, and crawling slows down.”
Bottom line? Make sure your site loads SUPER fast. You already know that this can help your rankings.
As it turns out, a fast-loading site squeezes more out of your crawl budget too.
3
Get more backlinks to your site
As if backlinks couldnât be any more awesome, it turns out that they also help with your crawl budget.
âThe best way to think about it is that the number of pages that we crawl is roughly proportional to your PageRank. So if you have a lot of incoming links on your root page, weâll definitely crawl that. Then your root page may link to other pages, and those will get PageRank and weâll crawl those as well. As you get deeper and deeper in your site, however, PageRank tends to decline.”
The takeaway:
More backlinks = bigger crawl budget.
Get The Most Out of âURL Inspectionâ
I already covered the URL Inspection tool in Chapter 3.
But that was one part of a big process. So letâs take a look at URL Inspection as a standalone tool.
Specifically, Iâm going to show you 3 cool things you can do with the Fetch As Google tool.
1
Get new content indexed (in minutes)
URL Inspection is the FASTEST way to get new pages indexed.
Just published a new page?
Just pop the URL into the box and press Enter.
Then hit “Request Indexing”…
…and Google will normally index your page within a few minutes.
2
Use âURL Inspectionâ to reindex updated content
If youâre a regular Backlinko reader, you know that I LOVE updating old content.
I do it to keep my content fresh. But I also do it because it increases organic traffic (FAST).
For example, in this case study, I reveal how relaunching an old post got me 260.7% more organic traffic in just 14 days.
And you better believe I always use the âFetch As Googleâ tool to get my new content indexed ASAP.
Otherwise, I have to wait around for Google to recrawl the page on its own.
As Sweet Brown famously said: âAinât nobody got time for that!â.
3
Identify Problems With Rendering
So what else can the âURL Inspectionâ tool do?
âTest Live URLâ shows you how Google and users see your page.
You just need to hit the âView Tested Pageâ button.
Then hit âScreenshotâ. And youâll see exactly how Google sees your page.
Make Sure Your Site Is Optimized For Mobile (Unless You Like Losing Traffic)
As you might have heard, more people are searching with their mobile devices than with desktops.
Bottom line? Your siteâs content and UX has to be 100% optimized for mobile.
But how do you know if Google considers your site mobile optimized?
Well, the Google Search Console has an excellent report called âMobile Usabilityâ. This report tells you if mobile users have trouble using your site.
Hereâs an example:
As you can see, the report is telling us about two mobile usability issues: âText too small to readâ and âClickable elements too close togetherâ.
All you need to do is click on one of the issues. And the GSC will show you:
1. Pages with this issue
2. How to fix the problem
Then, itâs just a matter of taking care of that issue.