If you are looking for an SEO checklist that will help you to increase your site's organic traffic and rank on Google, you have just found it.
If you are looking for an SEO checklist that will help you to increase your site's organic traffic and rank on Google, you have just found it.
We have put together the ultimate checklist that you need to drive SEO success in 2021, covering 41 best practice points and tasks that you need to know about.
From the SEO basics to must-knows when analyzing your off-page signals, use this as a reference point for ensuring that your site is adhering to best-practice and that you're not being held back by issues that you have missed.
Here are the main categories I will cover in this guide:
We've broken this checklist down into sections that cover the main focus areas of SEO; the basics, keyword research, technical SEO, on-page SEO, and content and off-page factors.
There's a good chance that your site already covers many of these points, and if it does, great!
However, we also know that all websites have opportunities to improve and are confident that you will find at least some best-practice areas that you have overlooked.
Some of these points might not be relevant to you, and that is OK!
Work through the list, reference these against your site, resolve issues, and maximize opportunities where you can. SEO success doesn't come from simply following a checklist, but to outrank your competitors; you need to make sure you are at least covering most of these points.
If you haven't got the basics covered, your site will struggle to rank for competitive terms.
The following points are very much housekeeping tasks but form the basics of implementing a successful SEO strategy.
Google Search Console is an essential tool that provides you with invaluable insights into your site's performance as well as a wealth of data that you can use to grow your site's organic visibility and traffic.
You can learn more about why it is so important to use, how to set it up, and more in our definitive guide.
Bing Webmaster Tools is the equivalent platform, just providing data and insights for their search engine.
These all-important tools allow you to view the search terms and keywords that users are finding your site on the SERPs for, submit sitemaps, identify crawl errors, and much more.
If you have not got these set up, do so now, and thank us later.
Without the right data, you can't make the right decisions.
Google Analytics is a free marketing analytics tool that allows you to view data and insights about how many people are visiting your site, who they are, and how they are engaging with it.
Our definitive guide will walk you through everything you need to know about the tool as a beginner, including how to set it up and the reports that you will find the most useful, but one this is for sure, and that is that you can't run a successful SEO strategy without it.
You will also need to connect Google Analytics and Google Search Console to import data from the latter.
If you are using WordPress as your CMS (which there is a pretty good chance that you are, given that it now powers 35% of the web), you should install and configure an SEO plugin to provide the functionality and features that you need to properly optimize your site.
In SEMrush's recently published WordPress SEO checklist, we have SEO plugin suggestions for you. Whichever plugin you choose pretty much comes down to personal preference, but these are three great options.
If you are using a different CMS to WordPress, speak with your developer to see whether you need to install a dedicated SEO plugin or module or whether the features that you need are included out of the box.
Plug in SEO, as an example, is one of the most popular Shopify SEO apps.
The purpose of a sitemap is to help search engines decide which pages should be crawled and which the canonical version of each is.
It is simply a list of URLs that specify your site's main content to make sure that it gets crawled and indexed.
A sitemap tells the crawler which files you think are important in your site, and also provides valuable information about these files: for example, for pages, when the page was last updated, how often the page is changed, and any alternate language versions of a page.
Google supports a number of different sitemap formats, but XML is the most commonly used. You will usually find your site's sitemap at https://www.domain.com/sitemap.xml.
If you are using WordPress and one of the plugins mentioned above, you will find that generating a sitemap is standard functionality.
Otherwise, you can generate an XML sitemap with one of the many sitemap generator tools that are available. In fact, we recently updated our ultimate guide to sitemaps, which includes our top recommendations.
Once you have generated your sitemap, make sure that this is submitted to Google Search Console and Bing Webmaster Tools.
Make sure to also reference your sitemap in your robots.txt file.
Quite simply, your site's robots.txt file tells search engine crawlers the pages and files that web crawlers can or can't request from your site.
Most commonly, it is used to prevent certain sections of your site from being crawled and is not intended to be used as a way to de-index a webpage and stop it showing on Google.
You can find your site's robots.txt file at https://www.domain.com/robots.txt.
Check whether you already have one in place.
If you don't, you need to create one — even if you are not currently needing to prevent any web pages from being crawled.
Several WordPress SEO plugins allow users to create and edit their robots.txt file, but if you are using a different CMS, you might need to manually create the file using a text editor and upload it to the root of your domain.
You can learn more about how to use robots.txt files in this beginner's guide.
In rare instances, you might find that your site has been negatively affected by having a manual action imposed upon it.
Manual actions are typically caused by a clear attempt to violate or manipulate Google's Webmaster Guidelines — this includes things like user-generated spam, structured data issues, unnatural links (both to and from your site), thin content, hidden text and even what is referred to as pure spam.
Most sites won't be affected by a manual action and never will be.
That said, you can check for these in the manual actions tab in Google Search Console.
You will be notified if your site received a manual action, but if you are working on a new project or taking over a site, it should always be one of the first things that you check.
It is not as uncommon as you may think that a website isn't actually able to be indexed by Google.
In fact, you'd be surprised at how often a sudden de-indexing of a site is caused by developers accidentally leaving noindex tags in place when moving code from a staging environment to a live one.
You can use the Semrush site audit tool to ensure that your website can actually be crawled and indexed.
Simply go ahead and start a crawl; if this is blocked, search engines won't be able to crawl or index your site either.
Double checking that your site's main pages that should be indexed are actually able to be indexed can save a lot of troubleshooting problems if you find issues later down the line. Unsure if your site is correctly indexed by Google? Let Jason Barnard take you through the steps of properly indexing your website.
Without a solid keyword research process, you are not going to rank for the right terms, and if you are not ranking for the right terms, your traffic isn't going to convert at the rate it could.
Here is a checklist of the essential keyword research tasks you need to ensure you are covering to see success from your SEO efforts.
One of the quickest ways to get started with keyword research is to find the terms that are working for your competitors.
In our opinion, no time spent doing competitor analysis is wasted time.
Run your own domain (and your key competitors) through the Semrush Domain Overview tool, and you will be able to quickly identify those competitors who are competing in the same space as you are and how your visibility compares.
You need to know what your main 'money' keywords are. If you hadn't guessed, these are the ones that are going to drive you leads, sales, and conversions.
You will also find these referred to as head terms and pillar page keywords.
Generally, these are the high volume, high competition keywords that really summarize what you offer, either at a topic or category level. Let's take the term 'NFL Jerseys' as an example — FYI, that is considered a head term.
You can use the Keyword Overview tool to conduct keyword research around your products and services, and identify your head terms.
If you are looking for a comprehensive guide to learn how to pull together a content strategy based on priority terms, check out this great tutorial.
A keyword strategy without long-tail keywords isn't really a keyword strategy.
In fact, long-tail keywords, despite typically being lower volume than head terms, deliver a higher conversion rate.
You need to make sure that your SEO strategy targets long-tail keyword variants as well as head terms. Both in terms of optimizing your site's pages to make sure they are ranking for a variety of related terms (you will want to include closely related long-tail terms throughout your page's content), but also to be able to create supporting content that sits alongside your key content.
You can use the keyword magic tool to help you find long-tail keywords.
Simply enter your main keywords and choose your country, and the tool will return a list of keywords where you can modify the match type by broad, phrase, exact, or related keywords.
For further long-tail ideas, the topic research tool can also help you to find keyword variations to target or use throughout your content.
Once you have identified your target keywords, you need to map these to pages on your site and also identify any gaps.
This guide to keyword mapping outlines it as:
In its simplest form, keyword mapping is a framework for the keywords you have chosen to target that mirrors your site’s structure. Driven by research, the ultimate goals of the map are to help you discover where to optimize, what content to build, and where you can add new pages to attract more traffic.
— Andrea Lehr
It is important that you put in the time to ensure that you're targeting the right pages with the right keywords, and the process outlined in the guide can help you to get this right first time and use this to power your strategy.
You need to make sure that your page's content matches the searcher's intent.
This means taking the time to analyze the pages that rank for your target terms and making sure that your content aligns.
Let's say you are looking to target a term at a nationwide level. You might have identified a high search volume and a realistic keyword difficulty, but if the SERPs return local results, you are not going to see yourself ranking in prominent positions.
If you don't understand the intent of the content that Google is ranking, you won't be able to ensure that yours aligns.
Learn more in this guide on how to identify intent in search and use this handy visual as a starting point to classify search features by intent type:
Knowing the questions that your audience are asking can help you to better answer these through your site's content. And let's not forget that many people are now referring to search engines as answer engines.
You will find a list of questions that relate to any entered keyword in the keyword overview tool, allowing you to see also the monthly search volume of these.
This is a great starting point and can provide great inspiration, especially if you enter more specific keywords as a starting point.
In addition, you could use a tool like AlsoAsked.com that scrapes and returns 'People Also Asked' results to find further ideas and questions to answer with your content.
A brand new website is going to struggle to rank for competitive keywords until it is built up authority. It is sometimes hard to hear, but it's the truth.
And for that reason, you need to understand how difficult it is going to be to rank for your target keywords, if only to manage your own (or your client or bosses) expectations as to when you are likely to begin to see results.
Again, head over to the keyword overview tool, input your target keywords, and you can see the keyword difficulty; this is how hard it will be for a new website to rank in first page positions.
Technical SEO helps you to create solid foundations and ensure that your site can be crawled and indexed.
Here are the most common areas of best-practice that you need to pay attention to.
It is 2021, and HTTPS has been known as a ranking factor since 2014.
There is no excuse for not using HTTPS encryption on your site, and if you are still running on HTTP, it is time to migrate.
You can really easily confirm that your site sits on HTTPS by taking a look at your browser's URL bar.
If you see a padlock, you are using HTTPS. If you don't, you are not.
It is really important that you are only allowing Google to index one version of your site.
https://www.domain.com
https://domain.com
http://www.domain.com
https://domain.com
These are all different versions of your site and should all point to a single one.
Whether you choose a non-www or www version is up to you, but the most common one is https://www.domain.com.
All other versions should 301 redirect to the primary one, and you can check this by entering each variant into your browser bar.
If you are redirected, there are no issues, but if you find that you can access different versions, implement redirects ASAP.
You can quickly identify any crawl errors that exist through Google Search Console.
Head to the coverage report, and you will see both errors and excluded pages, as well as those with warnings and those which are valid.
Take the time to resolve any errors that you find, and explore the cause of excluded URLs in more detail (in many cases, there is a reason why this is happening that you need to resolve; from 404 errors to incorrectly canonicalized pages).
Slow sites make for poor user experience.
In fact, Google has recently confirmed an upcoming page experience update for 2021 that is set to place an even greater focus upon user experience as a ranking factor than is currently the case.
You need to make sure your site loads quickly and acknowledge that users continue to expect more.
No one's waiting around for a slow site.
You will find slow-loading pages flagged when you run a crawl using the site audit too, and can also gain more specific insights using Google's PageSpeed Insights tool as well as advice on how to improve this in our recent guide.
Broken links are another signal of poor user experience. No one wants to click a link and find that it doesn't take them to the page they were expecting.
A list of broken internal and outbound links can be found in your Site Audit report, and you should fix and identified issues either by updating the target URL or removing the link.
Most sites migrated from HTTP to HTTPS quite some time back, yet it is still common to find that internal links point to HTTP pages, not the current version.
Even when there is a redirect in place to direct users to the new page, these are unnecessary, and you should aim to update these as soon as you can.
If there is only a small number of incorrect links, update these manually in your CMS. However, if these are site-wide (which they often are), you need to update page templates or run a search and replace on the database.
Speak to your developer if you are unsure.
Mobile-friendliness is a key factor in Google's upcoming page experience update, but the real reason why you should care so much about ensuring that your site is mobile-friendly is that Google switched to mobile-first indexing for all sites from mid-2019.
If you are not serving a mobile-friendly experience, you will find that your organic visibility suffers because of this.
You can test your site's mobile-friendliness with Google's mobile-friendly testing tool.
An SEO-friendly URL structure makes it easier for search engines to crawl your pages and understand what they are about. Your page URLs should be simple and descript.
Here is what an SEO-friendly URL looks like:
https://www.domain.com/red-shoes/
As opposed to a query string that isn't descript:
https://www.domain.com/category.php?id=32
As Google continues to build a more semantic web, structured data markup becomes increasingly valuable.
If you are not already using structured data, you need to be.
In fact, the Schema.org vocabulary includes formats for structuring data for people, places, organizations, local businesses, reviews, and so much more.
Structured data helps your organic listings stand out on the SERPs, and in the example below, you will see both the review stars and the price that enhances the result.
Check out the Semrush guide on structured data for beginners to learn more about how you can leverage this for success, or head to Google's structured data testing tool to analyze whether or not your site currently uses this at all.
Ideally, pages shouldn't be any further into your site than 3 clicks deep.
If it is, this is a sign that you need to spend time reworking your site structure to flatten it. It is as simple as being that the deeper a page, the less likely users or search engines are going to find it.
You will find pages that need more than 3 clicks to be reached clearly highlighted in the issues section of your site audit report.
302 redirects indicate that a redirect is temporary, whereas 301s signal that the move is permanent.
It is fairly common to find 302s used in place of 301s, and while Google has confirmed that 302s pass PageRank, the fact remains that if a 302 redirect isn't expected to be removed at any point in the future, it needs updating to a 301.
You will find any 302 redirects clearly highlighted in the Site Audit report as pages that have temporary redirects.
Your site shouldn't send users or search engines via multiple redirects (a redirect chain), nor should redirects create a loop.
Simply, redirects should go from page A to page B.
The site audit report will highlight any issues that exist in relation to redirect chains and loops, and you need to resolve these by updating all redirects in a chain to point to the end target, or by removing and updating the one causing the loop.
Without great content and a great on-page experience, you will struggle to rank your site and increase your organic traffic. This applies to website pages and blog posts. With so much competition, blogs have to ensure they are taking every possible step to outrank other blogs for important search terms.
Make sure that your site ticks the boxes below and focus on creating great content for users, not search engines.
Optimized title tags are the very basics of SEO. In fact, they are often the first thing any SEO would take a look at to help a page to rank.
You see, a title tag informs search engines what a page is about and should be unique.
Duplicate title tags shouldn't be found, and truncated title tags (ones that are too long) will be cut off on the SERPs (you will see three dots after the title tag and part of it missing).
You also need to ensure that title tags aren't missing (where the title tag is blank).
All of these issues can be found flagged in your Site Audit report and can be fixed by updating and improving your page's title tags.
While meta descriptions haven't been used as a direct ranking factor for many years, this is usually what shows beneath your site's title tag on the SERPs.
Quite simply, it is your meta description that encourages a user to click on your listing over someone else's and can either positively or negatively affect your organic CTR.
If you don't have a meta description in place, Google will display part of your page's content, but this could include navigation text and other elements and be far from enticing. If you have duplicates, there is a good chance that you are not presenting a unique description that encourages clicks.
A page's H1 tag is your content's main heading, and there should only be one in place per page.
The site audit report will flag pages that have more than one H1 tag in place, and you should take the time to resolve these to ensure only one exists on each page.
The most common reason why multiple H1 tags exist is that your site's logo is wrapped in one, as well as the main heading on the page.
Primarily, H1 tags should include a page's main target keyword, so be sure to make sure that you are tagging the right content.
If you are not properly optimizing your page titles and meta tags, you are missing out on an opportunity to rank not just for your main keywords, but also variations.
Head to the performance report in Google Search Console and identify keywords on each page that have a significant number of impressions but low clicks and a low average position.
This usually means that your page is deemed relevant for the queries, and is ranking at least somewhere, but that you have not optimized the page by including these variations in your content or tags.
Rework and re-optimize your page with these in mind, and you will almost certainly see an uplift in clicks and ranking position.
For some time now, SEOs have been seeing huge gains by pruning site content. Essentially, this means getting rid of content that doesn't rank, doesn't add value, and shouldn't really be on your site.
We can't stress enough that time spent running a content audit and pruning thin, duplicate, or low-quality content is massively underrated.
If any content isn't adding value to your site, it needs to go. It is as simple as that. Kevin Indig's guide to using Semrush for SEO pruning is a great starting point and will give you the insights you need to undertake this process effectively.
You should pay attention to image optimization. From properly naming images with a descriptive file naming convention through to optimizing the size and quality, it is an area of SEO that is often neglected.
At the very minimum, you should ensure that the main images on each page of your site use ALT tags to properly describe the content of it.
Internal links are arguably one of the most neglected link building tactics in SEO marketing. Spending time improving your site's internal linking strategy almost always drive noticeable gains quite quickly.
Some marketers see really quick wins from adding even one or two internal links from authoritative pages elsewhere on your site.
You can read this guide to executing an internal links strategy that works and begin to identify pages that need to be linked to from other pages or which hold authority that could be distributed elsewhere across your site.
You will also find a list of pages that have only one internal link pointing to them as part of the site audit report.
Related topic: What is Anchor Text? Everything You Need to Know for SEO
Keyword cannibalization is one of the most misunderstood SEO concepts. Despite what many think, it is an issue that is all about the intent of a page, not simply the keywords that are used in the content and in title tags.
It is a simple fact that if your site is suffering from cannibalization, you'll struggle to rank for competitive terms as the search engines have a hard time figuring out which page to show.
This cannibalization guide walks you through the most common ways to find and fix these issues.
Pages on your site should always be linked to from at least one other page.
After all, if Google isn't able to crawl a page through other links on your site, it is likely that it is not inserting the authority that it otherwise could and not ranking as well as it could be.
If you are serving pages in your sitemap that are not accessible via at least one link from another page, you will find these flagged as ' orphaned pages in sitemaps' in your Site Audit report.
Go ahead and link to these pages from at least one other relevant page.
Content naturally ages and becomes outdated. It's just something that happens as time moves on.
But updating old content is one of the easiest tasks that you can implement to see big wins.
In fact, Danny Goodwin of Search Engine Journal comments:
Updating your content can result in better search rankings, more links, more traffic, social shares, and new customers discovering you. Just don’t forget about the “marketing” in “content marketing”!
If the content on your page contains outdated information or could simply do with being brought up to date with a fresh perspective, it is time well spent.
After all, content that is outdated usually doesn't offer the best experience for users, so why would Google continue to rank it unless it's brought up to date?
If you want to drive SEO success in 2021, you simply can't turn a blind eye to off-page SEO factors, and while these are often thought of simply as link building, there is more to it than that — which I will show below.
If you don't have insight into your competitor's link profile, how can you plan a strategy to outrank them?
Just as it is important to spend time analyzing your competitor's content, you should also invest resources into digging deep into their link profile.
You can run any URL through the backlink analytics tool, and you can analyze your competitor's link profile and start to understand the overall quality and authority of the links that point to their site.
Are you missing out on links that your competitors are benefiting from, but you are not?
Conducting a link intersect analysis will often quickly help you to find quick-win opportunities that competitors are already enjoying.
Using the backlink analytics tool, you can enter up to five different domains to gain insight into which domains are linking to which of your competitors.
If there is, let's say, a resource page that links to all others in your space except you, a great starting point would be to reach out to ask to be added.
If you have got a PR team that is landing coverage in the press, there is a good chance that you will find articles that mention your business but don't link.
These are known as unlinked brand mentions. The brand monitoring tool can help you to quickly identify mentions of your brand that don't link, and this is a great guide that looks at exactly how to approach asking for a piece of unlinked coverage to have a link added.
There are always new link building opportunities that you can explore and act upon, but finding these often takes time.
We are very much of the opinion that you should always be trying to build great links to your site, but also that using the right tools can help make the task of finding these opportunities a little bit easier.
We love the link building tool as a really simple and straightforward way to see a continual stream of new opportunities that you can look into and websites that you can reach out to.
In just minutes, you will have a whole host of new opportunities and can put in place a solid strategy to make great gains on the competition.
Off-page factors go way beyond just links, despite these being a key ranking factor.
If you are running a local business, make sure that you are listed (and ranking) on Google My Business; otherwise, you will literally be handing visits to a competitor.
There is no denying that it takes time to optimize your GMB listing to a decent standard, and you need to keep it tidied up, but if you are serving customers at their location or they are visiting you, there is no reason why you won't stand to gain from the time you put in here.
To get started with improving your brand's local visibility through GMB, take a look at this guide.
There you go, a 41 step SEO checklist that both beginners and more advanced SEOs can follow and hopefully find at least a few ways to improve their site's optimization.
If you have any great checklist steps that you work into your own workflow, we'd love to know what they are in the comments below.
See How Your Agency or Brand Can Afford SEO Tools.