If you’re serious about optimizing your website for search, then you need to understand technical SEO.
Simply put, technical SEO is the process of making sure that search engines can discover, parse, and understand the content on your site. The better you get at technical SEO, the more likely you’ll get a high rank.
Technical SEO is a very fun and complex line of work. Here are 19 of the more common issues to look out for.
Duplicate content is a big no-no with the search engines. That’s why you need to make sure that your content is unique.
Take the time to run Copyscape to ensure it’s at least 70% unique and make sure to take into account common template elements.
Also, if you’re running an ecommerce site, you might be tempted to just copy product descriptions from the manufacturer’s website and paste them onto your own web pages. Resist that temptation.
Yes, it’s a pain to come up with unique descriptions. That’s especially the case if you’re selling thousands of products.
That’s why you should outsource the task to qualified professionals. You’ll spend money but end up with a better rank.
Don’t forget to optimize your title tag and descriptions.
If you’re unfamiliar with the title tag, it lives up to its name. The content of the tag is the title of the web page. The description is located in the code and only shows up in the SERP.
What are some technical SEO you can experience with title tags and descriptions? Here are just a few of them:
From a technical perspective, these issues usually occur because of a mis-configuration on the website or they just were not filled in on a static page.
For example, there might be a set of categories, tags or pagination that has no title tag rules associated with it. In this case, you will need to block the URLs, redirect them, or dynamically optimize them.
You probably already know about the importance of internal linking when it comes to SEO. But have you heard about the dangers of broken internal links?
When a search bot crawls your site and finds broken links, that’s a strike against you. Earn enough strikes, and you could lose rank.
If you have a website with thousands of pages, you might think that it’s difficult to find any internal broken links. Fortunately, there are tools for that.
Then, go to work fixing them.
This is a fairly heavily debated topic. How many links should you have on a page?
Yoast says 100 links, and I think that is about right.
If you are going over that, you need a better structure. Often what people will do is create a tiered footer sitemap.
That means there will be links to many HTML sitemaps in the footer. For example, there might be links to 20 sitemaps each with 100 links on them. This will allow you to link to all your pages and keep the link count inline.
What do you know about your ratio?
In this case, I’m referring to the text-to-HTML ratio on your web pages. If a page has way more HTML code than actual text, search engines will consider it thin content.
Why? Because users can’t see HTML. They only see content.
Users are visiting your website to check out its content. They aren’t there for the HTML.
As a general rule, you want a good amount of unique text on each page.
Way too many SEO strategists neglect to put alt tags on their images. That’s a mistake.
Alt tags are a throwback to the early days of web browsers and are very important for accessibility.
Today, search engines use alt tags to gather information about an image and make it so the visually impaired can have the image alt description read to them.
While I’m on the subject of images, let me also point out that broken images can hurt your rank.
Also, it creates a poor user experience. If your website visitors are expecting to see an image, and instead they see nothing or, worse yet, a broken image icon, they’ll bounce away to a competitor’s site.
Once again, you’ll probably need to enlist the aid of a tool to find broken images on your site.
Screaming Frog is your friend here. You can do a crawl of the site and find all the image URLs that are 404ing and then go fix them.
It’s a global market, and people all over the world are using the web.
That’s why you should declare the language on the page in the HTML tag.
In some cases, though, you might have copied HTML content from another site and pasted it onto your own. The HTML tag in that content might specify a different language than the one you’re using.
That can hurt your rank.
The robots.txt is incredibly important for SEO. This is the file that prevents Google from crawling certain sections of your site.
It can be valuable in preventing duplicate content issues or restricting access to pages or sections you don’t want to be indexed.
That said, if you don’t have sections of your site you want to keep blocked, you likely won’t need one. Also, keep in mind Google’s guidelines mention that you shouldn’t use a robots.txt to block pages from the search results.
Make sure nothing is blocked, and you have a link to your sitemap in the file.
There is something called a rel canonical that tells Google to consolidate pages. This helps prevent any duplicate content issues by indicating the preferred version of the page.
You’ll want to ensure that the rel canonical in the code points at the same page you want crawled and indexed by Google.
Often, people will mistakenly noindex a website when they push a development environment live.
As the name subjects, this will keep Google from indexing, and therefore ranking, your site.
Look in the code and do a Control F to ensure there is not a noindex.
Many times, there are issues with pagination on websites technical SEO.
Recently, Google got rid of something called rel next rel prev. To accommodate for this, you’ll want to check your source code and make sure this is not the configuration.
Lately, I’ve been recommending no indexing pages after the main category or putting a rel canonical page to the main page. I prefer the no index.
Recently, Google made the switch to mobile-first indexing, which means how your site looks and loads on mobile is more critical than ever.
In Google Search Console, there is a report called the mobile usability report. It’s essential to check this report for errors and fix them.
Often, sitemaps will have errors when you submit them to Google Search Console.
You might have submitted a page that does a 404, submitted the wrong file type (Google accepts a few), or you could have submitted the wrong way.
Make sure you have a normal xml sitemap, an image sitemap, and a video sitemap at a minimum. Also, make sure you don’t submit more than 50,000 URLs, or you will need to create an index sitemap.
Accelerated Mobile Pages are excellent for speed, but often create errors. Make sure you check your configuration and ensure there are no errors in Google Search Console.
This happens quite a lot.
Often, people try to create headless content management systems, which means they use a content management system to manage the website instead of delivering content using another method. The problem is that these systems but they have no HTML base.
I see this one all the time. Someone will forget to block their development site, and it gets in the index.
Make sure to perform a search for a piece of content on your site in Google. If you see your development site in the index, you have an issue.
Lazy loading pages is a great way to speed up a site.
But if your lazy loading page doesn’t load enough content, you’ll lose the potential of that page ranking.
Make sure your page loads the most important HTML content quickly so that it can be indexed.
You would be surprised how often there are different versions of a page running live with a .html, .php, a forward slash, or no forward slash. Or, perhaps only one is live, but the others are 302 redirecting or still delivering a 200 OK for some reason.
It is important to make sure only one version of the page is live and that it has the correct status code.
Technical SEO issues are very common and this is just a small list of some of the major problems to look out for.
To keep on top of any issues, make sure to run your website crawls monthly and continuously work on improving the site.
In SEO, you never want to fall behind from a technical perspective.