
Now, after reading the title, you can think, “What new can I read here? At least every month I see similar articles on different blogs”. I can say without a doubt you’ll definitely like this post.
My article is developed on the basis of unique research.
Every SEO specialist checks a site with the help of some SEO service. I work at one of the most popular all-in-one SEO platforms — Serpstat. Every year our team analyzes site audit results of our users to find out which SEO errors are really the most common.
In this article, I’ll shed light on the results we’ve got for the last year.
Serpstat research: Results we’ve got
During 2018, our users carried out 204K audits and checked 223M pages through Serpstat. Our team analyzed this data and collected the stat.
All stat you can see on the infographics below the text. I just want to specify some facts in words here.
After the research, we’ve discovered that most sites had problems with meta tags, markups, and links. The most common errors are concerned with headlines, HTTPS certificate, and redirects. Issues with hreflang, multimedia, content, indexing, HTTP status codes, AMP (accelerated mobile pages), and loading time were least likely.
Also, we’ve analyzed country-specific domains to get more exact information. The stat we’ve got from it shows that 70% of “.com” domains have the most common problems with links, loading time, and indexing. The same situation is with “.uk” and “.ca” domains.
The most common mistakes and how to fix them
1. Meta tags
Meta tags are rather important despite the fact they aren’t visible to website users. They tell search engines what the page is about and take part in snippets creation. Meta tags affect your website ranking. Errors which can occur with them may spoil user signals.
According to our research, you should first check the length of the title and description itself.
2. Links, markups, and headings
External links (their number and quality) affect your site’s position in SERP as search engines rate link profiles very carefully. Also, you should always remember about internal links factors (nofollow attributes and URL optimization).
The Serpstat team also found out that bugs with markups and headings are rather popular ones despite the fact that they are very important for websites. Markups and headings contain attributes which mark and structure the data of the page. They also help search engines and networks crawl and display the site correctly.
The most common errors in this chapter are with:
- Nofollow external link attributes
- Missing Twitter card markups
- H1 doubling the title tag
3. HTTPS certificate
This certificate is one of the important ranking factors as it ensures a secure connection to the website and the browser. If your website uses personal information, don’t forget to pay attention to it.
The most common mistake here is the referral of HTTPS website to HTTP one.
4. Redirects, hreflang attribute, multimedia
Redirects direct users from the requested URL to another one you need. According to our statistics, you should avoid the most common error with them — having a multilingual interface it’s necessary to apply the hreflang attribute for the same content in different languages. In such a way search engines can understand which version of your texts users prefer.
Multimedia elements don’t affect SEO directly. Although, they can cause bad user signals and indexing errors. Also, pictures affect the website’s loading time. That’s why multimedia are rather important. And here is the same situation with the hreflang attribute — if you have the multilingual interface, you should apply it for the same content in multiple languages.
More info about errors in this section you can find on the infographics.
5. Indexing
Search engines find out what sites are about while indexing. If the site is closed for indexing, users can’t find it in the SERP. Some weak spots of the site that often lead to errors are the following:
- Canonical tags that reference a different page
- Non-indexed pages (noindex)
- iframe tags
6. HTTP status codes, AMP, and content
Answers that the server delivers on user request have the name HTTP status codes. Errors with them are rather serious problems and negatively affect the position of the site in SERPs.
AMP is accelerated pages optimized for mobile devices. You should use such technologies to improve the loading time of the site. Also, poor content causes the deterioration of ranking positions.
The most common problems here are:
- 404 error codes
- Missing AMP
- Generated content
7. Loading time
Long loading time can worsen the site’s usability and waste the crawling budget. Serpstat team found that the most common problems with this issue are associated with the use of browser cache, image, JavaScript, and CSS optimization.

Source: Serpstat
How to correct these errors
To find all the above-mentioned errors for your own site, you can start a custom project at Serpstat Audit tool. Here you can check the whole site or even just a separate page. The module checks 20 pages per second and finds more than 50 errors that potentially harm your site.
In its reports, Serpstat sorts errors by importance and categories and gives the list of pages on which these problems were found. In addition, it offers recommendations on how to resolve a specific problem. Some of them are not errors in the true sense (“information”), they are only shown for you to be aware of such problems.
Summary
There are a lot of errors that can damage your site and its rankings. Despite this fact, you can find them all at once with the help of audit tools.
At first, pay your attention to the most common weaknesses:
- Meta tags
- Markups
- Links
- Headings
- HTTPS certificate
- Redirects
- Hreflang attribute
- Multimedia
- Indexing
- HTTP status сodes
- AMP
- Loading time
- Content
Inna Yatsyna is a Brand and Community Development Specialist at Serpstat. She can be found on Twitter .
Related reading
Global consumer Internet video traffic will account for 80% of all consumer Internet traffic. Pages with videos are 53 times more likely to rank on Google.
If you’re looking for a way to optimize your site for technical SEO and rank better, consider deleting your pages. I know, crazy, right? Lots of points covered with screenshots.
Free or freemium keyword research tools (not including obvious ones like Google Ads Keyword Planner) to provide useful insight into organic ranking opportunities, persona building, competitive research, product development, and more.
A brief idea of how you can rank a page on Google’s featured snippet, without building any links to that page. Types, techniques, and screenshots included.
New: What’s different *now* (current considerations + best practices)
A lot of the issues above are still the “daily bread” of technical SEO, but a few areas **have materially changed in recent years** due to Google’s evolving systems, crawling behavior, and SERP layouts. Use this section as a modern checklist to prioritize fixes that tend to have the biggest practical impact today.
### 1) Titles & descriptions: still important, but not fully under your control
– **What changed:** Search engines (especially Google) may **rewrite titles and meta descriptions** in snippets more often than they used to.
– **Best practice:**
– Treat title tags as a *strong suggestion*, not a guarantee. Make them accurately reflect the on-page content and the primary intent.
– Keep titles unique, avoid boilerplate duplication, and use real page-specific context (product name, category, location, etc.).
– Write meta descriptions for **CTR and clarity**, even if they’re sometimes rewritten—good descriptions still help when they are shown and can improve internal consistency for other platforms.
### 2) Structured data: validate more than “it exists”
– **What changed:** Rich result eligibility has become more strict over time, and some schema types have been limited to certain contexts. Incorrect markup is increasingly “ignored” rather than “partially helpful.”
– **Best practice:**
– Focus on schema that maps to your content and is eligible for rich results (e.g., Product, Review snippets where allowed, Article, FAQ *only if supported/eligible for your site type*, Breadcrumb, Organization).
– Ensure markup matches visible content (no “invisible” reviews, pricing, or FAQs).
– Validate in Google’s Rich Results Test and watch Search Console enhancements reports.
### 3) HTTPS is table stakes; mixed signals are the real risk
– **What changed:** HTTPS itself is no longer a differentiator; *incorrect migrations and mixed versions* are still common ranking/canonicalization problems.
– **Best practice:**
– Enforce one canonical version (https + preferred host) with consistent internal links, canonicals, hreflang, sitemaps, and redirects.
– Avoid redirect chains and make sure HTTP → HTTPS is **one hop**.
### 4) Redirects & canonicals: consolidation quality matters more than ever
– **What changed:** With more large sites and more duplicated/parameterized URLs, search engines rely heavily on **canonical and redirect consistency** to decide what gets indexed.
– **Best practice:**
– Use redirects for retired URLs and canonicals for near-duplicates—don’t mix signals (e.g., canonical to A but redirect to B).
– Keep sitemaps clean: only canonical, indexable, final URLs.
### 5) Indexing: “crawlable” doesn’t mean “index-worthy”
– **What changed:** Modern systems are better at **not indexing** pages that seem thin, duplicative, or unhelpful, even if they’re technically accessible.
– **Best practice:**
– Treat indexing as a quality pipeline: consolidate duplicates, improve thin pages, and prune low-value URLs (parameters, internal search pages, near-empty tag pages).
– Monitor Search Console’s indexing reports for patterns (e.g., “Discovered – currently not indexed”, “Crawled – currently not indexed”) and pair fixes with content/intent improvements.
### 6) Core Web Vitals: speed is now “experience consistency,” not just load time
– **What changed:** Google’s page experience signals emphasize **measurable, user-centric performance** (CWV). The bigger shift is operational: teams now need performance monitoring, not one-time optimization.
– **Best practice:**
– Optimize the biggest contributors: images (modern formats + proper sizing), JS bloat, render-blocking CSS, and third-party scripts.
– Measure with real-world data (CrUX where available) in addition to lab tests.
– Don’t sacrifice functionality, but avoid heavy UI elements that delay interactivity or cause layout shifts.
### 7) Mobile-first behavior is assumed; parity issues still hurt
– **What changed:** Mobile rendering/indexing has been the norm for a while; the practical issue today is **content and structured data parity** between mobile and desktop.
– **Best practice:**
– Ensure the mobile version includes the same primary content, internal links, meta robots directives, hreflang, and structured data as desktop.
– Avoid “collapsed” or lazy-loaded content that becomes hard for crawlers to reliably render.
### 8) hreflang: still easy to break, and conflicts are common
– **What changed:** Nothing “new” conceptually—but the ecosystem has more JS frameworks, CDN edge rewrites, and parameter variants that can quietly break hreflang reciprocity.
– **Best practice:**
– Make hreflang **bidirectional**, consistent with canonicals, and aligned with indexable URLs only.
– Use `x-default` where appropriate for language selectors or global pages.
– Audit at scale—hreflang problems are rarely isolated.
### 9) Links: quality signals and internal architecture are the safer wins
– **What changed:** External link evaluation is more nuanced, and spammy patterns are easier to discount. Meanwhile, internal linking remains one of the most controllable levers for discovery and prioritization.
– **Best practice:**
– Invest in internal linking that reflects real hierarchy and user journeys (hubs, breadcrumbs, related items).
– Use descriptive anchor text naturally; avoid sitewide keyword stuffing.
– Keep orphan URLs to a minimum and ensure important pages are reachable within a reasonable click depth.
### 10) AI/Generated content: risk is about usefulness, not the method
– **What changed:** Large-scale content generation is more common, and search engines are better at detecting low-value pages produced at scale.
– **Best practice:**
– If content is generated, apply strong editorial standards: add unique expertise, original data, real examples, clear ownership, and updates.
– Avoid mass-producing near-duplicate pages targeting slight keyword variations unless each page fulfills a distinct user intent.
### 11) AMP: no longer a default recommendation for most sites
– **What changed:** AMP’s role in visibility has declined significantly as standard mobile performance and CWV became the focus.
– **Best practice:**
– Treat AMP as optional and only keep it if it demonstrably benefits your workflow/performance and is maintained without errors.
– Focus first on improving the canonical (non-AMP) experience.
—
If you want, I can rewrite the “Summary” list to reflect these modern priorities while keeping the rest of the post evergreen and consistent with the original structure.




