Home / Resources / Why is My SEO not Working? 5 of the Most Common SEO Roadblocks

Why is My SEO not Working? 5 of the Most Common SEO Roadblocks

One of the most common things we hear during a website audit is “Our SEO efforts in the past just haven’t yielded any results”. Building and designing a website and creating content just to have no rankings or clicks coming from search engines is frustrating for any business. In this article, we will go into the most common roadblocks we see that could be causing your SEO efforts to be completely in vain.

Why is my seo not working? Poor Strategy, penalties, tech issues, more

There could be many reasons why your current SEO efforts and optimizations aren’t yielding results. The issue could be with quality, value-add, or even at a base technical level. The 5 most common complete roadblocks we see that impede all SEO results are:

Read further for more detail about these issues and find actionable solutions to help diagnose and remedy them.

1. Manual or algorithm penalties from Google

Manual actions, also referred to as “penalties” by SEO professionals, are actions that Google assigns to websites that are clearly breaking one of the many guidelines laid out in their Search Essentials documentation. If Google has taken manual action against your website, you could be in some real trouble. Websites with manual actions can be barred from high ranking, or even removed from Google Search entirely. Some examples of common manual penalties are:

  • Pure spam
  • Hacked site
  • Unnatural links (links that are obtained through manipulative or deceptive tactics, such as buying links, participating in link schemes, or using automated programs to create links)
  • Cloaking (a technique where two different pages are linked with the same URL, one for search engine bots and the other for human visitors)
  • Doorway pages (pages designed specifically for search engines, not for human visitors, and are used to attract users by targeting niche keywords and then redirecting them to another page)

The examples listed above, as well as most other manual actions, are not something a website owner would do “by accident”. These take a pretty deep knowledge of trying to “game” Google and enter a realm of SEO that tries to trick the algorithm to rank higher. Unfortunately, a very common scenario is a business may have worked with an SEO agency that used one or more of these tactics and left them high and dry. It’s also important to keep in mind that your website could have been hacked by spammers without you knowing.

Algorithmic penalties, on the other hand, are a bit trickier to diagnose. Google’s Algorithms are updated constantly, and most of the updates aren’t made public by Google. The big ones, however, are announced in the Google Search Status Dashboard and are normally targeted at a specific aspect of websites like product review websites, helpful content, page experience, etc. If you are “hit” by one of these updates, it can lead to a significant or even total loss in traffic, and your website won’t recover until you fix the issue.

How to Identify

Manual Penalties: You will have to go to Google Search Console to see if you’ve been hit with a manual action from Google. Navigate to the “Security & Manual Actions” section under the “Messages” tab. Any detected issues will be listed there along with the description of the issue and pages affected.

a manual penalty alert in Google Search Console for spammy, user-generated content.

Algorithmic Penalties: Google does not provide a notification or alert for algorithmic penalties. One of the most common ways to identify algorithmic penalties starts by noting the date when you saw a decline in organic traffic. Then, you can match that up with any known algorithm updates rolling out at that time. You’ll want to use the Google Search Status Dashboard to see a list of all known updates.

How to Remedy:

If you have a manual action taken against your website, you should prepare yourself for a long road to recovery. The way to remedy the penalty is to identify which action Google is flagging, and remove all traces of that on your website. Then, you can manually request a review of the issue through Google Search Console. The same idea applies to algorithmic penalties, but you don’t have the option to request a review. You just have to wait until Google crawls your new and improved website enough to get out of the trenches.

2. Lack of Trust or Authority

Trust and authority are important factors for internet users for a variety of reasons. Website visitors want to know that you are a trusted voice in your industry, and not just any random person putting nonsense on the internet. And what’s important for users is generally also important for Google search algorithms. In the real world, you can hear about companies from word-of-mouth or physical ads, but Googlebot relies on mentions from across the web and links pointing back to your website.

Brand new companies and websites naturally lack any sort of online presence, and therefore lack authority and trustworthiness. When a website gains popularity, it is linked to and mentioned all over the web. This is a great starting point for understanding your trust and authority. If you are a brand-new website, this is likely why your SEO efforts won’t seem to be working.

How to Identify:

One simple way to check your online presence is by using a “site colon” search in Google. Go to Google search, and type in [-site:www.yourdomain.com “your brand name”]. This is a directive to Google to show all results for your brand that are not your own website. The goal here is to have lots of mentions from social media websites, industry-related websites, directories, forums, etc. If there are no results, then lacking presence could be a major roadblock for your website.

a site-colon search performed on Google Search

Checking backlinks to your website is another way to gauge your website’s trust and authority. A free way to do this is, again, through Google Search Console. On the left rail, you’ll see a tab labeled “links” that gives you a quick overview of links to your website. Using tools like SERanking, SEOClarity, or Majestic to run backlink analysis will give you much more helpful information like the “strength” of individual links.

The link report in Google Search Console

How to Remedy:

If you suspect a  lack of authority or trustworthiness is what’s plaguing your website, you’ll want to focus on upping your online presence. This includes a mix of tactics such as:

  • Creating profiles and listings on all directories related to your industry (restaurants→Yelp, home services→Houzz, hotels→TripAdvisor)
  • If you’re a local business, find as many opportunities as you can to appear on local websites like business directories, local news outlets, online magazines and such.
  • Links are naturally gathered with time, but there are things you can do to increase the velocity of gained links. Creating content that people will want to share with their network, sponsoring events, and taking part in conferences and gatherings are all perfectly safe ways to earn backlinks.

Warning: never purchase or trade backlinks. This is strictly against Google guidelines and can result in a manual or algorithmic penalty. 

3. Technical Crawling Issues

Before you can rank in Google, you have to get into Google’s index. And before you get into Google’s index, Google has to be able to access and crawl your website. This first, crucial step in the search process is often overlooked by website owners. When Google can’t access (or “crawl”) your website, no matter how much other website optimization you do,you still will not show up in a Google search.

It is also important to remember that Google crawls via links on your website. So if you have web pages that are either not linked to (“orphaned”), or are four, five, or more links away from your homepage, this makes it very difficult for Google to access those pages, and it lowers the probability of high ranking.

How to Identify:

There are a few different ways to ensure your website’s crawling efficiency is optimized. First, you’ll want to check your robots.txt file. Robots.txt is a directive to web crawlers (like Googlebot) to “allow” and “disallow” crawling on your webpages. You can check a robots.txt file by typing in [www.yourdomain.com/robots.txt] to your browser. If you see “disallow: /” anywhere in your file, that could mean that crawling is disallowed on your website, and you will not be crawled by some or all web robots.

the robots.txt file for johnmu.com

Another thing you’ll want to do is download a web crawler for yourself, such as Screaming Frog. This will give you the ability to mimic a web crawler like Googlebot and see if any crawling issues appear. Screaming Frog will also show you the “crawl depth” of all of your webpages.

crawl depth report generated by Screaming Frog

How to Remedy:

Since crawling issues are beyond the scope of the average business owner, working with a professional SEO and web development team to permit crawling and reorganize folders and subfolders is your best bet at remedy or avoid future crawling issues.

4. Pages not Getting into Google’s Index

After a web page is crawled, Googlebot analyzes the page and has a decision to make: do we put this into our index? You can think of Google’s index as an enormous library of web pages that Google has at its disposal to serve up in search results. A lot of websites we have worked withface this exact roadblock, and therefore don’t have the ability to rank in Google search.

Much like crawling, there is a way to intentionally instruct Google to not include a page in their index. This is done with a meta robots tag, and we have seen this accidentally implemented incorrectly on many websites, resulting in a stalled SEO strategy.

One of the major things Google analyzes when deciding to index or not is duplicate content. When Google finds pages on the internet that have very similar or exact duplicate content, they will only choose the one that is “most representative of a group” into the index. Indexing also involves overall page quality, including user experience, readability, content quality, site speed, mobile friendliness, and more.

How to Identify:

Primarily, you want to ensure that the pages you want visible and ranking in Google are in the index. Google Search Console is once again your best bet for understanding your indexing situation. Under the “indexing” tab, you will see a complete overview of how many pages on your website have been indexed by Google.

Indexing report in Google Search Console

Furthermore, it will tell you the reasons certain pages are not indexed, whether that is intentionally using the “noindex” tag or there is a quality issue (which is normally the case with “discovered-not indexed” or “crawled – not indexed”)

Google Search Console showing why certain pages aren't indexed

How to Remedy:

First ,understand why your pages aren’t getting indexed. If it’s a technical issue, like the meta robots tag, redirects, or pages not found, you can work with a developer to solve these issues. If you are seeing pages that have been discovered or crawled and haven’t been indexed, this is most likely a site quality issue. Ask yourself:

  • Are these pages offering a good user experience? Site speed, intrusive pop-ups, and readable design all play a part in user experience.
  • Is there enough content on this page for Google to understand what the page is about?
  • Is the content on this page unique? If not, am I adding value that can’t be found elsewhere on the internet?
  • Do I have exact or near duplicates of this page on my own website?

5. Lack of an SEO Strategy

Many organizations and businesses that come to us have a limited SEO background and therefore have a general lack of strategy from the start. SEO does not have an “on/off” button. Google is constantly crawling, updating its algorithm, and altering what it understands is valuable to its searchers. SEO is about understanding the fundamentals and keeping up with Google’s changing goalposts. One of the top reasons we see SEO efforts fail is a misunderstanding of either the fundamentals of SEO or using outdated strategies to optimize websites for search.

One of the best examples of this is a brand new website owner getting frustrated for not ranking for a term like “pastry cream”. This is a “head term”, and takes years of SEO efforts to rank for. Instead, brand new websites should target “long-tail” keywords initially, giving them the ability to gain visibility and leads from search relatively quickly. You can read more about content strategy and long-tail keywords in our Content Strategy Guide.

How to Identify:

Luckily, it’s pretty easy to identify a lack of a practical SEO strategy. You can ask yourself questions like:

  • What keywords and keywords groups am I targeting?
  • Which keywords are in “defend”, “improve”, and “expand” positions?
  • What KPIs and goals am I monitoring?
  • Who are my largest competitors and what am I doing on my website that goes above and beyond what they are doing?

If you don’t have specific answers to each of these questions, you likely don’t have an SEO strategy in place that will deliver results.

How to Remedy:

It takes time, effort, and investment to see leads pour in from your website through organic search. Quick cheats and spammy hacks may result in quick SEO results, but are not sustainable and can do serious damage to your brand. Therefore, a SEO strategy that has realistic goals and a path to meet those goals is essential if you are investing in SEO as part of your overall digital marketing strategy.

Reach out today to let the SEO Experts at Atigro work with you to build a tailored SEO Strategy geared towards success

Looking to be more competitive in organic search?
Scroll to Top
Scroll to Top