There’s nothing more frustrating than spending countless hours of long sleepless nights building a massive website for your business only for Google to place it 20, 30 or 50 pages deep in their results. Common sense should tell you that very few people, if any, will actually click through this many pages. If Google isn’t showing your site any rank love, you first need to find out why. Some webmasters constantly point the finger at Google, but the problem typically occurs with ‘fixable’ off-site and/or on-site elements
A study published by SearchEngineWatch.com found that Google traffic drops off by a staggering 95% just at the second page! Now image just how much traffic your website would lose if it’s stuck 20 or more pages deep in the results. This same study also reveals that sites listed in the first organic position receive 32.5% of a particular keyword’s traffic, while the second position receives 17.6%. The bottom line is that you webmasters need to constantly fight their way up through the rankings to get the biggest piece of the traffic pie.
Reason #1) It Contains Too Many Broken Links
Does your site contain an abundance of broken links? In an effort to maintain a positive overall user experience Google factors this into their ranking algorithm, placing sites with too many broken links further down in the rankings. Internal links which are broken are more problematic than external links, so focus your efforts on fixing them first.
When a visitors clicks on a link, they are expecting to visit the site or page specified in the anchor text. If they are given a 302, 404 or any other error code instead, they’ll probably leave with a bad taste in their mouth.
The good news is that identifying and fixing broken links is a relatively easy and painless process (assuming your site doesn’t contain a massive database of 10,000+ pages). BrokenLinkCheck.com is one helpful tool for identifying broken links, or you can use the W3 Validator Tool, both of which are completely free. Try to get into the habit of checking your site for broken links on a regular basis to prevent any loss in ranking.
Reason #2) It Contains Too Much Duplicate Content
A second possible reason why Google could be knocking your site out of the rankings is because it contains too much duplicate content. Google wants fresh, original content displayed in their search results, and they’ll most certainly factor this into their ranking algorithm.
Let’s first go over the definition of ‘duplicate content,’ as this oftentimes confuses some webmasters. In short, duplicate content is any content that’s located on multiple webpages (urls). The content can be located on multiple pages within your site, or it can be found on other domains. While preventing 100% of duplicate content is nearly impossible, you should always be aware of your site’s structure and the location of your content.
Reason #3) It has a High Bounce Rate
Are you keeping track of your site’s bounce rate? If not, you should, especially with all of the recent algorithm updates Google has been rolling out. Allowing your website to suffer from a high bounce rate essentially tells Google that you aren’t providing your visitors with whatever they are looking for. And if you aren’t building your website for the end user, not search engines, you’ll probably experience a drop in rankings.
If you’ve never heard of bounce rate before, let me give you a quick explanation: bounce rate is the percentage of visitors who exit out of a website without clicking through to another page. A high bounce rate can be caused from coding errors, incompatibility with certain web browses, too many ads, poor site design, or a general lack of engagement between the visitor and website.
Reason #4) Backlink Building: You’re Doing It Wrong
With Google rolling out new algorithm updates like Panda, Penguin and the latest Hummingbird, some old school webmasters need to rethink their backlink building strategies. In the past, you could get away with cranking out hundreds or event thousands of targeted, irrelevant backlinks. In fact, this technique could easily rank even the most basic landing page. Now, however, using the same tactics will likely trigger a filter in Google’s algorithm, sending it tumbling down the rankings.
Today, webmasters need to take a more natural approach towards building backlinks for their sites. Focus on building quality content that people actually want to share and link back to. When this happens, you’ll notice your backlinks gradually increasing from relevant sites.
Reason #5) Google Bots Aren’t Allowed To Crawl It
A fifth possible reasons why your website isn’t ranking well is because Google Bots aren’t allowed to crawl it. WordPress, for instance, has an option to ‘discourage search engines from indexing the site.’ If you’re running a WordPress blog or website, log in as the admin and click on the ‘Settings’ tab to left, followed by ‘Reading.’ At the bottom, make sure the box for discouraging sit indexing is unchecked.
If you are running a static HTML website, open up your robots.txt file to ensure Google Bots are allowed to crawl and index your website.