Top 5 Technical Issues That Impact Your Organic Search

August 26th, 2015 by DaBrian Marketing Group

Optimizing your site to perform well in organic search is not a set it and forget it process. There are several factors that go into determining your site’s positioning in the search engines. In fact, backlinko.com reports (and lists them for you) that Google uses over 205 different factors! While many business owners tend to focus on the visible on-page components such as content and keywords (which are very important), there are several technical issues that often go unnoticed because they are not visible to the end users. These issues listed below greatly impact your business’s organic search results and should be a top priority when creating and maintaining your site.

Organic Search

(Image from http://seattleorganicseo.com/)

1.Navigation

Before your website appears in a search engine, it must be crawled, and then indexed. During this process, search engines will collect information about your site’s content. Just as users prefer sites that are easy to navigate, so do search engines.

If you wish to perform better in search engines, it would greatly benefit your business to create and maintain a sitemap. A sitemap is a document or a webpage containing all the pages on your site. Search engines evaluate sitemaps to understand the structure of the site. This is regularly done so it is a best practice to make sure this sitemap is being updated when new pages are created.

Your navigation bar is not just for the user either, search engines consider them into their organic search rankings. While (quality) links are always good for your site, not all links are created equal in a search engine’s eyes. Links placed in your navigation bar are deemed more important. Therefore your navigation bar should only contain links that are most valuable to your business. These are the links that generate more traffic and lead to more sales and conversions. Many website owners make the mistake of placing too many links in their navigation. This not only worsens the user experience (which worsens your position in organic search), but it also dilutes the authority/value of your links.

Having descriptive anchor texts, the text that appears highlighted in a hypertext link, is another factor associated with navigation. Having descriptive anchor texts means linking to your pages using content related words and not something generic such as “click here” or “read more.” These generic links tell search engines that the page should be about clicking things, or reading. This appears spammy and can negatively impact your performance in organic search.

2. Site Speed

One of the 200+ factors that Google and other search engines take into account is site speed, or how fast your site finishes loading. Search engines have become focused on providing the best user experience for those searching, so a faster load time to them, means a better experience and will correlate to the improvement of your rankings. Common mistakes that many site owners make that cause slow site speed are things such as large pictures, flash animation, bulky code, and external media. While some of  these things may look nice, it would be wise to reduce or remove them in order to increase site speed. To evaluate your site’s speed, use Google’s pagespeed insight tool.

3. Mobile Capability

On April 21st of this year, Google released an algorithm change requiring sites to be mobile friendly if they wished to perform well in organic search on mobile phones. Before the update, a study from Adobe reported that 45% of business did not have a mobile friendly site. To be mobile friendly, search engines such as Google recommend using a responsive design. This means that all your site’s content will remain the same for all users, it will just appear differently depending on the device the user is using. For more information about the update, please read my blog on Google’s Algorithm change posted near the time of the release.

4. Duplicate Content

Most business owners are aware that copying content from other sites is not only unethical, but will result in a ranking penalty from search engines. However, many are not aware of the internal duplicate content happening on their site right now. When a user visits a specific page, a URL is generated based upon the path they take. For example, a user may search Amazon looking for a cellphone, and once there, they decide they want a smartphone so they click on the smartphone section. After going to that section, they click on their desired phone (let’s say an Iphone 6).  Their URL will look something like this: Amazon.com/cellphone/smartphones/Iphones/Iphone6. Now let’s say another user knows exactly what they want. They go to Amazon and enter in the search bar “Iphone 6.” Their URL will be shorter and look something like this: Amazon.com/iphone6. While this is the same page, they have two separate URLs. Search engines will recognize them as two separate separate pages with duplicate content. Google lists solutions and explanations on how to deal with duplicate content.

5. Include a Robot.txt file

A Robot.txt file is used to direct search engines when crawling. It’s usually used to prevent search engines from indexing specific pages and files. So why would you want to prevent search engines from crawling any part of your site? Let’s say for example you are in the process of updating your site. You’ve began adding pages but they aren’t complete yet. If a search engine were to crawl those pages, they would not perform well in organic search due to lack of content. That places you in a uphill battle from the start. A similar issue arises when you are performance testing different layouts for pages. The pages have identical content, but you wish to see which layout users prefer. A Robot.txt file will tell search engines not to crawl these pages yet until you are finished or have picked a layout. Another reason would be to prevent search engines from penalizing you for duplicate content. For example if you have a print version of a page, you can tell search engines not to crawl it.

Learn more about Robot.txt files.

 

Performing well in organic search is essential for a business that wishes to improve digitally. A study done by BrightEdge shows that the majority of web traffic (51%) comes from organic search. That is a huge opportunity to miss out on. While having great content may be the priority in some business owners mind, none of that means anything if your website has the technical issues listed above.

Contact Us today to make sure your site is technically sound to perform well in organic search!
 
By: David McDowell

DaBrian Marketing Group (DMG) is a full service digital marketing agency focused on providing innovative, strategic marketing solutions for businesses that want to obtain digital awareness, cultivate meaningful customer relationships, and gain insights to achieve their goals.


Also published on Medium.

One Response to “Top 5 Technical Issues That Impact Your Organic Search”

  1. October 31, 2016 at 12:49 pm, Ecommerce Tips for Marketing to Millennials - DaBrian Marketing said:

    […] Learn more about technical issues that impact your organic search. […]

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

RSS
Facebook
Facebook
Google+
Google+
https://dabrianmarketing.com/blog/technical-issues-impacting-organic-search">
YOUTUBE
LinkedIn