28th Sep 2016
With Further’s first ecommerce seminar fast approaching we decided to write a post about common SEO mistakes on ecommerce websites.
All too often we see ecommerce casualties crawl to our doors needing serious medical attention. So let’s get out the SEO medikit and patch these issues up.
This is common variety of SEO illness, especially on “old” ecommerce platforms – the common cold as it were.
Product output pages exist on multiple URLs. This is usually caused by a poorly developed integration of product categories.
Seeing triple by Foto43
The cure will depend on what ecommerce solution you’re running with. Some ecommerce solutions are custom builds whereas some are based on pre-purchased software with little configuration scope.
Ideally a product URL will be constant and only live in one URL location – this way if the product is referenced from multiple categories there won’t be any issues with duplicate URLs.
“Prevention is better than cure”
Ideally the ecommerce setup should not cause this in the first instance. If it’s possible to modify, or this developed, then this is the best course of action.
While quick fixes can be put in place they will not solve the underlying problems. Understandably custom builds are costly and having to go back to your web developer and request changes (at extra cost) is a bitter pill to swallow.
A rel canonical meta tag could be added as “hint” to search engine which product URL is the master.
Query parameters: Know your limits
Query parameters such as filters can cause a serious case of indigestion with search engines crawling as many filter combinations swelling their data hungry bellies.
your-ecom-domain.co.uk/category-url.php? page=1&brandid=34&sort=high &price=100-500&colour=red
While you might think it sounds like a good idea for search engines to index every connotation, it’s not. While the search engine is crawling duplicate pages wouldn’t you rather it was keeping your more important pages up to date?
Once again this comes back to your ecommerce platform. One method, albeit a temporary fix, is to block crawler access via the robots.txt file. Careful use of robots.txt wildcards allow you to block access to certain query strings.
Google recommends not using robots.txt to block crawler access, although this is more stop incorrect usage.
Google Webmaster Tools has it’s own Parameter handling which lets you choose whether Googlebot will “Ignore” certain parameters. Similar to rel canonical, these requests are treated as suggestions rather directives.
A case of identity crisis.
Most manufacturers’ supply a stock description for their products. These are provided in feeds making them easy to add to your product’s database entry. However, this means competitors are also using them which makes your products pages look similar to your competition.
Don’t look like a clone, Dolly – Image credit: qyphon
If you have any spare in-house resources, then get your team to re-write your product descriptions. Not only will this make your descriptions unique, it also gives you the opportunity to add a more personal writing style to your site. This helps with user experience.
Writing product descriptions is a time consuming process and if you don’t have resource to update your duplicated descriptions then you can find affordable content writing services. This will enable your products to stand out in the search crowd.
Don’t assume your ecommerce solution is optimal for search. SEO is a continual process of improving your site – even large brands continue to develop their websites to ensure they are keeping up with competition and best practices.