Hello -
I was trying to create a sitemap for my website using a few different tools that crawl the site and process all the pages available. I have been told on a number of occasions that crawling bots 9 times out of 10 will ignore any links that require javascript processing. This has held true in their sitemap software, as the bot is unable to crawl many pages on my site due to the fact that the SmithCart coding places many links in javascript.
Now I know that Google/Bing bots are far more sophisticated than most, and that they often change how and what they crawl. This does however lead me to a few questions regarding the cart software's stance on SEO.
1. What is SmithCart's stance on category pages? I have noticed that many of the new styles for the category menu simply ignore the keyword URL and category description data. Are you moving away completely from having unique indexed category pages, in favor of just promoting product pages?
2. The Smith.CategoryMenu module places href="javascript:_doPostBack" links for each of the categories. Have you confirmed that the major search engines will be able to successfully navigate these links, in order to index all categories and the entire SmithCart installation?
Thanks for answering these questions. I have heard from many sources that using javascript links for your main site navigation is a big SEO no no. I wanted to hear directly from the source why you have chosen to do so, the rationality behind the choice, and whether or not your research has proven otherwise.
Alex
|
|