Hey all,
First post here and I did search for similar posts but couldn’t find a good answer. Let me start off by saying I am the person who writes descriptions for products on our ecommerce Magento site and I have very little technical know-how. Nonetheless I have been tasked with solving this problem:
Robots are eating up all our bandwidth. In Nov 2016 we went way over bandwidth and finally figured it was robots (googlebot specifically). Disallowed robots on robots.txt, but then the page is hard to find for customers. So I allow robots again and slow Google’s crawl rate the slowest it’s allowed to go. But it’s now still eating up our bandwidth – we’re at 83% used and still 15 more days to go in the month. We get 75 GB from Nexcess. I need to fix this!
– Is 75 GB bandwidth not enough? It’s a lingerie store with online shopping.– Searching google says we have 289000 pages (we had 348000 before I flushed the cache through Magento). Why do we have so many pages? Is this the problem? Similar sites don’t have so many pages. What have I done wrong??? Does this have to do with canonical URLs (something I don’t quite understand)? We have 10000 simple products.– Google webmaster tools indicates our sitemap is not working. Is getting a proper sitemap going to fix the problem?
What am I doing wrong and how can I fix it? The business cannot afford to pay overage charges on the bandwidth again like we did in the fall.
Sorry if this is duplicate / total moron question.
Thanks
A