Perhaps this is an interesting question to ponder. I have a retail site with about 3000 items for sale. Each item is kept in a table row. I have a page that, when called, shows all the details about a specific item.
Now, I have all of my items for purchase in a special XML file that the googlebot opens and scans and catalogs all items for sale. This is a great way for Google to index all the items for searches. An XML snippit looks like this:
Thus all items in this XML file has a <URL>-</URL> for each of all 3000 items. The problem I am having is that somewhere there is an error(s) that the googlebot encounters and is logged in the FoxWeb error pages. However, the error pages do not tell me what item IDs are causing the errors.
So is there a way to identify when the googlebot scans my site and is there a way to log item IDs that fail the scan?
Any help would be appreciated.