Hello, I was wondering if there is a solution to the problem search spiders are having with full AJAX sites, for example googlebot seemingly does not support AJAX sites very well. Is there a work around to this?
Set cookies, track sessions and log actions as long as the content isn't dependant of it. Search engines will have no trouble indexing your content.
Do use AJAX to save content
When a user enters information in a form field and hits the save button, you can use AJAX as much as you like. Search engines will never push the save button anyway and is therefore unaware of the use of AJAX.
Do use AJAX to do form field validation
When validating form fields, you can use AJAX to validate the input without disturbing the search engines. Search engines do not fill out forms so that won't be a problem.
Do use AJAX to display status messages
Displaying status messages of any kind based on user actions, is no problem for search engines, because the do not execute the JavaScript needed anyway and status messages are not important content for search engines to index anyway.
[B]Don'ts[/B]
Don't use AJAX for displaying static text content
By static content I mean the main text content of a page and not simple information like the number of current active session or something like that. The main text content of a page is the single most important thing for search engines, so never use AJAX for this purpose.
Don't use AJAX for paging a table or list
If the table is filled with numbers with no search engine relevancy, you can skip this point. If your table or list contains book reviews, chances are that you want them indexed correctly. If your paging is AJAX enabled, the search engines will only index the first page of the table.
Don't use AJAX for navigational purposes
This is not AJAX specific, the same rule applies to simple JavaScript as well. Search engines don't follow JavaScript links, so they will get stuck on the entry page and leaves again without indexing the rest of your site."