As far as we know, google has been indexing Ajax and .js files from November 2011 Or even before, but at least that blog post it¨s their official announcement.
It seemed the way to go, taken in account that many websites in the latest 5 years have been using Ajax to retrieve dynamic content in tabs, navigation and other areas.
But we often face the question from our clients:
Would this content affect my ranking ?
Will the links inside my Ajax or other websites pass PR to my site ?
The big question would be more something like:
Do they use this content to autodiscover more URL’s to crawl or they will just take this dynamic html as part of the related page that is fetching it ?
So after asking this myself I’ve been reading quite a lot about the subject. In seomoz’s Can Google really access content in Javascript is stated that after some tests, document.write browser output method to push html in the client side is only scanned when the call is made on the same Html and not in an external js. However we have proved that there are links inside external javascript files that have been autodiscovered.
So while my opinion is that external js can be used as an authodiscovery source I don’t think that links inside this external files would pass any link juice or PR at all. But this is just an opinion and the best would be to test it somehow having a couple of site that use this type of content.
According to the Seomoz article google takes the external Ajax content B,C and D as part of our main page A. And that makes total sense, this content is part of the page although it may be not be loaded initially but as a client request, so would be links inside that ajax content as valid as I put a link to a site in my main HTML ?
I believe so. But I cannot assure without making some tests. And that can take a bit of time.
On this same article, qwconsulting tries a method to get ajax content from an external page:
I used the out of the box approach on the AJAX tabs which has hyperlinks with an href going to an external page. The AJAX script loads in the content onto the page that contain’s the tabs instead of linking to that external page. In that scenario, the external pages were indexed separately.
And not, like this external content B, C and D don’t add anything to your page, this ones are indexed separately. I believe that if this pages are on the same domain will be valid content weight for main page A.
But all this theories have to be tested. And currently I do not own any interesting site with some PR and pages indexed to do any test. So if you do I propose:
- Point a lin from your page A ( PR 3+) to a new page B in a new domain with some unique content. Of course we are talking of putting some good a href links with the proper “this is my product/what I do” keywords to page B
- Wait some months, measure, see if get’s indexed.
- Do the same but this time in the domain itself. Use 2 domains for testing A & B related to the same niche product
In A you put the relevant content in Ajax tabs from the same domain. The content is not on page, it comes to page when triggering some navigation.
On page B use similar content, but this time on page, old style html.
Use the same style H1, bold text, lists, all proper web writing style with google reccomendations. - Add analytics, wait some months, measure and see what did better.
What’s your opinion about it ?
Do you have any sites with ajax content and can take some conclusions in Analytics ?