What is semalt.com

Lately I’ve been seeing entries coming from this website semalt.com to my blog.
With some referal like competitors_review.php . It’s funny that someone see’s this one article per week blog as a “Competitor”, I think only my father, brother and a dozen other persons read it per day.

Well back to the point, after reading about this semalt.com website, I think it’s not suspicious at all and it’s just another SERPs/SEO analysis tool like sistrix.
I’ve read a quite comprehensive description of what they do in their blog.

So in my humble point of view, all this warnings over the internet, are just because at the time of testing probably they hitted some blogs with their Spider and they didn’t had a properly About page.

I registered on their 7 day trial, and I’m still testing it, maybe at the end of this week I will write something about them. Or I will just drop them an email and interview them.

SEO Indexing other than HTML (js,pdf,txt,rss) – Essay & Research

As far as we know, google has been indexing Ajax and .js files from November 2011 Or even before, but at least that blog post it¨s their official announcement.
It seemed the way to go, taken in account that many websites in the latest 5 years have been using Ajax to retrieve dynamic content in tabs, navigation and other areas.

google_spider

But we often face the question from  our clients:
Would this content affect my ranking ?
Will the links inside my Ajax or other websites pass PR to my site ?
The big question would be more something like:
Do they use this content to autodiscover more URL’s to crawl or they will just take this dynamic html as part of the related page that is fetching it ?

So after asking this myself I’ve been reading quite a lot about the subject. In seomoz’s Can Google really access content in Javascript is stated that after some tests, document.write browser output method to push html in the client side is only scanned when the call is made on the same Html and not in an external js. However we have proved that there are links inside external javascript files that have been autodiscovered.
So while my opinion is that external js can be used as an authodiscovery source I don’t think that links inside this external files would pass any link juice or PR at all. But this is just an opinion and the best would be to test it somehow having a couple of site that use this type of content.
According to the Seomoz article google takes the external Ajax content B,C and D as part of our main page A. And that makes total sense, this content is part of the page although it may be not be loaded initially but as a client request, so would be links inside that ajax content as valid as I put a link to a site in my main HTML ?
I believe so. But I cannot assure without making some tests. And that can take a bit of time.
On this same article, qwconsulting tries a method to get ajax content from an external page:

I used the out of the box approach on the AJAX tabs which has hyperlinks with an href going to an external page. The AJAX script loads in the content onto the page that contain’s the tabs instead of linking to that external page. In that scenario, the external pages were indexed separately.

And not, like this external content B, C and D don’t add anything to your page, this ones are indexed separately. I believe that if this pages are on the same domain will be valid content weight for main page A.

But all this theories have to be tested. And currently I do not own any interesting site with some PR and pages indexed to do any test. So if you do I propose:

  1. Point a lin from your page A ( PR 3+) to a new page B in a new domain with some unique content. Of course we are talking of putting some good a href links with the proper “this is my product/what I do” keywords to page B
  2. Wait some months, measure, see if get’s indexed.
  1. Do the same but this time in the domain itself. Use 2 domains for testing A  &   B related to the same niche product
    In A you put the relevant content in Ajax tabs from the same domain. The content is not on page, it comes to page when triggering some navigation.
    On page B use similar content, but this time on page, old style html.
    Use the same style H1, bold text, lists, all proper web writing style with google reccomendations.
  2. Add analytics, wait some months, measure and see what did better.

What’s your opinion about it ?
Do you have any sites with ajax content and can take some conclusions in Analytics ?

Google and the link networks

I’ve been thinking about writing this post for long, and it’s being in my drafts as an unpublished entry for some days now, it’s not an easy thing to explain exactly how I feel about this.
This would not be the first link network targeted by Google. Google has penalized link networks several times over the years, most recently the BuildMyRank.com link network.
But after reading what Seoroundtable published about www.sape.ru and other entries about similar networks I though about writing my opinion about it.

Matt Cutts         @mattcutts @dannysullivan @seocom just another day at the office for me. :) Okay, gotta look at some really naughty Russian link selling software now.
9:27 PM – 13 Feb 13

Actually some people that know the market in Russia commented:

Actually we don’t care so much about Google, for Russians Yandex is the king. If Google wants to kill his Russian branch – no problem, nobody cares

It certainly bugs that there are “links marketplaces” out there that are manipulating the search results selling links. Some questions and thoughts arise:

  • Why there are caring about this specially in country where they have 30% of the market search ?
    (Yandex seems to be the search king of Russia)
  • They know this for so long but suddenly they decide to take action ?
  • I’m really surprised that professional SEO’s are still using this kind of get link marketplaces. There are valid ways to get links in an organic way to avoid doing something stupid like this that can damage your client reputation and your professionalism as SEO.

And another though that came to me after this is that Google used to be a search engine which nows looks as a giant advertising marquee of their own advertising and products. Try to search for “Hotel in Berlin” or any popular service and you will see it for yourself.
Google in 2013:
google-2013

And this is how it looked a search for “Hotels New York” just a couple of months ago when they got in the hotels business:
hotel-new-york-2012

Thanks! At least after the Google hotels finder we could see some natural results.
Now this is how we see the same search it in March 2013:
hotel-new-york-2013

So what I don’t get is the double message of fighting so much to keep their search ranking clean but at the same time they put all their advertising marquee in front of your face almost stopping all of us to see organic results.
Why are the organic results so important if you have to scroll to see them in the first place ?

Keep the conversation coming:
Did Google Just Penalize Another Link Network? SAPE Links
SAPE Links Penalized by Google? – Sape employee blog post

Here is the reason about why we will use Internet Explorer: Google keywords ‘not provided’

Lately the SEO world has been quite a revolution. It started long time ago but it was enhanced by last 2 weeks events when Raven announced that they stopped scraping search data but it continues up to this days until everyone is realizing the following fact:

Each day we will work more blindly in the SEO World

The reason is that Google is quite forcing Secure Socket Layer (SSL or https) so this is starting to appear as traffic keywords “Not provided” in Analytics. So what are the keywords that the user typed in Google get our site in the search results:

 Only Google knows

The fact is that the (not provided) traffic will keep on growing. First, because any user that is logged with any Google device will send their encrypted searchs to the (not provided) list. And secondly, because Firefox and other navigators decided to encrypt all the searches done by Google, independently if you are logged into a Google account or not.
After the updates of 10th December in Google Chrome the browser will have the exact encryption as Firefox, hence that means that you will find a closed door when you search for keyword data in any web analytics tool.

To explain it in a simple way, that means that the (not provided) traffic source will keep on growing giving us less and less information about the search keywords that brought the users into our websites.

So what will the Search Engine Optimizations experts will do when they are forced to work almost blind ?

We will do what we’ve done through all obstacles in life: We will adapt and survive (Darwin said it very well)

There is an exception about this when the keywords are revealed: It’s when you use this keywords to do advertising in Google Adwords. Then they have to provide this information.

And there is much more in SEO than tracking search engine results. So if you are a fan of tracking all the keywords and doing all this little adjustments in your content and titles to bring more traffic I personally suggest one thing: Enjoy while it lasts because after this you will have to learn a whole new way to do it. And maybe focus more on original content or social engagement. Or maybe just combine it with SEM and make Google earn a few more bucks.

Google has been trying to tell it very clear long time ago saying “The ranking should not be used as a SEO metric” (Really ?) and now it seems that they are forcing their statement to be right. Because there is not other choice:

The first thing you have to know is that if you live  from Google, your business depends on Google and if you don’t play with Google’s rules…you are going to be expulsed from the match.

Related posts that where an inspiration to write this one:
Google Chrome not provided