Jump to content
Sign in to follow this  
SEO

Google's November Shuffle - Update

Recommended Posts

This last update was very interesting. It was nice seeing a real dance again, it has been too long! However some of the new filters installed had a seriously negative affect on many top ranked sites. There are many theories going around but none of them are solid if one reviews all the evidence.

 

I have learned (or at this point I should say believe) that less importance is given to links within a certain distance from the linked page. This could be called a proximity set, neighborhood, or linked vicinity. How this is determined I am not sure, looking into IP discrimination but I do not think that is it.

 

What this all means, a site with many internal pages will suffer. All the internal pages will not contribute back to say the home page. It also seems that more 'on-page' characteristics are weighting more then they were before... I think this is a good thing.

 

What does this all mean… well for those of you that took a serious fall. Hang tough, when ever Google installs filters that alter the SERPS in a drastic way they ‘modify’ and come back to a more reasonable level of ‘change’. Therefore, I would do minimal changes/alterations/spend money over the next 30-45 days.

 

In the meantime, review your online characteristics (these have been discussed in many threads here at TCH). Work on getting more backlinks and focus on backlinks from a more diverse population (i.e. more unrelated ness).

 

I will offer more when I learn more.

Share this post


Link to post
Share on other sites

So basically what you're saying, is that pages within a website linked to the home page (Like "links.html" has a link back to the home page) aren't being given quite as much emphasis as they were before?

 

And now Google is measuring links by "how far away" they are from the website?

 

I don't mind these changes. Heck, they make sense anyway!

Share this post


Link to post
Share on other sites
So basically what you're saying, is that pages within a website linked to the home page (Like "links.html" has a link back to the home page) aren't being given quite as much emphasis as they were before?
Yes.

 

And now Google is measuring links by "how far away" they are from the website?

Well I think it is more 'how close' a backlink is rather then 'how far away'. Then backlinks which reside within a certain 'distance' will have less of a contribution. Now we must determine what we mean by 'distance'.

 

I remember reading a research paper awhile ago in this regard. It discussed 'closeness' with regard to an IP address. This obviously brought concern to me when one considers virtual hosting. Bill and I had more then one conversation on this matter. I wanted an understanding of what the octets of an IP represented and how were they related. From our multiple conversations (I am a bit slow ;) ) we concluded that it would be very difficult to show a true 'neighborhood' via IP address.

 

Now one can postulate a whois enquiry (would take too much processing I think). This again would not make sense because an early trend (and one that continues today) is to have the web developer register a domain. Thus one developer would consist of a 'neighborhood'? That would be crazy.

 

I think it must be more of a complex analysis of link structure (i.e. number of inter-related nodes within a certain subset). On to more research...

Share this post


Link to post
Share on other sites

I meant to add one more change that Google is doing on their searches, stemming.

 

From their 'The Basics of Google Search' page:

 

Word Variations (Stemming)

Google now uses stemming technology. Thus, when appropriate, it will search not only for your search terms, but also for words that are similar to some or all of those terms. If you search for "pet lemur dietary needs", Google will also search for "pet lemur diet needs", and other related variations of your terms. Any variants of your terms that were searched for will be highlighted in the snippet of text accompanying each result.

 

 

This too affects the SERPs. More importantly, it means that you can use a more diverse vocabulary and still get the benefit of a certain keyword density.

Share this post


Link to post
Share on other sites

A couple things...

 

1) Stemming. So this means, in regards to my website, "games" and "gaming" would be stemmed words? So I don't need to include both "games" and "gaming" in my keywords because they stem from the same root? Woah... I'll have to wait on this for more results.

 

2) I have noticed some interesting PR results on my website. My main page is 6. My subsections (Save for 3) are all 5. Then any link in each subsection is 4. And succeeding links get one less each per click (Save for a few exceptions).

 

Basically, with every "click" you have to take form my main page (PR 6) to get to a certain page on my site, you lose a PR of 1 per click. So if you have to click two links to get to Page X from my main page on my site, Page X (Generally) will have a PR of 4.

 

I imagine this may be what "distance" is referring to. If it is, then the distance formula needs more work, because my "Vote" page has a PR of 4, and I know there are a lot of websites out there that link directly to it and not my home page.

Share this post


Link to post
Share on other sites
1) Stemming. So this means, in regards to my website, "games" and "gaming" would be stemmed words?
Correct.

 

2) I have noticed some interesting PR results on my website.

What you are seeing is nothing 'new', this is how PR gets transferred. Your home page is 'feeding' all your internal pages. The actual PR that is transferred is dependant upon all outbound links on the 'feeding' page. In a simple structure, one would see the linear relationship from top to bottom that you have described.

 

This is not the 'distance' that we were discussing above. Remember that PR can be 'gained' or 'given'. In your 2), we are discussing PR being passed down through a site from a index page.

 

The 'distance' issue we discussed before was with regard to internal pages feeding PR to the index page (i.e. the other direction). Now if your internal pages only have PR from the home page, then obviously it can not contribute any real significance back. However, if internal pages gain PR from 'other' sources then they indeed can feed this PR back to the home page. What happens now (I think) in large sites is that only a certain number (e.g. threshold) of internal pages (or maybe even 'neighborhood' pages) are considered when recalculating PR for a given page.

 

As to your 'vote' page... looking.

 

If you are calling vote.php your 'vote' page, you only have one backlink to it. Thus your observation matches what one would expect.

Share this post


Link to post
Share on other sites

Thanks for the update, Scott. Your insight is the most valuable I've seen and I appreciate what you share with us. (Especially since you could and do get real money for the same kind of info.)

Share this post


Link to post
Share on other sites

Scott,

Your take is one that I haven't seen anywhere else. As you mentioned, there are lots of theories.

 

Given your theory, how would you explain that several sites (some of mine included) didn't drop in rankings but rather totally dropped out of the Google index.

 

These are sites without spammy BS, PR of 5 or better, and some have thousands of pages of content... real content.

 

Why impose a filter that drops high content sites without spammed out pages?

Share this post


Link to post
Share on other sites

Well, my vote.php page has only one backlink to it within my own website, however, there are other non-OMGN websites that link directly to the vote page, bypassing my home page.

Share this post


Link to post
Share on other sites
Well, my vote.php page has only one backlink to it within my own website, however, there are other non-OMGN websites that link directly to the vote page, bypassing my home page.

DarqFlare:

 

From Google's point of view, you have only one relevant backlink (internal or external) for your vote page. You may have more pages linking to this page but they are not meeting the PR threshold that Google uses to discriminate link 'relevancy' (as it relates to a real backlink). This can be viewed by using link:http://www.yoursite.com/whatever.html on Google.

Share this post


Link to post
Share on other sites
Why impose a filter that drops high content sites without spammed out pages?
Well obviously you would not want to. However, remember that we are talking about a huge (beyond huge) database and applying the same filter(s) across the entire data... unfortunate consequences will arise. This is where the fine tuning of the filter(s) will come into play, I have already seen alterations occur (i.e. lost sites coming back).

 

Given your theory, how would you explain that several sites (some of mine included) didn't drop in rankings but rather totally dropped out of the Google index.

Well if it is based a complex analysis of link structure, it would be logical that some sites would be hit and others not. The logical question would be to find the similarity of 'structure' between the sites that were hit. How much is 'too close'. To add to the complexity, I believe that PR scores can trigger or dodge certain filters. I see that it is definitely best to have a PR of 6 or better yet 7 to avoid some of these filters.

 

I should add that the 'leading' theory now regarding this filter (and it's inconsistency across all searches) discriminates via the actual search phrase. Some have purposed (with mounting evidence) that 'Paid Word Phrases' are being discriminated against. These would be phrases that are used in the Pay Per Click (PPC) arena. So, Google looks at it's Adwords list and picks a certain number of the popular phrases to trigger the new filter. As I said there is some mounting evidence that this is occurring however, I am not convinced that this is 'it' (as in the sole trigger/filter). I honestly do not like postulating at this point, it is just to new for 'talk'. More time needs to be looking and comparing real results, looking for real clues and then disproving any hunches... one or two will hold at the end of the day (or in this case, the next 30 or so days).

 

Sorry I am not giving any 'real' information... I wish I could. :P

Share this post


Link to post
Share on other sites

Even hypothesizing is helpful!

 

So you think the level for a filer-dodging PR is about 6 or 7? I better get my site up to 7 then, haha!

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  

×
×
  • Create New...