Jan
17
2011

Cracking the Google Code… Under the GoogleScope

Google’s sweeping changes confirm the search giant has launched
a full out assault against artificial link inflation & declared
war against search engine spam in a continuing effort to provide
the best search service in the world… and if you thought you
cracked the Google Code and had Google all figured out … guess
again.

Google has raised the bar against search engine spam and
artificial link inflation to unrivaled heights with the filing
of a United States Patent Application 20050071741 on December
31, 2003. On March 31, 2005 is was available online for the
first time.

The filing unquestionable provides SEO’s with valuable insight
into Google’s tightly guarded search intelligence and confirms
that Google’s information retrieval is based on historical data.

What exactly do these changes mean to you? Your credibility and
reputation on-line are going under the Googlescope! Google has
defined their patent abstract as follows:

A system identifies a document and obtains one or more types of
history data associated with the document. The system may
generate a score for the document based, at least in part, on
the one or more types of history data.

Google’s patent specification reveals a significant amount of
information both old and new about the possible ways Google can
(and likely does) use your web page updates to determine the
ranking of your site in the SERPs.

Unfortunately, the patent filing does not prioritize or
conclusively confirm any specific method one way or the other.

Here’s how Google scores your web pages.

In addition to evaluating and scoring web page content, the
ranking of web pages are admittedly still influenced by the
frequency of page or site updates. What’s new and interesting is
what Google takes into account in determining the freshness of a
web page.

For example, if a stale page continues to procure incoming
links, it will still be considered fresh, even if the page
header (Last-Modified: tells when the file was most recently
modified) hasn’t changed and the content is not updated or
‘stale’.

According to their patent filing Google records and scores the
following web page changes to determine freshness.

·The frequency of all web page changes ·The actual amount of the
change itself… whether it is a substantial change redundant or
superfluous ·Changes in keyword distribution or density ·The
actual number of new web pages that link to a web page ·The
change or update of anchor text (the text that is used to link
to a web page) ·The numbers of new links to low trust web sites
(for example, a domain may be considered low trust for having
too many affiliate links on one web page).

Although there is no specific number of links indicated in the
patent it might be advisable to limit affiliate links on new web
pages. Caution should also be used in linking to pages with
multiple affiliate links.

Developing your web page augments for page freshness.

Now I’m not suggesting that it’s always beneficial or advisable
to change the content of your web pages regularly, but it is
very important to keep your pages fresh regularly and that may
not necessarily mean a content change.

Google states that decayed or stale results might be desirable
for information that doesn’t necessarily need updating, while
fresh content is good for results that require it.

How do you unravel that statement and differentiate between the
two types of content?

An excellent example of this methodology is the roller coaster
ride seasonal results might experience in Google’s SERPs based
on the actual season of the year.

A page related to winter clothing may rank higher in the winter
than the summer… and the geographical area the end user is
searching from will now likely be considered and factored into
the search results.

Likewise, specific vacation destinations might rank higher in
the SERPs in certain geographic regions during specific seasons
of the year. Google can monitor and score pages by recording
click through rate changes by season.

Google is no stranger to fighting Spam and is taking serious new
measures to crack down on offenders like never before.

Section 0128 of Googles patent filing claims that you shouldn’t
change the focus of multiple pages at once.

Here’s a quote from their rationale:

“A significant change over time in the set of topics associated
with a document may indicate that the document has changed
owners and previous document indicators, such as score, anchor
text, etc., are no longer reliable.

Similarly, a spike in the number of topics could indicate spam.
For example, if a particular document is associated with a set
of one or more topics over what may be considered a ‘stable’
period of time and then a (sudden) spike occurs in the number of
topics associated with the document, this may be an indication
that the document has been taken over as a ‘doorway’ document.

Another indication may include the sudden disappearance of the
original topics associated with the document. If one or more of
these situations are detected, then [Google] may reduce the
relative score of such documents and/or the links, anchor text,
or other data associated the document.”

Unfortunately, this means that Google’s sandbox phenomenon
and/or the aging delay may apply to your web site if you change
too many of your web pages at once.

From the case studies I’ve conducted it’s more likely the rule
and not the exception.

What does all this mean to you?

Keep your pages themed, relevant and most importantly
consistent. You have to establish reliability! The days of
spamming Google are drawing to an end.

If you require multi page content changes implement the changes
in segments over time. Continue to use your original keywords on
each page you change to maintain theme consistency.

You can easily make significant content changes by implementing
lateral keywords to support and reinforce your vertical
keyword(s) and phrases. This will also help eliminate keyword
stuffing.

Make sure you determine if the keywords you’re using require
static or fresh search results and update your web site content
accordingly. On this point RSS feeds may play a more valuable
and strategic role than ever before in keeping pages fresh and
at the top of the SERPs.

The bottom line here is webmasters must look ahead, plan and
mange their domains more tightly than ever before or risk
plummeting in the SERPs.

Does Google use your domain name to determine the ranking of
your site?

Google’s patent references specific types of ‘information
relating to how a document is hosted within a computer network’
that can directly influence the ranking of a specific web site.
This is Google’s way of determining the legitimacy of your
domain name.

Therefore, the credibility of your host has never been more
important to ranking well in Google’s SERP’s.

Google states they may check the information of a name server in
multiple ways.

Bad name servers might host known spam sites, adult and/or
doorway domains. If you’re hosted on a known bad name server
your rankings will undoubtedly suffer… if you’re not blacklisted
entirely.

What I found particularly interesting is the criteria that
Google may consider in determining the value of a domain or
identifying it as a spam domain; According to their patent,
Google may now record the following information:

·The length of the domain registration… is it greater than one
year or less than one year?

·The address of the web site owner. Possibly for returning
higher relevancy local search results and attaching
accountability to the domain. ·The admin and the technical
contact info. This info is often changed several times or
completely falsified on spam domains; again this check is for
consistency! ·The stability of your host and their IP range… is
your IP range associated with spam?

Google’s rationale for domain registration is based on the
premise that valuable domains are often secured many years in
advance while domains used for spam are rarely secured for more
than a year.

If in doubt about a host’s integrity I recommend checking their
mail server at www.dnsstuff.com to see if they’re in the spam
database. Watch for red flags!

If your mail server is listed you may have a problem ranking
well in Google!

Securing a reputable host can and will go a long way in
promoting your web site to Google.

The simplest strategy may be registering your domain several
years in advance with a reputable provider thereby demonstrating
longevity and accountability to Google. Google wants to see that
you’re serious about your site and not a flash in the pan spam
shop.

http://www.tkqlhce.com/click-1604302-10294265

Googles Aging Delay has teeth… and they’re taking a bite out of
spam!

It’s no big secret that Google relies heavily on links when it
comes to ranking web sites.

According to their patent filing, Google may record the
discovery date of a link and link changes over time.

In addition to volume, quality & the anchor text of links,
Google’s patent illustrates possible ways how Google might use
historical information to further determine the value of links.

For example, the life span of a link and the speed at which a
new web site gets links.

“Burst link growth may be a strong indicator of search engine
spam”.

This is the first concrete evidence that Google may penalize
sites for rapid link acquisition. Whether the “burst growth”
rule applies to high trust/authorative sites and directory
listings remains unknown. I personally haven’t experienced this
phenomenon. What’s clear for certain though is the inevitable
end to results orientated link farming.

I would point out here that regardless of whether burst link
growth will be tolerated for authorative sites or authorative
link acquisition, webmasters will have to get smarter and work
harder to secure authorative links as their counterparts become
reluctant to exchange links with low trust sites. Now Page Rank
really has value!

Relevant content swaps may be a nice alternative to the standard
link exchange and allow you some control of the link page
elements.

So what else does Google consider in determining the aging delay?

·The anchor text and the discovery date of links are recorded,
thus establishing the countdown period of the aging delay.
·Links with a long-term life span may be more valuable than
links with a short life span. ·The appearance and disappearance
of a links over time. ·Growth rates of links as well as the link
growth of independent peer pages. Again, this suggests that
rapid link acquisition and the quality of peer pages are
monitored ·Anchor text over a given period of time for keyword
consistency. ·Inbound links from fresh pages… might be
considered more important than links from stale pages. ·Google
doesn’t expect that new web sites have a large number of links
so purchasing large numbers of brokered links will likely hurt
you more than help you. Google indicates that it is better for
link growth to remain constant and naturally paced. In addition,
the anchor text should be varied as much as possible. ·New web
sites should not acquire too many new links; it’ll be tolerated
if the links are from trusted sites but it may be considered
spam.

So how do you build your link popularity / Page Rank and avoid
penalties?

When it comes to linking, you should clearly avoid the hocus
pocus or magic bullet linking schemes. If you participate in
quick fix link exchange scams, use automated link exchange
software or buy hundreds of links at once, chances are Google
will interpret your efforts as a spam attempt and act
accordingly.

Don’t get caught in this trap… the recovery period could be
substantial since your host and IP range are also considered!

When you exchange links with other web sites, do it slowly and
consistently.

Develop a link management and maintenance program. Schedule
regular times every week to build the links to your site and
vary the anchor text that points to your site.

Obviously, the links to your site should utilize your keywords.
To avoid repetition use lateral keywords and keyword phrases in
the anchor text since Google wants to see varied anchor text!

Your sites click through rate may now monitored through
bookmarks, cache, favorites, and temporary files.

It’s no big secret that Google has always been suspected of
rewarding sites with higher click through rates (very similar to
what Google does with their AdWords program) so it shouldn’t
come as a great surprise that Google still considers site
stickiness and CTR tracking in their criterion.

What’s interesting though is Google is interested in tracking
the behavior of web surfers through bookmarks, cache, favorites,
and temporary files (most likely with the Google toolbar and/or
the Google desktop search tool). Google’s Patent filing
indicates Google might track the following information: ·Click
through rates are monitored for changes in seasonality, fast
increases, or other spike traffic in addition to increase or
decrease trends. ·The volume of searches over time is recorded
and monitored for increases. ·The information regarding a web
page’s rankings are recorded and monitored for changes. ·Click
through rates are monitored to find out if stale or fresh web
pages are preferred for a search query. ·The traffic to a web
page is recorded and monitored for changes… like Alexa. ·User
behavior may be monitored through bookmarks, cache, favorites,
and temporary files. ·Bookmarks and favorites could be monitored
for both additions and deletions, and; ·The overall user
behavior for trends and changes.

Since Google is capable of tracking the click-through rates to
your web site, you should make sure that your web pages have
attractive titles and utilize calls to action so that web
surfers click on them in the search results.

It’s also important to keep your visitors there so make your web
pages interesting enough so that web surfers stay some time on
your web site. It might also help if your web site visitors
added your web site to their bookmarks.

As you can see, Google’s new ranking criterion has evolved far
beyond the reliance of criteria that can be readily or easily
manipulated. One thing is for certain with Google, whatever
direction search innovation is going; you can trust Google to be
pioneering the way and setting new standards

Leave a Reply