Meanwhile, for completeness of information, here below some links related to previous articles in reverse chronological order:
– July 2007, Google, penalized or floating website?
– March 2007, Google: from dancer to surfer
But when started these fluctuations?
The first events were recorded at the end of February 2007 and completed by the 20th March, 2007. The second wave began in the last days of June 2007 and seems not to have arrested yet.
Let see point by point what connect websites victim of this particular swinging condition in search results of the Californian search engine.
How to understand if a website is in floating step?
1) Unfortunately, the easiest way, is the most “painful”, in fact just look at the chart path from the site visits by our software collection of statistics web. In fact, this chart will be easy to focus all sine waves that the site has passed. For example, if we take this example:
In fact, here we can see the behaviour of visitors (then the Google access “unfortunately”) over the past 5 months, with a down of the site initially for 1 day, lifts the next day, down to about 3 months, the up for 15 days and the phase fluctuation continues that characterizes daily this site.
From this chart, you can also see and quantify the frequency sine wave, long since up to a week before the PageRank export, but now closer in a few days.
2) the pages of the site is properly indexed in Google, to check do a search on site:nomesito.com
3) the site is not penalized. To check do use some small query listed in that other post (Understand if a site is penalized on Google)
4) the site is no longer present in the top positions of Google (first 2 pages) in the searches for keywords before fluctuation brought traffic to the site. Usually the site for these keywords is about 10-30 results of the first results omitted in the final page (around 950th position)
5) Not all keywords are vanished, but only those secondary and the long-tail (long tail, which means the combination / extension of multiple keywords for example, the long-tail “Rome hotels” can be “hotel in Rome near the station and a vista of the Coliseum”).
6) The site for historical keywords or major ones does not seem to suffer “penalties” in SERP (search results) if not by some “natural” lowering of any position (maximum 5 or 6 positions). This allows the site to have a reduction of traffic generated by Google equal to 90% and not 100%.
7) The fluctuation is not made with regard to the site nor to the page, but precisely for each keywords. Indeed it is possible that a different page for keywords present in a natural way or fluctuation..
What is shown in SERP instead of the site?
Through a somewhat comparisons and analysis with colleagues and friends of the web, I can say that usually instead of “downgraded” sites are proposed search results rather unique, in addition to sites that “merit” in the engine ranking:
– List directory containing the backlink to the site in fluctuation (although this appears to be inserted in the directory)
– Sites SPAM (spam engine, sites that use black-hat techniques, etc.)
– Sites with duplicate contents
– Sites without valid content and concerned for the end user (for example, an e-commerce site that has similar materials but does not have those)
What kind of sites have been affected?
Unlike previous fluctuations associated with new sites with little popularity and low TRUST, the fluctuation of the end of June struck several very authoritative sites with hundreds of backlink, contents valid both at SEO level and users.
Moreover, it seems not to be no association between the nature of the affected sites. Indeed were downgraded websites:
– Showcase / Institutional
– With / without Adsense
– From industrial to adult content
When you have these fluctuations?
The fate has it that the beginning or the end of fluctuations coincide with the PageRank export/update. Much more difficult is to find the link between these two actions carried out “simultaneously” by Google.
Why these sites suffer such penalties?
Hard to say, but what I can say is that the idea of cleaning SERP is no longer reliable, given that times are too long and would not guarantee a good service to the end user.
Probably this is the fluctuations due to inclusion/activation of new algorithms that Google tests and try again and re-evaluate the results obtained from the SERP.
But how can a website resolve this situation?
My suggestion is always the same, continue the promotion of the site as if nothing had (I know it is difficult but now there are not immediate remedies) trying to create original content and collect spontaneous links even from sites with high TRUST (we will see later how to recognize and capture the trust of a site).
The ideal would be to became independent from Google, by creating a community always present on the site through a blog or a forum or other innovative services that provide users’ loyalty.