Saturday, November 12, 2011

Previous Panda Updates

Here’s the Panda update schedule so far, as we’ve tracked and had confirmed by Google:

Wednesday, July 20, 2011

Panda Update - 2011

Amit Singhal, Google’s head of search, published a blog post on the Google Webmaster Central blog named More guidance on building high-quality sites.

Amit’s goal with this post is to have those webmasters impacted by this Panda Update, which rolled out internationally about a month ago, with some direction and guidance to help explain what sites Google likes and which they dislike.

Amit said that he cannot document publicly the “actual ranking signals” but will share questions you should ask yourself and consider when trying to understand why a site was impacted by this update. Those questions include:

  1. Would you trust the information presented in this article?
  2. Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
  3. Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
  4. Would you be comfortable giving your credit card information to this site?
  5. Does this article have spelling, stylistic, or factual errors?
  6. Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
  7. Does the article provide original content or information, original reporting, original research, or original analysis?
  8. Does the page provide substantial value when compared to other pages in search results?
  9. How much quality control is done on content?
  10. Does the article describe both sides of a story?
  11. Is the site a recognized authority on its topic?
  12. Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
  13. Was the article edited well, or does it appear sloppy or hastily produced?
  14. For a health related query, would you trust information from this site?
  15. Would you recognize this site as an authoritative source when mentioned by name?
  16. Does this article provide a complete or comprehensive description of the topic?
  17. Does this article contain insightful analysis or interesting information that is beyond obvious?
  18. Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
  19. Does this article have an excessive amount of ads that distract from or interfere with the main content?
  20. Would you expect to see this article in a printed magazine, encyclopedia or book?
  21. Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
  22. Are the pages produced with great care and attention to detail vs. less attention to detail?
  23. Would users complain when they see pages from this site?
Source: http://searchengineland.com/impacted-by-googles-panda-update-google-asks-you-to-consider-this-76050

Friday, April 22, 2011

Social media and Google’s Panda update

The introduction of Google’s recent Panda update changed its search engine algorithm with a view to diminishing a site’s ranking that provide “low-quality content”. The “Panda” update, as Google refers to it after a Google engineer, or the “Farmer” update, as Danny Sullivan of Search Engine Land has been calling it because its apparent target is content farms, has been received very cautiously by the SEO community.

As a result, some of the web’s most popular sites have seen a huge drop in traffic. This marks a major change in Google’s rankings, which has affected around twelve per cent of Google’s overall search results. The SEO industry is still digesting its impact and implications.

The RSS feed, a dynamically-generated summary of information or published news, either in the form of a blog or a dedicated news site, provides a glimpse of the article by providing a headline and, generally, the first few lines of the story’s introduction.

Some media sites have described the practice of using “authoritative” sites’ RSS feeds as “spam sites”. Not exactly, perhaps, but one could for example plug in the RSS feed from Search Engine Watch, as your own.

Quite simply, you can grab the feed url, paste it into feed2js.org, customise it, then grab the generated Javascript code and publish it on your website. As soon as new articles are added to the SEW website, and therefore the RSS feed, the content is then displayed on the “low-quality” website that has uses its content.

Read More: http://www.sitepronews.com/2011/04/14/social-media-and-googles-panda-update/

Google Jagger Algo Update

Jagger history

The Jagger 1 update pre-shocks actually started with a string of back-link updates that began in September 2005 and continued into middle of October 2005.

In mid October, Google updated its PageRank database for public view. Usually updated once a quarter, the PR update always creates a stir.

While most SEO professionals heavily play-down the importance of PR in ranking, the legacy of its importance is so deep-rooted in the minds of most webmasters, that it is difficult to shake it off as an insignificant ranking parameter.

[PageRank is Google's measure of the "popularity" of a web page, based on the number and quality of incoming links. The Editor.]

It is believed that the second phase of the Jagger update — Jagger 2 — is now complete and replicated to all the data centers of Google. However, you may still notice some fluctuations in the rankings as things stabilize for each update.

We are now at the threshold of the third phase of the Jagger update, which is expected to initiate sometime in the second week of November 2005.
The changes

From what we have studied so far, Google has re-engineered several aspects of its algorithm. Amongst other aspects we will know as things roll out, we believe it has altered the impact of the following:

1. Value of incoming links
2. Value of anchor text in incoming links
3. Content on page of incoming links
4. Keyword repetitions in anchor text
5. Age of the incoming links
6. Nature of sites linking to you
7. Directory links
8. Speed and volume of incoming links created
9. Value of reciprocal links
10. Impact of outbound links / links page on your website
11. Sandbox effect / age of your site, domain registration date
12. Size of your site’s content
13. Addition and frequency of fresh content update
14. Canonical / sub domains, sub-sub domains
15. Multiple domains on same IP numbers
16. Duplicate content on same site or on multiple domains
17. Over-optimization, excessive text markup
18. Irrational use of CSS

We are studying various aspects of the Jagger algo update and are closely monitoring the impact of changes in each of the above mentioned parameters and many more not mentioned here.

We shall be discussing the impact of each of these aspects in the next parts of this article, which are likely to be written once the Jagger 3 update and our study of it is complete.

In the meanwhile, we’d like to give out a word of caution – If you have suffered drop in your website rankings, do not do any drastic changes on your website until the Jagger 3 update is fully implemented and stabilized.

There is a delicate balance and inter-dependence of all these parameters that can bring back your ranks once the Jagger 3 update is completed.

Read More: http://www.pandia.com/sew/112-on-the-google-jagger-algo-update-part-1.html

SEO - URL Canonicalization Problem

We often come across websites that are accessible by both the www and the non-www version of its URL. While apparently both the URLs look the same, the search engines can treat them as separate sites altogether. This is something known as a canonical issue.

"Canonicalization is the process of picking the best url when there are several choices, and it usually refers to home pages."

Matt further adds:

"For example, most people would consider these the same urls:

* www.example.com

* example.com/

* www.example.com/index.html

* example.com/home.asp

But technically all of these urls are different. A web server could return completely different content for all the urls above. When Google “canonicalizes” a url, we try to pick the url that seems like the best representative from that set.”

Since the search engines treat the www and the non-www version of the same URL to be different websites, there’s a risk of duplicate content issues arising because of the same content served on both the URLs.


Read More: http://www.searchenginejournal.com/canonicalization-seo-should-i-use-www-or-not/6950/

Wednesday, February 9, 2011

Google Update 2010 - Mayday” Update Impacts Long Tail Traffic

Google made between 350 and 550 changes in its organic search algorithms in 2009. This is one of the reasons I recommend that site owners not get too fixated on specific ranking factors. If you tie construction of your site to any one perceived algorithm signal, you’re at the mercy of Google’s constant tweaks. These frequent changes are one reason Google itself downplays algorithm updates. Focus on what Google is trying to accomplish as it refines things (the most relevant, useful results possible for searchers) and you’ll generally avoid too much turbulence in your organic search traffic.

Wired On Google’s Algorithm

Exclusive: How Google’s Algorithm Rules the Web from Wired has an excellent and detailed look at the evolution Google’s search algorithm over the years. It is a pretty long write up, so I wanted to highlight those key points in bullet format for you all.

Key Advances:

* Backrub in September 1997
* New Algorithm in August 2001
* Local Connectivity Analysis in February 2003
* Fritz in Summer 2003
* Personalized Results in June 2005
* Bigdaddy Update in December 2005
* Universal Search in May 2007
* Real Time Search in December 2009

Personally, “Fritz” is a name I never heard before. In the SEO world, this is when the “Google Dance” died. Google, in the past, updated their index about every 30 days. Google stopped that in 2003 and began indexing and updating their index several times per day. Maybe my memory is foggy, but I do not remember the name “Fritz” coming up back then to describe this behavior. It is also true that Google was much more shy about sharing details about their search algorithm, back then.

Google Algorithm January 2011 update - Duplicate Content

Matt Cutts says in his blog:

“My [previous] post mentioned that ‘we’re evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content.’ That change was approved at our weekly quality launch meeting last Thursday and launched earlier this week.”

This means that the duplicate content filters have been tightend and Google is using more sophisticated ways of identifying original content.

Tuesday, February 8, 2011

The Impact of Google Caffeine

Google Caffeine is the new Google indexing system, with the aim of providing better results for their 128 million worldwide users. So what is Google Caffeine and how will it affect Google's users and website owners?

What is Google Caffeine?

Google caffeine is Google's updated system of assessing websites to decide how to rank them in their search engine. Although some have referred to it as their new search engine, it is really more of an improvement of the existing one. Users will not notice any major physical differences but there will be some variations in results and the way that results are presented. It is, however, likely to have an impact on web designers and those who work in SEO (search engine optimisation).

What is the purpose of Google Caffeine?

The purpose is simple; to give users more accurate results. Google are always trying to give users the results that are most relevant to what they are looking for. Google wants to prevent spam sites from reaching the top of their results pages and return the best and most relevant sites instead. Through a number of measures they believe they have improved their search engine with Google Caffeine to assist users further. They are also attempting to give quicker news stories and make them more prominent when appropriate.

Saturday, January 8, 2011

How Does Google Caffeine Impact Your Website?

How does Google Caffeine impact your website?

Quite simply, faster indexing and search results. When webmasters create or modify a websites existing content, these changes will be reflected quicker in the Google SERPs, so its a win-win for both the website owner and the searcher.

According to Bestranks Caitlin Self, there are a few ways Caffeine can impact your SEO efforts:

* Page Load Time: Incorporating speed means that page load times are going to come into play more than in the past. While it is standard to have a low page load time, somewhere around two seconds or so, this low time is even more important with the extra caffeine jolt.

* Keywords and Phrases: The relevancy of keywords is now more important than it has been in the past. This just means more work for your SEO campaign. While this poses a challenge to some SEO professionals, it may also filter out some of the sites that are more about spam and SEO than they are about the quality of content they produce for their visitors.

* Ads: It seems that your PPC management will become more important as well. Caffeine puts more emphasis on organic search results, rather than paid advertisements, so the ads that you do pursue are even more individually important. The emphasis on page load time further affects your PPC management, since load time affects your Quality Score for AdWords.

But is Google actually indexing your content faster? SEO guru Bruce Clay weighed into the discussion in a recent interview with WebProNews. According to Bruce, PPC ads will be better and more targeted, which means that ROI will increase. As the ROI increases, the bid will also increase, which would ultimately generate more revenue for Google. All that said, the searchers would win as well since they would be getting better results.

Bruce also believes that the advantage of Caffeine is the near real-time results and several behind-the-scenes factors which Google are yet to officially announce (stay tuned!).