How to fight against the "farm content" (content farms), web sites that infect the search results of search engines? Google thinks it has found a solution: to use its users directly to denounce sites without content, or abuse of search engine optimization, and thus disrupt the normal operation of the service.
Google has made available to users of Chrome, its search engine, an expansion called "personnal blocklist" that allows users to report sites that are poisoning their results pages. These sites will then be excluded from the search results for that user. And the extension could also be used Google to identify and penalize sites "parasites".
"The information on the blocked sites are transmitted to Google, and we will study the results of these returns to see if they have an interest to improve our ranking of search results," writes Matt Cutts, the head of the fight against spam at Google. This experiment marks a fairly clear break from the normal practices of search engine, which automatically classifies sites based on multiple criteria (number of links pointing to a site, frequency of update ...).
This is also the argument that the search engine between sites who feel misplaced in the results pages: the algorithm determines the classification is automated, it is "neutral", and the rules are the same for Tech Buzz News all. Yet it was already possible to report spam to Google, but the functionality was relatively quiet and heavily guarded - to avoid, in particular, it could be used by site managers wishing Degard the ranking of their competitors.
SPAM FROM AN INCREASINGLY INVASIVENESS But in recent months, the search has been critical of the relevance of its results, while the "strong content" and other sites considered pests by users have increased and have improved their ranking. The New York Times reports that the U.S. dealers who dominate JC Penney has been for months the search results for hundreds of applications, without any legitimacy, by using optimization SEO theoretically sanctioned by Google.
The search has finally sanctioned the ranking of the site. To fight against spam in its research, yet Google had changed its algorithm several times last year, but the results it seems inadequate. "The amount of spam hardline decreased over time, attention is now turning to the farms of content, these sites whose content is almost non-existent or of poor quality," wrote Matt Cutts late January.
We heard the message: our users are asking us to take stronger measures against strong content and sites that consist primarily of spam or content of low quality. " PRODUCTION OF LARGE QUANTITIES OF PAGES LOW PRICE recipe for "farm content" is simple: just build a site that publish large quantities of pages, designed to contain a large number of keywords, and whose SEO is optimized to dominate the results pages of search engines, a major source of traffic.
Some of these sites use editors, paid content, who write articles short and focused in large quantities. This is for example the case of Demand Media, which pays authors 15 or $ 20 per item: the company publishes about 4000 articles per day on all its sites. The company, which pays for itself through advertising on the pages of their sites, preparing its IPO in mid-2011 to an estimated 1.3 billion dollars.
Other sites, like Huffington Post, close follow recipes, with volunteer contributors, journalists and bloggers paid. The result is very effective when doing a search on the performance of Lady Gaga Grammy deuxarticles the Huffington Post, very short but contains a large number of links, listed on the first page of Google results.
They are ahead in the rankings most specialized sites, news sites, but also the official site of the Grammy Awards. The Huffington Post was bought by AOL for $ 315 million in early February. FULLY AUTOMATED SITES But compared to other fully automated services, the reference work done by Demand Media and the Huffington Post, based on the identification of subjects "in air time," responsiveness and efficient use of an army of contributors, seem almost handmade.
To quickly create large quantities of pages, some sites have automated their creation, and feed their pages by aggregating content that exists online, or RSS. Where the Huffington Post is the millions of Internet users interested in Lady Gaga, these sites take the opposite approach, seeking to attract Internet users researched infrequent, on which it is easy to get on the front page search engines.
Site 123people. com, which aggregates the information published online on everyone, will operate under this principle. A Google search on John Doe, as the majority of research on a name and a surname, 123people displays a result to the first page. With a structure designed to optimize SEO, the site is visible on millions of queries very accurate, even if it fails to be in the top results for the names of celebrities.
To purge their research results on these results, collected by search engines and much of the users as a pollutant, other engines have taken a more radical approach: Blekko has excluded a small list of platforms for its results. Feeling the tide turn, Jason Calacanis, CEO of Mahalo - a sort of cross between collaborative encyclopedia and search engine that had invaded the results of search engines at launch - said in early February that it now needed to focus on quality quantity.
"Our page 'how to cook a turkey is very good, but it is competing with 17 pages eHow [property of Demand Media] devoted to the different ways to cook a turkey," he said. "We will make high-quality content, and we're back in the Google ranking day by day until we're number one in the search 'how to cook a turkey".
Google has made available to users of Chrome, its search engine, an expansion called "personnal blocklist" that allows users to report sites that are poisoning their results pages. These sites will then be excluded from the search results for that user. And the extension could also be used Google to identify and penalize sites "parasites".
"The information on the blocked sites are transmitted to Google, and we will study the results of these returns to see if they have an interest to improve our ranking of search results," writes Matt Cutts, the head of the fight against spam at Google. This experiment marks a fairly clear break from the normal practices of search engine, which automatically classifies sites based on multiple criteria (number of links pointing to a site, frequency of update ...).
This is also the argument that the search engine between sites who feel misplaced in the results pages: the algorithm determines the classification is automated, it is "neutral", and the rules are the same for Tech Buzz News all. Yet it was already possible to report spam to Google, but the functionality was relatively quiet and heavily guarded - to avoid, in particular, it could be used by site managers wishing Degard the ranking of their competitors.
SPAM FROM AN INCREASINGLY INVASIVENESS But in recent months, the search has been critical of the relevance of its results, while the "strong content" and other sites considered pests by users have increased and have improved their ranking. The New York Times reports that the U.S. dealers who dominate JC Penney has been for months the search results for hundreds of applications, without any legitimacy, by using optimization SEO theoretically sanctioned by Google.
The search has finally sanctioned the ranking of the site. To fight against spam in its research, yet Google had changed its algorithm several times last year, but the results it seems inadequate. "The amount of spam hardline decreased over time, attention is now turning to the farms of content, these sites whose content is almost non-existent or of poor quality," wrote Matt Cutts late January.
We heard the message: our users are asking us to take stronger measures against strong content and sites that consist primarily of spam or content of low quality. " PRODUCTION OF LARGE QUANTITIES OF PAGES LOW PRICE recipe for "farm content" is simple: just build a site that publish large quantities of pages, designed to contain a large number of keywords, and whose SEO is optimized to dominate the results pages of search engines, a major source of traffic.
Some of these sites use editors, paid content, who write articles short and focused in large quantities. This is for example the case of Demand Media, which pays authors 15 or $ 20 per item: the company publishes about 4000 articles per day on all its sites. The company, which pays for itself through advertising on the pages of their sites, preparing its IPO in mid-2011 to an estimated 1.3 billion dollars.
Other sites, like Huffington Post, close follow recipes, with volunteer contributors, journalists and bloggers paid. The result is very effective when doing a search on the performance of Lady Gaga Grammy deuxarticles the Huffington Post, very short but contains a large number of links, listed on the first page of Google results.
They are ahead in the rankings most specialized sites, news sites, but also the official site of the Grammy Awards. The Huffington Post was bought by AOL for $ 315 million in early February. FULLY AUTOMATED SITES But compared to other fully automated services, the reference work done by Demand Media and the Huffington Post, based on the identification of subjects "in air time," responsiveness and efficient use of an army of contributors, seem almost handmade.
To quickly create large quantities of pages, some sites have automated their creation, and feed their pages by aggregating content that exists online, or RSS. Where the Huffington Post is the millions of Internet users interested in Lady Gaga, these sites take the opposite approach, seeking to attract Internet users researched infrequent, on which it is easy to get on the front page search engines.
Site 123people. com, which aggregates the information published online on everyone, will operate under this principle. A Google search on John Doe, as the majority of research on a name and a surname, 123people displays a result to the first page. With a structure designed to optimize SEO, the site is visible on millions of queries very accurate, even if it fails to be in the top results for the names of celebrities.
To purge their research results on these results, collected by search engines and much of the users as a pollutant, other engines have taken a more radical approach: Blekko has excluded a small list of platforms for its results. Feeling the tide turn, Jason Calacanis, CEO of Mahalo - a sort of cross between collaborative encyclopedia and search engine that had invaded the results of search engines at launch - said in early February that it now needed to focus on quality quantity.
"Our page 'how to cook a turkey is very good, but it is competing with 17 pages eHow [property of Demand Media] devoted to the different ways to cook a turkey," he said. "We will make high-quality content, and we're back in the Google ranking day by day until we're number one in the search 'how to cook a turkey".
- Ferm Living: Organic Robot Pillows (20/01/2011)
- A Dose of Cute: Marionette Pillows & Prints by Ferm Living (21/01/2011)
- Google Lets You Androidify Yourself (15/02/2011)
- Matt Cutts: Chase What Google Is Chasing After (15/02/2011)
- Glen Beck Warns Viewers Not To Use Google (15/02/2011)
Google (homepage)  NASDAQ: GOOG (googlefinance)  Google (blog)  Google (GOOG) (wikinvest)  Google (crunchbase)  Google (wikipedia)  Marissa Mayer at Stanford University (youtube)  
No comments:
Post a Comment