Monday, October 27, 2008

Legal precedents

On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[47][48]

In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. Kinderstart's web site was removed from Google's index prior to the lawsuit and the amount of traffic to the site dropped by 70%. On March 16, 2007 the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend, and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.
http://en.wikipedia.org/wiki/Search_Engine_Optimization

International markets

The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[42] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[43] As of 2006, Google held about 40% of the market in the United States, but Google had an 85-90% market share in Germany.[44] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[44]

In Russia the situation is reversed. Local search engine Yandex controls 50% of the paid advertising revenue, while Google has less than 9%.[45] In China, Baidu continues to lead in market share, although Google has been gaining share as of 2007.[46]

Successful search optimization for international markets may require professional translation of web pages, registration of a domain name with a top level domain in the target market, and web hosting that provides a local IP address. Otherwise, the fundamental elements of search optimization are essentially the same, regardless of language
http://en.wikipedia.org/wiki/Search_Engine_Optimization

As a marketing strategy

Eye tracking studies have shown that searchers scan a search results page from top to bottom and left to right (for left to right languages), looking for a relevant result. Placement at or near the top of the rankings therefore increases the number of searchers who will visit a site.[35] However, more search engine referrals does not guarantee more sales. SEO is not necessarily an appropriate strategy for every website, and other Internet marketing strategies can be much more effective, depending on the site operator's goals.[36] A successful Internet marketing campaign may drive organic traffic to web pages, but it also may involve the use of paid advertising on search engines and other pages, building high quality web pages to engage and persuade, addressing technical issues that may keep search engines from crawling and indexing those sites, setting up analytics programs to enable site owners to measure their successes, and improving a site's conversion rate.[37]

SEO may generate a return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[38] It is considered wise business practice for website operators to liberate themselves from dependence on search engine traffic.[39] A top-ranked SEO blog Seomoz.org[40] has reported, "Search marketers, in a twist of irony, receive a very small share of their traffic from search engines." Instead, their main sources of traffic are links from other websites
http://en.wikipedia.org/wiki/Search_Engine_Optimization

White hat versus black hat

SEO techniques can be classified into two broad categories: techniques that search engines recommend as part of good design, and those techniques that search engines do not approve of. The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[29] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[30]

An SEO technique is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[31][17][18][19] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines, but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the spiders, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[32] although the two are not identical.

Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses text that is hidden, either as text colored similar to the background, in an invisible div, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking.

Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One infamous example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.[33] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's list.
http://en.wikipedia.org/wiki/Search_Engine_Optimization

Preventing indexing

To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed, and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam
http://en.wikipedia.org/wiki/Search_Engine_Optimization

Getting indexed

The leading search engines, Google, Yahoo! and Microsoft, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. Some search engines, notably Yahoo!, operate a paid submission service that guarantee crawling for either a set fee or cost per click.[22] Such programs usually guarantee inclusion in the database, but do not guarantee specific ranking within the search results.[23] Yahoo's paid inclusion program has drawn criticism from advertisers and competitors.[24] Two major directories, the Yahoo Directory and the Open Directory Project both require manual submission and human editorial review.[25] Google offers Google Webmaster Tools, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that aren't discoverable by automatically following links.[26]

Search engine crawlers may look at a number of different factors when crawling a site. Not every page is indexed by the search engines. Distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled
http://en.wikipedia.org/wiki/Search_Engine_Optimization

Webmasters and search engines

By 1997 search engines recognized that webmasters were making efforts to rank well in their search engines, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Infoseek, adjusted their algorithms in an effort to prevent webmasters from manipulating rankings.[12]

Due to the high marketing value of targeted search results, there is potential for an adversarial relationship between search engines and SEOs. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web,[13] was created to discuss and minimize the damaging effects of aggressive web content providers.

SEO companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[14] Wired magazine reported that the same company sued blogger Aaron Wall for writing about the ban.[15] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[16]

Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, chats, and seminars. In fact, with the advent of paid inclusion, some search engines now have a vested interest in the health of the optimization community. Major search engines provide information and guidelines to help with site optimization.[17][18][19] Google has a Sitemaps program[20] to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website. Google guidelines are a list of suggested practices Google has provided as guidance to webmasters. Yahoo! Site Explorer provides a way for webmasters to submit URLs, determine how many pages are in the Yahoo! index and view link information.[21]

http://en.wikipedia.org/wiki/Search_Engine_Optimization