Links & Law - Information about legal aspects of search engines, linking and framing

Hyperlink & Search Engine Law News  Decisions & Court Documents Worldwide Legal Resources (Hyperlink & Search Engine Law Articles) Linking Law Cases Search Engine Law Publications by Dr. Stephan Ott Technical    Background

 Search Engine Optimization

Search engine optimization (SEO) is a set of methodologies aimed at improving the visibility of a website in search engine listings. The term also refers to an industry of consultants that carry out optimization projects on behalf of client sites.

History

SEO began in the mid-1990s, as the first search engines were cataloging the early Web. Many site owners quickly learned to appreciate the value of a new listing in a search engine, as they observed sharp spikes in traffic to their sites.

Site owners soon began submitting their site URLs to the engines on a regular basis, and began modifying their site to accommodate the needs of search engine spiders, the software programs sent out to explore the Web. Special features such as meta tags became a common feature of sites that sought out high-ranking listings in search engine result pages (the so-called "SERPs").

Consultant firms arose to serve the needs of these site owners, and attempted to develop an understanding of the search engines' internal logic, or algorithms. The goal was to develop a set of practices for copywriting, site coding, and submissions that would ensure maximum exposure for a website.

Controversy

As the industry developed, search engines quickly became wary of unscrupulous SEO firms that attempted to generate traffic for their customers at any cost (the most common problem being search results' decreasing relevance). One frequent practice, called keyword spamming, involved the insertion of random text at the bottom of a webpage, colored to match the background of the page. The inserted text usually included words that were frequently searched (such as sex), with the goal of getting rankings, and thus access to large streams of traffic. The search engines responded with a continuous series of countermeasures, designed to filter out the "noise" generated by these artificial techniques. In turn, several SEO firms developed ever-more-subtle techniques to influence rankings.

Reconciliation

In the early 2000s, search engines and SEO firms attempted to establish an unofficial truce. There are several tiers of SEO firms, and the most reputable companies employ content-based optimizations which meet with the search engines' (reluctant) approval. These techniques include improvements to site navigation and copywriting, designed to make websites more intelligible to search engine algorithms.

Search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences and seminars. In fact, with the advent of paid inclusion, search engines now have a vested interest in the health of the optimization community.

Paid inclusion

Paid inclusion is a fee-based model for submitting website listings to search engines. Historically, search engines have allowed webmasters, as well as SEOs and the general public, to freely submit sites for consideration. However, a pattern of abuse began to develop among less-reputable SEO firms, who flooded the engines with non-stop submissions of pages. Analysis of these submissions strained the search engines' capacity, necessitating the creation of artificial limits, including fees.

The fee structure is used by search engines as a filter against superfluous submissions, and also as a revenue generator. Typically, the fee covers an annual subscription for one webpage, which will automatically be cataloged on a regular basis. Search engines still offer free submit forms, but make no promises as to the timeliness of the cataloging process through this channel.

Google has a particularly ethical way of handling paid placement. Their main results are uninfluenced by payments, but paid "AdWords" drive small, visually distinct text-only ads, so the user is able to tell which matches were the result of a payment. Google also uses various methods to prevent paid placement of truly irrelevant content.

Ethical and unethical SEO methods

To obtain maximum search engine visibility, it is essential to understand how the target audience is searching for actual information on a web site. When the target audience uses a search engine to find products and services, they type a set of words or phrases into the search box. This set of words is commonly called targeted keywords or phrases.

For the target audience to find a site on the search engines, the page must contain keyword phrases that match the phrases the target audience is typing into search queries.

When a search engine spider analyzes a web page, it determines keyword relevancy based on an algorithm, which is a formula that calculates how web pages are ranked. The most important text for a search engine is the most important text for the target audience - the text your target audience is going to read when they arrive at your web site.

At its worst, SEO becomes spamdexing, the promotion of irrelevant, chiefly commercial, pages through taking advantage of the search algorithms. Indeed, many search engine administrators say that any form of search engine optimization used to improve a website's page rank is spamdexing. However, over time a widespread consensus has developed in the industry as to what are and are not acceptable means of boosting one's search engine placement and resultant traffic.

Arguably, the most ethical method is to have worthwhile content on one's Web site, to which many other Web sites will voluntarily link. There are also few who would question the ethics of informing other relevant sites around the web of one's own content and asking for links, although as relevance diminishes this becomes a more dubious practice.

Equally, virtually no one would question the ethics of choosing the vocabulary of your site (and especially of your page titles) to emphasize words that you know are often searched for by people in your market. Again, the ethics of this becomes shadier if the words in question are not relevant.

It is certainly ethical (in fact it is highly recommended) to add a "site map" page to your site, linked either from the home page or from every page on your site. Such a page guarantees that once a spider has found your site, it will be able to traverse and index the entire site.

Cloaking — any of several means to serve up a different page to the search-engine spider than will be seen by human users — is one of the most controversial methods of search engine optimization. To wit, cloaking can be an illegitimate attempt to mislead search engines regarding the content on a particular Web site. On the other hand, it can be used to provide human users with more or less equivalent content that a search engine would not be able to process or parse. Another ethical use of cloaking is providing accessibility to Web sites for blind people and people with other disabilities. A good benchmark on whether a given act of cloaking is ethical is precisely whether it enhances accessibility.

 

!!! This article is licensed under the GNU Free Documentation License, which means that you can copy and modify it as long as the entire work (including additions) remains under this license. See http://www.gnu.org/copyleft/fdl.html for details. It uses material from the Wikipedia article Search Engine Optimization!!!

Back to the overview

Overview of the Section

Search Engine, Linking & Framing Terminology Explained!

 

Latest News - Update 71

Legal trouble for YouTube in Germany

Germany: Employer may google job applicant

EU: Consultation on the E-Commerce-Directive

WIPO Paper on tradmarks and the internet

The ECJ and the AdWords Cases

 

 

Masthead/Curriculum Vitae
Copyright © 2002-2008 Dr. Stephan Ott 

All Rights Reserved.

 

Google