Monday, October 27, 2008

Legal precedents

On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[47][48]

In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. Kinderstart's web site was removed from Google's index prior to the lawsuit and the amount of traffic to the site dropped by 70%. On March 16, 2007 the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend, and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.
http://en.wikipedia.org/wiki/Search_Engine_Optimization

International markets

The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[42] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[43] As of 2006, Google held about 40% of the market in the United States, but Google had an 85-90% market share in Germany.[44] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[44]

In Russia the situation is reversed. Local search engine Yandex controls 50% of the paid advertising revenue, while Google has less than 9%.[45] In China, Baidu continues to lead in market share, although Google has been gaining share as of 2007.[46]

Successful search optimization for international markets may require professional translation of web pages, registration of a domain name with a top level domain in the target market, and web hosting that provides a local IP address. Otherwise, the fundamental elements of search optimization are essentially the same, regardless of language
http://en.wikipedia.org/wiki/Search_Engine_Optimization

As a marketing strategy

Eye tracking studies have shown that searchers scan a search results page from top to bottom and left to right (for left to right languages), looking for a relevant result. Placement at or near the top of the rankings therefore increases the number of searchers who will visit a site.[35] However, more search engine referrals does not guarantee more sales. SEO is not necessarily an appropriate strategy for every website, and other Internet marketing strategies can be much more effective, depending on the site operator's goals.[36] A successful Internet marketing campaign may drive organic traffic to web pages, but it also may involve the use of paid advertising on search engines and other pages, building high quality web pages to engage and persuade, addressing technical issues that may keep search engines from crawling and indexing those sites, setting up analytics programs to enable site owners to measure their successes, and improving a site's conversion rate.[37]

SEO may generate a return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[38] It is considered wise business practice for website operators to liberate themselves from dependence on search engine traffic.[39] A top-ranked SEO blog Seomoz.org[40] has reported, "Search marketers, in a twist of irony, receive a very small share of their traffic from search engines." Instead, their main sources of traffic are links from other websites
http://en.wikipedia.org/wiki/Search_Engine_Optimization

White hat versus black hat

SEO techniques can be classified into two broad categories: techniques that search engines recommend as part of good design, and those techniques that search engines do not approve of. The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[29] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[30]

An SEO technique is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[31][17][18][19] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines, but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the spiders, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[32] although the two are not identical.

Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses text that is hidden, either as text colored similar to the background, in an invisible div, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking.

Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One infamous example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.[33] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's list.
http://en.wikipedia.org/wiki/Search_Engine_Optimization

Preventing indexing

To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed, and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam
http://en.wikipedia.org/wiki/Search_Engine_Optimization

Getting indexed

The leading search engines, Google, Yahoo! and Microsoft, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. Some search engines, notably Yahoo!, operate a paid submission service that guarantee crawling for either a set fee or cost per click.[22] Such programs usually guarantee inclusion in the database, but do not guarantee specific ranking within the search results.[23] Yahoo's paid inclusion program has drawn criticism from advertisers and competitors.[24] Two major directories, the Yahoo Directory and the Open Directory Project both require manual submission and human editorial review.[25] Google offers Google Webmaster Tools, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that aren't discoverable by automatically following links.[26]

Search engine crawlers may look at a number of different factors when crawling a site. Not every page is indexed by the search engines. Distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled
http://en.wikipedia.org/wiki/Search_Engine_Optimization

Webmasters and search engines

By 1997 search engines recognized that webmasters were making efforts to rank well in their search engines, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Infoseek, adjusted their algorithms in an effort to prevent webmasters from manipulating rankings.[12]

Due to the high marketing value of targeted search results, there is potential for an adversarial relationship between search engines and SEOs. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web,[13] was created to discuss and minimize the damaging effects of aggressive web content providers.

SEO companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[14] Wired magazine reported that the same company sued blogger Aaron Wall for writing about the ban.[15] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[16]

Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, chats, and seminars. In fact, with the advent of paid inclusion, some search engines now have a vested interest in the health of the optimization community. Major search engines provide information and guidelines to help with site optimization.[17][18][19] Google has a Sitemaps program[20] to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website. Google guidelines are a list of suggested practices Google has provided as guidance to webmasters. Yahoo! Site Explorer provides a way for webmasters to submit URLs, determine how many pages are in the Yahoo! index and view link information.[21]

http://en.wikipedia.org/wiki/Search_Engine_Optimization

Search engine History

Webmasters and content providers began optimizing sites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all a webmaster needed to do was submit a page, or URL, to the various engines which would send a spider to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[1] The process involves a search engine spider downloading a page and storing it on the search engine's own server, where a second program, known as an indexer, extracts various information about the page, such as the words it contains and where these are located, as well as any weight for specific words, as well as any and all links the page contains, which are then placed into a scheduler for crawling at a later date.

Site owners started to recognize the value of having their sites highly ranked and visible in search engine results, creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the earliest known use of the phrase search engine optimization was a spam message posted on Usenet on July 26, 1997.[2]

Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag, or index files in engines like ALIWEB. Meta tags provided a guide to each page's content. But using meta data to index pages was found to be less than reliable because the webmaster's account of keywords in the meta tag were not truly relevant to the site's actual keywords. Inaccurate, incomplete, and inconsistent data in meta tags caused pages to rank for irrelevant searches.[3] Web content providers also manipulated a number of attributes within the HTML source of a page in an attempt to rank well in search engines.[4]

By relying so much on factors exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search allowing those results to be false would turn users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate.

While graduate students at Stanford University, Larry Page and Sergey Brin developed "backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[5] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random surfer.


Google headquartersPage and Brin founded Google in 1998. Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[6] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaining PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[7] In recent years major search engines have begun to rely more heavily on off-web factors such as the age, sex, location, and search history of people conducting searches in order to further refine results.

By 2007, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. Google says it ranks sites using more than 200 different signals.[8] The three leading search engines, Google, Yahoo and Microsoft's Live Search, do not disclose the algorithms they use to rank pages. Notable SEOs, such as Rand Fishkin, Barry Schwartz, Aaron Wall and Jill Whalen, have studied different approaches to search engine optimization, and have published their opinions in online forums and blogs.[9][10] SEO practitioners may also study patents held by various search engines to gain insight into the algorithms.[11]

Search engine optimization

Search engine optimization
Search engine optimization (SEO) is the process of improving the volume and quality of traffic to a web site from search engines via "natural" ("organic" or "algorithmic") search results. Usually, the earlier a site is presented in the search results, or the higher it "ranks," the more searchers will visit that site. SEO can also target different kinds of search, including image search, local search, and industry-specific vertical search engines.

As an Internet marketing strategy, SEO considers how search engines work and what people search for. Optimizing a website primarily involves editing its content and HTML coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Sometimes a site's structure (the relationships between its content) must be altered too. Because of this it is, from a client's perspective, always better to incorporate Search Engine Optimization when a website is being developed than to try and retroactively apply it.

The acronym "SEO" can also refer to "search engine optimizers," a term adopted by an industry of consultants who carry out optimization projects on behalf of clients, and by employees who perform SEO services in-house. Search engine optimizers may offer SEO as a stand-alone service or as a part of a broader marketing campaign. Because effective SEO may require changes to the HTML source code of a site, SEO tactics may be incorporated into web site development and design. The term "search engine friendly" may be used to describe web site designs, menus, content management systems and shopping carts that are easy to optimize.

Another class of techniques, known as black hat SEO or Spamdexing, use methods such as link farms and keyword stuffing that degrade both the relevance of search results and the user-experience of search engines. Search engines look for sites that employ these techniques in order to remove them from their indices.

http://en.wikipedia.org/wiki/Search_Engine_Optimization

Real user monitoring

Real user monitoring (RUM) is a passive web monitoring technology that records all user interaction with a website. Monitoring actual user interaction with a website is important to website operators to determine if users are being served quickly, error free and if not which part of a business process is failing. Software as a Service (SaaS) and Application Service Providers (ASP) use RUM to monitor and manage service quality delivered to their clients. Real user monitoring data is used to determine the actual service-level quality delivered to end-users and to detect errors or slowdowns on web sites. The data may also be used to determine if changes that are promulgated to sites has the desired effect or causes errors.

Organizations also use RUM to test website changes prior to deployment by monitoring for errors or slowdowns in the pre-deployment phase, they may also use it to test changes to production websites, or to anticipate behavioural changes in a website. For example a website may add an area where users would be likely to congregate before moving forward in a group (test takers logging into a website over twenty minutes and then simultaneously beginning a test for example), this is called rendezvous in test environments. Changes to websites such as these can be tested with RUM.

Real user monitoring is typically "passive monitoring," i.e. the RUM device collects web traffic without having any effect on the operation of the site. In some limited cases it also uses Javascript injected into a page to provide feedback from the browser.

Passive monitoring can be very helpful in troubleshooting performance problems once they have occurred. Passive monitoring differs from synthetic monitoring in that it relies on actual inbound and outbound web traffic to take measurements.

Retrieved from "http://en.wikipedia.org/wiki/Real_user_monitoring"

Subject matter

One category of rating sites, such as Hot or Not or HotFlation, is devoted to rating contributors' physical attractiveness. Other looks-based rating sites include RateMyFace (an early site, launched in the Summer of 1999) and NameMyVote, which asks users to guess a person's political party based on their looks. Some sites are devoted to rating the appearance of pets (e.g. kittenwar.com, petsinclothes.com, and meormypet.com). Another class allows users to rate short video or music clips. One variant, a "Darwinian poetry" site, allows users to compare two samples of entirely computer-generated poetry using a Condorcet method. Successful poems "mate" to produce poems of ever-increasing appeal. Yet others are devoted to disliked men (DoucheBagAlert), bowel movements (ratemypoo.com), unsigned bands (RateMyBand), politics (HateMyTory.Com), nightclubs, business professionals, clothes, cars, and many other subjects.

When rating sites are dedicated to rating products, services, or businesses rather than to rating people, and are used for more serious or well thought-out ratings, they tend to be called review sites, although the distinction is not exact.

http://en.wikipedia.org/wiki/Rating_sites

Features

This summary is not available. Please click here to view the post.

Rating sites

Rating sites (less commonly, rate-me sites) are websites designed for users to vote on or rate people, content, or other things. Rating sites are typically organized around attributes such as physical appearance, body parts, voice, personality, etc. They may also be devoted to the subjects' occupational ability, for example teachers, professors, lawyers, doctors, etc.
http://en.wikipedia.org/wiki/Rating_sites

Planning and creating an intranet

Most organizations devote considerable resources into the planning and implementation of their intranet as it is of strategic importance to the organization's success. Some of the planning would include topics such as:

The purpose and goals the intranet
Persons or departments responsible for implementation and management
Implementation schedules and phase-out of existing systems
Defining and implementing security of the intranet
How they'll ensure to keep it within legal boundaries and other constraints
Level of interactivity (eg wikis, on-line forms) desired.
Is the input of new data and updating of existing data to be centrally controlled or devolved.
These are in addition to the hardware and software decisions (like Content Management Systems), participation issues (like good taste, harassment, confidentiality), and features to be supported

The actual implementation would include steps such as

User involvement to identify users' information needs.
Setting up web server(s) with the appropriate hardware and software.
Setting up web server access using a TCP/IP network.
Installing required user applications on computers.
Creation of document framework for the content to be hosted.
User involvement in testing and promoting use of intranet.
http://en.wikipedia.org/wiki/Intranet

Benefits of intranets

Workforce productivity: Intranets can help users to locate and view information faster and use applications relevant to their roles and responsibilities. With the help of a web browser interface, users can access data held in any database the organization wants to make available, anytime and - subject to security provisions - from anywhere within the company workstations, increasing employees' ability to perform their jobs faster, more accurately, and with confidence that they have the right information. It also helps to improve the services provided to the users.
Time: With intranets, organizations can make more information available to employees on a "pull" basis (i.e., employees can link to relevant information at a time which suits them) rather than being deluged indiscriminately by emails.
Communication: Intranets can serve as powerful tools for communication within an organization, vertically and horizontally. From a communications standpoint, intranets are useful to communicate strategic initiatives that have a global reach throughout the organization. The type of information that can easily be conveyed is the purpose of the initiative and what the initiative is aiming to achieve, who is driving the initiative, results achieved to date, and who to speak to for more information. By providing this information on the intranet, staff have the opportunity to keep up-to-date with the strategic focus of the organization.
Web publishing allows 'cumbersome' corporate knowledge to be maintained and easily accessed throughout the company using hypermedia and Web technologies. Examples include: employee manuals, benefits documents, company policies, business standards, newsfeeds, and even training, can be accessed using common Internet standards (Acrobat files, Flash files, CGI applications). Because each business unit can update the online copy of a document, the most recent version is always available to employees using the intranet.
Business operations and management: Intranets are also being used as a platform for developing and deploying applications to support business operations and decisions across the internetworked enterprise.
Cost-effective: Users can view information and data via web-browser rather than maintaining physical documents such as procedure manuals, internal phone list and requisition forms.
Promote common corporate culture: Every user is viewing the same information within the Intranet.
Enhance Collaboration: With information easily accessible by all authorised users, teamwork is enabled.
Cross-platform Capability: Standards-compliant web browsers are available for Windows, Mac, and UNIX.
http://en.wikipedia.org/wiki/Intranet

Intranet

An intranet is a private computer network that uses Internet protocols and network connectivity to securely share any part of an organization's information or operational systems with its employees. Sometimes the term refers only to the organization's internal website, but often it is a more extensive part of the organization's computer infrastructure and private websites are an important component and focal point of internal communication and collaboration.


An intranet is built from the same concepts and technologies used for the Internet, such as clients and servers running on the Internet Protocol Suite (TCP/IP). Any of the well known Internet protocols may be found in an intranet, such as HTTP (web services), SMTP (e-mail), and FTP (file transfer). There is often an attempt to employ Internet technologies to provide modern interfaces to legacy information systems hosting corporate data.

An intranet can be understood as a private version of the Internet, or as a private extension of the Internet confined to an organization. The term first appeared in print on April 19, 1995, in Digital News & Review in an article authored by technical editor Stephen Lawton.[1]

Intranets differ from extranets in that the former are generally restricted to employees of the organization while extranets may also be accessed by customers, suppliers, or other approved parties.[2] Extranets extend a private network onto the Internet with special provisions for access, authorization and authentication (see also AAA protocol).

An organization's intranet does not necessarily have to provide access to the Internet. When such access is provided it is usually through a network gateway with a firewall, shielding the intranet from unauthorized external access. The gateway often also implements user authentication, encryption of messages, and often virtual private network (VPN) connectivity for off-site employees to access company information, computing resources and internal communications.

Increasingly, intranets are being used to deliver tools and applications, e.g., collaboration (to facilitate working in groups and teleconferencing) or sophisticated corporate directories, sales and Customer relationship management tools, project management etc., to advance productivity.

Intranets are also being used as corporate culture-change platforms. For example, large numbers of employees discussing key issues in an intranet forum application could lead to new ideas in management, productivity, quality, and other corporate issues.

In large intranets, website traffic is often similar to public website traffic and can be better understood by using web metrics software to track overall activity. User surveys also improve intranet website effectiveness.

Intranet user-experience, editorial, and technology teams work together to produce in-house sites. Most commonly, intranets are managed by the communications, HR or CIO departments of large organizations, or some combination of these.

Because of the scope and variety of content and the number of system interfaces, intranets of many organizations are much more complex than their respective public websites. Intranets and their use are growing rapidly. According to the Intranet design annual 2007 from Nielsen Norman Group, the number of pages on participants' intranets averaged 200,000 over the years 2001 to 2003 and has grown to an average of 6 million pages over 2005–2007.

http://en.wikipedia.org/wiki/Intranet

Google guidelines

The Google webmaster guidelines are a list of suggested practices Google has provided as guidance to webmasters. Websites that do not follow some of the guidelines may be removed from the Google index. A website experiencing problems being indexed or ranked well can find direction in the guidelines. Often websites are not following all of the guidelines and may experience a lower ranking in Google's search engine results or complete removal from the Google index. There are currently thirty-one guidelines which are split into four categories:

Quality: There are five "basic principles" and eight "specific guidelines" in this category. These guidelines are directed toward deceptive behavior and manipulation attempts that may lessen the quality of the Google search engine results. Violations of the quality guidelines are the most common reason of a website being removed from Google's index.
Technical: There are five guidelines in this category. These guidelines cover specific issues that may inhibit a web page from being seen by Googlebot, which is Google's search engine crawler.
Design and content: There are nine guidelines in this category. These guidelines give practical information to webmasters concerning the way their site is built and represent the most common unintentional mistakes that webmasters make.
When your site is ready:There are five guidelines in this category. These guideline provide specific direction for a webmaster who has created a new site and are also relevant for older sites which are not yet in the Google index
http://en.wikipedia.org/wiki/Google_guidelines

Advantages

Exchange large volumes of data using Electronic Data Interchange (EDI)
Share product catalogs exclusively with wholesalers or those "in the trade"
Collaborate with other companies on joint development efforts
Jointly develop and use training programs with other companies
Provide or access services provided by one company to a group of other companies, such as an online banking application managed by one company on behalf of affiliated banks
Share news of common interest exclusively
http://en.wikipedia.org/wiki/Extranet

Industry uses

During the late 1990s and early 2000s, several industries started to use the term "extranet" to describe central repositories of shared data made accessible via the web only to authorized members of particular work groups.

For example, in the construction industry, project teams could login to and access a 'project extranet' to share drawings and documents, make comments, issue requests for information, etc. In 2003 in the United Kingdom, several of the leading vendors formed the Network of Construction Collaboration Technology Providers, or NCCTP, to promote the technologies and to establish data exchange standards between the different systems. The same type of construction-focused technologies have also been developed in the United States, Australia, Scandinavia, Germany and Belgium, among others. Some applications are offered on a Software as a Service (SaaS) basis by vendors functioning as Application service providers (ASPs).

Specially secured extranets are used to provide virtual data room services to companies in several sectors (including law and accountancy).

There are a variety of commercial extranet applications, some of which are for pure file management, and others which include broader collaboration and project management tools. Also exist a variety of Open Source extranet applications and modules, which can be integrated into other online collaborative applications such as Content Management Systems.

http://en.wikipedia.org/wiki/Extranet

Extranet

An extranet is a private network that uses Internet protocols, network connectivity, and possibly the public telecommunication system to securely share part of an organization's information or operations with suppliers, vendors, partners, customers or other businesses. An extranet can be viewed as part of a company's intranet that is extended to users outside the company (e.g.: normally over the Internet). It has also been described as a "state of mind" in which the Internet is perceived as a way to do business with a preapproved set of other companies business-to-business (B2B), in isolation from all other Internet users. In contrast, business-to-consumer (B2C) involves known server(s) of one or more companies, communicating with previously unknown consumer users.

Briefly, an extranet can be understood as an intranet mapped onto the public Internet or some other transmission system not accessible to the general public, but managed by more than one company's administrator(s). For example, military networks of different security levels may map onto a common military radio transmission system that never connects to the Internet. Any private network mapped onto a public one is a virtual private network (VPN). In contrast, an intranet is a VPN under the control of a single company's administrator(s).

An argument has been made[citation needed] that "extranet" is just a buzzword for describing what institutions have been doing for decades, that is, interconnecting to each other to create private networks for sharing information. One of the differences that characterizes an extranet, however, is that its interconnections are over a shared network rather than through dedicated physical lines. With respect to Internet Protocol networks, RFC 4364 states "If all the sites in a VPN are owned by the same enterprise, the VPN is a corporate intranet. If the various sites in a VPN are owned by different enterprises, the VPN is an extranet. A site can be in more than one VPN; e.g., in an intranet and several extranets. We regard both intranets and extranets as VPNs. In general, when we use the term VPN we will not be distinguishing between intranets and extranets. Even if this argument is valid, the term "extranet" is still applied and can be used to eliminate the use of the above description."[1]

It is important to note that in the quote above from RFC 4364, the term "site" refers to a distinct networked environment. Two sites connected to each other across the public Internet backbone comprise a VPN. The term "site" does not mean "website." Further, "intranet" also refers to just the web-connected portions of a "site." Thus, a small company in a single building can have an "intranet," but to have a VPN, they would need to provide tunneled access to that network for geographically distributed employees.

Similarly, for smaller, geographically united organizations, "extranet" is a useful term to describe selective access to intranet systems granted to suppliers, customers, or other companies. Such access does not involve tunneling, but rather simply an authentication mechanism to a web server. In this sense, an "extranet" designates the "private part" of a website, where "registered users" can navigate, enabled by authentication mechanisms on a "login page".

An extranet requires security. These can include firewalls, server management, the issuance and use of digital certificates or similar means of user authentication, encryption of messages, and the use of virtual private networks (VPNs) that tunnel through the public network.

Many technical specifications describe methods of implementing extranets, but often never explicitly define an extranet. RFC 3547 [2] presents requirements for remote access to extranets. RFC 2709 [3] discusses extranet implementation using IPSec and advanced network address translation (NAT).

http://en.wikipedia.org/wiki/Extranet

Sunday, October 26, 2008

Common features

Although web server programs differ in detail, they all share some basic common features.

HTTP: every web server program operates by accepting HTTP requests from the client, and providing an HTTP response to the client. The HTTP response usually consists of an HTML document, but can also be a raw file, an image, or some other type of document (defined by MIME-types). If some error is found in client request or while trying to serve it, a web server has to send an error response which may include some custom HTML or text messages to better explain the problem to end users.
Logging: usually web servers have also the capability of logging some detailed information, about client requests and server responses, to log files; this allows the webmaster to collect statistics by running log analyzers on log files.
In practice many web servers implement the following features also:

Authentication, optional authorization request (request of user name and password) before allowing access to some or all kind of resources.
Handling of static content (file content recorded in server's filesystem(s)) and dynamic content by supporting one or more related interfaces (SSI, CGI, SCGI, FastCGI, JSP, PHP, ASP, ASP.NET, Server API such as NSAPI, ISAPI, etc.).
HTTPS support (by SSL or TLS) to allow secure (encrypted) connections to the server on the standard port 443 instead of usual port 80.
Content compression (i.e. by gzip encoding) to reduce the size of the responses (to lower bandwidth usage, etc.).
Virtual hosting to serve many web sites using one IP address.
Large file support to be able to serve files whose size is greater than 2 GB on 32 bit OS.
Bandwidth throttling to limit the speed of responses in order to not saturate the network and to be able to serve more clients.
http://en.wikipedia.org/wiki/Web_server

Web server

The term web server can mean one of two things:

A computer program that is responsible for accepting HTTP requests from web clients, which are known as web browsers, and serving them HTTP responses along with optional data contents, which usually are web pages such as HTML documents and linked objects (images, etc.).
A computer that runs a computer program as described above
http://en.wikipedia.org/wiki/Web_server

Obtaining hosting

Web hosting is often provided as part of a general Internet access plan; there are many free and paid providers offering these services.

A customer needs to evaluate the requirements of the application to choose what kind of hosting to use. Such considerations include database server software, scripting software, and operating system. Most hosting providers provide Linux-based web hosting which offers a wide range of different software. A typical configuration for a Linux server is the LAMP platform: Linux, Apache, MySQL, and PHP/Perl/Python. The webhosting client may want to have other services, such as email for their business domain, databases or multi-media services for streaming media. A customer may also choose Windows as the hosting platform. The customer still can choose from PHP, Perl, and Python but may also use ASP .Net or Classic ASP.

Web hosting packages often include a Web Content Management System, so the end-user doesn't have to worry about the more technical aspects. These Web Content Management systems are great for the average user, but for those who want more control over their website design, this feature may not be adequate.

Most modern desktop operating systems (Windows, Linux, Mac OSX) are also capable of running web server software, and thus can be used to host basic websites.

One may also search the Internet to find active webhosting message boards and forums that may provide feedback on what type of webhosting company may suit his/her needs. However some of these message boards and forums will require not only registration, but a paid subscription to be able to access the sections and sub forums with such information.

http://en.wikipedia.org/wiki/Web_hosting_service

Types of hosting

Internet hosting services can run Web servers; see Internet hosting services.

Hosting services limited to the Web:

Free web hosting service: is free, (sometimes) advertisement-supported web hosting, and is often limited when compared to paid hosting.
Shared web hosting service: one's Web site is placed on the same server as many other sites, ranging from a few to hundreds or thousands. Typically, all domains may share a common pool of server resources, such as RAM and the CPU. A shared website may be hosted with a reseller.
Reseller web hosting: allows clients to become web hosts themselves. Resellers could function, for individual domains, under any combination of these listed types of hosting, depending on who they are affiliated with as a provider. Resellers' accounts may vary tremendously in size: they may have their own virtual dedicated server to a colocated server.
Virtual Dedicated Server: dividing a server into virtual servers, where each user feels like they're on their own dedicated server, but they're actually sharing a server with many other users. The users may have root access to their own virtual space. This is also known as a virtual private server or VPS.
Dedicated hosting service: the user gets his or her own Web server and gains full control over it (root access for Linux/administrator access for Windows); however, the user typically does not own the server. Another type of Dedicated hosting is Self-Managed or Unmanaged. This is usually the least expensive for Dedicated plans. The user has full administrative access to the box, which means the client is responsible for the security and maintenance of his own dedicated box.
Managed hosting service: the user gets his or her own Web server but is not allowed full control over it (root access for Linux/administrator access for Windows); however, they are allowed to manage their data via FTP or other remote management tools. The user is disallowed full control so that the provider can guarantee quality of service by not allowing the user to modify the server or potentially create configuration problems. The user typically does not own the server. The server is leased to the client.
Colocation web hosting service: similar to the dedicated web hosting service, but the user owns the colo server; the hosting company provides physical space that the server takes up and takes care of the server. This is the most powerful and expensive type of the web hosting service. In most cases, the colocation provider may provide little to no support directly for their client's machine, providing only the electrical, Internet access, and storage facilities for the server. In most cases for colo, the client would have his own administrator visit the data center on site to do any hardware upgrades or changes.
Clustered hosting: having multiple servers hosting the same content for better resource utilization. Clustered Servers are a perfect solution for high-availability dedicated hosting, or creating a scalable web hosting solution.
Grid hosting : this form of distributed hosting is when a server cluster acts like a grid and is composed of multiple nodes.
Home server: usually a single machine placed in a private residence can be used to host one or more web sites from a usually consumer-grade broadband connection. These can be purpose-built machines or more commonly old PCs.
Some ISPs actively attempt to block home servers by disallowing incoming requests to TCP port 80 of the user's connection and by refusing to provide static IP addresses. A common way to attain a reliable DNS hostname is by creating an account with a dynamic DNS service. A dynamic DNS service will automatically change the IP address that a URL points to when the IP address changes.

http://en.wikipedia.org/wiki/Web_hosting_service

Hosting reliability and uptime

Hosting uptime refers to the percentage of time the host is accessible via the internet. Many providers state that they aim for a 99.9% uptime, but there may be server restarts and planned (or unplanned) maintenance in any hosting environment.

A common claim from the popular hosting providers is '99% or 99.9% server uptime' but this often refers only to a server being powered on and doesn't account for network downtime. Real downtime can potentially be larger than the percentage guaranteed by the provider. Many providers tie uptime and accessibility into their own service level agreement (SLA). SLAs sometimes include refunds or reduced costs if performance goals are not met.

http://en.wikipedia.org/wiki/Web_hosting_service

Service scope

The scope of hosting services varies widely. The most basic is web page and small-scale file hosting, where files can be uploaded via File Transfer Protocol (FTP) or a Web interface. The files are usually delivered to the Web "as is" or with little processing. Many Internet service providers (ISPs) offer this service free to their subscribers. People can also obtain Web page hosting from other, alternative service providers. Personal web site hosting is typically free, advertisement-sponsored, or cheap. Business web site hosting often has a higher expense.

Single page hosting is generally sufficient only for personal web pages. A complex site calls for a more comprehensive package that provides database support and application development platforms (e.g. PHP, Java, Ruby on Rails, ColdFusion, and ASP.NET). These facilities allow the customers to write or install scripts for applications like forums and content management. For e-commerce, SSL is also highly recommended.

The host may also provide an interface or control panel for managing the Web server and installing scripts as well as other services like e-mail. Some hosts specialize in certain software or services (e.g. e-commerce). They are commonly used by larger companies to outsource network infrastructure to a hosting company. To find a web hosting company, searchable directories can be used. One must be extremely careful when searching for a new company because many of the people promoting service providers are actually affiliates and the reviews are biased.

http://en.wikipedia.org/wiki/Web_hosting_service

Web hosting service

A web hosting service is a type of Internet hosting service that allows individuals and organizations to provide their own website accessible via the World Wide Web. Web hosts are companies that provide space on a server they own for use by their clients as well as providing Internet connectivity, typically in a data center. Web hosts can also provide data center space and connectivity to the Internet for servers they do not own to be located in their data center, called colocation.

http://en.wikipedia.org/wiki/Web_hosting_service

Web Content Management History

Web Content Management Systems began to be formally developed as a commercial software product in 1995 by two startups, Sunnyvale, California-based Interwoven and its flagship TeamSite product and Austin, Texas-based Vignette's Vignette Content Management product. As the internet began to grow, likewise, the importance of Web Content Management as a part of IT infrastructure began to grow, other vendors in adjacent markets began to develop their own WCM solutions including Documentum and FileNet who had traditionally built Document Management software. Other WCM providers such as Stellent and RedDot Solutions also began to appear. By 2002, IT departments began seeking out a single vendor who could manage all of their unstructured content (documents, web pages, rich media, etc.) and WCM became a sub-set of a new, supercategory, Enterprise Content Management (ECM) which it still remains a part of today.

In the mid 2000s, the web content management market became an even more fragmented market as a plethora of new providers emerged to compliment the traditional ECM vendors. These Web Content Management systems are typically broken down into several groups: Enterprise (Vignette, Interwoven, Documentum, Oracle and others), Mid-market (Ektron, PaperThin, Ingeniux, and others), Open source (Joomla, Drupal, Alfresco and others) and SaaS (Clickability, Crownpeak, Hot Banana and others).
http://en.wikipedia.org/wiki/Web_content_management_system

Types

There are three major types of WCMS: offline processing, online processing, and hybrid systems. These terms describe the deployment pattern for the WCMS in terms of when presentation templates are applied to render Web pages from structured content. Seth Gottlieb has used the terms 'baking', 'frying', and 'parbaking' to describe the three alternatives.

Offline processing
These systems pre-process all content, applying templates before publication to generate Web pages. sagar Vignette CMS and Bricolage are examples of this type of system. Since pre-processing systems do not require a server to apply the templates at request time, they may also exist purely as design-time tools; Adobe Contribute is an example of this approach.


Online processing
These systems apply templates on-demand. HTML may be generated when a user visits the page, or pulled from a cache. Some of the better known open source systems that produce pages on demand are Mambo, Joomla!, Drupal, WordPress, Zikula and Plone. Hosted CMSs are provided by such SaaS developers as Bravenet, UcoZ, Freewebs. Most Web application frameworks perform template processing in this way, but they do not necessarily incorporate content management features. Wikis, e.g. MediaWiki and TWiki generally follow an online model (with varying degrees of cacheing), but generally do not provide document workflow.

Hybrid Systems
Some systems combine the offline and online approaches. Some systems write out executable code (e.g. JSP, PHP, Perl pages) rather than just static HTML[citation needed], so that the CMS itself does not need to be deployed on every Web server. Other hybrids, such as Blosxom, are capable of operating in either an online or offline mode.

http://en.wikipedia.org/wiki/Web_content_management_system

Capabilities

A WCMS is a software system used to manage and control a large, dynamic collection of Web material (HTML documents and their associated images). A CMS facilitates document control, auditing, editing, and timeline management. A WCMS provides the following key features:

Automated templates
Create standard output templates (usually HTML and XML) that can be automatically applied to new and existing content, allowing the appearance of all of that content to be changed from one central place.
Easily editable content
Once content is separated from the visual presentation of a site, it usually becomes much easier and quicker to edit and manipulate. Most WCMS software includes WYSIWYG editing tools allowing non-technical individuals to create and edit content.
Scalable feature sets
Most WCMS software includes plug-ins or modules that can be easily installed to extend an existing site's functionality.
Web standards upgrades
Active WCMS software usually receives regular updates that include new feature sets and keep the system up to current web standards.
Workflow management
Workflow is the process of creating cycles of sequential and parallel tasks that must be accomplished in the CMS. For example, a content creator can submit a story, but it is not published until the copy editor cleans it up and the editor-in-chief approves it.
Document management
CMS software may provide a means of managing the life cycle of a document from initial creation time, through revisions, publication, archive, and document destruction.
Content virtualization
CMS software may provide a means of allowing each user to work within a virtual copy of the entire Web site, document set, and/or code base. This enables changes to multiple interdependent resources to be viewed and/or executed in-context prior to submission
http://en.wikipedia.org/wiki/Web_content_management_system

Web content management system

A Web content management system (WCMS or Web CMS) is content management system (CMS) software, usually implemented as a Web application, for creating and managing HTML content. It is used to manage and control a large, dynamic collection of Web material (HTML documents and their associated images). A WCMS facilitates content creation, content control, editing, and many essential Web maintenance functions.

Usually the software provides authoring (and other) tools designed to allow users with little or no knowledge of programming languages or markup languages to create and manage content with relative ease of use.

Most systems use a database to store content, metadata, and/or artifacts that might be needed by the system. Content is frequently, but not universally, stored as XML, to facilitate reuse and enable flexible presentation options.[1][2]

A presentation layer displays the content to regular Web-site visitors based on a set of templates. The templates are sometimes XSLT files.[3]

Administration is typically done through browser-based interfaces, but some systems require the use of a fat client.

Unlike Web-site builders like Microsoft FrontPage or Adobe Dreamweaver, a WCMS allows non-technical users to make changes to an existing website with little or no training. A WCMS typically requires an experienced coder to set up and add features, but is primarily a Web-site maintenance tool for non-technical administrators.

http://en.wikipedia.org/wiki/Web_content_management_system

Automated different content

With automated different content in internet marketing and geomarketing the delivery of different content based on the geographical geolocation and other personal information is automated.
http://en.wikipedia.org/wiki/Web_content

Different content by choice

A typical example for different content by choice in geo targeting is the FedEx website at FedEx.com where users have the choice to select their country location first and are then presented with different site or article content depending on their selection.
http://en.wikipedia.org/wiki/Web_content

Geo targeting of web content

Geo targeting of web content in internet marketing and geo marketing is the method of determining the geolocation (the physical location) of a website visitor with geolocation software and delivering different content to that visitor based on his or her location, such as country, region/state, city, metro code/zip code, organization, Internet Protocol (IP) address, ISP or other criteria.
http://en.wikipedia.org/wiki/Web_content

Saturday, October 25, 2008

Content management

Because websites are often complex, a term "content management" appeared in the late 1990s identifying a method or in some cases a tool to organize all the diverse elements to be contained on a website. [3] Content management often means that within a business there is a range of people who have distinct roles to do with content management, such as content author, editor, publisher, and administrator. But it also means there may be a content management system whereby each of the different roles are organized whereby to provide their assistance in operating the system and organizing the information for a website.

Even though a business may organize to collect, contain and represent that information online, content needs organization in such a manner to provide the reader (browser) with an overall "customer experience" that is easy to use, the site can be navigated with ease, and the website can fulfill the role assigned to it by the business, that is, to sell to customers, or to market products and services, or to inform customers.
http://en.wikipedia.org/wiki/Web_content

Content is king

A current meme when organizing or building a website is the catchwords "Content is King" (although Andrew Odlyzko in "Content is Not King" argues otherwise). What is meant by the term "content" is written text in plain vanilla HTML or a variant that produces good clean text that can be indexed with ease by a search engine.

This argument is valid to a greater extent, not because people will find it interesting and useful, or a good enough description to buy the product online, but because search engines can index text easily and if the information is close to what a searcher is seeking can be delivered as a site for the seeker of information. Textual information is therefore "king" online to aid the rather raw search tools to operate rather than actually to be impelling for people when seeking information.
http://en.wikipedia.org/wiki/Web_content

A wider view of web content

While there are many millions of pages that are predominantly composed of HTML, or some variation, in general we view data, applications, e-services, images (graphics), audio and video files, personal Web pages, archived e-mail messages, and many more forms of file and data systems as belonging to websites and web pages.

While there are many hundreds of ways to deliver information on a website, there is a common body of knowledge of Search engine optimization that needs to be read as advisory of ways that anything else but text should be delivered. Currently search engines are text based and are one of the common ways people using a browser location sites of interest.
http://en.wikipedia.org/wiki/Web_content

HTML web content

Even though we may embed various protocols within web pages, the "web page" composed of "html" (or some variation) content is still the dominant way whereby we share content. And while there are many web pages with localized proprietary structure (most usually, business websites), many millions of websites abound that are structured according to a common core idea.

Blogs are a type of website that contains mainly web pages authored in html (although the blogger may be totally unaware that the web pages are composed using html due to the blogging tool that may be in use). Millions of people use blogs online; a blog is now the new "Home Page", that is, a place where a persona can reveal personal information, and/or build a concept as to who this persona is. Even though a blog may be written for other purposes, such as promoting a business, the core of a blog is the fact that it is written by a "person" and that person reveals information from her/his perspective.

Search engine sites are composed mainly of html content, but also has a typically structured approach to revealing information. A Search Engine Results Page (SERP) displays a heading, usually the name of the Search Engine, and then a list of Websites and their addresses. What is being listed are the results from a query that may be defined as keywords. The results page lists webpages that are connected in some way with those keywords used in the query.

Discussion boards are sites composed of "textual" content organized by html or some variation that can be viewed in a web browser. The driving mechanism of a discussion board is the fact that users are registered and once registered can write posts. Often a discussion board is made up of posts asking some type of question to which other users may provide answers to those questions.

Ecommerce sites are largely composed of textual material and embedded with graphics displaying a picture of the item(s) for sale. However, there are extremely few sites that are composed page-by-page using some variant of HTML. Generally, webpages are composed as they are being served from a database to a customer using a web browser. However, the user sees the mainly text document arriving as a webpage to be viewed in a web browser. Ecommerce sites are usually organized by software we identify as a "shopping cart".

http://en.wikipedia.org/wiki/Web_content

The page concept

Web content is dominated by the "page" concept. Having its beginnings in an academic settings, and in a setting dominated by type-written pages, the idea of the web was to link directly from one academic paper to another academic paper. This was a completely revolutionary idea in the late 1980s and early 1990s when the best a link could be made was to cite a reference in the midst of a type written paper and name that reference either at the bottom of the page or on the last page of the academic paper.

When it was possible for any person to write and own a Mosaic page, the concept of a "Home Page" blurred the idea of a page.[1] It was possible for anyone to own a "Web page" or a "Home Page" which in many cases the website contained many physical pages in spite of being called "a page". People often cited their "Home Page" to provide credentials, links to anything that a person supported, or any other individual content a person wanted to publish.

Even though "the web" may be the resource we commonly use to "get to" particular locations online, many different protocols [2] are invoked to access embedded information. When we are given an address, such as http://www.youtube.com, we expect to see a range of web pages, but in each page we have embedded tools to watch "video clips".
http://en.wikipedia.org/wiki/Web_content

Beginnings of web content

While the Internet began with a U.S. Government research project in the late 1950s, the web in its present form did not appear on the Internet until after Tim Berners-Lee and his colleagues at the European laboratory (CERN) proposed the concept of linking documents with hypertext. But it was not until Mosaic, the forerunner of the famous Netscape Navigator, appeared did the Internet become more than a file serving system.

The use of hypertext, hyperlinks and a page-based model of sharing information, introduced with Mosaic and later Netscape, helped to define web content, and the formation of websites. Largely, today we categorize websites as being a particular type of website according to the content a website contains.
http://en.wikipedia.org/wiki/Web_content

Web content

Web content is the textual, visual or aural content that is encountered as part of the user experience on websites. It may include, among other things: text, images, sounds, videos and animations.

In "Information Architecture for the World Wide Web" (second edition, page 219), Lou Rosenfeld and Peter Morville write, "We define content broadly as 'the stuff in your Web site.' This may include documents, data, applications, e-services, images, audio and video files, personal Web pages, archived e-mail messages, and more. And we include future stuff as well as present stuff."
http://en.wikipedia.org/wiki/Web_content

Overselling

Overselling is a term used in the web hosting industry to describe a situation in which a company provides hosting plans that are unsustainable if every one of its customers uses the full extent of services advertised. The term is usually referred to the web space and bandwidth transfer allowance.

This practice usually incurs little ill-effect since most customers do not use any significant portion of their allocated share. If a customer has a small, low-traffic site serving static HTML pages, few resources will be used. If a customer wishes to run a high-traffic, professional, or business website an oversold hosting account can be detrimental. In these cases, a shared hosting provider that does not oversell, a virtual private server or dedicated server is a preferred option.
http://en.wikipedia.org/wiki/Overselling

Drop registrar

A drop registrar is a domain name registrar who registers expiring Internet domain names immediately after they expire and are deleted by the domain name registry. A drop registrar will typically use automated software to send up to 250 simultaneous domain name registration requests in an attempt to register the domain name first.[1] They usually work for a domain back-order service, and receive a percentage of the final auction price.

Note that a registrar is not considered a drop registrar if they also sell domains retail or through resellers. Individuals also try their hand at acquiring expiring domain names by paying for lists of domains that are going to expire in the very near future, unless the current owner decides to renew their registration with the registrar. There are sites that do this for free expiringlookup.com [1], the down side is that everyone will have the list as well.
http://en.wikipedia.org/wiki/Drop_registrar

Open source templates

The rise of the open source design movement has seen a slow but steady rise in the community of open source designers. Some sites offer open source templates in addition to other content.

As of this writing there are over 4000 unique templates available for modification and use by anyone under various open source licenses.
http://en.wikipedia.org/wiki/Web_template

Reusability

Not all potential users of web templates have the willingness and ability to hire developers to design a system for their needs. Additionally, some may wish to use the web but have limited or no technical proficiency. For these reasons, a number of developers and vendors have released web templates specifically for reuse by non-technical people. Although web template reusability is also important for even highly-skilled and technically experienced developers, it is especially critical to those who rely on simplicity and "ready-made" web solutions.

Such "ready-made" web templates are sometimes free, and easily made by an individual domestically. However, specialized web templates are sometimes sold online. Although there are numerous commercial sites that offer web templates for a licensing fee, there are also free and "open-source" sources as well.
http://en.wikipedia.org/wiki/Web_template

Flexible presentation

One major rationale behind "effective separation" is the need for maximum flexibility in the code and resources dedicated to the presentation logic. Client demands, changing customer preferences and desire to present a "fresh face" for pre-existing content often result in the need to dramatically modify the public appearance of web content while disrupting the underlying infrastructure as little as possible.

The distinction between "presentation" (front end) and "business logic" (infrastructure) is usually an important one, because:

the presentation source code language may differ from other code assets;
the production process for the application may require the work to be done at separate times and locations;
different workers have different skill sets, and presentation skills do not always coincide with skills for coding business logic;
code assets are easier to maintain and more readable when disparate components are kept separate and loosely coupled;
http://en.wikipedia.org/wiki/Web_template

Effective separation

A common goal among experienced web developers is to develop and deploy applications that are flexible and easily maintainable. An important consideration in reaching this goal is the separation of business logic from presentation logic Developers use web template systems (with varying degrees of success) to maintain this separation.

One difficulty in evaluating this separation is the lack of well-defined formalisms to measure when and how well it is actually met. There are, however, fairly standard heuristics that have been borrowed from the domain of software engineering. These include 'inheritance' (based on principles of object-oriented programming); and the 'Templating and Generative programming', (consistent with the principles of MVC separation).[2] The precise difference between the various guidelines is subject to some debate, and some aspects of the different guidelines share a degree of similarity
http://en.wikipedia.org/wiki/Web_template

Template Uses

Web templates can be used by any individual or organization to set up their website. Once a template is purchased or downloaded, the user will replace all generic information included in the web template with their own personal, organizational or product information. Templates can be used to:

Display personal information or daily activities as in a blog.
Sell products on-line.
Display information about a company or organization.
Display family history.
Display a gallery of photos.
Place music files such as mp3 on line for play through a web browser.
Place videos on-line for public viewing.
To setup a private login area on-line.



http://en.wikipedia.org/wiki/Web_template

Web template

A web template is a tool used to separate content from presentation in web design, and for mass-production of web documents. It is a basic component of a web template system.

Web templates can be used to set up any type of website. In its simplest sense, a web template operates similarly to a form letter for use in setting up a website.
http://en.wikipedia.org/wiki/Web_template

Linked data page

A Linked Data page is a Web page that explicitly describes one or more Things (Objects) via Hyperdata Links. Unlike traditional Web page Hypertext links, Hyperdata links expose Properties (Attributes or Relationships) and Property Values associated with the Things in a page. Linked Data pages can be created by hand or generated dynamically (via a Linked Data Server or User Agent).

You only need a single reference (a URI or IRI) to a Thing to experience the power of serendipitous data discovery offered by a Linked Data page.

http://en.wikipedia.org/wiki/Linked_data_page

Guestbook

A guestbook is a logging system that allows visitors of a website to leave a public comment. Traditionally, the term applied to the actual ledgers held, for that same purpose, at weddings, B&Bs and museums.

It is possible in some guestbooks for visitors to express their thoughts about the site or its subject. Generally, they do not require the poster to create a user account, as it is an informal method of dropping off a quick message. The purpose of a website guestbook is to display the kind of visitors the site gets, including the part of the world they reside in, and gain feedback from them. This allows the webmaster to assess and improve their site.

A guestbook is generally a script, which is usually remotely-hosted and written in a language such as Perl, PHP, Python or ASP. Many free guestbook hosts and scripts exist.

Often, e-mail addresses, the visitor's site's URLs and IP addresses are collected, and sometimes published. A guestbook is not intended to be a place for discussion. Due to this, a guestbook is different from a chat room (which is more or less realtime communication), or an Internet forum (which is intended to be a location for discussions), or a blog (which is intended for regular updates and more involved exchanges).

http://en.wikipedia.org/wiki/Guestbook

Domain name confusion

Intercapping is often used to clarify the meaning of a domain name. However, DNS is case-insensitive, and some names may be misinterpreted when converted to lowercase. For example: Who Represents, a database of artists and agents, chose whorepresents.com; a therapists' network thought therapistfinder.com looked good; and another website operating as of August 2007, cummingfirst.com, website of the Cumming First United Church in Cumming, GA and powergenitalia.com, a website for an Italian Power Generator company. In such situations, the proper wording can be clarified by use of hyphens. For instance, Experts Exchange, the programmers' site, for a long time used expertsexchange.com, but ultimately changed the name to experts-exchange.com.

Leo Stoller threatened to sue the owners of StealThisEmail.com on the basis that, when read as stealthisemail.com, it infringed on claimed trademark rights to the word "stealth". There is no word mark for "stealth" in the USPTO trademark database and Leo Stoller's trademarks on this term were canceled.
http://en.wikipedia.org/wiki/Domain_name

Branding with a domain name

Brands are greatly affected by the ability of the company to obtain the matching domain name. If a company builds a brand around a name to which it does not own the domain name, it can end up directing traffic to another domain owner's site. If it is a competitor, this would be a problem.

Today's advertising development of a great brand is strictly confined to the availability to synchronize the brand with a domain name. Any confusion might result in a competitor gaining valuable internet traffic and possible customers.
http://en.wikipedia.org/wiki/Domain_name

Popular domain prefixes - "E" and "I"

In addition to a domain placing value on the shortness of the word, ease in spelling, commercial appeal, and organic capacity to generate natural traffic, today's domain names are being valued for the branding potential. The domain name sale iReport although not an organic or dictionary term alone, is actually preferred as a highly brandable term, in that it is has a popular pre-fix "i" which indicates the "report" to be online.

The prefixes and dashes between words were once considered second, but now due to brandability, if the term is a commercial term, a prefix is often preferred. Example eLoans markets with an e to indicate to its potential customers that a loan may be obtained online.

The two primary prefixes are "E", for electronic, and "I", for Internet. Both indicate the word or phrase to be accessible online. Because of that, in terms of branding, an i or e combined with a commercial term are highly desirable. In domain sales typically an e has been preferred, and i slightly less in terms of demand. eBrooklyn sold for approximately $2500 whereas once it would have been available to register at the price of a domain name (which ranges from $8 to $30 us dollars depending on the registrar). The rapidly increasing use of prefixes in conjunction with main dictionary and or commercial terms is here and for some predominantly internet based companies, or high technology, high profile companies, the prefix is now preferred.

One of the details that make a domain with a prefix more valuable for a brand, is the ability to simply promote the name without the use of ".com" in the promotion. If a domain owner had report.com he would be forced to use the .com to indicate it was on the net at that address, however a domain name with a one letter prefix does not need to use the ".com".

Someone could promote "iReport" as a brand, and assuming it was a world class brand, visitors would know they could find it at "iReport.com without seeing the .com. However if it was a .net, it would be wise to state iReport.net. This option to simply state the name of the company or entity is particularly valuable in that it is brief and clear in indicating that a report can be either made or found on the "i"nternet.

eLoans similarly does not have to state "eLoans.com". eLoans, in the minds of most is clearly an online entity offering electronic loan applications.

Some alternative domains that avoid the use of ".com" in their promotion are "WebMD" as the word web as a prefix suffice to indicate the information is online and likely at a .com extension.

http://en.wikipedia.org/wiki/Domain_name

Resale of domain names

Various factors influence the perceived value or market value of a domain name. They include 1) the natural or "organic" traffic that can be attributed to web surfers typing in a domain name in their web browser as opposed to doing a search for the site through a search engine. 2) Branding Opportunity. The ability to have a term recognized and easily recalled as a brand for a company or entity. 3) Re-sale value. The ability to spot trends and predict the value of a name based on its length (short is preferred), clarity, and commercial use. The word loan is far more valuable than the word sunshine.

Generic domain names have sprung up in the last decade. Certain domains, especially those related to business, gambling, pornography, and other commercially lucrative fields of digital world trade have become very much in demand to corporations and entrepreneurs due to their importance in attracting clients.

The most expensive public sale of an Internet domain name to date, according to DNJournal, is porn.com which was sold in 2007 for $9.5 million cash.[citation needed]

There are disputes about the high values of domain names claimed and the actual cash prices of many sales such Business.com. Another high-priced domain name, sex.com, was stolen from its rightful owner by means of a forged transfer instruction via fax. During the height of the dot-com era, the domain was earning millions of dollars per month in advertising revenue from the large influx of visitors that arrived daily. The sex.com sale may have never been final as the domain is still with the previous owner. Also, that sale was not just a domain but an income stream, a web site, a domain name with customers and advertisers, etc. Two long-running U.S. lawsuits resulted, one against the thief and one against the domain registrar VeriSign [1]. In one of the cases, Kremen v. Network Solutions, the court found in favor of the plaintiff, leading to an unprecedented ruling that classified domain names as property, granting them the same legal protections. In 1999, Microsoft traded the name Bob.com with internet entrepreneur Bob Kerstein for the name Windows2000.com which was the name of their new operating system. [2]

One of the reasons for the value of domain names is that even without advertising or marketing, they attract clients seeking services and products who simply type in the generic name. This is known as Direct Navigation or Type-in Traffic. Furthermore, generic domain names such as movies.com (now owned by Disney) or Books.com (now owned by Barnes & Noble) are extremely easy for potential customers to remember, increasing the probability that they become repeat customers or regular clients. In the case of Movies.com, Disney has built a stand-alone portal featuring branded content. More and more large brands are beginning to employ a more comprehensive domain strategy featuring a portfolio of thousands of domains, rather than just one or two.

Although the current domain market is nowhere as strong as it was during the dot-com heyday, it remains strong and is currently experiencing solid growth again. [3] Annually tens of millions of dollars change hands in connection with the resale of domains. Large numbers of registered domain names lapse and are deleted each year. On average, more than 25,000 domain names drop (are deleted) every day.

It is important to remember that a domain (name, address) must be valued separately from the website (content, revenue) that it is used for. The high prices have usually been paid for the revenue that was generated from the website at the domain's address (URL.). The intrinsic value of a domain is the registration fee. It is difficult to appraise a current market value for a domain. The Fair Market Value of a domain can be anything from nearly nothing to millions of dollars. Factors involved may include previous sales data of similar domains, however a single letter difference can completely alter the value. The value of the domain (or any sum resp. division etc.) are usually added to the current or expected revenue from the web content (advertising, sales, etc.). The price of a domain (name + ext.) should not be confused with that of a website (content + revenue).

An estimate by an appraiser is always the addition of what they would like a domain to be worth together with the effective/expected/desired revenue from the web content. Some people put value on the length of the SLD (name) and other people prefer description capability, but the shorter an SLD is, the less descriptive it can be. Also, if short is crucial, then the TLD (extension) should be short too. It is less realistic to get a domain like LL.travel or LL.mobi than a domain travel. LL or mobi. LL. This illustrates the relativity of domain value estimation. It can be safely put that the revenue of web (content) can be easily stated, but that the value of a domain (SLD.TLD aka name.ext) is a matter of opinion and preference. In the end, however, any sale depends on the expectations of the domain seller and the domain buyer.

A webmaster creating a new web site either buys the domain name directly from a domain name registrar, or indirectly from a domain name registrar through a domainer. People who buy and sell domain names are known as domainers. People who sell value estimation services are known as appraisers.

http://en.wikipedia.org/wiki/Domain_name

Premium domain names

In the business of marketing domain names, "premium" domain names are often valuable, and have particular characteristics. For example, the names are short and memorable, or may contain words that are regularly searched on search engines, or keywords that help the name gain a higher ranking on search engines. They may contain generic words, so the word has more than one meaning, and they may contain common typos.
http://en.wikipedia.org/wiki/Domain_name

Unconventional domain names

Due to the rarity of one-word dot-com domain names, many unconventional domain names, domain hacks, have been gaining popularity. They make use of the top-level domain as an integral part of the Web site's title. Two popular domain hack Web sites are del.icio.us and blo.gs, which spell out "delicious" and "blogs", respectively.

Unconventional domain names are also used to create unconventional email addresses. Non-working examples that spell 'James' are j@m.es and j@mes.com, which use the domain names m.es (of Spain's .es) and mes.com, respectively.
http://en.wikipedia.org/wiki/Domain_name

Generic domain names

Within a particular TLD, parties are generally free to register an undelegated domain name on a first come, first served basis, resulting in Harris's lament, all the good ones are taken. For generic or commonly used names, this may sometimes lead to the use of a domain name which is inaccurate or misleading. This problem can be seen with regard to the ownership or control of domain names for a generic product or service. By way of illustration, there has been tremendous growth in the number and size of literary festivals around the world in recent years. In the current context, a generic domain name such as literary.org is available to the first literary festival organization that is able to obtain the registration, even if the festival in question is very young or obscure. Some critics argue that there is greater amenity in reserving such domain names for the use of, for example, a regional or umbrella grouping of festivals. Related issues may also arise in relation to noncommercial domain names.
http://en.wikipedia.org/wiki/Domain_name

Abuses

As domain names became interesting to marketers because of their advertising and marketing potential, rather than just being used to label Internet resources in a technical fashion, they began to be used in manners that in many cases did not reflect the intended purpose of the label of their top-level domain. As originally planned, the structure of domain names followed a hierarchy in which the TLD indicated the type of organization (commercial, governmental, etc.), and addresses would be nested down to third, fourth, or further levels to express complex structures, where, for instance, branches, departments and subsidiaries of a parent organization would have addresses in subdomains of the parent domain. Also, hostnames were originally intended to correspond to actual physical machines on the network, generally with only one name per machine.

As the World Wide Web became popular, site operators frequently wished to have memorable addresses, regardless of whether they fit properly into the structure; thus, because the .com domain was the most popular and therefore most prestigious, even noncommercial sites began to obtain domains directly within that gTLD, and many sites desired second-level domain names in .com, even if they were already part of a larger entity where a subdomain would have been logical (e.g., abcnews.com instead of news.abc.com).

Shorter, and therefore more memorable, domain names are thought to have more appeal. As a convenience methods were implemented to reduce the amount of typing required when entering a web site address into the location field of a web browser. A website found at ''http://www.example.org'' will often be advertised without the http://, since the HTTP protocol is implicitly assumed when referring to web sites. In many cases, web sites can be also be reached by omitting the www prefix, as in this given example. This feature is usually implemented in DNS by the website administrator. In the case of a .com, the website can sometimes be reached by just entering example (depending on browser versions and configuration settings, which vary in how they interpret incomplete addresses).

The popularity of domain names also led to uses which were regarded as abusive by established companies with trademark rights; this has become known as cybersquatting, in which a person registers a domain name that resembles a trademark in order to profit from visitors looking for that address. To combat this, various laws and policies were enacted to allow abusive registrations to be forcibly transferred, but these were sometimes themselves abused by overzealous companies committing reverse domain hijacking against domain users who had legitimate grounds to hold their names. Such legitimate uses could include the use of generic words that are contained within a trademark, but used in a particular context within the trademark, or their use in the context of fan or protest sites with free speech rights of their own.

As of 2008, the four major Registrars have all sub-contracted their expiring domain lists to certain reseller and auctioneer partnerships, for the purpose of keeping the domain name at the original registrar and continuing to extract revenue off the renewal of premium registered names. Since this policy is not explicitly banned at ICANN, the practice has become more commonplace and as a result, complaints from individual registrants about losing their domains has tracked higher over the past two years [1].

Laws that specifically address domain name conflicts include the Anticybersquatting Consumer Protection Act in the United States and the Trademarks Act of 1999 in India. Alternatively, domain registrants are bound by contract under the UDRP to comply with mandatory arbitration proceedings should someone challenge their ownership of a domain name.

http://en.wikipedia.org/wiki/Domain_name

Official assignment

The Internet Corporation for Assigned Names and Numbers (ICANN) has overall responsibility for managing the DNS. It administers the root domain, delegating control over each TLD to a domain name registry. For ccTLDs, the domain registry is typically installed by the government of that country. ICANN has a consultation role in these domain registries but cannot regulate the terms and conditions of how domain names are delegated in each of the country-level domain registries. On the other hand, the generic top-level domains (gTLDs) are governed directly under ICANN, which means all terms and conditions are defined by ICANN with the cooperation of each gTLD registry.

Domain names are often seen in analogy to real estate in that (1) domain names are foundations on which a website (like a house or commercial building) can be built and (2) the highest "quality" domain names, like sought-after real estate, tend to carry significant value, usually due to their online brand-building potential, use in advertising, search engine optimization, and many other criteria.

A few companies have offered low-cost, below-cost or even cost-free domain registrations with a variety of models adopted to recoup the costs to the provider. These usually require that domains be hosted on their website within a framework or portal that includes advertising wrapped around the domain holder's content, revenue from which allows the provider to recoup the costs. Domain registrations were free of charge when the DNS was new. A domain holder (often referred to as a domain owner) can give away or sell infinite number of subdomains under their domain name. For example, the owner of example.edu could provide subdomains such as foo.example.edu and foo.bar.example.edu to interested parties.

http://en.wikipedia.org/wiki/Domain_name

Second-level and lower level domains

Below the top-level domains in the domain name hierarchy are the second-level domain (SLD) names. These are the names directly to the left of .com, .net, and the other top-level domains. As an example, in the domain en.wikipedia.org, wikipedia is the second-level domain.

Next are third-level domains, which are written immediately to the left of a second-level domain. There can be fourth- and fifth-level domains, and so on, with virtually no limitation. An example of a working domain with four domain levels is www.sos.state.oh.us. The www preceding the domains is a host name of the World-Wide Web server. Each level is separated by a dot, or period symbol. 'sos' is said to be a sub-domain of 'state.oh.us', and 'state' a sub-domain of 'oh.us', etc. In general, Sub-domains are domains subordinate to their parent domain. An example of very deep levels of subdomain ordering are the IPv6 reverse resolution DNS zones, e.g., 1.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.ip6.arpa, which is the reverse DNS resolution domain for the IP address of a loopback interface, or the localhost name.

Second-level (or lower-level, depending on the established parent hierarchy) domain names are often created based on the name of a company (e.g., microsoft.com), product or service (e.g., gmail.com). Below these levels, the next domain name component has been used to designate a particular host server. Therefore, ftp.wikipedia.org might be an FTP server, www.wikipedia.org would be a World Wide Web server, and mail.wikipedia.org could be an email server, each intended to perform only the implied function. Modern technology allows multiple physical servers with either different (cf. load balancing) or even identical addresses (cf. anycast) to serve a single hostname or domain name, or multiple domain names to be served by a single computer. The latter is very popular in Web hosting service centers, where service providers host the websites of many organizations on just a few servers.
http://en.wikipedia.org/wiki/Domain_name

Examples

The following example illustrates the difference between a URL (Uniform Resource Locator) and a domain name:

URL: http://www.example.net/index.html
Domain name: www.example.net
Registered domain name: example.net
As a general rule, the IP address and the server name are interchangeable. For most Internet services, the server will not have any way to know which was used. However, the explosion of interest in the Web means that there are far more Web sites than servers. To accommodate this, the hypertext transfer protocol (HTTP) specifies that the client tells the server which name is being used. This way, one server with one IP address can provide different sites for different domain names. This feature goes under the name virtual hosting and is commonly used by Web hosts.

For example, as referenced in RFC 2606 (Reserved Top Level DNS Names), the server at IP address 208.77.188.166 handles all of the following sites:

example.com
www.example.com
example.net
www.example.net
example.org
www.example.org
When a request is made, the data corresponding to the hostname requested is provided to the user.
http://en.wikipedia.org/wiki/Domain_name

Defined

By definition (RFC 1034), domain names are restricted to the ASCII letters a through z (case-insensitive), the digits 0 through 9, and the hyphen, with some other restrictions in terms of name length and position of hyphens. Since this does not allow the use of many characters commonly found in non-English languages, and no multi-byte characters necessary for most Asian languages, the Internationalized domain name (IDN) system has been developed and is now in testing stage with a set of top-level domains established for this purpose.

The underscore character is frequently used to ensure that a domain name is not recognized as a hostname, as with the use of SRV records, for example, although some older systems such as NetBIOS did allow it. To avoid confusion and for other reasons, domain names with underscores in them are sometimes used where hostnames are required.

Domain names are often referred to simply as domains and domain name registrants are frequently referred to as domain owners, although domain name registration with a registrar does not confer any legal ownership of the name, only an exclusive right of use.
http://en.wikipedia.org/wiki/Domain_name

Domain name

The most basic functionality of a domain name is to provide symbolic representations, i.e., recognizable names, to mostly numerically addressed Internet resources. This abstraction allows any resource (e.g., website) to be moved to a different physical location in the address topology of the network, globally or locally in an intranet, in effect changing the IP address. This translation from domain names to IP addresses (and vice versa) is accomplished with the global facilities of Domain Name System (DNS).

By allowing the use of unique alphabetical addresses instead of numeric ones, domain names allow Internet users to more easily find and communicate with web sites and any other IP-based communications services. The flexibility of the domain name system allows multiple IP addresses to be assigned to a single domain name, or multiple domain names to be services from a single IP address. This means that one server may have multiple roles (such as hosting multiple independent websites), or that one role can be spread among many servers. One IP address can also be assigned to several servers, as used in anycast networking.
http://en.wikipedia.org/wiki/Domain_name

Dead link

A dead link (also called a broken link or dangling link) is a link on the World Wide Web that points to a web page or server that is permanently unavailable. The most common result of a dead link is a 404 error, which indicates that the web server responded, but the specific page could not be found. The browser may also return a DNS error indicating that a web server could not be found at that domain name. A link might also be dead because of some form of blocking such as content filters or firewalls.

Another type of dead link is a URL that points to a site unrelated to the content sought. This can sometimes occur when a domain name is allowed to lapse, and is subsequently reregistered by another party. Domain names acquired in this manner are attractive to those who wish to take advantage of the stream of unsuspecting surfers that will inflate hit counters and PageRanking.

Link rot is the process by which links on a website gradually become irrelevant or broken over time as sites they link to disappear, change content, or redirect to new locations.

Links specially crafted to not resolve, as a type of meme, are known as Zangelding, which roughly translated from Dutch means tangle thing. A zangelding is basically a list of self referencing broken links.

Dead links commonplace on the Internet can also occur on the authoring side, when website content is assembled, copied, or deployed without properly verifying the targets, or simply not kept up to date. Because broken links are to some very annoying, generally disruptive to the user experience, and can live on for many years, sites containing them are regarded as unprofessional.
http://en.wikipedia.org/wiki/Dead_link

Saving a web page

While one is viewing a web page, a copy of it is saved locally; this is what is being viewed. Depending on the browser settings, this copy may be deleted at any time, or stored indefinitely, sometimes without the user realizing it. Most GUI browsers will contain all the options for saving a web page more permanently. These include, but are not limited to:

Saving the rendered text without formatting or images - Hyperlinks are not identified, but displayed as plain text
Saving the HTML file as it was served - Overall structure will be preserved, although some links may be broken
Saving the HTML file and changing relative links to absolute ones - Hyperlinks will be preserved
Saving the entire web page - All images will be saved, as well as links being changed to absolute
Saving the HTML file including all images, stylesheets and scripts into a single MHTML file. This is supported by Internet Explorer, Mozilla, Mozilla Firefox and Opera. Mozilla and Mozilla Firefox only support this if the MAF plugin has been installed. An MHTML file is based upon the MHTML standard.
Common web browsers, like Mozilla Firefox, Internet Explorer and Opera, give the option to not only print the currently viewed web page to a printer, but optionally to "print" to a file which can be viewed or printed later. Some web pages are designed, for example by use of CSS, so that hyperlinks, menus and other navigation items, which will be useless on paper, are rendered into print with this in mind. Space-wasting menus and navigational blocks may be absent from the printed version; other hyperlinks may be shown with the link destinations made explicit, either within the body of the page or listed at the end.
http://en.wikipedia.org/wiki/Web_page