Pages

Thursday, February 25, 2010

Website

A website is a collection of related web pages, images, videos or other digital assets that are addressed relative to a common Uniform Resource Locator (URL), often consisting of only the domain name, or the IP address, and the root path ('/') in an Internet Protocol-based network. A web site is hosted on at least one web server, accessible via a network such as the Internet or a private local area network.

A web page is a document, typically written in plain text interspersed with formatting instructions of Hypertext Markup Language (HTML, XHTML). A web page may incorporate elements from other websites with suitable markup anchors.

Web pages are accessed and transported with the Hypertext Transfer Protocol (HTTP), which may optionally employ encryption (HTTP Secure, HTTPS) to provide security and privacy for the user of the web page content. The user's application, often a web browser, renders the page content according to its HTML markup instructions onto a display terminal.

All publicly accessible websites collectively constitute the World Wide Web.

The pages of a website can usually be accessed from a simple Uniform Resource Locator called the homepage. The URLs of the pages organize them into a hierarchy, although hyperlinking between them conveys the reader's perceived site structure and guides the reader's navigation of the site.

Some websites require a subscription to access some or all of their content. Examples of subscription sites include many business sites, parts of many news sites, academic journal sites, gaming sites, message boards, web-based e-mail, services, social networking websites, and sites providing real-time stock market data.

Overview

Organized by function, a website may be

- a personal website
- a commercial website
- a government website
- a non-profit organization website

It could be the work of an individual, a business or other organization, and is typically dedicated to some particular topic or purpose. Any website can contain a hyperlink to any other website, so the distinction between individual sites, as perceived by the user, may sometimes be blurred.

Websites are written in, or dynamically converted to, HTML (Hyper Text Markup Language) and are accessed using a software interface classified as a user agent. Web pages can be viewed or otherwise accessed from a range of computer-based and Internet-enabled devices of various sizes, including desktop computers, laptops, PDAs and cell phones.

A website is hosted on a computer system known as a web server, also called an HTTP server, and these terms can also refer to the software that runs on these systems and that retrieves and delivers the web pages in response to requests from the website users. Apache is the most commonly used web server software (according to Netcraft statistics) and Microsoft's Internet Information Server (IIS) is also commonly used.


Static website

A static website is one that has web pages stored on the server in the format that is sent to a client web browser. It is primarily coded in Hypertext Markup Language (HTML).

Simple forms or marketing examples of websites, such as classic website, a five-page website or a brochure website are often static websites, because they present pre-defined, static information to the user. This may include information about a company and its products and services via text, photos, animations, audio/video and interactive menus and navigation.

This type of website usually displays the same information to all visitors. Similar to handing out a printed brochure to customers or clients, a static website will generally provide consistent, standard information for an extended period of time. Although the website owner may make updates periodically, it is a manual process to edit the text, photos and other content and may require basic website design skills and software.

In summary, visitors are not able to control what information they receive via a static website, and must instead settle for whatever content the website owner has decided to offer at that time.

They are edited using four broad categories of software:

- Text editors, such as Notepad or TextEdit, where content and HTML markup are manipulated directly within the editor program
- WYSIWYG offline editors, such as Microsoft FrontPage and Adobe Dreamweaver (previously Macromedia Dreamweaver), with which the site is edited using a GUI interface and the final HTML markup is generated automatically by the editor software
- WYSIWYG online editors which create media rich online presentation like web pages, widgets, intro, blogs, and other documents.
- Template-based editors, such as Rapidweaver and iWeb, which allow users to quickly create and upload web pages to a web server without detailed HTML knowledge, as they pick a suitable template from a palette and add pictures and text to it in a desktop publishing fashion without direct manipulation of HTML code.

Dynamic website

A dynamic website is one that changes or customizes itself frequently and automatically, based on certain criteria.

Dynamic websites can have two types of dynamic activity: Code and Content. Dynamic code is invisible or behind the scenes and dynamic content is visible or fully displayed.

Dynamic code

The first type is a web page with dynamic code. The code is constructed dynamically on the fly using active programming language instead of plain, static HTML.

A website with dynamic code refers to its construction or how it is built, and more specifically refers to the code used to create a single web page. A dynamic web page is generated on the fly by piecing together certain blocks of code, procedures or routines. A dynamically-generated web page would call various bits of information from a database and put them together in a pre-defined format to present the reader with a coherent page. It interacts with users in a variety of ways including by reading cookies recognizing users' previous history, session variables, server side variables etc., or by using direct interaction (form elements, mouseovers, etc.). A site can display the current state of a dialogue between users, monitor a changing situation, or provide information in some way personalized to the requirements of the individual user.

Dynamic content

The second type is a website with dynamic content displayed in plain view. Variable content is displayed dynamically on the fly based on certain criteria, usually by retrieving content stored in a database.

A website with dynamic content refers to how its messages, text, images and other information are displayed on the web page, and more specifically how its content changes at any given moment. The web page content varies based on certain criteria, either pre-defined rules or variable user input. For example, a website with a database of news articles can use a pre-defined rule which tells it to display all news articles for today's date. This type of dynamic website will automatically show the most current news articles on any given date. Another example of dynamic content is when a retail website with a database of media products allows a user to input a search request for the keyword Beatles. In response, the content of the web page will spontaneously change the way it looked before, and will then display a list of Beatles products like CD's, DVD's and books.

Purpose of dynamic websites

The main purpose of a dynamic website is automation. A dynamic website can operate more effectively, be built more efficiently and is easier to maintain, update and expand. It is much simpler to build a template and a database than to build hundreds or thousands of individual, static HTML web pages.

More Info : http://en.wikipedia.org/wiki/Web_site

Thursday, February 18, 2010

Top SEO Mistakes

Targetting the wrong keywords

This is a mistake many people make and what is worse – even experienced SEO experts make it. People choose keywords that in their mind are descriptive of their website but the average users just may not search them. For instance, if you have a relationship site, you might discover that “relationship guide” does not work for you, even though it has the “relationship” keyword, while “dating advice” works like a charm. Choosing the right keywords can make or break your SEO campaign. Even if you are very resourceful, you can't think on your own of all the great keywords but a good keyword suggestion tool, for instance, the Website Keyword Suggestion tool will help you find keywords that are good for your site.

Ignoring the Title tag

Leaving the title tag empty is also very common. This is one of the most important places to have a keyword, because not only does it help you in optimization but the text in your title tag shows in the search results as your page title.


A Flash website without a html alternative

Flash might be attractive but not to search engines and users. If you really insist that your site is Flash-based and you want search engines to love it, provide an html version. Search engines don't like Flash sites for a reason – a spider can't read Flash content and therefore can't index it.
JavaScript Menus

Using JavaScript for navigation is not bad as long as you understand that search engines do not read JavaScript and build your web pages accordingly. So if you have JavaScript menus you can't do without, you should consider build a sitemap (or putting the links in a noscript tag) so that all your links will be crawlable.

Concentrating too much on meta tags

A lot of people seem to think SEO is about getting your meta keywords and description correct! In fact, meta tags are becoming (if not already) a thing of the past. You can create your meta keywords and descriptions but don't except to rank well only because of this.

Using only Images for Headings

Many people think that an image looks better than text for headings and menus. Yes, an image can make your site look more distinctive but in terms of SEO images for headings and menus are a big mistake because h2, h2, etc. tags and menu links are important SEO items.

Ignoring URLs

Many people underestimate how important a good URL is. Dynamic page names are still very frequent and no keywords in the URL is more a rule than an exception. Yes, it is possible to rank high even without keywords in the URL but all being equal, if you have keywords in the URL (the domain itself, or file names, which are part of the URL), this gives you additional advantage over your competitors. Keywords in URLs are more important for MSN and Yahoo! but even with Google their relative weight is high, so there is no excuse for having keywordless URLs.

Backlink spamming

It is a common delusion that it more backlinks are ALWAYS better and because of this web masters resort to link farms, forum/newgroup spam etc., which ultimately could lead to getting their site banned. In fact, what you need are quality backlinks.

Lack of keywords in the content

Once you focus on your keywords, modify your content and put the keywords wherever it makes sense. It is even better to make them bold or highlight them.

How to Build Backlinks

It is out of question that quality backlinks are crucial to SEO success. More, the question is how to get them. While with on-page content optimization it seems easier because everything is up to you to do and decide, with backlinks it looks like you have to rely on others to work for your success. Well, this is partially true because while backlinks are links that start on another site and point to yours, you can discuss with the Web master of the other site details like the anchor text, for example. Yes, it is not the same as administering your own sites – i.e. you do not have total control over backlinks – but still there are many aspects that can be negotiated.

Getting Backlinks the Natural Way

The idea behind including backlinks as part of the page rank algorithm is that if a page is good, people will start linking to it. And the more backlinks a page has, the better. But in practice it is not exactly like this. Or at least you cannot always rely on the fact that your contents is good and people will link to you. Yes, if your content is good and relevant you can get a lot of quality backlinks, including from sites with similar topic as yours (and these are the most valuable kind of backlinks, especially if the anchor text contains your keywords) but what you get without efforts could be less than what you need to successfully promote your site. So, you will have to resort to other ways of acquiring quality backlinks as described next.

Ways to Build Backlinks

Even if plenty of backlinks come to your site the natural way, additional quality backlinks are always welcome and the time you spend building them is not wasted. Among the acceptable ways of building quality backlinks are getting listed in directories, posting in forums, blogs and article directories. The unacceptable ways include inter-linking (linking from one site to another site, which is owned by the same owner or exists mainly for the purpose to be a link farm), linking to spam sites or sites that host any kind of illegal content, purchasing links in bulk, linking to link farms, etc.

The first step in building backlinks is to find the places from which you can get quality backlinks. A valuable assistant in this process is the Backlink Builder tool. When you enter the keywords of your choice, the Backlink Builder tool gives you a list of sites where you can post an article, message, posting, or simply a backlink to your site. After you have the list of potential backlink partners, it is up to you to visit each of the sites and post your content with the backlink to your site in it.

You might wonder why sites as those, listed by the Backlink Builder tool provide such a precious asset as backlinks for free. The answer is simple – they need content for their site. When you post an article, or submit a link to your site, you do not get paid for this. You provide them for free with something they need – content – and in return they also provide you for free with something you need – quality backlinks. It is a free trade, as long as the sites you post your content or links are respected and you don't post fake links or content.

Getting Listed in Directories

If you are serious about your Web presence, getting listed in directories like DMOZ and Yahoo is a must – not only because this is a way to get some quality backlinks for free, but also because this way you are easily noticed by both search engines and potential visitors. Generally inclusion in search directories is free but the drawback is that sometimes you have to wait a couple of months before you get listed in the categories of your choice.

Forums and Article Directories

Generally search engines index forums so posting in forums and blogs is also a way to get quality backlinks with the anchor text you want. If the forum or blog is a respected one, a backlink is valuable. However, in some cases the forum or blog administrator can edit your post, or even delete it if it does not fit into the forum or blog policy. Also, sometimes administrators do not allow links in posts, unless they are relevant ones. In some rare cases (which are more an exception than a rule) the owner of a forum or a blog would have banned search engines from indexing it and in this case posting backlinks there is pointless.

While forum postings can be short and do not require much effort, submitting articles to directories can be more time-consuming because generally articles are longer than posts and need careful thinking while writing them. But it is also worth and it is not so difficult to do.

Content Exchange and Affiliate Programs

Content exchange and affiliate programs are similar to the previous method of getting quality backlinks. For instance, you can offer to interested sites RSS feeds for free. When the other site publishes your RSS feed, you will get a backlink to your site and potentially a lot of visitors, who will come to your site for more details about the headline and the abstract they read on the other site.

Affiliate programs are also good for getting more visitors (and buyers) and for building quality backlinks but they tend to be an expensive way because generally the affiliate commission is in the range of 10 to 30 %. But if you have an affiliate program anyway, why not use it to get some more quality backlinks?

News Announcements and Press Releases

Although this is hardly an everyday way to build backlinks, it is an approach that gives good results, if handled properly. There are many sites (for instance, here is a list of some of them) that publish for free or for a fee news announcements and press releases. A professionally written press release about an important event can bring you many, many visitors and the backlink from a respected site to yours is a good boost to your SEO efforts. The tricky part is that you cannot release press releases if there is nothing newsworthy. That is why we say that news announcements and press releases are not a commodity way to build backlinks.

Backlink Building Practices to Avoid

One of the practices that is to be avoided is link exchange. There are many programs, which offer to barter links. The principle is simple – you put a link to a site, they put a backlink to your site. There are a couple of important things to consider with link exchange programs. First, take care about the ratio between outbound and inbound links. If your outbound links are times your inbound, this is bad. Second (and more important) is the risk that your link exchange partners are link farms. If this is the case, you could even be banned from search engines, so it is too risky to indulge in link exchange programs.

Linking to suspicious places is something else that you must avoid. While it is true that search engines do not punish you if you have backlinks from such places because it is supposed that you have no control over what bad guys link to, if you enter a link exchange program with the so called bad neighbors and you link to them, this can be disastrous to your SEO efforts. For more details about bad neighbors, check the Bad Neighborhood article. Also, beware of getting tons of links in a short period of time because this still looks artificial and suspicious.

Wednesday, February 17, 2010

Google search

Google search is a web search engine owned by Google Inc. and is the most-used search engine on the Web. Google receives several hundred million queries each day through its various services. Google search was originally developed by Larry Page and Sergey Brin in 1997.

Google Search provides more than 22 special features beyond the original word-search capability. These include synonyms, weather forecasts, time zones, stock quotes, maps, earthquake data, movie showtimes, airports, home listings, and sports scores. (see below: Special features). There are special features for numbers including prices, temperatures, money/unit conversions ("10.5 cm in inches"), calculations ( 3*4+sqrt(6)-pi/2 ), package tracking, patents, area codes, and rudimentary language translation of displayed pages.

A Google search-results page is ordered by a priority rank called a "PageRank" which is kept secret to prevent spammers from forcing their pages to the top. Google Search provides many options for customized search (see below: Search options), such as: exclusion ("-xx"), inclusion ("+xx"), alternatives ("xx OR yy"), and wildcard ("x * x").

The search engine

Google's algorithm uses a patented system called PageRank to help rank web pages that match a given search string. The PageRank algorithm computes a recursive score for web pages, based on the weighted sum of the PageRanks of the pages linking to them. The PageRank derives from human-generated links, and is thought to correlate well with human concepts of importance. The exact percentage of the total of web pages that Google indexes is not known, as it is very hard to actually calculate. Previous keyword-based methods of ranking search results, used by many search engines that were once more popular than Google, would rank pages by how often the search terms occurred in the page, or how strongly associated the search terms were within each resulting page. In addition to PageRank, Google also uses other secret criteria for determining the ranking of pages on result lists, reported to be over 200 different indicators.

Search results

Google not only indexes and caches web pages but also takes "snapshots" of other file types, which include PDF, Word documents, Excel spreadsheets, Flash SWF, plain text files, and so on. Except in the case of text and SWF files, the cached version is a conversion to (X)HTML, allowing those without the corresponding viewer application to read the file.

Users can customize the search engine, by setting a default language, using the "SafeSearch" filtering technology and set the number of results shown on each page. Google has been criticized for placing long-term cookies on users' machines to store these preferences, a tactic which also enables them to track a user's search terms and retain the data for more than a year. For any query, up to the first 1000 results can be shown with a maximum of 100 displayed per page.

indexable data

Despite its immense index, there is also a considerable amount of data available in online databases which are accessible by means of queries but not by links. This so-called invisible or deep Web is minimally covered by Google and other search engines. The deep Web contains library catalogs, official legislative documents of governments, phone books, and other content which is dynamically prepared to respond to a query.

Privacy in some countries forbids the showing of some links. For instance in Switzerland every private person can force Google Inc. to delete a link, which contains its name.

Google optimization

Since Google is the most popular search engine, many webmasters have become eager to influence their website's Google rankings. An industry of consultants has arisen to help websites increase their rankings on Google and on other search engines. This field, called search engine optimization, attempts to discern patterns in search engine listings, and then develop a methodology for improving rankings to draw more searchers to their client's sites.

Search engine optimization encompasses both "on page" factors (like body copy, title elements, H1 heading elements and image alt attribute values) and Off Page Optimization factors (like anchor text and PageRank). The general idea is to affect Google's relevance algorithm by incorporating the keywords being targeted in various places "on page", in particular the title element and the body copy (note: the higher up in the page, presumably the better its keyword prominence and thus the ranking). Too many occurrences of the keyword, however, cause the page to look suspect to Google's spam checking algorithms.

Google has published guidelines for website owners who would like to raise their rankings when using legitimate optimization consultants.

Search engine optimization (SEO)

Search engine optimization (SEO) is the process of improving the volume or quality of traffic to a web site from search engines via "natural" or un-paid ("organic" or "algorithmic") search results as opposed to search engine marketing (SEM) which deals with paid inclusion. Typically, the earlier (or higher) a site appears in the search results list, the more visitors it will receive from the search engine. SEO may target different kinds of search, including image search, local search, video search and industry-specific vertical search engines. This gives a web site web presence.

As an Internet marketing strategy, SEO considers how search engines work and what people search for. Optimizing a website primarily involves editing its content and HTML and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines.

The acronym "SEO" can refer to "search engine optimizers," a term adopted by an industry of consultants who carry out optimization projects on behalf of clients, and by employees who perform SEO services in-house. Search engine optimizers may offer SEO as a stand-alone service or as a part of a broader marketing campaign. Because effective SEO may require changes to the HTML source code of a site, SEO tactics may be incorporated into web site development and design. The term "search engine friendly" may be used to describe web site designs, menus, content management systems, images, videos, shopping carts, and other elements that have been optimized for the purpose of search engine exposure.

Another class of techniques, known as black hat SEO or spamdexing, use methods such as link farms, keyword stuffing and article spinning that degrade both the relevance of search results and the user-experience of search engines. Search engines look for sites that employ these techniques in order to remove them from their indices.