Tuesday, December 14, 2010

How To Register Google Webmaster Tools

For Google Webmaster Tools

1) Register for a new Google account

2) Once you have a Google account, you will need to register your site in Google’s Webmaster Tools

3) Once you are logged into Webmaster Tools, enter your site into the Add Site box and click on “Add Site.”

4) Now you will need to verify that you have control of the site. Click on “Verify Site” to go to the verification page.

5) Choose your verification method. You will either choose to upload a small file or add a new Meta tag to the code on the home page.

To upload a verification file:

- Choose “Upload an HTML file” from the dropdown box.
- Open Notepad (or your preferred text editing program) and save a blank file with the file name provided by Google.
- Upload this file into the root of your site. This will be the same place that your index.htm or index file will be saved.
- Once the file is uploaded, return to the verification screen and choose “Upload an HTML file” from the dropdown box. Click on the “Verify” button at the bottom of the page.
- Please note that the verification file must stay in place for Google to continue providing complete information in the Webmaster Tools.

To add a meta tag:

- Choose “Add a meta tag” from the dropdown box.
- Copy the Meta tag provided and paste it into the area on your homepage. Please note that if you are using a template on your site, this Meta can be placed on all of the pages without causing any problems.
- Once the verification Meta is in place on the site, return to the verification screen and choose “Add a meta tag” from the dropdown box.
- Click on the “Verify” button at the bottom of the page.
- Please note that the verification Meta must stay in place for Google to continue providing complete information in the Webmaster Tools.

Sunday, November 21, 2010

Cloaking

Cloaking is a black hat search engine optimization (SEO) technique in which the content presented to the search engine spider is different to that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page. The purpose of cloaking is to deceive search engines so they display the page when it would not otherwise be displayed.

As of 2006, better methods of accessibility, including progressive enhancement are available, so cloaking is not considered necessary by proponents of that method[who?]. Cloaking is often used as a spamdexing technique, to try to trick search engines into giving the relevant site a higher ranking; it can also be used to trick search engine users into visiting a site based on the search engine description which site turns out to have substantially different, or even pornographic content. For this reason, major search engines consider cloaking for deception to be a violation of their guidelines, and therefore, they delist sites when deceptive cloaking is reported.

Cloaking is a form of the doorway page technique.

A similar technique is also used on the Open Directory Project web directory. It differs in several ways from search engine cloaking:

* It is intended to fool human editors, rather than computer search engine spiders.
* The decision to cloak or not is often based upon the HTTP referrer, the user agent or the visitor's IP; but more advanced techniques can be also based upon the client's behaviour analysis after a few page requests: the raw quantity, the sorting of, and latency between subsequent HTTP requests sent to a website's pages, plus the presence of a check for robots.txt file, are some of the parameters in which search engines spiders differ heavily from a natural user behaviour. The referrer tells the URL of the page on which a user clicked a link to get to the page. Some cloakers will give the fake page to anyone who comes from a web directory website, since directory editors will usually examine sites by clicking on links that appear on a directory web page. Other cloakers give the fake page to everyone except those coming from a major search engine; this makes it harder to detect cloaking, while not costing them many visitors, since most people find websites by using a search engine.


Friday, October 22, 2010

Spam and Illegitimate SEO Techniques

Google doesn’t appreciate being fooled, and once it discovers websites using inappropriate optimization techniques it may also punish them by reducing the website’s ranking and even removing the website from its search results altogether. If you’re not sure that what you’re doing is acceptable SEO practice or not, keep in mind the one golden rule: If it’s good for your users, it’s good for Google. Incorporating elements that are for Google’s bot eyes alone usually leads to fishy results. The following is a list of ILLEGITIMATE SEO practices. Here’s what you SHOULD NOT be doing:

Using Redirects to Manipulate Google Page Rank

An illegitimate redirect is a one that occurs automatically when you approach a certain URL. As you click on the link to that site, the page URL (address) will appear for a short while and then automatically redirect you to the main site. This technique is used to increase the number of times the website will appear in search results, as it will appear through different domains.

Google’s crawlers will see a different page than the users, fooling the robot into giving a false page rank.

Not all redirects are considered spam, there are several redirect types that Google accepts and acknowledges, which you can read about in other blog posts here, or through Google’s Webmaster Guidelines.

100% Frames

We’ve already discussed iframes here in this blog. Just like the redirect, and the golden principle of SEO, if what your users see is different than what the Google bot sees, it’s a problem. A 100% frame page is a page covered completely by a frame that consists of different content from the rest of the website. End result – Google sees one thing on which it bases its page ranking and the user sees something entirely different.

Just like the redirect, this enables spammers to index the same site over and over again under different domains. While the different domains may have different content and get ranked as a result of that content, the end user will find himself viewing the same main site.

Hiding Texts and Links

If a text is visible to search engines only it is considered spam. So what does Google consider hidden text?

* Any text written in the same color (or close to) as the web page background.
* Any text situated in an area of the page that has been defined as hidden or invisible using CSS.
* Extremely small fonts that are not legible to internet user.
* Any text that is being hidden behind an image.

While you might find this useful, particularly if you don’t want to overload your web design with texts, it may very well backfire at you. It may get you kicked out of the ranking game altogether, not to mention that clicking ctrl-a in the browser may reveal your texts anyway.

Other Illegitimate Practices to Avoid

Spamming the keywords – using the same keyword over and over without any real content involved.

Cloaking – this is a technological ruse. As you enter the website, the website issues a query inspecting your status. If you are discovered as a crawler you will end up seeing a different page than you would have reached as a regular user.

Doorway Pages – these are pages created solely to optimize for a specific word. The chosen keyword is repeated over and over again on this page, suggesting high relevance to search engines. This doorway page will either include a link to the main homepage or it will include an automatic redirect to the homepage. Either way, this doorway is considered unethical SEO practice.

Excessive Linking between Websites – an exaggerated amount of links between two sites. What’s considered exaggerated? Good question. There isn’t a specific number of links and it depends greatly on the balance of the rest of the content. There is higher risk of getting caught when the two sites use the same IP. In general, triangle linking is much better for SEO purposes. This means that if your website is site A, and you sent a link to site B, site B will link to site C and site C will send the link right back to you – site A. Another unethical practice involves a bombardment of links on a single page or website. Link farms are a particularly deplorable practice.

Selling Links for PR

Lately, websites that have integrated a practice where they sell links to other websites (meaning, website x pays website y to include a link to it) have been losing ground fast. This is done in order to increase the page rank and is also considered deceptive.

So, How Will You Be Discovered?

Search engines use three different methods to discover culprit websites. The first is technological. Search engine bots are programmed to uncover some of the more obvious deception techniques. When the crawler runs in to such cases it will raise a ‘red flag’. This will lead to a temporary PRO penalty (in Google). Usually these penalties are only temporary but in certain cases they may become permanent.

Google and the other search engines also encourage users to report unethical website promotion techniques. You can report other websites through a special web page dedicated to this subject. This is Google’s Spam Report page. You need to sign in to use it though.

Forums are another method of discovering SEO scams. Apparently Google employees read webmaster forums and if they run into something suspicious… they do something about it.

Friday, September 17, 2010

Thursday, September 9, 2010

Google Doodle mystery explained as new 'Instant' search results unveiled

Goggle has today announced new 'Instant' search results, introduced for users around the world.

Google searches will now predict what users are looking for from the very first letter they type.

Google's homepage will then switch instantly to a drop-down menu of results that continually updates as the user adds more letters to their search term.


Instant: Google searches will now predict what users are looking for from the very first letter typed


New feature: Google Vice President of Search Product and User Experience Marissa Mayer speaks during the launch of Google Instant

A post on the official Google Blog on Wednesday declared: 'Instant takes what you have typed already, predicts the most likely completion and streams results in real-time for those predictions - yielding a smarter and faster search that is interactive, predictive and powerful.

'The user benefits of Google Instant are many — but the primary one is time saved. Our testing has shown that Google Instant saves the average searcher two to five seconds per search.

'That may not seem like a lot at first, but it adds up. With Google Instant, we estimate that we'll save our users 11 hours with each passing second!'

Earlier, there had been numerous suggestions over what the new interactive Google homepage, used for the past two days, meant.

On Wednesday the homepage showed the Google logo in pale grey, with no sign of the interactive balloons which appeared without explanation yesterday.

However when a word was typed into the search engine the letters that make up the word Google one by one reverted to their classic colours.

Tuesday's 'Google Doodle' was a collection of coloured balls that moved away when you tried to pass your mouse over them.

Then, if left undisturbed, they 'settled' and reformed the Google logo.

Google regularly updates the way the way its name is displayed on the homepage to reflect historic dates, famous birthdays and other world events.

Normally, when a user clicks on the Doodle, the reason for the design is displayed as if you'd done a Google search for it.


Faster: Google predicts the new instant searches will save users valuable time

But unlike most Google Doodles, this week's balloon design did not appear to be referring to anything in particular - sparking wild speculation on the internet.

Some commentators correctly predicted the design was a teaser for a big announcement that Google was due to make.

Others pointed out that the design's release almost coincides with what would be Google's 12th birthday.

And others said it was simply an exercise in HTML5 - the latest version of the standard programming code for displaying content on the web.


Google's homepage yesterday was initially grey but gradually recaptured its colour


The Google logo begins to refill in its original colours as you type into the search box


Tuesday's Google Doodle was an interactive balloon design that did not appear to refer to anything in particular

Wednesday, September 8, 2010

Tuesday, August 17, 2010

Trackback submitter

Trackback Submitter is one of most popular link building tools used by spammers and lovers of black SEO. Developed by an unknown spammer from Europe in September 2006, Trackback Submitter became very popular because of its ability to bypass comment spam protection used on popular blogging systems like Wordpress.

Overview

Trackback Submitter is based on the PHP programming language and works on Apache powered servers which have cURL and Zend Optimizer installed. The script uses unknown functions to find thousands of blogs which have comments posting enabled. Once the software finds a related blog, it generates random comments automatically (if not pre-defined by user) and submits these comments including one or more backlinks to the spammer's websites. Then the tool determines if the comment was accepted successfully and if so, the blog URL is added to a database for later use.

Effectiveness

Once Trackback Submitter was covered on Search Engine Journal and other popular SEO blogs, its popularity increased dramatically and some blog owners were forced to disable trackback feature or moderate each comment to prevent this software from hacking their spam protection plugins like Spam Karma and others.

Friday, August 6, 2010

15 Link Building Methods, Get The Most Out Of Each Of Them

Content is king”, that’s what everyone says. And it’s true. However, content is useless if it doesn’t get read. When you’re a beginner and you’re not yet quite so known, the best thing you can do is to combine great content with good link building methods. I’ve compiled a list of 15 different methods of link building, each with a rating in three departments: difficulty to use, time consumed and quality of links that it generates. My advice would be to spend your link building efforts on methods that give High Quality backlinks. Yes, they might be much harder to obtain, but take my advice, between 100 backlinks in forum signatures and 1 backlink from a high profile blog, take the 1 backlink.

I’ve also attached some advices on improving your chances of getting good value for your link building efforts for each of the methods. Please feel free to contribute with advices in the comments if you have anything to add.

1) Forum Posts

  • Difficulty: Low – Read the thread, think of something to say and write it
  • Time Consumed: Low – Seriously, how long does it take to write a reply in a forum
  • Quality: Low – Most forums are seriously handicapped when it comes to SEO. A lot of similar threads, centered around the same core keywords, long and ugly URL’s, very few if any links pointing to threads, most just to the main page, duplicate content issues

Improve your chances:

Post in threads that make it to the front page of Digg or other social media sites.

Look for posts with high linkability value. For example, a thread on DigitalPoint about how a guy got banned has extremely low chances of getting links. A thread on how a guy made $5000 in a month using a new technique has better odds of getting some link love.

Post in sticky threads. They’re just 2 links away from the main page all the time and should get some good link juice.

2) Social Bookmarking

  • Difficulty: Low – now much to it. Just enter url, title, description and some tags.
  • Time Consumed: Low – first time is more time consuming as you make your accounts on the websites. After that it takes less then a minute for each social bookmarking site to save a link.
  • Quality: Low – a lot of them don’t give link juice. But even if they do, the tags that are linked from the front page are the most popular, so plenty of links are entered there all the time. Your link will be slipping to the back pages and moving their position all the time, from the first page of that tag to the 5th, 10th, 20th page and so on.

Improve your chances:

Create your own tag. For example, instead of submitting all your links to the link building tag, submit them to the “link building links” tag. On less used sites, your tag could end up being linked from the front page all the time, if you save enough links in that tag. Your links would be just 1 link away from the front page this way.

Add as many tags as possible to every link you submit, as long as they are relevant to the subject of the article.

3) Social Media

  • Difficulty: High – building up your profile, becoming a top user, getting friends, writing good linkbait. None of these come easy.
  • Time Consumed: High – writing good linkbait is time consuming. Not everyone has the luxury of posting funny or cute photos on their blog.
  • Quality: High – if you get to the front page you get a good number of links usually, depending on your subject. Also, the page where your link is listed can become a PR4-5 on Digg.

Improve your chances:

Build up a good profile on a niche social media site that is suited to your blog’s subject. Less traffic, but more likely to subscribe or be interested in the rest of your articles.

Network with other bloggers, and once in a while, ask them to vote your story, if indeed its front page worthy. A story with a number of initial votes and a few comments is much more likely to be voted by other users of that social media site.

Work your ass off on your linkbait.


4) Guest Posts

  • Difficulty: Medium – writing a post for a blog with the same subject as yours should be easy to you.
  • Time Consumed: Medium – it needs to be good quality. You’re trying to get some of their readers to subscribe to your blog. Don’t waste this opportunity by writing a low key article.
  • Quality: High – if you pick the blog right, and you write a good article, it should bring you new subscribers, a link from a blog in the same niche as you and maybe links from other bloggers in the same niche.

Improve your chances:

Write linkbait in your guest posts. Some say that your best content should stay on your blog. I disagree. If you have 30 subscribers and you write a guest post for someone with 10,000 subscribers, it can bring you couple of hundred new subscribers if you play your cards right.

Prepare your blog for the incoming visitors. Before your guest post goes up, make sure that at least your last 2-3 articles are of great quality. Even better, make them part of a series, and if you know what the new visitors are interested in, then they should be much more inclined to subscribe in order to follow that series.

5) Interviews

  • Difficulty: Medium – approach a few bloggers in your niche that have a fair number of subscribers. Try and get interviews with people from new companies that generate a lot of hype. DealDotCom and BlogRush, a lot of people interested in them these days. Did anyone try to get an interview with some juicy details from people working for these two?
  • Time Consumed: Medium – study the subject, see what people are saying about them, what kind of questions they have. Make a list of questions and do a good quality interview. Be unique, don’t ask the same questions they’ve been asked before 100 times (study the previous interviews they gave).
  • Quality: High – Links from high profile blogs in your niche and maybe links from their readers.

Improve your chances:

Try and secure interviews with bloggers/people that people have talked about lately and that they might be curious about. If you can bring some extra details or another point of view on the subject it can be good linkbait

Don’t waste the opportunity. Think of what people might be interested in knowing about that guy, try to get some good tips from him, see what others failed to ask him before you. Don’t ask just generic questions.

6) Linkbait

  • Difficulty: High – it might come easy to SEOmoz or Aaron Wall, but for most people it will take a few tries before they get
  • it right. When you’re small and not a lot of people follow your blog it’s not that easy to get the linkbait out there. Make it appealing to others and work hard on promoting it.
  • Time Consumed: High – again, as a small blogger you have to put a lot of effort and time in your linkbait. Then comes the promotion part.
  • Quality: High – a lot of links from blogs writing on the same subject as you. Links from old and high authority domains if you get picked up by the media.

Improve your chances:

Think outside the box. Writing yet another list of link building methods doesn’t do much good anymore. Put a twist on it. Like this article does :P

Let other people know about it. Don’t contact high profile bloggers with each article you write. Once or twice a month, if you write a high quality article, you can send them a mail if they’re writing about the same thing. Contact Daniel from Daily Blog Tips or Kevin from Blogging Tips if you got a blogging tip (both great guys), or contact John Chow if you got a money making article.

7) Linking Out

  • Difficulty: Medium – the difficult part is not using a link to others in your articles, but actually making it part of a very good article when linking to high profile bloggers.
  • Time Consumed: Low – find a way to link out to other bloggers with every good article you write. If you’re doing research for an article, link to those that served as inspiration.
  • Quality: High – again, if they like your content and write about it, links from high profile bloggers in your niche.

Improve your chances:

Link to articles or about pages instead of the index. If they have to approve the trackback then they will probably come and see if your blog is trackback worthy. They usually have plenty of links to the index page and deep links are always good.

If you see an article that the blogger put a lot of effort into, but doesn’t get much reaction from his readers, link to it and recommend it if its good. He will probably appreciate more the attention on an article like that, then if you link together with 100 other people to a more popular one.

8) Link Exchanges

  • Difficulty: Medium – finding people that are wiling to do link exchanges in the same niche as you might be difficult for some, especially if you want good links
  • Time Consumed: Medium – it takes time to write all those emails, even if you have a template for it and just change the name. It doesn’t hurt to put a little effort in creating that email.
  • Quality: Medium – they are links from the same niche, but they’re sitewide blogroll links.

Improve your chances:

Link to them before you send the email. Let them know that you’ve already put them in your blogroll and if they like your blog they can do the same. Don’t be upset if they don’t want to exchange links. I’m not too crazy after blogroll link exchanges for example. They can become quite a long list of links on each of your pages, diluting the amount of link juice that you can give.

Don’t do link exchanges with everyone. Pick some bloggers with authority in your niche, that you know they’ll be there in the long run. Network a bit before you ask something like that and know when to ask. For example, you can network with Darren Rowse all you want, I still don’t think he’ll exchange links with you.

Offer some value with that link exchange. Gain some authority before you start sending emails to people asking for link exchanges.

9) Directory Submission

  • Difficulty: Low – completing forms is not exactly rocket science
  • Time Consumed: Low – takes a few minutes for each submission
  • Quality: Low – pages and pages full of links. If they’re general directories then you’ve got links from cars to how to make soap sites. Before supplemental results were removed I used to look at directories and almost all their pages were marked as supplemental. Not much value there.

Improve your chances:

Some directories still provide some value. Especially those that Google thinks that they’re actually taking care of who they let in, and they’re not in it just for the money. Niche directories might also bring some value.

10) Free Templates or Themes

  • Difficulty: High – you actually have to know how to make one and have a good eye at design
  • Time Consumed: High – it can take anywhere from a few hours to one week. Depends how much experience you have and how good the theme is.
  • Quality: Low/Medium – you do get some good links from articles announcing your theme, but most are footer links from bloggers with lower authority. The likes of John Chow, Shoemoney, Darren Rowse have custom themes. Authority bloggers that don’t have custom themes usually don’t change what they have for another free theme. They either go custom at some point, or they stay with what works for them right now. So your theme and your footer links will largely come from new blogs, that might take some time to get authority. Still, a very good theme can get a huge number of backlinks and it’s not that unusual to see PR5-6 blogs that got their PR from themes they released.

Improve your chances:

Study successful themes, make yours Adsense ready, SEO them. Talk with friends and other bloggers and see what they like to see in a theme, ask for feedback as you develop it.

Do it the Nate Whitehill way. One custom theme for John Chow and one redesign for Shoemoney got him $13,000 worth of orders. If he had made a free theme with similar layout as the one used by John Chow, he would’ve gotten a huge number of backlinks from those that try to be like John.

11) Create a WordPress Plugin

  • Difficulty: High – again, you have to know how to code, more so then with themes
  • Time Consumed: High – from a few hours to days, depends on how complex it is
  • Quality: Medium/High – if you make something that people need, it can bring you a ton of good quality links

Improve your chances:

See what other bloggers need. They might need an Adsense Plugin, a DoFollow plugin, A SEO plugin or simply a Buy me a beer plugin. If you manage to do something new that most people would embrace,
that plugin page will get linked to quite a lot.

Promote it. Plugin directories, blogs that announce new plugins and themes, blogs that are giving blogging tips. Contact those people and tell them what you’ve created.

12) Hold a Contest

  • Difficulty: Low – make the announcement, promote the contest, give away the prizes
  • Time Consumed: Low/Medium – it really depends on how successful it is. 200 entries would be quite a hand full, 5 of them not that much.
  • Quality: Depends – if you give $4000 worth of prizes away, then you might get the big bloggers as well. If you’re giving $5-$20 or just a few links and you’re not high PR, then the quality of links might be lower.

Improve your chances:

Let people know what value they’re getting if you’re not giving away money. Put banners up in your blog if you’re giving away free advertising space (like I did in my contest – check it out). Don’t ask too much as a condition to enter the contest. Unless you’re giving away hundreds of dollars worth of prizes, I wouldn’t ask for a full review or post dedicated just to your blog.

13) Create mini-blogs and link to your main blog

  • Difficulty: Low – writing shorter articles on the same topic as your main blog should be easy
  • Time Consumed: High – you do have to write a number of articles and maybe do some link building for it
  • Quality: Low – links from blogs in the same niche, but without much authority. The time spent writing articles for mini-blogs can be spent better if you use it to write pillar content for the main blog.

Improve your chances:

If you do decide to make mini blogs to support the main one, at least don’t use your hosting account because it would be the same IP class. Use Blogspot, Wordpress.com and other free blogging hosts. Create a mini blog on each one and write at least a few articles with links to your own blog in them.

One way of doing it is for linkbait that doesn’t belong on your main blog. Maybe you have an idea for a funny piece that wouldn’t be appreciated by your readers, but is still somehow related to your subject. Use a mini-blog and try to get it on Digg to gather some links.

14) Buying links

  • Difficulty: Low – not hard, you just need money and to know what you need
  • Time Consumed: Medium – contacting other sites/bloggers, looking for good pages
  • Quality: High – if you can afford to pay for them

Improve your chances:

Don’t go the Text Link Ads route. Everyone can see the blogs that are selling links there.

Instead, use Google and search for articles centered around your keywords. Look for the backlinks and see who linked to it, see what PR it has, how far away it is from the main page. Then contact the blogger/site owner and offer him money to transform that keyword from his article into a link to your blog. It might get expensive, but if you can manage to find articles about your keywords that were linked a lot, then getting a link there would be much better then getting a side-wide blogroll link.

15) Paid Reviews

  • Difficulty: Low – others write about you, not much effort there
  • Time Consumed: Medium – do it right. Study the blogger and his past paid reviews to see how he writes them.
  • Quality: High – articles dedicated just to your blogs, with your chosen keywords in them

Improve your chances:

Don’t just pick a blog and ask for a paid review. Study other paid reviews done by the blogger, see what he usually didn’t like and what he did. If that blogger doesn’t like Adsense and says it in every post (Tyler :P ) then take the ads out before you order that review. The blogger’s obligation is to tell his readers his real opinion, your duty is to get as much bang for the buck as possible. I’m a subscriber of Tyler’s blog for quite some time now, and it never ceases to amaze me that people still pay him for reviews without taking the Adsense out first. Adapt your blog to what that particular blogger that you’re paying likes to see. At least when it comes to ads or minor elements of design.

Also, make sure you’ve got some good posts up when he comes and after he gives his review. Again, make sure you get the most out of your money if you want to pay for a review.

Wednesday, August 4, 2010

List of search engines

List of search engines

This is a list of Wikipedia articles about search engines, including web search engines, selection-based search engines, metasearch engines, desktop search tools, and web portals and vertical market websites that have a search facility for online databases.

By content/topic

General

Ask.com (known as Ask Jeeves in the UK)
Baidu (Chinese, Japanese)
Bing (formerly MSN Search and Live Search)
Cuil
Duck Duck Go
Kosmix
Sogou (Chinese)
Yodao (Chinese)
Yandex (Russian)
Yebol
Geographical limited scope
Accoona, China/US
Alleba, Philippines
Ansearch, Australia/US/UK/NZ
Daum, Korea
Goo, Japan
Guruji.com, India
Leit.is, Iceland
Maktoob, Arab World
Onkosh, Arab World
Miner.hu, Hungary
Najdi.si, Slovenia
Naver, Korea
Rambler, Russia
Rediff, India
SAPO, Portugal/Angola/Cabo Verde/Mozambique
Search.ch, Switzerland
Sesam, Norway, Sweden
Seznam, Czech Republic
Walla!, Israel
Yandex, Russia
ZipLocal, Canada/US

Accountancy

IFACnet

Business

GlobalSpec
Nexis (Lexis Nexis)
Thomasnet (United States)
GenieKnows (United States and Canada)

Enterprise

AskMeNow: S3 - Semantic Search Solution
Concept Searching Limited: concept search products
Dieselpoint: Search & Navigation
dtSearch: dtSearch Engine (SDK), dtSearch Web
Endeca: Information Access Platform
Exalead: exalead one:enterprise
Expert System S.p.A.: Cogito
Fast Search & Transfer: Enterprise Search Platform (ESP), RetrievalWare (formerly Convera)
Funnelback: Funnelback Search
IBM: OmniFind Enterprise Edition
ISYS Search Software: ISYS:web, ISYS:sdk
Jumper 2.0: Universal search powered by Enterprise bookmarking
Microsoft: SharePoint Search Services
Northern Light
Open Text: Hummingbird Search Server, Livelink Search
Oracle Corporation: Secure Enterprise Search 10g
SAP: TREX
TeraText: TeraText Suite
Vivisimo: Vivisimo Clustering Engine
X1 Technologies : X1 Enterprise Search
ZyLAB Technologies: ZyIMAGE Information Access Platform

Mobile/Handheld

Taptu: taptu mobile/social search

Job

Main article: Job search engine
Bixee.com (India)
CareerBuilder.com (USA)
Craigslist (by city)
Dice.com (USA)
Eluta.ca (Canada)
Incruit (Korea)
Monster.com (USA), (India)
Naukri.com (India)
Yahoo! HotJobs (Countrywise subdomains, International)

Legal

WestLaw
Lexis (Lexis Nexis)
Quicklaw
Manupatra

Medical

Bioinformatic Harvester
Entrez (includes Pubmed)
EB-eye EMBL-EBI's Search engine
GenieKnows
GoPubMed (knowledge-based: GO - GeneOntology and MeSH - Medical Subject Headings)
Healia
Searchmedica
WebMD
PubGene
Nextbio (Life Science Search Engine)
VADLO (Life Sciences Search Engine)

News

Daylife
MagPortal
Newslookup
Nexis (Lexis Nexis)
Topix.net

People

PeekYou
InfoSpace
Spock
Spokeo
Wink
ZoomInfo

Real property

Rightmove

Television


Video Games

Wazap (Japan)

By information type
Search engines dedicated to a specific kind of information

Forum


Blog

Multimedia

Source code


BitTorrent

These search engines work across the BitTorrent protocol.


Email


Maps


Price

Google Product Search (formerly Froogle)
Kelkoo
MySimon
PriceGrabber
PriceRunner
PriceSCAN
Shopping.com
ShopWiki
Shopzilla (also operates Bizrate)
TheFind.com
Wishabi

Question and answer

Human answers

Uclue
Yahoo! Answers
Stack Overflow
DeeperWeb

Automatic answers

AskMeNow
BrainBoost
True Knowledge
Wolfram Alpha

Natural language

Bing (Semantic ability is powered by Powerset)
BrainBoost
hakia
Lexxe
Powerset

By model

Open source search engines

DataparkSearch
Egothor
Grub
Ht://dig
Isearch
Lucene
Lemur Toolkit & Indri Search Engine
mnoGoSearch
Namazu
Nutch
OpenFTS
Sciencenet (for scientific knowledge, based on YaCy technology)
Sphinx
SWISH-E

Terrier Search Engine

Wikia Search
Xapian
YaCy
Zettair

Semantic browsing engines

Evri
Hakia
Yebol

Social search engines

ChaCha Search
Delver
EarthFrisk.org
Eurekster
Mahalo.com
OneRiot
Rollyo
Sproose
Trexy
Wikia search
Wink provides web search by analyzing user contributions such as bookmarks and feedback

Metasearch engines

Brainboost
ChunkIt!
Clusty
Dogpile
Excite
Harvester42
HotBot
Info.com
Ixquick
Kayak
LeapFish
Mamma
Metacrawler
MetaLib
Mobissimo
Myriad Search
SideStep
Turbo10
WebCrawler
DeeperWeb
Visual search engines
ChunkIt!
Grokker

Methods of website linking

This article pertains to methods of hyperlinking to/of different websites, often used in regard to search engine optimization (SEO). Many techniques and special terminology about linking are described below.

Reciprocal link

A reciprocal link is a mutual link between two objects, commonly between two websites to ensure mutual traffic. have websites. If Bob's website links to Alice's website, and Alice's website links to Bob's website, the websites are reciprocally linked. Website owners often submit their sites to reciprocal link exchange directories, in order to achieve higher rankings in the search engines. Reciprocal linking between websites is an important part of the search engine optimization process because Google uses link popularity algorithms (defined as the number of links that led to a particular page and the anchor text of the link) to rank websites for relevancy.

Resource Linking

Resource Links are a category of links, which can be either one-way or two-way, usually referenced as "Resources" or "Information" in navbars, but sometimes, especially in the early, less compartmentalized years of the Web, simply called "links". Basically, they are hyperlinks to a website or a specific webpage containing content believed to be beneficial, useful and relevant to visitors of the site establishing the link.

In recent years, resource links have grown in importance because most major search engines have made it plain that -- in Google's words -- "quantity, quality, and relevance of links count towards your rating."

The engines' insistence on resource links being relevant and beneficial developed because many of the methods described elsewhere in this article -- free-for-all linking, link doping, incestuous linking, overlinking, multi-way linking -- and similar schemes were employed solely to "spam" search-engines, i.e. to "fool" the engines' algorithms into awarding the sites employing these unethical devices undeservedly high page ranks and/or return positions.

Despite cautioning site developers (again quoting from Google) to avoid "'free-for-all' links, link popularity schemes, or submitting your site to thousands of search engines (because) these are typically useless exercises that don't affect your ranking in the results of the major search engines -- at least, not in a way you would likely consider to be positive," most major engines have deployed technology designed to "red flag" and potentially penalize sites employing such practices.

Forum signature linking

Forum signature linking is a technique used to build backlinks to a website. This is the process of using forum communities that allow outbound hyperlinks in their member's signature. This can be a fast method to build up inbound links to a website; it can also produce some targeted traffic if the website is relevant to the forum topic. It should be stated that forums using the nofollow attribute will have no actual Search Engine Optimization value.

Blog comments

Leaving a comment on a blog can result in a relevant do-follow link to the individual's website. Most of the time, however, leaving a comment on a blog turns into a no-follow link, which is almost useless in the eyes of search engines, such as Google and Yahoo! Search. On the other hand, most blog comments get clicked on by the readers of the blog if the comment is well-thought-out and pertains to the discussion of the other commenters and the post on the blog.

Monday, July 26, 2010

Do Follow Blogs edu Auto Approve List

http://blogs.lynn.edu/afreshlook/2009/09/02/president-ross-and-the-new-blog-squad/

http://cyberlaw.stanford.edu/node/6242

http://bbnews.blog.usf.edu/2009/08/24/blackboard-external-urls-and-internet-explorer-8

http://www.iq.harvard.edu/blog/sss/archives/2009/09/grimmer_on_quan.shtml

http://blog.scad.edu/eco/2009/05/01/sustainability-council-screens-savannah-earth-day-festival-visitors/

http://studentsenate.rpi.edu/blog/show/16 – Comment directly in body, use ‘

http://iris.ebs.edu/accessdb/www/logistikblog.nsf/d6plinks/holm-vereinsgr%FCndung

http://news21.jomc.unc.edu/index.php/powering-a-nation-blog/eating-corn-eating-oil.html

http://blog.brookdalecc.edu/article.php?story=20081005224044757

http://www.stanford.edu/group/ccr/blog/2008/09/ocean_views_1.html

http://shauna.blog.usf.edu/2009/08/16/hello-from-scott

http://www.sca.ucla.edu/blog/2009/2/25/monkey-business.html – Comment directly in body, use ‘

http://www.darden.virginia.edu/html/blog-JimClawson.aspx?id=15836&blogid=294

http://bcnm.berkeley.edu/blog/2009/06/new-media-and-the-crisis-in-iran/

http://interactiondesign.sva.edu/blog/entry/the_human_race_jill_nussbaum
s_story_from_the_front/

http://library.duke.edu/blogs/scholcomm/2009/08/22/a-model-copyright-law

http://blog.fvsu.edu/2009/08/be-careful-when-social-networking/

http://www.csdhead.cs.cmu.edu/blog/2009/07/30/at-last-useful-social-networking

http://diva.sfsu.edu/blog/04-20-2009/using-content-outside-of-diva

http://www.sft.edu/blog/2009/07/15/so-much-for-type-casting/

http://blog.axehandle.org/2009/01/writing-about-black-panthers.html

http://connect.rhodes.edu/blog/tyler/

http://apps.career.colostate.edu/blog/archive/2009/09/09/telluride-film-festival-inspiration.aspx#feedback

http://oregonstate.edu/admissions/blog/2009/05/14/honorary-degree-recepient-jack-yoshihara-passes-away/

http://microsys.unity.ncsu.edu/blog/index.php?blog=7&title=meeting_with_dennis_6_29_2007&page=1&more=1&c=1&tb=1&pb=1&disp=single#c3363

http://valis.cs.uiuc.edu/blog/?p=2694

http://blog.library.villanova.edu/blog/2009/06/12/robin-bowles-appointed-life-and-health-sciences-librarian/

http://blogs.tamu.edu/jmpackard/2009/07/16/technical-writing-help/

http://www.gspm.gwu.edu/545_GSPM-Dean-Talks-Bi-Partisanship

http://blog.shimer.edu/shimer/2009/04/being-so-far-from-home-i-tend-to-miss-my-lil-oregon-a-lot-but-even-more-than-my-state-of-birth-i-miss-my-family-a-lot-ive.html

http://id.ome.ksu.edu/blog/2009/aug/12/seemingly-simple-assignment-digital-storytelling/

http://hcil.cs.umd.edu/localphp/hcil/vast/index.php/blog/comments/questions_about_the_challenge/

http://blog.shimer.edu/shimer/2009/09/a-bittersweet-adventure.html

http://www.marymountpv.edu/news-events/intentional-conversation/blog

http://blog.luthersem.edu/library/2009/02/kindle2-and-the-doom-of-libraries.html

http://scripts.mit.edu/~inventeams/blog/?p=29

Tuesday, June 29, 2010

Advantages and Disadvantages of Link Building

Advantages of Link Building:

Increases link popularity – Link popularity is the total number of websites that link to your website. So, total number of inbound links for your website will increase. The inbound links are the links from other websites to your website.

Increases page rank – Your website page rank will increase. If you get low PR inbound links, there is a possibility that your website PR gets reduced. So, getting high PR and relevant inbound links are very important here. Also, these inbound links are the very important factor that the search engines take into account for ranking the websites. So, be careful while getting inbound links.

Easy to get links in the future – As your page rank increases, it is very easy to get links in the future. Even it is very easy to get one-way links. As mentioned above PR is the very important factor and all the webmasters will check the PR before linking to the websites. So, they will get satisfy because of our high website PR and they will come and ask us to give link to their website from our website.

Increases search engine ranking – Your website rank in the search engines will increase. So, your website gets the top position in the search engines search results page. This depends on the link popularity and page rank. This depends on many other factors including other than link building process.

Increases website traffic – Traffic to your website will increase. So, you will get more number of relevant visitors to your website.

Increases sales – As more relevant visitors visit your website, there is a possibility that they will buy your products or services. So, your sales will increase.

Popularizes business – As many of the websites/authority websites link to your website, you can get into the users mind very easily as a branded website/company.

Cost effective Advertising – This is the very cost effective advertising for your business as you are going to advertise in the other websites for free.

Disadvantages of Link Building:

Chances of reduced sales – With reciprocal links method, you are encouraging your visitors to visit the other relevant websites/companies. This will lead to reduce your sales or services as they are navigating to other websites.

Time-Consuming – This is the time-consuming process because getting the sufficient number of inbound links to increase your website PR takes many months/years.

Webmaster is necessary – Webmaster is necessary to manage the link building process because it needs more follow-ups and of busy schedule of website owners.

Low inbound links PR reduces website PR – If you get the low PR inbound links unknowingly, then it will decrease your website PR. So, we should be very careful while getting inbound links.

Friday, May 14, 2010

Next Google PageRank update schedule

I am sure everyone who is reading this page must be aware about PageRank (PR) and its importance. Google PageRank adds value to website Search Engine Ranking (SERP). There are many other factors too like back links, website age, popularity and more which is important for SERP.

Pagerank have a range from 0-10. A site with PR 0 is least important and the site with PageRank- 10 is the most important in the eye of Google.


Google PageRank Update Schedule for the year 2010

Last PageRank (PR) Update 31 December 2009

First 2010 PageRank (PR) Update 31 March – 3 April 2010

Second 2010 PageRank (PR) Update 30 June – 7 July 2010

Third 2010 PageRank (PR) Update 31 September – 7 October 2010

Fourth 2010 PageRank (PR) Update 31 December – 2 January 2011


The above data is the assumption based on previous PR updates. Google Generally updates website or blog PageRank PR 4 times in a year. All these updates are major PR updates. Every month Google also do minor PageRank updates for every site.

Wednesday, May 5, 2010

Monday, April 26, 2010

High Page Rank Social Bookmarking List

This summary is not available. Please click here to view the post.

Sunday, April 18, 2010

Hyperlink

In computing, a hyperlink (or link) is a reference to a document that the reader can directly follow, or that is followed automatically. The reference points to a whole document or to a specific element within a document. Hypertext is text with hyperlinks. Such text is usually viewed with a computer. A software system for viewing and creating hypertext is a hypertext system. To hyperlink (or simply to link) is to create a hyperlink. A user following hyperlinks is said to navigate or browse the hypertext.

A hyperlink has an anchor, which is a location within a document from which the hyperlink can be followed; that document is known as its source document. The target of a hyperlink is the document, or location within a document, that the hyperlink leads to. The user can follow the link when its anchor is shown by activating it in some way (often, by touching it or clicking on it with a pointing device). Following has the effect of displaying its target, often with its context.

In some hypertext, hyperlinks can be bidirectional: they can be followed in two directions, so both points act as anchors and as targets. More complex arrangements exist, such as many-to-many links.

The most common example of hypertext today is the World Wide Web: webpages contain hyperlinks to webpages. For example, in an online reference work such as Wikipedia, many words and terms in the text are hyperlinked to definitions of those terms. Hyperlinks are often used to implement reference mechanisms that predate the computer, such as tables of contents, footnotes, bibliographies, indexes and glossaries.

The effect of following a hyperlink may vary with the hypertext system and sometimes on the link itself; for instance, on the World Wide Web, most hyperlinks cause the target document to replace the document being displayed, but some are marked to cause the target document to open in a new window. Another possibility is transclusion, for which the link target is a document fragment that replaces the link anchor within the source document. Not only persons browsing the document follow hyperlinks; they may also be followed automatically by programs. A program that traverses the hypertext following each hyperlink and gathering all the retrieved documents is known as a Web spider or crawling.

Saturday, April 17, 2010

Keyword stuffing

Keyword stuffing is considered to be an unethical search engine optimization (SEO) technique. Keyword stuffing occurs when a web page is loaded with keywords in the meta tags or in content. The repetition of words in meta tags may explain why many search engines no longer use these tags.

Keyword stuffing had been used in the past to obtain maximum search engine ranking and visibility for particular phrases. This method is completely outdated and adds no value to rankings today. In particular, Google no longer gives good rankings to pages employing this technique.

Hiding text from the visitor is done in many different ways. Text colored to blend with the background, CSS "Z" positioning to place text "behind" an image — and therefore out of view of the visitor — and CSS absolute positioning to have the text positioned far from the page center are all common techniques. By 2005, many invisible text techniques were easily detected by major search engines.

"Noscript" tags are another way to place hidden content within a page. While they are a valid optimization method for displaying an alternative representation of scripted content, they may be abused, since search engines may index content that is invisible to most visitors.

Sometimes inserted text includes words that are frequently searched (such as "sex"), even if those terms bear little connection to the content of a page, in order to attract traffic to advert-driven pages.

In the past, keyword stuffing was considered to be either a white hat or a black hat tactic, depending on the context of the technique, and the opinion of the person judging it. While a great deal of keyword stuffing was employed to aid in spamdexing, which is of little benefit to the user, keyword stuffing in certain circumstances was not intended to skew results in a deceptive manner. Whether the term carries a pejorative or neutral connotation is dependent on whether the practice is used to pollute the results with pages of little relevance, or to direct traffic to a page of relevance that would have otherwise been de-emphasized due to the search engine's inability to interpret and understand related ideas. This is no longer the case. Search engines now employ themed, related keyword techniques to interpret the intent of the content on a page.

With relevance to keyword stuffing, it is quoted by the largest of search engines that they recommend Keyword Research and use (with respect to the quality content you have to offer the web), to aid their visitors in the search of your valuable material. To prevent Keyword Stuffing you should wisely use keywords in respect with SEO, Search Engine Optimization. It could be best described as keywords should be reasonable and necessary, yet it is acceptable to assist with proper placement and your targeted effort to achieve search results. Placement of such words in the provided areas of HTML are perfectly allowed and reasonable. Google discusses keyword stuffing as Randomly Repeated Keywords

More info : http://en.wikipedia.org/wiki/Keyword_stuffing

Thursday, April 15, 2010

Free Press Release Sites List

24-7PressRelease.com – Free release distribution with ad-support

1888PressRelease.com – Free distribution, paid services gives you better placement and permanent archiving.

ClickPress.com – Distributs to sites like Google News and Topix.net, Gold level will also get you to sites like LexisNexis.

EcommWire.com – Focuses on ecommerece and requires you include an image, 3 keywords and links.

Express-Press-Release.com – Free distribution company with offices in 12 states.

Free-Press-Release.com – Easy press release distribution for free, more features for paid accounts.

Free-Press-Release-Center.info – Distributes your release, offers a web page with one keyword link to your site. Pro upgrade will give you three links, permanent archiving and more.

I-Newswire.com – Allows for free distribution to sites and search engines, premium membership differs only slightly in adding in graphics.

NewswireToday.com – All the usual free distribution tools, premium service includes logo, product picture and more.

PR.com – Not only will they distribute your press releases, but you can also set up a full company profile.

PR9.net – Ad supported press distribution site.

PR-Inside.com – European-based free press release distribution site.

PRBuzz.com – Completely free distribution to search engines, news sites, and blogs.

PRCompass.com – Distribute your press release with a free or paid version, others can vote it up ala Digg (Digg) style.

PRUrgent.com – Not only distributes your release, but attempts to teach you how to write one, and even offers downloadbale samples for you to work with.

Press-Base.com – Submit your release for free and get on their front page and the category of your choice.

PressAbout.com – A free press release service formatted as a blog.

PressMethod.com – Free press release distribution no matter what, but extra services based on the size of your contribution.

PRLeap.com – Free distribution to search engines, newswires, and RSS feeds. Fee based bumps get you better placement.

PRLog.org – Free distribution to Google News and other other search engines.

TheOpenPress.com – Gives free distribution for plain formatted releases, fees for HTML-coded releases.

Monday, April 12, 2010

Web crawler

A Web crawler is a computer program that browses the World Wide Web in a methodical, automated manner. Other terms for Web crawlers are ants, automatic indexers, bots, and worms or Web spider, Web robot, or—especially in the FOAF community—Web scutter.

This process is called Web crawling or spidering. Many sites, in particular search engines, use spidering as a means of providing up-to-date data. Web crawlers are mainly used to create a copy of all the visited pages for later processing by a search engine that will index the downloaded pages to provide fast searches. Crawlers can also be used for automating maintenance tasks on a Web site, such as checking links or validating HTML code. Also, crawlers can be used to gather specific types of information from Web pages, such as harvesting e-mail addresses (usually for spam).

A Web crawler is one type of bot, or software agent. In general, it starts with a list of URLs to visit, called the seeds. As the crawler visits these URLs, it identifies all the hyperlinks in the page and adds them to the list of URLs to visit, called the crawl frontier. URLs from the frontier are recursively visited according to a set of policies.

More info : http://en.wikipedia.org/wiki/Web_crawler

Wednesday, April 7, 2010

PageRank

PageRank is a link analysis algorithm, named after Larry Page, used by the Google Internet search engine that assigns a numerical weighting to each element of a hyperlinked set of documents, such as the World Wide Web, with the purpose of "measuring" its relative importance within the set. The algorithm may be applied to any collection of entities with reciprocal quotations and references. The numerical weight that it assigns to any given element E is also called the PageRank of E and denoted by PR(E).

The name "PageRank" is a trademark of Google, and the PageRank process has been patented (U.S. Patent 6,285,999). However, the patent is assigned to Stanford University and not to Google. Google has exclusive license rights on the patent from Stanford University. The university received 1.8 million shares of Google in exchange for use of the patent; the shares were sold in 2005 for $336 million.


Mathematical PageRanks (out of 100) for a simple network (PageRanks reported by Google are rescaled logarithmically). Page C has a higher PageRank than Page E, even though it has fewer links to it; the link it has is of a much higher value. A web surfer who chooses a random link on every page (but with 15% likelihood jumps to a random page on the whole web) is going to be on Page E for 8.1% of the time. (The 15% likelihood of jumping to an arbitrary page corresponds to a damping factor of 85%.) Without damping, all web surfers would eventually end up on Pages A, B, or C, and all other pages would have PageRank zero. Page A is assumed to link to all pages in the web, because it has no outgoing links.

History

PageRank was developed at Stanford University by Larry Page (hence the name Page-Rank) and later Sergey Brin as part of a research project about a new kind of search engine. The first paper about the project, describing PageRank and the initial prototype of the Google search engine, was published in 1998: shortly after, Page and Brin founded Google Inc., the company behind the Google search engine. While just one of many factors which determine the ranking of Google search results, PageRank continues to provide the basis for all of Google's web search tools.

PageRank has been influenced by citation analysis, early developed by Eugene Garfield in the 1950s at the University of Pennsylvania, and by Hyper Search, developed by Massimo Marchiori at the University of Padua (Google's founders cite Garfield's and Marchiori's works in their original paper). In the same year PageRank was introduced (1998), Jon Kleinberg published his important work on HITS.

More info http://en.wikipedia.org/wiki/PageRank

Monday, April 5, 2010

Web Directories List: Free Directory Submission

This summary is not available. Please click here to view the post.

Friday, March 5, 2010

List of Social Bookmarking Sites

AllMyFavorites – Share favorites/bookmarks with your friends. Search the favorites/bookmarks of others. Access your favorites/bookmarks from any computer
with an internet browser.

Backflip – Automatically organizes personal bookmarks into a searchable hierarchical directory.

Blinklist – List of the best sites on the web

Blogmarks – BlogMarks is a collaborative link management project based on sharing and key-word tagging.

Blummy – blummy, a handy tool that puts your favorite services at your fingertips.

BuddyMarks – The Online Bookmark Manager
. The Web’s best online personal, group and social bookmarks manager.

BookmarkTracker – free web-based bookmark management, social bookmarks, tracking, sharing, synchronizing and RSS services

Chipmark – Chipmark is an on-line bookmark manager that allows you to access your bookmarks from any computer. Chipmark fits seamlessly into your browser so you don’t even have to change your bookmarking habits.

del.icio.us – Keep, share, and discover the best of the Web using Delicious, the world’s leading social bookmarking service.

De.lirio.us – cut the web down to size with this social bookmarking tool.

Diigo – Diigo is a powerful research tool and a knowledge-sharing community

Dogear – dogear is meant to be used as an in-page bookmarking tool. If you often read blogs, articles, online books, or e-texts of whatever kind, dogear can help you keep track of your position within a text. It also keeps a history of everything you’ve read.

Favoritoo – Free web-based service to manage your bookmarks (favorites). Autologin, sharing, searching are some cool features offered by Favoritoo.

Feedmarker – Free bookmarker and newsreader with tagging

Foxmarks Foxmarks is now Xmarks. Please visit Xmarks.com for the latest and greatest version of our free bookmark sync and backup add-on

FreeLink – Keep your link pages private or share them with your friends and colleagues by sending them a URL (no login required). Take advantage of the Internet knowledge that you and your friends have collectively. Create a network of expertise by linking to each other’s pages.

GiveALink – Share your bookmarks with the community and help others navigate the Web

iKeepBookmarks – iKeepBookmarks.com allows you to upload, and keep, your bookmarks on the web for free. You can access them at any time, from any computer… anywhere!

Jack of All Links – Jack of All Links is built off of the websites that you send us. If you find a funny video, let us know. If you found a cute picture of a cat, let us know. We want links!

LinkaGoGo – Need instant access to your favorite bookmarks? from any browser? With our unique dynamic bookmark toolbars you will have your favorite sites always at hand.

Linkatopia – Would you like to share your bookmarks and favorite web sites with your friends but keep them hidden from everyone else? Linkatopia is a free utility for keeping your favorites online!

Linkroll – BOOKMARK, COMMENT, ORGANIZE, SEARCH. IT’S SIMPLE AND IT WORKS

Ma.gnolia Now called Gnolia – Welcome to the re-launched Gnolia, an online community built around link saving and sharing. To keep Gnolia sustainable, membership is by invitation only with former members given priority for now.

Mister Wong – Mister Wong is a leading social bookmarking service with portals in 6 languages and over 7 million monthly users globally

Mobilicio.us – Access del.icio.us bookmarks from your mobile device.

MyBookmarks – MyBookmarks is a free Internet service that allows you to keep your browser bookmarks and favorites online so you can access them from anywhere.

MyHq – Easily manage your bookmarks in a banner ad FREE environment. Import/Export, create public pages (password protected if you wish), share bookmarks. All for FREE.

Mylinkvault – MyLinkVault is a free online favorites manager. Other favorites managers can be so clumsy to use – trying to rearrange your favorites can be slow and frustrating.

MyPip – Make your own personal favorites page, Manage your favorite bookmarks and have them available everywhere within a moueclick

My Stuff (from Ask) – Import your bookmarks or favorites

Netvouz – Netvouz is a social bookmarking service that allows you to save your favorite links online and access them from any computer, wherever you are.

OnlyWire – OnlyWire syndicates your content and articles to the web’s top social networking sites with a single button click.

Oyax – Oyax is a social bookmark manager. It allows you to add web sites to your personal collection of links, categorize those sites with tags and share your collection not only with your own browsers and machine, but also with other people.

Simpy – Simpy is a social bookmarking service that lets you save, tag, search and share your bookmarks, notes, groups and more.

SiteBar – SiteBar is a solution for people who use multiple browsers or computers and want to have their bookmarks available from anywhere without need to synchronize them or take them along.SiteBar is a solution for people who use multiple browsers or computers and want to have their bookmarks available from anywhere without need to synchronize them or take them along.

SiteJot – It allows you to store all your bookmarks/favorites in one online location, making them easy to access and manage from anywhere. Your bookmarks (organized by category) are displayed on a simple, well laid out page. SiteJot will even integrate with your web browser, allowing you to bookmark any site you are currently visiting with a click of your mouse.

Snipit – Intelligent bookmark management and information sharing

Socializer – The Socializer allows you to easily submit a link to several social bookmarking systems. Instead of having a link to each social bookmarking website, you have a single link to all of them!

StartAid – StartAid is perfect for saving all your Bookmarks and Favorites Online. You can quickly access your Bookmarks and Favorites from any computer and best of all, you will never lose a site again.

Stumble Upon – Discover the best of the web in less time.

Sync2it – SyncIT is the original FREE bookmark synchronizer.

Turboclip – yet another social bookmarking site.

Twine – You’re into a lot of things. Keep track of them with Twine.

WireFan – Save Your Favourite Sites and Quickly Access them from Anywhere.

Zurpy – Save your bookmarks, text clippings, images, files, and news feeds in one place. Easily tag and find what you’ve saved. Access your stuff from any computer anywhere. It’s free and extremely easy to use.