Everybody use Google and it’s by far the best and most popular search engine on the Internet. Have you ever asked yourself how does Google decide which site ranks first and which site appears on the last page of the search results? Of course, it use your keywords to determine which site is the most relevant to your query, but how does Google decide between two sites with the exact same title? You can even push it further: How would Google decide which site to rank first between to sites with the same content and same title?
Google ranking algorithm is called PageRank and was developed by Google’s founders Larry Page and Sergey Brin at Standford University. This is the core of Google’s ranking even if a lot of other aspects will influence the rankings. Here’s how Google define their PageRank algorithm:

PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page’s value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B. But, Google looks at considerably more than the sheer volume of votes, or links a page receives; for example, it also analyzes the page that casts the vote. Votes cast by pages that are themselves “important” weigh more heavily and help to make other pages “important.” Using these and other factors, Google provides its views on pages’ relative importance.
Of course, important pages mean nothing to you if they don’t match your query. So, Google combines PageRank with sophisticated text-matching techniques to find pages that are both important and relevant to your search. Google goes far beyond the number of times a term appears on a page and examines dozens of aspects of the page’s content (and the content of the pages linking to it) to determine if it’s a good match for your query.

So there you go, the more back links a website has, the more chances it has to be highly ranked. The quality of the websites pointing to another also has a lot of influence. I think it’s probably the best technique so far as your search will return you the site that seems to be the most popular for the query you entered. This is where it gets tricky as more popular doesn’t necessarily mean best quality. Let’s say you build a new website about technology with a lot of quality content and that is built with the highest standards of the industry. Now you write a new website about the new MacBook Air hoping to get some traffic when people search for MacBook Air on Google. Even if your website is the best available for that keyword, it will be returned on maybe page 15-20 of Google search results. Why? Because you don’t have any links pointing to your website, so poor PageRank. It might be also almost impossible for you to rank on the first Google search results page because you might be competing with websites having millions of back links.

That’s where Google ranking gets tricky. Most of the users don’t go any further than 30 results and sometimes you could be surprised by the quality of the links you would find on page 7-8. The websites might be a little less popular but that doesn’t mean they don’t offer quality content. It’s hard work today in the Internet industry to get a good rank on Google because you fight with people and companies that wants to be first on Google and have full time employees whose job his to optimize the rankings. There’s a lot of techniques to improve your rankings other than back links and it’s called SEO (Search Engine Optimization). This is out of the scope of this article though.

So there it is, that’s how Google works and how it decides which site should be ranked first for specific keywords. It has a lot of advantages but at the same time, smaller websites with quality content struggle to get decent traffic. That’s the downside.
The World Wide Web Consortium (W3C) is the main international standards organization for the World Wide Web (abbreviated WWW or W3). Founded and headed by Tim Berners-Lee,[2] the consortium is made up of member organizations which maintain full-time staff for the purpose of working together in the development of standards for the World Wide Web. As of 8 September 2009, the World Wide Web Consortium (W3C) has 356 members.[1]
W3C also engages in education and outreach, develops software and serves as an open forum for discussion about the Web.

Standards
W3C/IETF Standards (over Internet protocol suite):
               
               
http://www.3ainfocom.com


The query processor has several parts, including the user interface (search box), the “engine” that evaluates queries and matches them to relevant documents, and the results formatter.

PageRank is Google’s system for ranking web pages. A page with a higher PageRank is deemed more important and is more likely to be listed above a page with a lower PageRank.

Google considers over a hundred factors in computing a PageRank and determining which documents are most relevant to a query, including the popularity of the page, the position and size of the search terms within the page, and the proximity of the search terms to one another on the page. A patent application discusses other factors that Google considers when ranking a page. Visit SEOmoz.org’s report for an interpretation of the concepts and the practical applications contained in Google’s patent application.

Google also applies machine-learning techniques to improve its performance automatically by learning relationships and associations within the stored data. For example, the spelling-correcting system uses such techniques to figure out likely alternative spellings. Google closely guards the formulas it uses to calculate relevance; they’re tweaked to improve quality and performance, and to outwit the latest devious techniques used by spammers.

Indexing the full text of the web allows Google to go beyond simply matching single search terms. Google gives more priority to pages that have search terms near each other and in the same order as the query. Google can also match multi-word phrases and sentences. Since Google indexes HTML code in addition to the text on the page, users can restrict searches on the basis of where query words appear, e.g., in the title, in the URL, in the body, and in links to the page, options offered by Google’s Advanced Search Form and Using Search Operators (Advanced Operators).

Let’s see how Google processes a query.

Search engine optimization refers to the tasks that are carried in order to maintain quality ranking and indexation of the website being visited through various major search engines.Search Engine Optimization is a two part process, firstly it is important to make sure that the major search engines know what keywords you would like your pages to rank for, this is done using "OnPage"Search Engine Optimization. Secondly the search engines need to know how popular your page is. To measure this, they look around the world at all the other websites that they have records for, and add up how many of them link to you, they also give each of these links a notional value. In short,Google,Yahoo and Bing run what is simply a massive popularity contest and the business with the largest volume of targeted, high quality links will rank first. This is called "Off-Page" Search Engine Optimization.
http://www.3ainfocom.com

Did you Know you could post your blog Posts to Twitter automatically  without having to open a single page on Twitter.I just got registered in this amazing facility at twitter feeds and believe me I dont have to post a tweet on every single new topic which appears in my company blog.
twitter bird logo




  1. Go to Twitterfeed.com,register or directly login if you have OpenID (your Blogger or WordPress blog URL). Put in your Twitter user credentials.
  2. Create a new Twitterfeed. Choose the feed URL as (Blogger): "http://YourBlogName.blogspot.com/atom.xml?redirect=false". This helps faster posting, as it is the raw feed. WordPress and other blogging systems, use the raw feed, not the FB-burned feed.
  3. Set other options and update the feed. As you post a new entry to your blog, it will be posted automatically to Twitter.
    http://www.3ainfocom.com

    RSS (Rich Site Summary) is a format for delivering regularly changing web content. Many news-related sites, weblogs and other online publishers syndicate their content as an RSS Feed to whoever wants it.




    Why RSS? Benefits and Reasons for using RSS
    RSS solves a problem for people who regularly use the web. It allows you to easily stay informed by retrieving the latest content from the sites you are interested in. You save time by not needing to visit each site individually. You ensure your privacy, by not needing to join each site's email newsletter. The number of sites offering RSS feeds is growing rapidly and includes big names like Yahoo News.
    What do I need to do to read an RSS Feed? RSS Feed Readers and News Aggregators
    Feed Reader or News Aggregator software allow you to grab the RSS feeds from various sites and display them for you to read and use.
    A variety of RSS Readers are available for different platforms. Some popular feed readers include Amphetadesk (Windows, Linux, Mac),FeedReader (Windows), and NewsGator (Windows - integrates with Outlook). There are also a number of web-based feed readers available.My Yahoo, Bloglines, and Google Reader are popular web-based feed readers.




    Once you have your Feed Reader, it is a matter of finding sites that syndicate content and adding their RSS feed to the list of feeds your Feed Reader checks. Many sites display a small icon with the acronyms RSS, XML, or RDF to let you know a feed is available.
    PageRank is a link analysis algorithm, named after Larry Page,used by the Google Internet search engine that assigns a numerical weighting to each element of a hyperlinked set of documents, such as the World Wide Web, with the purpose of "measuring" its relative importance within the set. The algorithm may be applied to any collection of entities with reciprocal quotations and references.

    Page Rank - Also known as Google Juice, is the rank that Google assigns individual web pages within your site.  Page Rank or PR is based on logarithmic 0 -10 scale which considers the content and transparency of your site according to Google's search bots as well as the number and quality of inbound and outbound links to the site.
    http://www.3ainfocom.com