Pages

Wednesday, 18 July 2012

Firefox 14 Now Encrypts Google Searches, But Search Terms Still Will “Leak” Out


Firefox 14 has officially launched today, which means all Google searches are encrypted by default. However, due to a Google loophole, the encryption will not prevent things you search for from “leaking” out to Google’s advertisers nor potentially showing up as search suggestions or in data reported to web sites through Google Webmaster Central. The Firefox team saidof the change:

We automatically make your Google searches secure in Firefox to protect your data from potentially prying eyes, like network administrators when you use public or shared WiFi networks.

This is true. The “secure” version of Google search that Firefox will be using — called Google SSL search — does prevent anyone from “eavesdropping” on what you’re searching for. However, Google SSL search will tell advertisers what you searched for, if you click on their ads. If Firefox were trying to make searching fully secure, it would also block what’s called “referrer” information from being passed along, in addition to using Google SSL Search. Technically, this shouldn’t be a problem. However, Firefox apparently has decided against doing this. Our previous story explains more:

Firefox To Use Google Secure Search By Default; Expect More “Not Provided” Keywords To Follow

As for Google, it could also prevent referrer information from being passed along to advertisers, if it wanted. However, it made a deliberate choice to keep providing this information. The choice continues to be confusing. When Google made the change last October to block referrer informationfor non-advertisers last year, the argument was that this was intended to protect privacy, that search terms themselves were potentially sensitive and revealing information. However, those same potentially sensitive terms are provided to advertisers, plus they may be revealed within things like Google Autocomplete or in data reported to publishers through Google Webmaster Central. The articles below explain more about these issues:

Google Puts A Price On Privacy
2011: The Year Google & Bing Took Away From SEOs & Publishers
Google’s Results Get More Personal With “Search Plus Your World”
Google “Search Plus Your World” To Launch Beyond US? Likely, As Secure Search Set To Expand

For those seeking full-privacy, consider some of the search options listed below:

Scroogle’s Gone? Here’s Who Still Offers Private Searching

Postscript: My Debate With Firefox

I’ve been having a bit of a back-and-forth between Asa Dotzler, the director at Mozilla who oversees Firefox, who both accuses me of not understanding how Google SSL Search works and misrepresenting what Mozilla has said about how it will provide privacy within Firefox. Actually, I’ve come to think that Mozilla doesn’t understand how Google SSL Search works and itself has been misrepresenting how privacy protection will work — and not work — within Firefox.
SSL Search Blocks Two Types Of Leakage, Not One

Here’sthe comment at The Verge where Dotzler tells me I don’t understand what’s happening:

Danny, you misunderstand what SSL search is trying to accomplish. We’ve made the connection between the user and Google secure from snooping. That’s what SSL does and that’s why we’ve implemented it. Google can do what ever it wants with the data once it gets it, but the bad guys sniffing your wi-fi connection cannot get at your information.

Given that I’ve been writing about Google SSL Search in-depth (see those links above) since Google launched it last October, yeah, I have a pretty good idea of what it is and what Google was trying to accomplish with it. My replyat The Verge:

I’ve not misunderstood what SSL search is trying to accomplish. In fact, I probably understand it better than you do. Otherwise, I wouldn’t be having to explain the next part. SSL Search was rolled out because Google said that search term data was too sensitive to be leaked out, either through eavesdropping on a connection (what encryption prevents) or by passing along those terms in referrer data to publishers. SSL Search blocked BOTH of those things, because Google itself felt they were co-equal issues. SSL Search, however, specifically did not block passing referrer data to Google’s advertisers. Sensitive search terms data was apparently not so sensitive for Google’s advertisers to have access to. When Firefox makes use of SSL Search, you’re still allowing all those advertisers to see the search data that supposedly is too sensitive to leak out to non-advertisers. If you really wanted to make SSL Search as secure as Google could have — and should have — made it, then Firefox would stop passing referrers. Alternatively, you could use the completely separate Google Encrypted Search. That would prevent referrer leakage except in the extremely rare case where someone left Google for another secure site. The site would still see the referrer, but at least the data would remain encrypted. I’m pretty sure that by using SSL Search, the referrer data is being passed along without encryption, potentially opening up the ad clicks from Google to eavesdropping.

If you want to understand more about this, the referrers, the difference between Google SSL Search and Google Encrypted Search and how it all plays out with Firefox, I’ll refer you back to reading this previous post from me: Firefox To Use Google Secure Search By Default; Expect More “Not Provided” Keywords To Follow.

Friday, 13 July 2012

Google Analytics introducing Content Experiments!

Yesterday, Google Analytics announced content experiments for goals measuring, testing and optimizing in one place. Content Experiments allows you to test how well different version of your web pages work from your random visitors.

Google integrates content experiments to Google Analytics tool and reporting much better to Website optimizer. Google will say goodbye to the standalone tool Google website optimizer on August 2012. According to Google Analytics Blog,

"The last day you’ll be able to access Google Website Optimizer, and any reports for current or past experiments, will be August 1, 2012."

Content Experiments, What you can do:

Compare how different web pages performing with your website visitors
Define the percentage of your visitors are included In the experiment
Choose what type of goal you’d like to test
Get updates by email about how your experiment id doing ( not currently available)
Content Experiments easy to test your web page how performing from your visitors. You can define after the experiments which page getting the more conversions and all. It will provide advance reports like page metrics, goal conversions, how many visitors engage your experiments etc.,

Google starts Jelly Bean roll out with Galaxy Nexus



Internet search giant Google has begin uploading the latest version of its Android operation system to users' smartphones.

ners of the HSPA+ version of Samsung's Galaxy Nexus, available in the UK, most of Europe and the U.S., are the first to receive the update.

ogle claims that the Jelly Bean software offers an improved search experience.
According to The BBC, the update, also known as Android 4.1, poses a direct challenge to Apple iPhone's voice-recognition app Siri.

Google said it has improved Voice Search so that it can display answers to spoken questions from sources including Wikipedia.

According to the report, it has also introduced Google Now, which is designed to offer information without the user having to actively trigger a query.

"Google Now tells you today's weather before you start your day, how much traffic to expect before you leave for work, or your favourite team's score as they're playing," said the firm in an update to the Nexus page on its Google+ social network.

According to the report, both newly introduced features are potentially in breach of an integrated search patent filed by Apple

Thursday, 12 July 2012

Learn How You Can Become a Master in Web Design

The design of the website is something that describes the success or failure story of any website. With attractive and interesting web design, you can grab the attention of your visitors. It is very important to have simple, appealing and user friendly web design to influence visitors to revisit the site and also talk about that with their friends. This can help you in making a good reputation amongst the customers. You cannot hold visitors to your website if you do not have impressive designs.

Here are some suggestions that can help you in designing good websites.

1. Select Appropriate Font:

Select a font style that looks professional and pleasing. Don't go for comic sans or other fancy looking fonts, which are not accessible on every computer. Remember, if your visitor does not have those fonts in his system, then it may show a default font and leave a bad impression on him.

2. Edit your website:

It is very important to edit your site before putting it on the server for everyone to come and visit the site. Make sure to check the whole site twice before putting it on server for visitors. Thoroughly go through the content of the site because having poor quality content on website may look unprofessional. You may loose your visitors because of unprofessional appearance.

3. Keep your links up-to-date and functioning

A good web design is considerably those which does not display any error message and are simple to search. Check every link on your site daily to make sure that they are working smoothly.

4. Work with multiple platforms

Learn various working platforms for web designing such as PHP, JAVA, MySQL and many others that can help you in working on various websites. If you want to become a leading web designer, then it is necessary for you to learn different we designing languages and servers. You have become a multi-talented designer if you want to work on different websites.

5. Take guidance from professionals

Take help from experts in learning things like HTML, Photoshop or Dreamweaver. Learn every aspect of web design before start working like a professional. This will help you in facing all new challenges that you may face while working on various websites.

Google Docs: A part of Google Drive


After the release of Google Drive, Google Cloud based storage service; most of the users who are already using Google Docs are wondering How Google Drive is different from Google Docs?

One of the major differences which come into mind is that Google Docs is just a part of Google Drive.

Google Docs refers to the editors for Google documents, spreadsheets, presentations, drawings, and forms. These are online documents that live in the cloud and provide real-time collaboration features.

A major change that you can see is, that the DOCUMENTS hyperlink on your Google Apps navigation bar becomes DRIVE.

With the help of the newly launched, Google Drive one can access all the files, including both Google Docs and local files from a web browser or any device where Google Drive is installed.

With an initial space of 5GB storage facility, one can store all file types, including documents, presentations, music, photos, and videos. One can even open many file in the browser, including PDFs, Microsoft Office files, high-definition videos, and many image file types. There is no need for a corresponding program to be installed on your computer to open each of these files.

Google Drive allows you to store files and sync them between computers, both on the web and offline. It’s basically just like Dropbox but also allows having the Google Docs aspect with it.

The best feature of using this cloud based storage features is that one can share and edit the files anytime and anywhere. The data stored here is secured, so, if in case there is a problem with the device, the files are safe.

Google Drive provides many ways to view, find, and sort files. It includes powerful search options—even the ability to search for text in pictures—to quickly find what one is looking for.

An offline access to your Google Documents and Spreadsheets is also possible but one can only view the Google Documents and Spreadsheets without an internet connection. Editing is not possible when offline. When talking about a non-Google document or spreadsheet, one can view as well as edit the files from the Google Drive folder without an internet connection. Any changes made to the synced file while offline will sync to all devices when reconnected to the internet.

Users have a curiosity to use the Google’s newly released Google Drive. This new feature is expected to arrive on all the domains very soon so that one can take the utmost benefit of its features.

Google Webmaster Tools Update: Organize your YouTube presence through Associates

Google has announced a new feature in webmaster tools that enables “associates” i.e. trusted users to act on behalf of your site on other Google Products. We all know that these associates cannot view the website data or take any action in Webmaster Tools like the website owner, but through this new feature these associates will be able to perform specific functions in other Google Products.

To make this simpler, a website that is listed on webmaster tools and has accounts on social networks like twitter and YouTube can now associate these accounts with the webmaster tools. Through this feature, the visitors have a confidence that they are actually associated with the same website.

Presently, the members of the YouTube Partner Program are able to associate their YouTube channels with the webmaster tools. The YouTube channel becomes the official channel for the website.

Wednesday, 11 July 2012

Why SEO Is Necessary For Improving Your Business Online

I recent years, e-Commerce on online business has become a major part of the industry. People are becoming more and more aware about the products and businesses available online and are becoming more comfortable buying anything online. From a business owner’s point of view, it’s very important to have a robust online presence so that users can find you and look at the product listed on your website. Your website is a showcase or showroom for all the products of services you want to sell and it is very important to make it very impressive and interactive so that the users, who are visiting the website, can understand the product as per their requirement. However, I t is also very important to attract users to your website. There might hundreds of thousands of website available for the same idea or selling the same product which you want to sell. Some of the websites must have been working way before you hosted your website.

One more important fact is that people might not remember the web address or they might not be aware about the URL of your website. So how can you make more users come to your website? Users mostly never remember or type the website address on the internet browser; they simply go by the keywords and they try to search for anything online using a search engine like Google or Yahoo. For an example, if a user needs to buy laptop then he/she would simply type the keyword “buy laptop” on search engine and will visit the website from the search result. It is also a notable point that most of the users do not go beyond the first page of the search result. Now you must be thinking how in the world you can make sure that your website should be there in the first page of the search result for the particular keyword on particular search result?

SEO is the answer for that. Search Engine Optimization, as the name suggest, is related to perform activities on and for your website which can make sure that the targeted website get the required exposure online and attract users. It involved so many activities to make your website popular and without an effective SEO program you may never reach the require audience. No matter how good you product is, no matter how wonderful your business services are, you can never reach the customers if you do not have a good marketing strategy. SEO is a type of promotion for your website which can create awareness about the product or services you are offering online and it helps to generate traffic on your website which can further be converted into sales.

Monday, 9 July 2012

GOOGLE PANDA & PENGUIN UPDATE: DOES THIS RESTRICT YOUR COMPANY?


Since the time Google has brought changes in its Algorithm, most of the companies are finding ways to recover from these updates as these updates have become a matter of great concern for the people in the SEO industry.

As Google is the king of all search engines, any update in its algorithm will effect majorly all the websites positioned on it.

Most of SEO Companies are facing the consequences arising from these updates. SEO Service providers using ethical practices to rank websites on the top positions do not suffer from such updates. Shivaami Corporation is one such SEO company in India which has not been restricted by the consequences of these updates.

Some SEO companies are shut down since the Penguin update went live as these companies were not able to recover from the negative impact of these updates and deliver results and hence lost client’s confidence. They were not able to rank their client’s website on the top positions.
THE PANDA UPDATE:

The Panda Update which went live in April 2011 targeted websites that were of low quality. The aim was to reduce the rankings of low quality sites and improve the rankings of quality sites that offer visitors original and high quality information. According to the Panda update, Google does not give importance to the low quality websites or the website with copied content. The motive behind this update was to give the exact search results to the searchers instead of manipulated search results of the sites with poor content.
THE PENGUIN UPDATE:

The Penguin update which went live on April 2012 targeted websites which are overly optimized and uses aggressive SEO techniques to achieve top rankings. Aggressive Link Building, unnatural link patterns and low quality back links are no longer tolerated. One of the major effects is on the back links using too many exact anchor text. Google penalizes domains that have too many back links using the same anchor text.

Many users are supportive of the update as it aims to make internet a better place for all users but some of them are also of the view that this update may result in lowering the ranks of some high quality websites.

These updates were an attempt by Google to remove the high amount of spam on the web.

If you are an SEO service provider, it is a must for you to know about these updates and their effects and start adopting ethical ways to be saved from the effects of these updates if you are not practicing them till now.

Facebook Launches a New App for Group Chats and Messaging

In a bid to launch newest mobile applets, Social Media giant Facebook has introduced a new mobile app called Messenger. It would be a standalone app. apart from Facebook’s regular mobile app. In the smart phones category, the app would be accessible on iPhone and Android phones.


It is seemingly designed with the help of group messaging app Beluga, which was acquired by Facebook recently. According to the officials at Facebook, Messenger is a separate app, which enables you to get or send messages at a faster speed using just one click of the mouse. It is supported by notifications and texts so that the user can receive and view it instantly. Apart from Facebook friends this app can also be used to reach your phone contacts. Group chat feature is also included in the app.

Things to Keep in Mind while Designing and Developing a Search Engine Friendly Website

Hey friends, I am here again with a new refreshing topic. This time, I would like to share the secrets of an attractive website that can easily hold the visitors as well as web spiders. So, let’s leave the introduction behind and start discussing about the three main features that are must for a website.
The very first one is Readable text. You entered a keyword in the search bar of a search engine and clicked on the first result that appeared on the result page. But what you see was just confusing and unreadable due to the dark hues and nasty background design with small font size text. Reading this mere example must give you frown lines. Hence, it’s advisable, not to use amok backgrounds with complicated texts as it will make keep your readers confused and will also offer an amateur look to the site. Use generic font size, font style and readable font color.
Another important feature is Navigation of the website. According to reports, clear and consistent navigation of a website is one of the vital reasons to hold the visitors. A cluttered navigation cannot hold the visitor for more than one minute. So, try to make the hierarchy of your site logical and uniform. Also, include the site map or index page for a site that have more than 10 pages.
The third and the most important feature is the usage of Codes or the programming languages. You must be aware about the fact that codes are the lifeline of a website. Excellent coding gives a compelling and eye-catchy website. But have you ever wondered that whether the developed codes are search engine friendly or not? Of course, Not!!!
Most of the developers don’t even bother to change the codes according to the search engines, once the coding has been done. However, codes should be Search Engine friendly. There are various mistakes done by a web developer but the most common is the usage of unique titles and description of the pages in the included files. Hence, it’s an advice to avoid the same. Also, use a forward back slash ‘/’ at the end of every URL.

SEO Experts Driving Traffic to your Website

When you search for a SEO company in Toronto, it’s important to do your research with care. It’s very easy to claim to be SEO experts, but not all of them have the skills, the experience and the ability to work on your Web site. Sometimes it can be difficult to make a decision. Many independent reviews provide information on the more honest SEO experts company. If you know how to find reliable and independent reviews, you can get a lot of information on their part.
By just browsing you can extract enough consistency results based on the experience of the support period, increase ranking and other related statistics and data which you cannot get from the official sites of these SEO companies in Toronto. Because of the reviews, you can also take a blow of eye how these so-called SEO experts respond to the type of SEO service in Toronto specifically SEO tactics.
Why not step confidence in the examination of SEO experts’ company official sites, instead? While it is true that the official sites of most of the companies highlight testimony and brief accounts of clients, these so-called reviews are by no means exempt from bias. Remember, many SEO companies who said SEO experts provide many good claims. Featuring favorable criticism is one of the many strategies to back up these claims.
It is better to investigate about the SEO Company before you hire them
The most reliable source of information is a person who has already used the services- perhaps a friend, an expert or just someone familiar who had previously used the services of Toronto SEO Company. These people will have proper information on SEO companies and the type of services they offer. By listening to what they have to say, you can begin to build your list of SEO Experts Company or begin to cross the names which should not be on the list. Technically, it should be a list of SEO companies that work by area.

Yahoo and Facebook enter agreement over ad sales, patent rights

Yahoo and Facebook reached an agreement Friday to expand the partnership between the tech giants.
AllThingsD first reported the deal, which comes as the final settlement in the patent infringement lawsuits filed by each of the companies.
Yahoo and Facebook will launch joint advertising sales efforts and begin cross-licensing key patents. Yahoo’s Board of Directors agreed to the deal Friday morning, according to the report.
In contrast with Facebook’s recent $550 million payment to Microsoft, no money will change hands in the Yahoo deal.
Yahoo interim CEO Ross Levinsohn and Facebook COO Sheryl Sandberg largely negotiated the deal. Contention between the two companies was centered on the lawsuits spearheaded by recently ousted Yahoo CEO Scott Thompson, and the new negotiations started almost as soon as Thompson’s firing was announced last month.
Many top level Yahoo executives had expressed concern over the attack on Facebook, so Levinsohn jumped at the opportunity to establish a more positive relationship between the companies.
Follow Stephen on Twitter
Join the conversation on The Daily Caller
Read more stories from The Daily Caller
Hollywood and U.S. government deny Kim Dotcom claim that Joe Biden plotted Megaupload raid
Global-warming ice sculpture protest canceled in embarrassment for green group
FNC's 'The Five' co-host Bob Beckel rips MSNBC's 'The Cycle'
David Brooks: Romney has 'a secret plan' for health care
GM reports increased sales because of government intervention

Friday, 6 July 2012

Google Nexus 7 tablet review: Solid, but not revolutionary

The Google Nexus 7 tablet resets expectations of what an inexpensive tablet can and should be. Starting at $199, the Nexus 7 clearly guns for Amazon’s same-priced but lesser-quality Kindle Fire, which runs Amazon’s limited flavor of Android. Make no mistake: Of today’s 7-inch Android tablets, the Nexus 7 is the one to beat, and it is handily one of the best-executed Android tablets of any size you can buy. In some ways, that’s not saying much; for as much as it does well—it has a tremendous 10-plus-hour battery life, and it produces reasonably clear text and accurate colors—the Nexus 7 stumbles by leaving out an expansion slot. You need to step up to the 16GB $249 version for the Nexus 7 to make a sensible purchase, and even then you’ll be settling for something short of the ideal tablet.
The lack of a memory card slot hobbles Google’s shiny new tablet before you can even get moved in and set up. Android has always held a big advantage over Apple’s iOS in its ability to expand on-board storage via a memory card; in fact, this is something that every tablet competing with the Nexus 7 except the Amazon Kindle Fire (and Apple’s iPad, natch) has. The Kindle Fire has taken lots of flack for providing a baseline model with only 8GB of storage and no room to grow.

It’s not clear why Google opted to leave out the card slot. Cutting it may be as much about Google’s live-in-the-cloud philosophy and services as it is a cost-cutting measure adopted by Google and Nexus 7 manufacturer Asus in order to meet an aggressive price. If Google’s emphasis on cloud services is indeed behind this choice—and likely that’s the case, given that Google bills the Nexus 7 as being “Made for Google Play”—that frankly makes Google’s despotism no better than Apple’s decision to keep users in its walled garden or Amazon’s decision to force us to use its cloud services with the Kindle Fire. Amazon, too, tried to spin its minimal on-board storage by saying that you could store media in, and stream content from, its cloud services. That approach is not rooted in consumers’ real-world usage patterns, and it doesn’t account for the vagaries of Wi-Fi availability and bandwidth. Consumers crave offline storage; we’re still away from wireless connections often enough for local storage to matter. No one wants to have to keep managing their content on and off the tablet just to work around a space limitation.
Given that we’re seven months on from when Amazon’s first-generation Kindle Fire was introduced, I’m surprised and disappointed that Google didn’t push the default memory on the Nexus 7 to 16GB in the $199 model. Now that would have gotten our collective attention—and rightly so. With the Nexus 7 you’re going to be downloading movies and television shows in high-definition, using apps optimized for high-definition displays, and loading up your high-resolution images for use in the gallery; and with all that activity, 8GB just won’t go very far. That amount of storage, with only 5.62GB of user-accessible space when you start the tablet for the first time—is too parsimonious to make the Nexus 7 a tablet I can recommend whole-heartedly. That’s unfortunate, because the Nexus 7 actually gets a lot right—far more than most competing Android machines.

Google to Shut Down iGoogle

Google is cleaning house again. This time the company is shutting down five services.
Google has a long history of unceremoniously killing off its less-used services, having previously axed once-high-profile efforts like Wave, Buzz, Knol and Gears, among others.
The most notable Google service on the chopping block this time is iGoogle, the company’s customizable homepage. Similar to Netvibes, MyYahoo or the now defunct PageFlakes, iGoogle was a dashboard for the web, allowing users to embed gadgets like weather, email and news.
When iGoogle first launched in 2005 it was something of a me-too effort, duplicating features found in other services, but adding numerous Google-centric gadgets. Eventually iGoogle’s gadget selection grew to encompass everything from feed readers to web-based games.
Citing the growth of mobile and web apps that “put personalized, real-time information at your fingertips,” Google says “the need for iGoogle has eroded over time.”
Fans of iGoogle don’t need to panic just yet, Google doesn’t plan to completely shut the service down until November 1, 2013. Presumably Google sees Google+ as a replacement. Other alternatives include Netvibes and PageFlakes, which both offer similar widget-based dashboard home pages. [Update: PageFlakes ceased operation in January 2012. Other possible replacements for iGoogle include UStart and ProtoPage.]
The other four services on Google’s spring cleaning shortlist include a Symbian search app, Google Talk Chatback (an embeddable Google Talk widget), Google Video, which long ago stopped taking new uploads, and Google Mini, part of Google’s enterprise search service.

Just How Interested Is the World in SEO?

How big is the SEO industry on the Internet? Really big, according to a new infographic by Spanish-based SEO service provider BlueCaribu.
In fact, the reveals that 3.5 people look up the term "SEO" on Google each second and 9.1 million web users are digging up information on the topic each month.
[More from Mashable: ]
People are getting their information about search engine optimization (SEO) from a variety of ways. There are 863 million websites worldwide that mention the term "SEO" and 164,000 YouTube videos are indexed with the topic.
[More from Mashable: ]
Los Angeles may be the number one city in the U.S. to search for SEO, but India is the top country with an interest on SEO strategy, followed by Pakistan and the Phillipines. The U.S. is ranked fourth, while Canada snagged the fifth spot.
Other fun facts about SEO is that March is the most popular month to search for the term and Thursday is the most popular day of the week to do so.
How often do you search for SEO on the Internet? What are some of your favorite resources? Let us know in the comments below.
Image courtesy of ,
This story originally published on Mashable .

Thursday, 14 June 2012

PageRank


Jump to: navigation, search
Mathematical PageRanks for a simple network, expressed as percentages. (Google uses a logarithmic scale.) Page C has a higher PageRank than Page E, even though there are fewer links to C; the one link to C comes from an important page and hence is of high value. If web surfers who start on a random page have an 85% likelihood of choosing a random link from the page they are currently visiting, and a 15% likelihood of jumping to a page chosen at random from the entire web, they will reach Page E 8.1% of the time. (The 15% likelihood of jumping to an arbitrary page corresponds to a damping factor of 85%.) Without damping, all web surfers would eventually end up on Pages A, B, or C, and all other pages would have PageRank zero. In the presence of damping, Page A effectively links to all pages in the web, even though it has no outgoing links of its own.
PageRank is a link analysis algorithm, named after Larry Page[1] and used by the Google Internet search engine, that assigns a numerical weighting to each element of a hyperlinked set of documents, such as the World Wide Web, with the purpose of "measuring" its relative importance within the set. The algorithm may be applied to any collection of entities with reciprocal quotations and references. The numerical weight that it assigns to any given element E is referred to as the PageRank of E and denoted by PR(E).
The name "PageRank" is a trademark of Google, and the PageRank process has been patented (U.S. Patent 6,285,999). However, the patent is assigned to Stanford University and not to Google. Google has exclusive license rights on the patent from Stanford University. The university received 1.8 million shares of Google in exchange for use of the patent; the shares were sold in 2005 for $336 million.[2][3]

Contents

Description

Cartoon illustrating basic principle of PageRank
A PageRank results from a mathematical algorithm based on the webgraph, created by all World Wide Web pages as nodes and hyperlinks as edges, taking into consideration authority hubs such as cnn.com or usa.gov. The rank value indicates an importance of a particular page. A hyperlink to a page counts as a vote of support. The PageRank of a page is defined recursively and depends on the number and PageRank metric of all pages that link to it ("incoming links"). A page that is linked to by many pages with high PageRank receives a high rank itself. If there are no links to a web page there is no support for that page.
Numerous academic papers concerning PageRank have been published since Page and Brin's original paper.[4] In practice, the PageRank concept has proven to be vulnerable to manipulation, and extensive research has been devoted to identifying falsely inflated PageRank and ways to ignore links from documents with falsely inflated PageRank.
Other link-based ranking algorithms for Web pages include the HITS algorithm invented by Jon Kleinberg (used by Teoma and now Ask.com), the IBM CLEVER project, and the TrustRank algorithm.

History

PageRank was developed at Stanford University by Larry Page (hence the name Page-Rank[5]) and Sergey Brin as part of a research project about a new kind of search engine.[6] Sergey Brin had the idea that information on the web could be ordered in a hierarchy by "link popularity": a page is ranked higher as there are more links to it.[7] It was co-authored by Rajeev Motwani and Terry Winograd. The first paper about the project, describing PageRank and the initial prototype of the Google search engine, was published in 1998:[4] shortly after, Page and Brin founded Google Inc., the company behind the Google search engine. While just one of many factors that determine the ranking of Google search results, PageRank continues to provide the basis for all of Google's web search tools.[8]
PageRank has been influenced by citation analysis, early developed by Eugene Garfield in the 1950s at the University of Pennsylvania, and by Hyper Search, developed by Massimo Marchiori at the University of Padua. In the same year PageRank was introduced (1998), Jon Kleinberg published his important work on HITS. Google's founders cite Garfield, Marchiori, and Kleinberg in their original paper.[4]
A small search engine called "RankDex" from IDD Information Services designed by Robin Li was, since 1996, already exploring a similar strategy for site-scoring and page ranking.[9] The technology in RankDex would be patented by 1999[10] and used later when Li founded Baidu in China.[11][12] Li's work would be referenced by some of Larry Page's U.S. patents for his Google search methods.[13]

Keyword Research Tool Review: Google AdWords

Keywords are important. I’ve heard from several potential clients recently who have talked to other “professionals” who told them keywords are no longer important for SEO, and that good content is all you really need.
Here’s the thing. Keywords are language. They’ll always be important because words are how we communicate with each other.
So until someone invents the microchip that can be implanted in your brain and read your thoughts without using words, you’ll need to think about language in marketing. And I can only hope that those potential clients were told this by one seriously misguided vendor.
With the ubiquitous need for keywords, there are a lot of tools out there that claim to do keyword research better. So I present to you, a series on Free Keyword Tools!
Let’s start with the Google Keyword Tool. This is what we all use, and like it or not, it’s the best free tool available. It’s tied directly into Adwords, and it uses search frequency from Google (although that’s debatable, see below). If you don’t sign in with an Adwords account, you’ll have to enter the captcha every time you want to do a search.
Lately, I’ve been seeing the captcha show up even when I’m logged in though, so you may not be completely free of this annoyance. By the way, it’s free to set up an Adwords account, and you don’t have to have anything live in it to be able to use the keyword tool.

Cool Features Of The Google Keyword Tool

  • The ability to switch from broad to exact or phrase match volumes. This can really help identify which keywords you want to set on which match types, and for SEO, it can help to hint at which keyword phrase has the overall highest volume in a niche.
  • The tool has lots of options for information about the keywords, which is somewhat hidden in the “columns” drop down menu – shown below. These are primarily designed for paid search, but they have some value in regular research as well.
Column options in the Google Keyword Tool
Column options in the Google Keyword Tool
  • Global vs. Local Monthly search volumes. The difference here is that one shows you only the local area you set up (defaulted to United States) and the other shows you volume for the entire world.
  • Be cautious using Global when you really mean Local, otherwise you’ll get keywords like “handy” instead of “mobile phone” showing as they have a lot of volume, even though only Germans call it a “handy”. Adjust your local settings here:
Options in the Google Keyword Tool
Options in the Google Keyword Tool
  • Approximate CPC and Competition are both useful for seeing what the competition is like for these keywords in the paid arena. It logically follows that those keywords would also be competitive organically or in any other context, and it can be a good way (along with volume) to make relative decisions.
  • Local Search Trends is also pretty cool, as it takes data from Google Insights and displays it directly in the keyword tool. The bars that are shown represent the previous 12 months’ search trend. Here’s an example for “summer olympics 2012″ which clearly shows that interest in this topic has been increasing, just as we would expect.
Trends for "summer olympics 2012"
Trends for "summer olympics 2012"
  • Search Share and Extracted from Webpage seem like they’d be an SEO’s dream, but I’ve been unable to get this feature to work properly, even with an active Adwords account. In theory, these two data points would show you (respectively) when your site showed up in organic search for a given keyword and what webpages on your site already match this keyword. We can continue to dream, but I think it’s likely this feature will only work when you have an active Adwords account live – so you can decide whether or not to bid on a keyword in paid.
  • The ad group ideas (beta) is really pretty cool. While I would never advocate using it to set up your ad groups, it can be very useful for categorizing keywords into niches. The fact that it saves your “ideas” and that you can check all keywords in a “group” at the same time makes me have to use those complicated excel formulas a bit less often.
  • Include/Exclude terms allows you to really narrow things down. An example from real life – I was searching for “estate planning” terms, and didn’t want anything associated with real estate. By adding the word “real” into the exclude terms box, I got a nicely filtered list of just what I needed.
  • The category drop down can also be useful, particularly when you’re doing work for a niche within a larger context. For example, I was doing some work for an attorney who helps with adoptions. By narrowing the category to “Law and Government”, I was able to take out all those keywords like “teenage pregnancy adoption” that aren’t relevant enough to the attorney trying to sell services.
  • The “only show ideas closely related to my search terms” box is unchecked by default, but you may want to check it if you are working with some really general search terms. Basically, what this box does is require that the keywords the tool returns have at least one of your keywords in it. If you leave this box unchecked, you may get keywords like “car quote” when you search “car insurance”. If you leave the box checked, you won’t get keywords like “insure car”, so think carefully about whether you want to check this or not.
  • Finally, the “Locations and Languages” feature under “Advanced Options and Filters” is a must for anyone doing international research. Maybe someday they’ll allow us to filter by region or state of the US too. That would be helpful.

What Is Social Media Marketing

Social media marketing refers to the process of gaining traffic or attention through social media sites.
Social media itself is a catch-all term for sites that may provide radically different social actions. For instance, Twitter is a social site designed to let people share short messages or “updates” with others. Facebook, in contrast is a full-blown social networking site that allows for sharing updates, photos, joining events and a variety of other activities.
Why would a search marketer — or a site about search engines — care about social media? The two are very closely related.
Social media often feeds into the discovery of new content such as news stories, and “discovery” is a search activity. Social media can also help build links that in turn support into SEO efforts. Many people also perform searches at social media sites to find social media content. The articles from Search Engine Land below give some more background on all of this:
Advice At Search Engine Land
Here at Search Engine Land, we provide paid search advertising information and news in a variety of ways:
How To: Social Media Marketing is our section that is devoted to practical tips and tactics about social media marketing. Also see the related How To: Twitter section.
Let’s Get Social is Search Engine Land’s column that covers different social media marketing topics every week.
Social Media Marketing Library Archives: This area of Search Engine Land provides a collection of all stories we’ve written on the topic of social media. Also check out the related categories of linkbait and URL shorteners.
In addition to covering social media marketing generally, Search Engine Land also has areas specifically about particular major social media sites and social search sites:

Link Building & Ranking In Search Engines

Links were the first major “Off The Page” ranking factor used by search engines. No, Google wasn’t the first search engine to count links as “votes,” but it was the first search engine to massively depend on link analysis as a way to improve relevancy.
Today, links remain the most important external signal that can help a web site rise in the rankings. But some links are more equal than others….

Lq: Link Quality

If you were sick, which would you trust more? The advice of five doctors or fifty people you didn’t know but who offered their opinions as you walked down the street.
Unless you’ve had a really bad experience with doctors, you’re probably going to trust the doctor advice more. Even though you’re getting fewer opinions, you’re getting those opinions from experts. They’re quality opinions.
In the same way, search engines do count all the links pointing at web sites (except those blocked using nofollow or other methods). But they don’t count them all equally. They give more weight to the links that are considered to be of better quality.
What’s a quality link? It’s one of those “you’ll know it when you see it” types of things, in many cases. But a link from any large, respectable site is going to be higher on the quality scale than a link you might get from commenting on a blog.
To learn more about link quality and how Google in particular examines links, see this tutorial from us:
These articles from us provide some additional tips on getting quality links:
Also be sure to check out our Link Week column, which provides information about link building every week.

Lt: Link Text / Anchor Text

Amazon has millions of links pointing at it. Yet, it doesn’t rank for “cars.” It does rank for “books.” Why? Many of those links pointing at Amazon say the word “books” within the links. Relatively few will say “cars,” since Amazon doesn’t sell cars.
The words within a link — the link text or “anchor text” — are seen by search engines as a way that one web site is describing another. It’s as if someone’s pointing at you in real life and saying “books,” declaring you to be an expert on that topic.
Often, you can’t control the words people use to link to you. But if you have the opportunity to influence this, you should seek to. It’s a powerful ranking factor.
To learn more about anchor text, see our tutorial below:

Ln: Number Of Links

While you want link quality over sheer number of links, plenty of sites have found that getting many links can add up.
In particular, viral linkbaiting campaigns can be effective and something even search engine representatives have suggested.
But in your quest for links, don’t fire up automated software and begin blog spamming. That’s a bad thing, in many ways, as we’ll explore later in this guide.

What Is SEO / Search Engine Optimization?

SEO stands for “search engine optimization.” It is the process of getting traffic from the “free,” “organic,” “editorial” or “natural” listings on search engines. All major search engines such as Google, Yahoo and Bing have such results, where web pages and other content such as videos or local listings are shown and ranked based on what the search engine considers most relevant to users. Payment isn’t involved, as it is with paid search ads.

Google Lied about Manually Changes

The Lie was told here by Udi Manber, and repeated by Matt Cutts. And I quote:

‘At Google we do not manually change results. For example, if we find for a particular query that result No. 4 should be result No. 1, we do not have the capability to manually change it. We made that decision not to put that capability in the algorithm—we have to go and actually change the algorithm.”

Contrast that with the story from 2 days ago from the official Google Blog:

“We created about 100 “synthetic queries”—queries that you would never expect a user to type, such as [hiybbprqag]. As a one-time experiment, for each synthetic query we inserted as Google’s top result a unique (real) webpage which had nothing to do with the query. Below is an example:”

There is no way to reconcile those two statements. If Google sees someone at number 4 that they want at number 1, they can remove the number 4 result with their manual spam filter, and then manually insert it to number 1.

This is not nitpicking: they have the capability and they have used it. They have used it more than for legality or for spam, they used it at the very least for their recent PR stunt.

This is on par with “Read My Lips, No new Taxes”

They lied. They were caught.

Do not trust Matt Cutts.

Do not trust Udi Manber.

DO NOT TRUST GOOGLE.

They are bold face liars.

They can and do manually change their search results. They can and do manually put whoever they want at number one: regardless of what they have said in the past.

Tuesday, 12 June 2012

Matt Cutts Talks Google Penguin, Negative SEO, Disavowing Links, Bounce Rate & More

What is Google looking for in a high quality website, worthy of top rankings? Well according to Matt Cutts, head of Google’s web spam team, you must first use as many keywords as possible; the optimal keyword density is actually 77 percent. Definitely link to porn sites. Annoying users: that’s a plus. You do get a boost for running AdSense, he revealed. Oh, and all those other search engines - Bing, Blekko, DuckDuckGo - they’re a bunch of hackers doing illegal things. You heard it hear, folks.

If you bought any of that, I have some icebergs I’d like to sell you. That video was actually a mash-up spoof Google put out, one that was played at the beginning of You & A with Matt Cutts at SMX Advanced 2012.

All jokes aside, Cutts got into some great topics and dispelled some modern-day SEO myths in his session. Here are the highlights.

Is Penguin a Penalty?

No, neither Penguin nor Panda are manual penalties, Cutts said. He explained that Penguin was designed to tackle “the stuff in the middle;” between fantastic, high quality content and spam. Panda was all about spam, but the need for Penguin arose from this middle ground.

“It does demote web results, but it’s an algorithmic change, not a penalty. It’s yet another signal among over 200 signals we look at,” he said.

A penalty is a manual action taken against a site and you will “pretty much always” be notified in Webmaster Tools if it’s a penalty affecting your site.

Will a Reconsideration Request Help You Recover From Penguin?

No. “People who think it should rank higher after Penguin can let us know and we can look at it, and in a couple of instances, it actually helped us make a couple of tweaks to the algorithm.” You should submit a reconsideration request if you receive a warning.
Negative SEO - Will Google Add an Option to Disavow Links?

They sure seem to be thinking about it. People have been asking about negative SEOfor a long time, Cutts said. He noted that Google has changed their documentation over time to reflect that negative SEO is not impossible, but it is difficult. Google is “talking about” being able to enable disavowing links, possibly within a few months.
Did Google Send WMT Notifications About Penguin?

Google is trying to be more transparent by sending out more warnings, he said. Only a single-digit percent of those 700,000 unnatural link warnings that went out around the time of Penguin were actually Penguin-related. The majority were for obvious black-hat tactics.

Is Google Trying to Make a Point About Buying Links?

Yes, they are. According to Cutts, “People don’t realize, when you buy links, you might think you’re very careful, that you have no footprints, but you may be getting into business with someone who’s not as careful. People need to realize as we build new tools, it becomes a higher-risk endeavor.”

Is SEO Going to Get More Difficult?

Yes. He notes that it’s become more challenging over the past five to seven years and SEOs should expect that trend to continue and even increase.

Does That Mean Google Hates SEOs?

Of course not. Though Cutts did hand out a spanking for SEOs who buy or sell links: “There are people who continue to sell links, although they don’t do any good, and that’s part of how SEO has a bad reputation.”

Later, he said he would consider giving link building for non-profits a try to better understand what SEOs are facing. When asked about the war on SEOs, he said, “There’s no war on SEOs!” and that it’s just a war on spam.

Google, Yahoo, MSN Unite On Support For Nofollow Attribute For Links

In the first cooperative move for nearly ten years, the major search engines have unveiled a new indexing command for web authors that they all recognize, one that they hope will help reduce the link and comment spam that plagues many web sites, especially those run by bloggers.

The new "nofollow" attribute that can be associated with links was originated as an idea by Google several weeks ago and pitched past MSN and Yahoo, as well as major blogging vendors, gaining support.