Usman Farooq is an SEO Specialist from pakistan with his diverse SEO service skills he intends to make changes.

Thursday, June 24, 2010

New Edition Of A Great Book

New Edition Of A Great Book: "A few years ago, Brian Clifton was working at Google in London, leading our team in Europe. Since then, he's left to focus on growing his own Google Analytics Certified Partner called GA Experts From Omega Digital Media and written a fantastic book called Advanced Web Metrics With Google Analytics, which has just released a new edition. According to Brian, here's what's changed in the new edition:
'Since the first edition was published in 2008, a lot has changed - both for Google Analytics and the web as a whole. Remember two years ago hardly anyone had heard of Twitter. In that time Google Analytics has integrated with AdSense and Feedburner, launched event tracking, advanced segments, Intelligence alerts, motion charts, custom reporting, custom variables and the data export API. The new edition covers using all of these in detail from a practitioners point of view and with as many real-world examples as I could muster.'

It's very well written and readable with screenshots - a great resource for all things Google Analytics. Ways to get the book:Order from Amazon or Barnes & Noble or direct from Wiley the publisher, or buy the PDF ebook




Posted by Jeff Gillis, Google Analytics Team

Use Case: Twiddy & Company

Use Case: Twiddy & Company: "We’re excited to hear from users that are able to attribute some incredible growth to Google Analytics, Google Website Optimizer, and many of the other tools we offer. Today, we’re taking a quick look at Twiddy & Company, which uses Google Analytics on a daily basis to optimize their website. They are one of our best examples of using marketing tools from Google to generate skyrocket growth. If you run a business, we think you'll enjoy this story and be inspired - it's like a blueprint for using Google Analytics for a successful SMB that relies on their website. Make sure you read to the metaphor they use around bounce rate - we love it!


Also, Twiddy & Company was also recently featured in a CNN Small Business Article, where they shared their success in using Kampyle, which uses the Google Analytics API to analyze web analytics and user feedback.


Meet Doug Twiddy

Doug Twiddy started selling real estate in 1978 in the sleepy village of Duck, North Carolina. After selling a few oceanfront lots, the owners built a few homes and asked the question “can you rent out my home when I’m not using it?” Today, Twiddy & Company manages 860 vacation rental homes on the Outer Banks of North Carolina. These homes range from a 23 bedroom oceanfront on 20 acres to a 3 bedroom soundfront.


We sat and talked with Doug at length, and here's what's important to him, in his own words.

Favorite Reports
“Before Google Analytics, we only knew half of the working equation. Now that e-commerce tracking is installed, we can see the complete formula and it illuminates the true end result. Before we were following indicative numbers, now we can follow the most fundamental and necessary ingredient in all of business. Top Content is also especially useful at highlighting the exact exposure an individual home receives. This feature has created an all out addiction for home owners. Now their first question in relation to their performance is how many visitors their individual home has received.

Bounce rate is also a must-have for us. It’s the online equivalent of the human senses. We know a higher bounce rate means that something on that page doesn’t smell, look, or taste good.”


How Analytics has changed their approach in analyzing the website

“Google Analytics is our compass in terms of allocating our resources. It allows an evolution of marketing. The more successful ideas draw more time and capital. Even the non productive ideas yield educational lessons. In many instances, we learn more from a quick failure than a slow success.”

How they tested changes on the website

“We’ve recently started testing with Google Website Optimizer. How did we ever survive without this? Our old testing setup was an elementary A/B test but Google Website Optimizer engaged the hyper drive. David Booth at Webshare helped us get started and the results quickly produced the laughter of humility. The variables are now part of our secret sauce. Not only did it make it easier, it made it more successful in an exponential fashion.”

How Google Analytics has changed their company

“Google Analytics gives Twiddy the tools to outperform the market; the metrics for successful marketing. One of the unforeseen benefits includes the hospitality of the phone calls. By examining what the visitors are looking for online, Twiddy is able to produce more relevant content online that decreases the redundancy of questions for the reservationists. The reservationists now can focus on the more personal side of the vacation experience and guests can fulfill their desire to research the choices and arrive at a very intelligent decision.

Google Analytics has had a very tangible impact on the success of the company. It’s become ingrained into the daily routine and crucial to the marketing strategy. General Patton had the 3rd Army, Twiddy & Company has Google Analytics.”




We congratulate Twiddy & Company on their success. If you find yourself in the Outer Banks of North Carolina, be sure to stop by their offices and say hello.


Posted by Ashish Vij, Google Analytics Team

The Power of Multiple Custom Variables, part 2

At first glimpse, the dashboard numbers confirm that the Spanish-speaking customers who visited the wedding registry and looked at suits is more likely to buy than the English-speaking customers who visited the same store sections! This is invaluable feedback to report back to your marketing team.

Concluding Remarks


We can easily create additional segments/reports (or use Secondary Dimensions & Pivoting, or extract the data via the GA API) and have a much deeper understanding of user behavior on the site, and act on these findings.



What if you don't have an ecommerce site? No worries, the same concepts explained in the the above example are totally applicable to other types of sites. Lead generation sites, content sites and other types can definitely benefit from this powerful MCV feature. Just think of visitor segments & user actions that are important to you and then apply a similar implementation approach.



Now go out, segment, analyze and truly understand your users!
Related Posts

Google Analytics - Custom Variables


Deep Dive Analysis in Google Analytics: Secondary Dimensions and Pivoting




Posted by Jeff Gillis, Google Analytics Team

Building A Business With The API

Building A Business With The API: "Do you like web analytics data? Do you like number crunching in Excel? Get ready to drool once you click on the images below to look at them up close. But don't jump ahead just yet! A little background on a great story...

When we released the Google Analytics Data Export API, we were excited to see what developers would build - but it’s even more exhilarating to see developers profiting from all their hard work. One developer, Mikael Thuneberg, has succeeded in doing just that, by starting a new business consulting around our API called AutomateAnalytics.com.

Mikael started working with the API in June 2009, developing a set of VBA functions to import data into Excel (VBA is Excel's built-in scripting language). His free solution has many benefits:

It does not require installing plug-ins


Reports are simple to share with others


The functions can be used just like any of Excel's built-in functions like SUM or COUNT



Check out the getting started guide to learn how to use this free tool to generate the jaw-dropping reports below - click images to enlarge:















Click images for larger versions


Because the solution is free and easy to use, Mikael quickly got requests from companies to build custom reports. As he discusses, ”All of these customers have excellent skills for analyzing data, but have asked for help in automating time-consuming manual work, like data retrieval and building custom visualizations.”


One of these came from Sanoma Games, the online gaming unit of Sanoma Group, one of the largest media companies in Europe. Sanoma owns dozens of popular sites, and so it was taking huge amounts of time for their analytics team to keep track of KPIs, let alone gather data for in-depth analysis.


Mikael built an Excel tool for them that fetches and processes the data they need in a matter of seconds. Now Sanoma's analysts can spend their time analyzing and taking action, instead of manually copying the data from one place to another.

Mikael eventually got many requests for customized reports from leading Internet companies, which led him to create AutomateAnalytics. As Mikael says, “I’ve always wanted to run a business. What I thought would be a fun project led into an amazing business opportunity. The Google Analytics API really helped me realize this goal.”


We’re really impressed with what Mikael has done and thrilled to share his story!


Posted by Nick Mihailovski, The Google Analytics API Team

Tuesday, June 22, 2010

Happy Father’s Day 2010 Logos From Google & Others

Happy Father’s Day 2010 Logos From Google & Others: "Today is Father’s Day in many countries and we wanted to wish you a Happy Father’s Day. Here is a collection of the various logos from the search industry celebrating the day.
Google:

Yahoo Animated:

Yahoo Static:

Baidu:

Sogou:

DogPile:

Bing:

Ask.com:

Cre8asite Forums:

Search Engine Roundtable:



*** Read the full post by clicking on the headline above or, in Facebook, by clicking on the 'View Original Post' link below. ***"

Google Is Top Shopping Site, TheFind Passes Yahoo As The New Number Two

Google Is Top Shopping Site, TheFind Passes Yahoo As The New Number Two: "According to comScore data, Google Product Search has become the top comparison shopping engine online. Number two is now TheFind, which moved in front of Yahoo Shopping in terms of average daily users:

Source: comScore
TheFind has been quietly moving up the ladder over the past couple of years and now has surpassed all longer-established competitors except [...]



*** Read the full post by clicking on the headline above or, in Facebook, by clicking on the 'View Original Post' link below. ***"

Integrating Feedburner & Google Analytics

Integrating Feedburner & Google Analytics: "In mid-November, Google Analytics and Feedburner announced a very interesting integration. While this is excellent news for Google Analytics and Feedburner users, it must be dealt properly when it comes to SEO, especially on websites that are heavily based on RSS feeds.
Google Analytics users rejoice!
This integration is very interesting in that it allows a much [...]



*** Read the full post by clicking on the headline above or, in Facebook, by clicking on the 'View Original Post' link below. ***"

Bing Paparazzi Photos

Bing Paparazzi Photos: "Opening remarks by Yusef Medhi:

Part II of the Bing Entertainment event
set up for discussion session about the future of entertainment in search with Hollywood hitters:

Joseph Gordon Leavitt & Kathryn Bigelow share their thoughts:



*** Read the full post by clicking on the headline above or, in Facebook, by clicking on the 'View Original Post' link below. ***



"

Linkscape Index Update: New Partnerships & API Data

Linkscape Index Update: New Partnerships & API Data: "

Posted by randfish

Many of our keen members observed that late last week, Linkscape's index updated (this is actually our 27th index update since starting the project in 2008). This means new link data in Open Site Explorer and Linkscape Classic, as well as new metric data via the mozbar and in our API.


Index 27 Statistics


For those who are interested, you can follow the Linkscape index update calendar on our API Wiki (as you can see, this update was about a week early).


Although we've now crawled many hundreds of billions of pages since launch, we only serve our uber-freshest index. Historical data is something we want to do soon - more on that later. This latest index's stats feature:



  • Pages - 40,152,060,523

  • Subdomains - 284,336,725

  • Root Domains - 91,539,345

  • Links - 420,049,105,986

  • % of Nofollowed Links - 2.02%

  • % of Nofollows on Internal Links - 58.7%

  • % of Nofollows on External Links - 41.3%

  • % of Pages w/ Rel Canonical - 4.3%


These numbers continue the trend we've been seeing for some time where internal nofollow usage is declining slightly while rel canonical is down a bit in this index but up substantially over the start of the year (this likely has more to do with our crawl selection than with sites actually removing canonical URL tags.


Comparing Metrics from Index to Index


One of the biggest requests we get is the ability to track historical information about your metrics from Linkscape. We know this is really important to everyone and we want to make this happen soon, but have some technical and practical challenges to overcome. The biggest of which is that what we crawl changes substantively with each index, both due to our improvements in what to crawl (and what to ignore) and with the web's massive changes each month (60%+ of pages we fetched 6 months ago are no longer in existence!).


For now, the best advice I can give is to measure yourself against competitors and colleagues rather than against your metrics last month or last year. If you're improving against the competition, chances are good that your overall footprint is increasing at a higher rate than theirs. You might even "lose" links in a raw count from the index, but actually have improved simply because a few hundred spam/scraper websites weren't crawled this time around, or we've done better canonicalization with URLs than last round or your link rotated out of the top of a popular RSS feed many sites were reproducing.


OpenSiteExplorer Comparison Report

Measuring against other sites in your niche is a great way to compare from index to index


If you've got more questions about comparisons and index modifications over time, feel free to ask in the comments and we'll try to dive in. For those who are interested, our current thinking around providing historical tracking is to give multiple number sets like - # of links from mR 3+ pages, # of links from mR 1-3 pages, etc. to help show how many "important" links you're gaining/losing - these fluctuate much less from index to index and may be better benchmarking tools.


Integration with Conductor's Searchlight Software


SEOmoz is proud to be powering Conductor's new Searchlight software. I got to take a demo of their toolset 2 weeks ago (anyone can request one here) and was very impressed. See for yourself with a few exclusive screenshots I've wrangled up:


Searchlight Screenshot 1/4


Searchlight Screenshot 2/4


Searchlight Screenshot 3/4


Searchlight Screenshot 4/4


Conductor's Seth Besmertnik at the Searchlight Launch Event


And at the bottom of the series is Seth Besmertnik, Conductor's CEO, during the launch event (note the unbuttoned top button of his shirt with the tie; this indicates Seth is a professional, but he's still a startup guy at heart). Searchlight already has some impressive customers including Monster.com, Care.com, Siemens, Travelocity, Progressive and more. I think many in the SEO field will agree that moving further into software is a smart move for the Conductor team, and the toolset certainly looks promising.


Conductor's also releasing some cool free research data on seasonality (request form here). Couldn't resist sharing a screenshot below of the sample Excel workbook they developed:


Keyword Seasonality Excel Workbook from Conductor


mmm... prepopulated


SEOmoz's Linkscape index currently powers the link data section of Searchlight via our API and we're looking forward to helping many other providers of search software in the future. We're also integrated with Hubspot's Grader.com and EightFoldLogic's (formerly Enquisite) Linker, so if you're seeking to build an app and need link data, you can sign up for free API access and get in touch if/when you need more data.


The Link Juice App for iPhone


We're also very excited about the popular and growing iPhone app - LinkJuice. They've just recently updated the software with a few recommendations straight from Danny Dover and me!


LinkJuiceApp 2/2LinkJuice App 1/2


The LinkJuice folks have promised an Android version is on its way soon, and since that's my phone of choice, I can't wait!


If you've got an app, software piece or website that's powered by Linkscape, please do drop us a line so we can include it. I've been excited to see folks using it for research - like Sean's recent YOUmoz post on PageRank correlations - as well as in many less public research works.


Oh, and if you somehow missed the announcement, go check out the new Beginner's Guide to SEO! It's totally free and Danny's done a great job with it.


Do you like this post? Yes No



"

Choosing the Right Keyphrases - Especially for the Smaller Sites!

Choosing the Right Keyphrases - Especially for the Smaller Sites!: "

Posted by Sam Crocker

Hey there folks! Today's post is a hands-on walkthrough of some of the decision making used when choosing the keyphrases to target. Producing a list of the most important terms in an industry is nice, but actually choosing the right keyphrases is essential. The post was largely created in response to a question submitted by Kien in the comments of my last post.

What to Expect

This post should  provide you with real-life examples of the keyword decision making process and help you make sense of the output from the revamped Keyword Difficulty Tool. If you're already a hardcore keyword research (and keyword difficulty) guru, this is more of a refresher, but should provide valuable insights to journeymen and perhaps a bit more transparency into how to choose the right keyphrases for your site (plus a bit of a tour for those of you who haven't used the Keyword Difficulty Tool for a while).

 

While you may get more value from this on smaller sites, and those that are newly launched, large sites with specific keyword targets may benefit, too.

Two Different Camps

There are really two separate and distinct camps on keyphrase research implementation: those who always go for the highest volume search terms that are moderately relevant to a page and those that also give consideration to the competitive landscape for a certain term. Regardless of which camp you fall into, this tool can be immensely useful. If you find yourself in the "competitive" camp and always go after the highest volume term no matter what, the Keyword Difficulty Tool could and should still be used to track your linkbuilding efforts vs. the competition and to have some understanding of if/when you might be able to rank for the term in question. There's no point in setting goals (even if they are lofty) without having some idea of how to reach them.

Image: Ebaums World

However, if you are in the camp that likes quick results and has a bit more time to give to the keyphrase research process I strongly recommend going the extra mile to indentify realistic targets for the short-term and keep those lofty ambitions in the back of your mind to be addressed as a site grows, gains new links and hopefully acheives higher authority and trust metrics.

But what if my site is brand new?

As a general rule any newer sites should probably be aiming for the "less-moderately competitive" types of keyphrases for the most part. The one exception to this rule would be for exact match domain names.

If you can afford to look at these metrics in this light you can scale up the competitiveness for your terms as you improve the quality and competitiveness of your website.

Click Through Rates

Image from: Rand's Post on Multiple vs. Singular Keywords

Thinking about it logically (with the above graphic in mind): if you could rank first for a term with 30,000 global monthly searches or somewhere on the fourth or fifth page for a similar term with 300,000 global monthly searches which would you choose? The optimist might choose the second option, but the truly intelligent will pick the first for now and aim for the second option down the road. It doesn't take superior math skills to figure out that a small fraction of 1.2% of clicks from the 300,000 searches will not amount to anywhere near 42.1% of 30,000 searches.

I'm not saying it's always better to play it safe, I'm just saying that realistic goals will help you achieve more in the short term, and the opportunity to re-evaluate these goals will help you in the long term.

Does this keyword require a new page? How many terms can I target on a page?

This depends on a couple of factors. It depends, again, on the competitiveness of the terms but also largely depends on the strength of the overall site or page on which the keyword is being targeted.

As covered in Rand's post in March on this issue, a highly competitive term deserves "single page targeting". This is true in most instances and particularly good advice for smaller, newer sites. The way this works is somewhat backwards, but experience suggests stronger sites can target multiple terms on a page yet also can afford to rank for a larger number of long-tail keyphrases. This may seem a bit unfair, but it is what anecdotal evidence has shown me.

The long and short of it? If your site is big, unless the keyphrase is highly competitive it can probably be targeted on a page targeting other similar terms as well. However, if the phrase is extremely competitive it deserves it's own page.

In the case of smaller, newer sites there are forces working against the best approach. On the one hand, smaller sites will have fewer pages indexed and will not have a great deal of authority to rely on and spread throught the site. On the other hand, this also means they cannot drop the term "breast augmentation" on the same inner page as "breast enhancement" and expect to rank for both terms.

How do I know if my Site is Strong Enough to Rank for that Term?

The Difficulty Tool pulls in some nice metrics from the rankings that allow you to see a fair bit of information about the other sites ranking for a particular search term or phrase. It won't always be a simple case of "my page/domain is stronger than theirs, thus I will rank". There will always be other factors: is the keyphrase an exact match for the Top Level Domain [TLD]? Are the other sites targeting multiple terms on the page or just the one? How many inbound links does the other page have? How relevant is this specific keyphrase to your term?

I think you get the point here, it won't always be a simple fix but let's look at an example to try to get a clearer idea of how to work through this.


Let's look at an example from a recent client project I did for a plastic surgeon (honest). In doing research for terms around plastic surgery for a brand new website I tried to get my head around some of the inner nuances of some of the terminology and procedures to try to better understand search behaviour. As you can see from the above research the broad match search volume for breast augmentation is considerably larger than the others [insert corny joke here], however the local search volume is quite comparable for the top two terms and they are about equally competitive.

This particular area of research can be quite complicated because you also have to look at the intent of the searcher and weigh that with the product offered. The term "boob job" whilst funny, is probably not likely to lead to serious searchers who are considering having a selective surgery so that coupled with the lower search volume means we can probably get rid of that one for now (though it might be worth bearing in mind for future link bait).

As an aside: by doing a bit further research into the types of pages that rank for the two terms and looking into search behaviours a bit it actually seems that people searching for "breast implants" are also not likely to be the highest converting traffic and there are certain social stigmas associated with the various terminology. So, in this case we can actually probably rule out breast implants (in terms of the main target of the top level page) because it is not likely to lead to highly converting traffic. This does not, however mean that we won't want to target the term at all, just that it may move down the priority list a touch.

So, the very first step has helped us eliminate two of the terms for the time being- so we're making some progress! The next step is to compare the terms that seem as though they might both be realistic targets for the page but are also relatively similar in terms of competitiveness. Although the scores assigned by the Keyword Difficulty Tool can be very helpful when comparing a term that is ranked as a "10" versus one that is ranked as a "95" these "difficulty scores" do not provide enough information alone when comparing two simialrly competitive terms. Thankfully, the tool gives us a lot more data to work with.

As you can see above, although "breast augmentation" seems to be a slightly "less competitive" term based on the diffculty metric there is a clear outlier within this chart (which we can go ahead and guess is going to be Wikipedia without even having the rest of the data) that looks like it will be extremely difficult to outrank even if the top spot seems slightly weaker than for "breast enhancement."

 

Similarly, the overall landscape for "breast enhancement" actually seems a bit more realistic as a target for a new site. Thus, in this case (based solely on the likelihood of ranking) we would actually choose to target "enhancement" rather than "augmentation." And try to work our way up to the more difficut term by building links to the site as a whole and specifically trying to target this page before shifting our approach on the term targeting. But before we make this a final decision, let's have a slightly closer look at what the competition really looks like.

 

As you can we were right in assuming the 5th spot for the "augmentation" comparison has been taken by Wikipedia, though not on a page directly targeted towards "breast augmentation" (hence why it's probably riding as low as it is in the rankings).

Meanwhile, setting aside the Wikipedia page it looks like the top spot for "augmentation" is actually being held by a rather weak site that happens to have great anchor text in the domain. This is a perfect indicator of just how much benefit having a strong TLD with exact match anchor text can be, but unfortunately this sort of tactic won't help much when you're trying to land the client who wants a "tummy tuck" instead.

 

So, what do we do?

In an ideal world, the client would be a great big site with loads of authority and without much sense. They will have been targeting "boob job" and have 302'ed all of their old links so we can make some quick changes and win. In this case, we go after all of the terms, do a bit of linkbuilding and we'll probably turn out just fine.

Meanwhle, in the real world situation we would recommend going for "augmentation" as our targeted term for a couple of reasons.

Labeling/Usability Fail

First, it is probably the "best" of the keyphrases in question. It targets the right kind of customer/searcher (we know this based on existing data and background insights on behaviour) and it has the highest search volume at the broader level.

Second, this keyphrase actually makes the most sense for the page we set out to build. As a top-level page it gives us the opportunity to (over time) target some of these other terms on the page (with the exception, maybe, of "boob job") . Augmentation is the most generic term and will allow us to discuss "implants, reduction, enhancement, etc."

Third, after having looked more closely at the sites/pages currently ranking for these terms it actually seems like it will be easier to rank for this particular keyphrase (please note that Wikipedia is not even directly targeting this keyphrase in this case).

 

"But what about all the other keyphrases? I don't want to waste them!"

This is where the post comes full circle. If you're building the small/new site the most sensible option (in the short term) is to create a page that is optimised for as many of these pages can be justified and for which you have research. As we mentioned ealier on, you can't just go after every single keyphrase in the industry on individual pages from the get-go because they won't all be indexed.

Try just to use some common sense: create the augmentation page high-up within the information architecture, construct a page for "reduction", "implants" and "enhancements" and forget about "boob job" for now. This term may get some traffic but if it doesn't fit with the theme/tone of the site then save it for the linkbait and build strong links to these inner pages now.

This technique creates much more work. But with a brand new site this is to be expected. Try to structure things in a manner that you can get rid of some of the smaller pages targeting extremely similar terms without impacting usability. This is essential and will make your life much easier in the future.

 

"What if my client is a massive site with great links?"
 We should all be so lucky. This is obviously a different ball game we're talking about here. But, if you are fortunate enough to have a Domain Authority that is considerably higher than your competitors for the keyphrase(s) in question you aren't going to struggle too much and you only need one page to rank for a number of terms.

Image via: 3 Meeses

If this is your starting point, I would advise creating one hub/landing page for all "augmentation" related terms. If your site is strong enough the Wikipedia example quite clearly illustrates that some of these other pages may be superfluous.

There's no need to jam all these keyphrases in the title-tag either. If there are enough inbound links and the site is trusted enough you can probably just go for the highest search volume terms so long as the term is related to the service offered (never forget usability!), if you are in this position kick back, relax and just wait for the little guys to catch-up!

Sam is based in London as a lead SEO at Distilled. He hopes you've enjoyed this post and is looking forward to your comments, questions and concerns!

Do you like this post? Yes No



"

Sunday, June 20, 2010

Bing vs. Google: Prominence of Ranking Elements

Bing vs. Google: Prominence of Ranking Elements: "

Posted by randfish

This past week during the SMX Advanced conference in Seattle, I presented some correlation data alongside Janet Driscoll-Miller, Sasi Parthasarathy of Bing & Matt Cutts of Google. Matt in particular was quite vocal in expressing a desire to see additional data points from our research, primarily around the prominence/visibility of particular elements in the results. This post is intended to help make that available.

2 Tweets from Matt Cutts

I must say that I don't agree with Matt on the importance of the raw visibility/counts over the ranking correlations. My feeling is that SEOs in these spaces are more interested in answering the question - "what features predict a result will rank higher vs. lower on page 1?" - rather than the more straightforward - "does this feature appear more frequently on page 1 at Google or Bing?" However, I certainly agree that both are relevant and interesting.

If you're trying to wrap your head around how to understand this prominence/visiblity data vs. our earlier data on the correlation with rankings, here's how we'd best describe it:

  • Correlation w/ rankings data helps to answer the question, "when this feature appears in results on the first page of Google/Bing, who ranks it higher and by what amount?" Those correlation numbers were derived by looking at the liklihood that a result would rank above another when it contained the target attribute.
  • Visibility/prominence of an element helps to answer the question, "is this element more likely to appears on the first page of Google's/Bing's results?" This simply looks at the number of times we saw a result (or multiple results) ranking on page 1 containing the target attribute.

We're looking at the latter one in this post, but before we dive in, there are a few critical items to understand:

  • This isn't correlation data and there's no standard error or deviation numbers here. It's simply how many times we saw the element in the results we gathered, divided by the total number of results (SERPs or URLs depending on the chart) to get a percentage. 
  • This data is from page 1 of results from 11,351 search results, gathered from Google's AdWords categories. This means the terms and phrases vary somewhat in search quantity (from sub-100 searches per month to tens or hundreds of thousands) but generally have a commercial focus and a intent. They generally don't include brand names, long tail phrases or vanityname searches. Overall, we picked them because they're precisely the kinds of queries most SEOs care about when they're doing competitive SEO for their companies and clients.  We also ignore the second result in a SERP from the same domain to avoid effects of indented results (which was important for our earlier statistics, but not those in this post). 
  • The results were collected the week of May 31st and thus, include post-"Mayday" update SERPs and likely results from after the "caffeine" launch as well (though Google did not announce when exactly that rollout occurred - it may not have much bearing as caffeine supposedly is an infrastructure, rather than an algorithmic change).
  • Each feature contains two pie charts, one showing the percentage of results that contained at least 1 URL with this feature and another showing the percentage of total URLs in all results (102,296 for Google and 109,966 for Bing - note that some SERPs will fluctuate the quantity of standard web results they show on page 1). These are labeled as "(feature) in SERPs" and "(feature) in URLs," respectively.

In gathering this data, we did not optimize to share it in this fashion. In fact, Ben & I both feel that if we wanted to do it this way, we should gather the first 3-5 pages of results, not just the 1st page.  The way, one could compare the counts on page 1 with the counts on page 2.  However, since we've got the data and Matt, Sasi and several other folks expressed interest, we're sharing anyway. Hopefully in the future we can do more on this front.

Let's dive in!


Exact Match Domains

These are domains that precisely matched the keywords in the query - e.g. for the query "dog collars" only a domain that matched *.dogcollars.* would be included.

Exact Match Domains in SERPs 

Exact Match Domains in URLs

You can see that Bing has slightly more exact match domains appearing in at least one result of the SERPs we collected and in the overall count of results (all the URLs from all the SERPs).

Exact Match .com Domains

Similar to exact match domains, exact match .com domains had to contain the exact query in the domain name and have a .com TLD extension.

Exact Match .com Domains in the SERPs

Exact Match URLs in the SERPs

Again, Bing showed a slight preference for displaying results from these sites in the SERPs and URLs we observed.

Exact Match .net Domains

As above, but replace ".com" with ".net."

Exact Match .net Domains in the SERPs

Exact Match .nets in URLs

The similarity is much closer in the number of total URLs we saw with .net exact match, but Bing is showing a preference in the SERPs count.

Exact Match .org Domains

In the .org TLDs, we start to see a bit of what we observed in the ranking correlation data:

Exact Match .orgs in the SERPs

Exact Match .orgs in URLs

This is the first exact match domain TLD where Google actually had more SERPs containing a result of this type. Bing, however, had a very tiny amount more URLs with this feature.

Exact Hyphenated Match Domains

One of Matt Cutts' complaints centered around how Google vs. Bing handled exact hyphenated match domains. When we observed them in ranking correlations, it appeared that, when Google listed them, they would rank them higher than Bing did when they appeared on that first page of results. However...

Exact Hyphenated Match Domains in the SERPs

Exact Hyphenated Match Domains in URLs

As I called out in the presentation and the prior post, Bing has quite a few more SERPs where exact match domains appear and somewhat more URLs, too. This is another data point that should make us all think carefully about the fallacy of presuming correlation = causation. Bing might have a preference for exact hyphenated match domains, but the ranking correlations suggest to me there's more going on here - maybe something to do with anchor text or where those types of sites tend to get links or something else we haven't considered?

It's critical to keep in mind that we're just looking at individual factors here - not trying to explain why they exist or correlate (at least, not in the data).

Results that Include All Keywords in the Domain Name

Here we looked for domains that contained the keyword query in the domain, even if the match wasn't exact. For example, mydogcollar.com would now match for the phrase "dog collar."

All Keywords in the Domain Name in the SERPs

All Keywords in the Domain Name in URLs

Again, it's Bing that shows a higher number of these types of domains in their results.

Results that Include All Keywords in the Subdomain Name

We've previously shown some data suggesting that subdomains might have some ranking influence, but not as much as root domains (this was done using our rank modeling / machine learning process). Here's some raw data on the number of times we observed keyword matching subdomains:

Contains all Keywords in the Subdomain in SERPs

Contains all Keywords in the Subdomain in URLs

Perhaps not surprisingly, Bing again is showing more of these results in their SERPs and individual URLs.

.com Domains

For this feature and all the TLDs below, we're just looking at any URL that has the domain extension.

.com Domains in the SERPs

.com Domains in URLs

It looks like Bing has very slightly more .coms in their results vs. Google.

.org Domains

Let's see what happens for .org domains, recalling Google's apparent preference for them in the ranking correlations.

.org Domains in the SERPs

.org Domains in URLs

Oddly, Bing again seems to have more .org pages in the SERPs and URLs.

.net Domains

URLs with .net probably won't surprise you much:

.net Domains in the SERPs

.net Domains in URLs

Yet again, Bing is showing a small number more than their Googly competitors.

.edu Domains

Recall how, in the correlation data, the numbers were small(ish) but negatively correlated? Let's see what the number of results shows: 

.edu Domains in the SERPs

.edu Domains in URLs

True to the stereotype, Google is slightly ahead on number of .edu domains in the SERPs & URLs.

.gov Domains

Given the previous charts, this one likely won't surprise you:

.gov Domains in the SERPs

.gov Domains in URLs

Google has more .edus and more .govs, too.

Keywords in the Title Element

Not surprisingly, nearly every set of SERPs had at least one result where the title tag contained the keywords:

Keywords in Titles in the SERPs

Keywords in Titles in URLs

Bing shows up with more results that contain title tag to keyword matching. One thing that is worth mentioning is that we didn't observe the titles the engines chose to show, but rather the page titles from the results themselves. Hence, if a result was showing a DMOZ title or a brand title (which Goole will sometimes insert), we ignored those and just saw the title element on the page itself.

Keywords in the URL

This one actually surprised me, if only because there were even fewer results with keywords in the URL than in the title! 

Keywords in the URL in the SERPs

Keywords in the URL in URLs

Bing again has more results with keyword-matching URLs, though remember that some of that is probably from keyword matching domains, too.

Keywords in the H1

The ranking correlations suggested that the H1 tag isn't much of a differentiator, yet lots of people still swear by them:

Keywords in the H1 in the SERPs

Keywords in the H1 in URLs

The results would bear out that this is a much less frequent item than URLs or Titles for those ranking on page 1. Bing seems to show more of them than Google, though.

Keywords in the Alt Attribute

Alt attributes looked interesting last fall when we collected ranking information and once again provde worth a look in the correlation data from SMX Advanced. Let's see what the raw couts show:

Keywords in the Alt Attribute in the SERPs

Keywords in the Alt Attribute in URLs

Bing is showing slightly more of these, but if the positive correlation means something, these numbers certanly suggest there's lots of opportunity left for good alt attribute practices.

Homepages

Who lists homepages vs. deep pages in the results more?

Homepages in the SERPs

Homepages in URLs

My word! It's Google by a good margin. Bing's show of internal pages actually surprises me a bit, though perhaps that's an old stereotype I need to abolish.

And with that, we're done!


One important point to notice is that I've not included data on link results, as these would be hard to interpret and likely non-useful. Every page of results had pages with links to them and nearly every individual ranking URL also had links (a good sign for Linkscape's index, but not super valuable as a data point). There were a few other data pieces like this that wouldn't make sense here (keyword prominence in the body tag, word tokens in the body tag, domain name length, etc) and have thus been excluded.

I've done less analysis on these results in general, as I think the data is a bit less ideal for the purpose, but it's still interesting and hopefully, illustrative of general prominence. I look forward to seeing your interpretations and discussion!

p.s. If you email Ben at SEOmoz dot org, he will send you a lot of numbers in a TSV which is for each query the metrics for each result that we used in these posts.  You can also find raw results in a public Google spreadsheet doc here. Feel free to play around and let us know if you see anything else cool and interesting.


Do you like this post? Yes No



"