Tags Archives

You are currently viewing all posts tagged with Traffic.

Posted by randfish

 If you use Google Analytics, you’ve undoubtedly seen a report like this:

Google Analytics Pie Chart

The problem is, there’s no breakdown of "social media" in this view of traffic sources, and with the dramatic rise of social media marketing, marketers need an easy way to segment and "see" this traffic separately from the rest of their referrers. We know it’s mixed in with "referring sites" and "direct traffic" but luckily, there’s a way to extract that data in just a few simple steps.

Step 1: Create a Custom Segment

Custom segments are the way to go for separating traffic into filter-able buckets for deeper analysis. GA makes this fairly painless:

Step 1

From any of the "Traffic Sources" sections, just click the "Advanced Segments" in the upper-right hand corner and then the link to "Create a new advanced segment."

Step 2: Add Social Sources

This is the most crucial part, and requires that you have a full list of the sites/words to include. I don’t recommend using just the domain names or URLs of the most popular social sites, but instead, some clever "catch-all" words using the "source" condition, as shown below:

Step 2

Make sure to continue adding "OR" statements, not "and" statements – the latter will require that both conditions are met vs. any one of the "ORs". Here’s the list of words I used, though you can certainly feel free to add to it:

  • twitter
  • tweet
  • facebook
  • linkedin
  • youtube
  • reddit
  • digg
  • delicious
  • stumbleupon
  • ycombinator
  • flickr
  • myspace
  • hootsuite
  • popurls
  • wikipedia

Depending on your niche, it might be valuable to run through your top 2-500 referring domains looking for any obvious matches. You could also refer to Wikipedia’s list of popular social sites.

Step 3: Test & Name Your Segment

In order to create a fully functional segment, you’ll want to test the logic you’ve created to be sure results are returning. Before you do that, though, GA requires naming your segment (I used "social media"):

Step 3

Once it’s complete and working properly, click "save segment." You’ll be returned to the prior screen with the segment ready to rumble.

Step 4: Filter Traffic by "Social Media"

Your new segment is ready to be applied. You can now filter social media exclusively or see it in comparison to other traffic sources on any report in GA. Just use the advanced segments drop-down and choose "social media" under the custom segments list like so:

Of course, just having data is useless unless there’s some action you can take from it. Segmenting social traffic is useful for reporting, particularly to gauge value (if you have action tracking on commercial activities set up in GA, for example) and see growth/impact over time. But, there’s more you can learn than just raw traffic and conversions numbers.

Here’s some examples of reports I ran, along with the value/intelligence extracted from the data:

It can be tough to "see" the social sites between other referring domains, but once they’re broken out, combing through and finding the sites where your efforts are working is vastly more simple. If you then compare this against traffic "opportunity" from these sites (using a combination of traffic data and gut check), you’ll be able to find which sites have the greatest chance to improve. For SEOmoz, Facebook, LinkedIn, Reddit and Wikipedia stand out to me as places where we likely have more opportunity than we’re currently capturing.

This next chart compares search vs. social traffic over time:

If I’m looking to evaluate progress and make comparisons, this view is fairly useful. I can tell if my share of social media is growing or shrinking and how it compares to overall traffic and search individually. I’m only looking at a short timeframe here, but over the course of weeks or months, I can quickly gauge whether my efforts in social are paying off with traffic and whether they’re improving my performance in search engines (through new links, citations, etc). When someone asks if social helps search, showing these two segments over time can be persuasive.

Next, I’m reviewing the level of engagement of social media visitors:

At first, I can compare this against other segments (like "search" or "direct") as a measure of comparative value. But, I also want to compare this over time, particularly if I’m making tweaks to my site to encourage greater engagement and click-through to see if those efforts are successful.

Just because I’m curious, I’ll check out some browser stats:

 

Admittedly, this isn’t especially actionable, but it is fascinating to see the browser "savvy" of social users. Dominated by Firefox and Chrome with very little Internet Explorer use. If I’m trying to see what the cutting edge users are shifting towards, this is where to look. I suspect Rockmelt will soon be joining the list. (BTW – I love that 5 people came with the user-agent "Googlebot" – awesome).

Last, let’s peek at the pages social visitors see:

These are all potential opportunities to create more customized landing experiences based on the referrer path, and the report can also give me insight about what content I need to continue producing if I want to draw in more social traffic. 


If social media marketing is a focus of your organization, segmenting that traffic in reporting is critical to determining the value of your efforts and improving. So get into GA, segment, and start seeing your traffic for what it really is. 

Do you like this post? Yes No


SEOmoz Daily SEO Blog

Posted by Sam Crocker

Today we’re going to break down a number of different tools and resources for getting insights into competitors traffic data. We have looked at a handful of tools here and will break them down one by one as to their strengths and weaknesses, as well as the validity and usefulness of the data provided. Ultimately we just wanted to share with you some other information sources out there that you can add to SEOmoz’s list of tools that are great for competitor analysis (my personal favourites being the Linkscape Visualization and Comparison tool as well as the Competitive Link Research tool).

However, I have had a number of clients asking me for a better view of overall market size and what kind of traffic their competitors are getting. Despite the fact that this has, unfortunately, at times meant crushing a few dreams about who a genuine/realistic competitor is or should be (i.e. NOT mashable if you are a new site) it can be tricky to find meaningful predictive data even when you know who your competitors are.

The Failed Experiment

Initially I wanted this to be an experiment testing out a number of services and running them against Analytics data to compare like for like and find which sites provided the most accurate information. I compiled Analytics data from 25 websites with hopes of comparing the real numbers (from Analytics) against the predictions of the other tools to try to find which was the most reliable across a number of sites from different sectors with a range of monthly traffic from ~1,000 monthly visitors to over 48,000,000 monthly visitors.

The idea was to report on data across a number of these platforms for average monthly visits, total yearly visits, geographical visits and so forth. Unfortunately, there were many fewer tools that provided this data than I initially anticipated and it quickly became clear that we weren’t going to be able to compare apples to apples and there is no substitute for internal data… but through the combination of some of the below tools you can get a good idea of what sort of traffic your competitors sites are getting.

So, the experiment was a bit of a failure, but I learned more than my fair share about the tools so let’s have a look at which tools are available and which tasks/comparisons they can be used for. I’m hanging on to all the data I collected and at a future date (if I ever hear back from some of the data sources) I will post a follow-up/re-do of the experiment.

The Tool Belt

It is worth pointing out that a number of these sites suggest they can provide better data if you claim the site(s) in question. I cannot testify to the accuracy of this because we have not looked into this (and could not feasibly claim the data for all 25 sites), also, all comments are based upon the free version of the tools as we did not have paid access to any of the tools.

Alexa

Strengths:
Alexa is good for comparing different sites traffic and for monitoring general traffic trends. It can be quite useful for comparing one site to a competitor site (up to 5 sites at a time).
The index is massive and contains some data about all of the 25 sites we tested.

Weaknesses:
Not so great for the smaller sites. As you can see below, you won’t get any of the traffic charts for sites ranked outside of the top 100,000 (which means if you Alexa thinks you are getting any fewer than 10,000 visits per month you’re unlikely to glean any great information.

Accuracy is a serious concern. This does taint the usefullness of the tool in general.

The numbers reported are not helpful for predicting traffic on their own.

Accuracy:

We want to keep this all anonymous but let’s just say one site that we know gets 10-20,000 visits per month had an Alexa rank that was more than 5 times better than a site that we know gets 75,000+ visitors per month. And this was not just a one-off event.

I would have to seriously quetsion the reliability of this tool. It didn’t seem to be too bad at predicting the trends for a single site but the charts are extremely difficult to make any real use of. The information on bounce rate seems fairly accurate (give or take a few percent) but the trends for bounce rate seemed much less accurate (e.g. the ups and downs did not seem to correspond with similar peaks and valleys in Analytics).

Perhaps most interestingly it seems to be skewed in favour of sites within the search marketing space. Sites in the search marketing space that we looked at regulary outranked sites receiving more than 10 times as much traffic on a monthly basis.

How to best use Alexa?
The tool is interesting for comparing similar sites or sites within an industry. I would like to recommend the tool but based upon my experience and this particular data set I would have to say I would be very cautious about using this to make any meaningful suggestions or estimates on traffic data. It is a great concept for a site but does not seem to have been particularly accurate.

The most accurate data seemed to be the data from the visitors by country (the order was fairly accurate and the percentages we looked at were not to far off). To the extent that this data would be useful to have for your competitors this would be one good use of Alexa data.

The insights for audience demographics could also potentially be extremely valuable, though accuracy will always be a question.

Cost:
Free. Options for site audits for 9
 

Compete

Strengths:
Useful interface.
Speak the right language (unique visitors, visits, etc).
Ability to compare multiple sites
Data is easy to understand and well presented.

Weaknesses:
Accuracy
Somewhat limited number of sites – many sites that it classifies as "low sample sites"
Cost of "Pro" option  

Accuracy:
Again, accuracy is a serious concern here. The data was off in some cases by as much as 2,000% for monthly visits. The accuracy seemed to be a bit better for the peaks in traffic and some of the general trends we looked at, but was certainly not reliable enough for us to suggest reporting competitor traffic based upon this information.  

How to best use Compete?
It should come as no surprise that Compete is best used for comparing competitors. The scale of the data is way off but some of the trends seemed to be fairly reliable. I wouldn’t advise reporting any numbers from this data (as they do not seem close/reliable at all – often off by a factor of 2 or more), however the trends are reliable. The information could be meaningfully used to look into seasonal trends between competitors. The demographic information (again, not being able to comment on the accuracy) would also be quite interesting but would require registering your site.

I can’t very well recommend the PRO services as I was not able to gain access and was unwilling to pay the cost just for the blog post. I would be extremely interesting in looking further into some of the referral data and the keywords data but this is not available as part of the standard free toolset.

Cost: Free. The PRO membership is 9 per month.
 < /p>

ComScore

Unfortunately we struggled with ComScore. We were unable to get a login or sneak a peak at any of the data. Thus, obviously we cannot comment on the validity of the data, only some of the offerings.

Strengths:
N/A

Weaknesses:
N/A

Accuracy:
N/A

Best use of ComScore:
ComScore offers a number of reports and insights into markets including reports on Local market size, as well as information about valuable/important keywords in an industry. It would be very interesting to find out where this data was coming from and how good it was, but we were not able to achieve this in time to publish this information.

Cost: N/A
Costs were not listed on the site, but rather suggest contacting ComScore directly.

 

Google Ad Planner

Strengths:
Sites also visited data is good
Keywords searched for can be quite valuable
Audience interests data interesting
 

Weaknesses:
Accuracy
Lack of data for small sites

Accuracy:
The accuracy was really mixed. For many of the sites AdPlanner provided much better data than some of the others, however, they were still off by miles for some sites – off by as much as 1000%. Again, the data on this in general tended to be better than many of the others, but given the occassional "big miss" I would not be comfortable using this data to make traffic predictions for a client.

How to best use Google Ad Planner:
The data about other sites visited as well as keywords searched for (with affinity) could be extremely valuable. As well as some of the other metrics reported on and audience interests. However, the traffic data is not particularly meangingful and is not to be relied upon.

Pro-tip: The data tends to be better when site owners have granted permission to analytics to publish data, I know we all love open and friendly, but this isn’t the sort of thing you neccessarily want to make easier for your competitors to find.

Cost: Free.

 

Google Insights

Strengths:
Trends around Keyphrases and keyphrase groups
Regional information
Trusted source

Weaknesses:
Difficult to read the data
No hard and fast numbers about traffic
Hard to compare entire sites to one another

Accuracy:
You can bet that the accuracy of this data is going to be pretty good given that the data provider has access to more data than anyone else on the internet. However, the fact that the numbers are normalised and more designed for keyphrases and search terms and trends than for traffic data means that the search volume will correspond perfectly with the traffic to a site.

How to best use Google Insights:
Google Insights could be quite helpful for finding the most valuable pockets of keyphrases and keyphrase groups. This could be particularly valuable when looking at a competitor site and trying to figure out which of their keyphrases are driving the most traffic. Further to that point, it could help you see which of the keyphrases within a keyphrase group might be the most valuable.

Cost: Free.
 

Google Trends for Websites

Strengths:
Good for illustrating magnitudes of difference between sites
Allows comparison of multiple websites
Includes regional information

Weaknesses:
Not good for comparing sites fairly similar in size
Accuracy imperfect
No numbers

Accuracy:
The data seems to be more accurate when only trying to compare traffic from search, it does not seem to do as well in picking the winning recipient of overall traffic. Given that these trends are Google Trends this is reasonable and still paints a fair landscape for an SEO’s needs.

When comparing websites with drastically different traffic numbers the rough visual estimation appears to correspond quite well with the observed analytics data as well.

It’s a shame there are no actual numbers for the data, but that would just be too easy.
 

How to best use Google Trends for Websites?
Trends is great for broad information gathering. It gives some insite into similar searches when comparing sites, and in general it is unlikely that you will find better comparative data out there without direct access to your competitor’s analytics account. However, Trends does not provide numbers and thus can only be used to venture a guess at what sorts of numbers competitors are pulling in.

When two sites are relatively similar in size Google Trends does not always pick the winner in terms of monthly traffic correctly. For example, one of the sites we tested received around 7.5m monthly visits whilst another received around 8m and Google ranked the 7.5m website higher. However, it is worth noting that the 7.5m visitor site received considerably more volume from search than did the 8m visitor site so from an SEO standpoint this data is probably quite accurate.

 

HitWise

Unfortunately we were not able to get data from HitWise in time. The HitWise team was very helpful, responsive and agreeable and we will share this data once we have gotten our hands on it. However, we had not received the data back on the websites in the study in time for publication.

Strengths:
N/A

Weaknesses:
N/A

Accuracy:
N/A

Best use of HitWise:
HitWise, similarly to comScore works on a reporting basis insofar as you speak to them about the types of market reports you would like or you can create custom reports. Whilst we obviously cannot comment on the accuracy of the data the services offered look to be better tailored to an SEOs needs than do the reports offered by comScore. However, generally speaking HitWise will not work with agencies which will be a bit of a bummer for some of you.

Cost: Free-5+ per report
The range in cost seems to be fairly large. Whether the data warrants the pricing structure cannot really be judged without looking at the data, though they do make some data freely available through their website.

Quantcast

Strengths:
Traffic Numbers that are easy to follow
Design and display of information
Demographic information (when available)
Media Planner Tool

Weaknesses:
Unreliable
Lacks data for small-medium trafficked sites
Accuracy
Inability to compare sites

Accuracy:
Definitely the biggest shortcoming of the Quantcast data is accuracy. As with some of the other sources the traffic data is estimated and is nowhere near accurate on the sites for which Quantcast had any data. Data was off by as much as 10 times the actual analytics data for s
ome of the sites. Again, I cannot say that I would recommend sharing any of the data with a client as an accurate predictor of a competitor’s traffic.

Best Use of Quantcast:
Although the data is not particularly reliable for the traffic data some of the other tools the site has to offer seem quite interesting and worth further investigation. The demographics information is also particularly interesting because it provides a reference as to how the data compare against the internet average. This sort of data could be particularly valuable for analysing a market by compiling data across multiple sites.

Cost: Free.
 

SEMrush

Strengths:
Data Includes sites of all sizes
List of Keyphrases and rankings for thos terms
Most accurate numbered data of all tools looked at

Weaknesses:
Data imperfect
Pay to get full data lists
Data only for Google traffic

Accuracy:
The data was not perfectly accurate, though generally speaking SEMrush did not miss the mark for any of the sites we tested the same way a number of the other tools did. This is, obviously not to say that this data is infallible or that there won’t be some issues with some sites, but the data was surprisingly accurate. As with some of the Google data the information reported is just the Google SE traffic, but this is our main area of focus and was quite accurate when drilling down into that specific area of traffic within analytics.

Best Use of SEMrush:
Although imperfect, this tool came the closest to providing accurate data that I would at least with a word of warning, be willing to share with a client about potential expectations or about where there competitors may be traffic wise. Most importantly, the add-on options and ability to see the keyword lists and how the competitor ranks for these terms is extraordinarily appealing to me.

Cost: Free-9 per month

Conclusion

I hope that the findings from all this research will be valuable to you. At the end of the day it is an incomplete study and I look forward to following up on it when I have another big chunk of time and if/when I get access to comScore, HitWise, Compete PRO and SEMrush Pro. For the time being I would rely most heavily on SEMrush for predicting traffic and estimating how well a competitor is doing, but all of these tools add something to the ever growing toolbelt even if it may be for a purpose other than that which I was hoping they would achieve for me – we all know I love to misuse tools and I’m sure I will come up with some creative ways to use these insights.

 

Thanks a lot and look forward to any feedback you might have in the comments below or feel free to contact me on Twitter.

Do you like this post? Yes No


SEOmoz Daily SEO Blog
Post image for Change Your Content Based on Traffic Intent

A few weeks ago, Brent Payne made a post about “whitehat cloaking” and changing your content based on referring website. He asked for some feedback on Twitter, causing some follow up discussions. I had a few people asking for examples about how to do this. In this two part post, first we’ll look at some theory about why would you want to do it, under what circumstances, and how to do it without angering the Google Gods. In tomorrow’s post “How to Conditionally Change Your Content,” I’ll give you some ideas about how to implement this.

Let’s talk about the high level strategy items first. Why would you want to serve different content to different users:

  • Social media traffic is advertising averse, so show them fewer ads and more social oriented content
  • Search traffic can be goal/purchase oriented, so try to serve them content designed to help you do that
  • Direct traffic can get the full brand treatment designed to build subscribers, regular visitors, or a sense of community

To use a cooking metaphor, I’m not serving each of these people a different meal, but I’m varying the seasoning to suit my guest’s individual tastes. Let’s get past the superficial. What are some things you could do differently for, say, social media traffic? Under most situations, social media visitors don’t click adsense, banner ads, and that sort of thing. For social media traffic, your best outcome will be getting them to link to your page, vote/retweet your page, or visit other pages. What you want to think about is how you can change your content to help you meet those goals.

With Google’s announcement that site speed is a factor, many savvy webmasters opted out of third party buttons and began to use smaller, lightweight, on-site graphics. While this helps with site speed, it doesn’t help with social engagement. If you want more social interaction show bigger buttons up top, especially the third party buttons with active vote/tweet counts. I would remove as much advertising as you could. I would replace this with graphics or sections featuring other social content. If you use tags to isolate your social content it would be easy to pull out using a DB query. How about showing your most popular or most emailed pages. Rather than showing a social media audience 25 pages of your top 25 list, consolidate all of the content onto one page.

What about search traffic? How can you change the content to better suit their needs? In some cases you may want to remove content like the sidebar, making your pages more like single page squeeze pages. Of course this will depend on the page content like, say, a product page. You may want to be more aggressive with advertising placement if you run an adsense or affiliate website. You could also vary the advertising a bit. I’ve spoken before about using tags to target your advertising, but why not use search query terms as well. If someone came to your website searching for [cheap mexico vacations], normally you would just serve them ads about cruises, hotels, or vacation packages to Mexico. However, if you trapped for search queries containing the word [cheap] you might also want to mix in some value based vacation advertising.

While there are some advantages to doing this, there are some pitfalls as well. This type of behavior makes for a more complicated website to maintain and run, so make sure you have the resources for the long haul. Secondly you have to be concerned about the search engines and giving the appearance of cloaking with ill intent. The more dramatic the main content is from one version to another, the more likely it is to upset a search engine. For example, if you serve a 1,400 word article to direct traffic, a 700 word trimmed down version to search traffic, and a 400 word version to social traffic, you are taking some risks. I would make sure that search engine bots get a version that is very close if not identical to the version that users coming from a search engine will get.

In the next post I’ll walk you through some of the steps on How to Conditionally Change Your Content from a programming perspective.
Creative Commons License photo credit: Rob Hughes

This post originally came from Michael Gray who is an SEO Consultant. Be sure not to miss the Thesis WordPress Theme review.

Change Your Content Based on Traffic Intent

tla starter kit

Related posts:

  1. Is Google Stealing Your Content and Hijacking Your Traffic Google has long been an advocate of “build great content”;…
  2. Advertising on Content Based Websites After spending time on research, creating content, buying a domain…
  3. Putting a Content Based Website Together We’ve covered long term content and short term content, information…
  4. No More Link Begging: 4 Engagement Methods for Content-Based Link Building Link begging is the practice of identifying link prospects, usually…
  5. Thesis Tutorial – Adding Date Based Triggers to Your Posts There are a lot of times when you are working…

Advertisers:

  1. Text Link Ads – New customers can get 0 in free text links.
  2. BOTW.org – Get a premier listing in the internet’s oldest directory.
  3. Ezilon.com Regional Directory – Check to see if your website is listed!
  4. Page1Hosting – Class C IP Hosting starting at .99.
  5. Directory Journal – List your website in our growing web directory today.
  6. Content Customs – Unique and high quality SEO writing services, providing webmasters with hundreds of SEO articles per week
  7. Majestic SEO – Competitive back link intellegence for SEO Analysis
  8. Glass Whiteboards – For a professional durable white board with no ghosting, streaking or marker stains, see my Glass Whiteboard Review
  9. Need an SEO Audit for your website, look at my SEO Consulting Services
  10. KnowEm – Protect your brand, product or company name with a continually growing list of social media sites.
  11. Scribe SEO Review find out how to better optimize your wordpress posts.
  12. TigerTech – Great Web Hosting service at a great price.

Michael Gray – Graywolf’s SEO Blog

Posted by randfish

Yesterday, Perfect Market, a company that "helps publishers create value from their online content with little effort and no risk1" released a study that’s been getting quite a bit of attention. The study analyzes the relative traffic value per visit of several types of content, coming to the conclusion2 that "while the Lindsay Lohan sentencing and other celebrity coverage drove significant online traffic for major news publishers, articles about unemployment benefits, the Gulf oil spill, mortgage rates and other serious topics were the top-earning news topics based on advertising revenue per page view."

Coverage included the New York Times’ Traffic Bait Doesn’t Bring Ad Clicks, Columbia Journalism Review’s Celebs are Loud, but Hard News Pays, Nieman Journalism Lab’s Public Interest News Can Be More Valuable to Publishers than Traffic Bait and Search Engine Land’s Hard News Pays More than Chasing Search Trends.

I’m worried for a few reasons:

  1. What’s the branding value of those stories? Do they drive up awareness of the publications that authored them? Do they increase return visits?
  2. What other actions do those visitors take? Are they more likely to subscribe to an RSS feed? To share those stories on social networks? To get email notifications?
  3. Do these stories drive links that then help other, lower link-earning content rank well in search engines? The goal of linkbait, after all, is often to drive branding, links and sharing rather than being directly monetizable. Plenty of consultants on viral content creation even recommend removing ads to drive up sharing and linking activities.

Granted, from a personal perspective, I love the idea that writing about celebrity gossip and other "soft news" isn’t profitable and therefore might be less prevalent in the future. It’s purely opinion, but I suspect that many share my sentiment that the United States’ major media outlets are far too focused on shallow reporting of topics (like those mentioned in the Perfect Market analysis) that deserve far less attention than, say, understanding what caused the mortgage crisis, who’s spending money on elections and why, the success other nations have had in dealing with crime, poverty, drugs, multiculturalism, etc.

However, anytime a skin-deep, single-metric analysis like this makes its way into major publications, it has an effect on content publication that’s not necessarily positive. If executives, editors and journalists start using singular metrics rather than deep analyses of data to make decisions, their publications will suffer and their content and marketing budgets will be misallocated.

If Perfect Market (or another source) could show:

  • The value of the links brought in from those stories
  • The branding impact of the visits generated
  • The value of sharing activities from those visits

I’d be far more inclined to agree with the conclusions the press is reporting.

If you can’t fully/accurately analyze the true lifetime value to your publication of so-called "bait" (and I don’t just mean celebrity-obsessed soft news, but a broader group of creative, traffic-driving pieces), that’s OK. Just don’t presume a single metric like "ad click value" combined with "page views" will give you the whole story. The web is all about providing data, and you’re cheapening your own value when you cut corners to this extent.

BTW – I don’t mean to cast all the blame on Perfect Market – they did some reasonable data analysis and shared the findings. I wish it had included a bit more caveats, but their job is promoting their work. I’m more concerned with how the media treated the story – reporting, exaggerating and not bothering to dig deeper. Just look at the opening lines of the NYTimes piece3:

Sure, articles about Lindsay Lohan’s repeat trips to rehabilitation and Brett Favre’s purported sexual peccadilloes generate loads of reader traffic, but do they actually make decent money for the Web sites that publish them? According to a new analysis, no.

That’s not what the analysis showed. It showed one metric and it’s impact, but it didn’t explore the overall value of the page views, visits and CLTV (Customer Lifetime Value) of the stories it examined. Let’s hope the publishers do a more thorough job and that we, as content creators & marketers, think carefully about how to value the content we create and the traffic we attract.

Do you like this post? Yes No


SEOmoz Daily SEO Blog

Posted by richardbaxterseo

If you think about it, search engines are more or less constantly driving us SEO people to keep our technical SEO strategy in a state of constant refinement. This “evolution” of the marketing environment we thrive in is a great thing, it challenges us to come up with new ways to improve our traffic generating capabilities, user experience and the overall agility of our websites.

Here are a few ideas (6, to be exact) based on issues I’ve encountered in QA or on our recent client work that I hope will provide a little food for thought the next time you’re planning SEO enhancements to your site.

1)      Leverage UGC (review content) beyond the product page

UGC is brilliant, particularly on product, content thin or affiliate sites. Making it easy for users to leave a review is powerful stuff, but are you making the most of your precious user generated comments? Consider this scenario. So many users leave product reviews on your pages that you decide to limit the number of reviews visible for that product. You could cherry pick some UGC for category listings pages, adding to the uniqueness of those otherwise content-thin pages too.

the power of UGC

2)      Use “other users found this document for”

I know Tom loves this trick, and rightly so. You can turn insightful recent searches data into valuable on-page uniqueness. Internal search and external referrals are great, but how about extending the process to make it easy for users to evaluate, extend, tag or remove terms they feel are irrelevant to the page?

This simple example shows how users of a forum site may have found that thread. I think there’s a whole lot more you can do with this trick, but it’s a start:

users found this page for these keywords

3)      Consider delivering search engine friendly URLs in your internal site search results

I know how “out there” this might initially sound, but why settle for search engine unfriendly URLs on your internal site search pages? I have seen lots of examples of links being awarded to unfriendly, internal site search URLS. Why do we spend so much time carefully crafting our external URLs, only to completely forget our internal search URLs? A little extra development work to apply a meaningful pattern to your search result page URLs today could lead to the construction of an entirely new content type down the line.

Look at how folks are linking to these search query pages, and note the first example (where instead of a URL rewrite, this site is using breadcrumbs to make their ranking page URL appear more friendly):

interesting search results pages with links

4)      Microformats are really gaining traction – be creative with them

What we’ve found with Microformats is that webmasters tend to apply the markup to web pages hosting the content, but  that’s where they stop. Imagine you have a website that sells tickets. Do you add hCalendar to your event page and walk away? No! You can nest other Microformats such as hProduct and hReview, and syndicate your formatted data to other internal pages, snippets on your homepage and category pages. Any mention of an event, a link to a product or a review snippet should use the appropriate mark-up, consistently across your website.

5)      Work hard to resolve errors and improve site speed

Think about how Google have placed site performance at the top of their agenda. I genuinely believe that a site riddled with performance issues and errors is tolerated less today by search engines than ever before. Websites with platform issues can raise serious problems for SEO, users, conversion and repeat visits. Fortunately, there are plenty of tools (including SEOmoz Pro, IIS Toolkit, Pingdom Tools and Webmaster Tools from Bing and Google) to help you identify and tackle these issues head on. Go and set aside some performance maintenance time, if you haven’t done for a while.

6)      Watch your homepage title in Google’s SERPs

Google can be pretty aggressive when it comes to choosing the most appropriate text to appear in your title snippets. Sometimes, you might disagree with Google’s choice! Our tests so far indicate that the NOODP meta tag (used to prevent Google using the DMOZ description from displaying in your SERPS) can prevent Google from doing this, even if you have no DMOZ listing.

From this;

Without ODP

To this:

better title display in serps

That “penny drop” moment when a new technical SEO strategy idea presents itself has to be my favourite part of SEO work. I’m glad that technical strategy has to evolve as search engines develop. I really can’t see a time in the near future when that will change.

If you’d like to hear more tips, I’ll be speaking at next week’s A4Uexpo in London on exactly this topic. If you’re there, be sure to drop by and say hello. My buddy Dave Naylor will be introducing me (I have no idea what he’s going to say) and hopefully there’s going to be some time to do a preview of the session over on his blog soon.

Do you like this post? Yes No


SEOmoz Daily SEO Blog

Posted by Kate Morris

As a consultant, I work with many In-House SEO teams with strategy and other issues that arise throughout the course of the year. One trend we are seeing is that these In-House teams are having a hard time coming up with accurate traffic-centered goals. Traffic is the base for many metrics measurements, so being able to semi-accurately predict that number in the coming year is important for every business. 

I can hear you all now, "Well there is the Google Keyword Tool … use that." Typically, that is my answer too, but there have been major questions about the accuracy of Google’s keyword tools and others available to webmasters, marketers, and search engine optimization teams.

(If you will comment with your favorite keyword tool other than those I mention, I’ll happily test and add it here!)

The Google Keyword Tools (yes, plural)

There was a shift recently with the Google Keyword Tool. The Legacy/API version is showing different numbers than the newest Beta interface. David Whitehouse and Richard Baxter both noticed this shift as well and did a few tests on accuracy. The jury is still out as to which version is more accurate, the legacy or the new keyword tool. But I believe like Mr. Whitehouse that the newer tool is the updated one, but that does not make it more accurate. 

To be clear, when I speak of the Legacy, API, and Beta tools, I do mean different versions of the Google Keyword Tool. First, from what I can see using the SEOmoz Keyword Difficulty tool, the Google API pulls from the Legacy tool, so they are one and the same. The Legacy tool is the prior interface for the current Beta version of the Keyword Tool. We had previously assumed that these pulled the same numbers, but my research and that of others proves otherwise.

But wait! *infomercial voice* There is more!

There is also the Search-based Keyword Tool that aids AdWords advertiser’s in choosing relevant keywords based on search behavior and a specified website. This tool is explained by Google here and gives more in depth information on account organization and cost.  

But even this tool is not on par with the other two when it comes to impressions. A random query in the Search-based tool returned a suggestion for the keyword "maragogi." The Search-Based tool says there should be 12,000 monthly searches. The Legacy tool returns 110 Local Exact match searches, 33,100 Global Exact match, and 201,000 Global Broad match. The new tool returns information only for a global setting (all countries, all languages). That returns 74,000 searches broad and phrase match, and 12,100 for exact match. It seems like the Search-based tool is more like the exact global match in this one instance. But what is a business supposed to do with all of these numbers?!?!?

(hint: always use exact match)

Back to Strategy

If these tools are possibly inaccurate, how do our clients go about setting their yearly strategy goals?

Put simply, in search, you never want to rely on one set of results or one ranking report. Data over time and from many sources is best. But with the lack of tools out there and Google bringing in at least 65% of traffic organically for most sites, how do you get the best numbers? 

Impressions

First, you need to start out by figuring out how many impressions a keyword or set of keywords can bring in on average for a specific month. If you are in a cyclical industry, this will have to be done per month of the calendar year. 

1. Pull from both Google Tools and other Keyword Tools

Below is a look at some information I pulled using the tools mentioned for the key phrase "curtain fabric."

The idea here is that if you take into account all of the numbers out there, you might see a trend that you can use for estimating future traffic. If there is no trend, then a median of the numbers can be used as your metric. A few other tools that you might look into include Word Tracker and Keyword Spy. You can see that the numbers are all over the place, but looking at these figures, I’d guess that the keyword might bring in around 6,500 impressions a month in the UK. 

The downside is that WordTracker and KeywordSpy don’t allow you to look at exact match information versus broad match. When performing keyword research, you always want to look at the local (target to your country) exact match information. Too many people pull keyword information use broad match and get inflated numbers for all phrases related to that key phrase. 

2. Run a PPC campaign if possible.

The absolute best way to get accurate numbers about traffic over time is to run a PPC campaign. I pulled some numbers from a few campaigns (for our client’s sake we have masked a number of the actual key phrases) in attempts to see if the new keyword tool is accurate to actual trafffic in August. The keywords pulled were all exact match in the campaign and the information pulled from the keyword tool was Local Exact and set to the country that the campaign was targeting. 

As you can see, some of these are higher and some lower. What I found that there really is no definitive answer of if the Google Keyword Tool is accurate. Take a look at the results for the example I used before, curtain fabric. The campaign saw 11,389 impressions, much higher than the new keyword tool, and lower than some other keyword tools. This is why a well run PPC campaign is important if you want to get a more accurate look at impression numbers. 

Please note that I didn’t get a chance to ensure that these accounts were all showing at all times during the month, but they were all accurately geo-targeted and all showed on the top of the first page on average. 

Finding Traffic Based on Rank

After getting a good idea of the number of impressions, you then need to take into account where you are showing for that keyword on average organically (aka your rank). While we cannot know specific click through numbers for every search done on the web, there have been some studies done on how much of those impressions the top organic result gets, the second and so on. The one I used the most often is from Chitika. Using the percent of the traffic below and the impression numbers, you should be able to get a good idea of the visitors you can expect per month organically for a specific key phrase.

 

So using the "curtain fabric" example, assuming that the site I am working on has maintained an average ranking over the last few months of #3 organically, I could expect about 1300 visits from Google for the keyword in a month (11.42% of 11,389 impressions).

Past Metrics

Once you
get everything figured out, keep in mind that your past metrics are another good way of seeing how close you are to getting the traffic about right. Assuming that no major changes have occurred (like lack of metrics data in the last year), a look back is the most accurate way to understand traffic flow and trending on your site. Pull the unique visitors for every month of the last year and do some analysis on percent increase month over month. This can be done on any level in most analytics programs – overall traffic trends all the way down to the keyword level. 

A look at overall traffic per month in Google Analytics for organic searches from Google:

A look at traffic for a specific keyword over the last year per month from Google organic:

Educated Guesses

In the end though, making predictions are just that, educated guesses. Pulling data from all available sources and using your own historical data can assist in making an educated prediction for the next year. Keep in mind though that things never stay the same. Google Instant just proved that with one of the biggest changes we have seen in a while. 

Do you like this post? Yes No


SEOmoz Daily SEO Blog

Marketing generally has 2 core strategies in terms of customers: finding new customers & keeping your current/old customers happy. The best businesses tend to keep the interest of their customers for months and years through consistently improving their products and services to deliver more value. Whereas the other sorts of businesses tend to be hard-close / hype driven & always promoting a new product / software / scheme. It is never a complete system being sold, but some “insider secret” shortcut that unearths millions automatically while you sleep – perpetually. ;)

One of the problems with false scarcity hype launches is that it attracts the type of customers who can’t succeed. The people who are receptive to that sort of marketing want to be sold a dream, they are not the type of people who want to put the time and effort in to become successful. They are at stage 2 in this video: “my life sucks” … so sell me a story that will instantly make everything better without requiring any change from me at all. ;)

Another one of the problems with the hype launch business model is that it requires you to keep repeating the sales process like a traveling salesman. Each day you need to think up a new scheme or angle to sell a new set of crap from, and you have to hope that the web has a short enough memory that the scammy angles used to pitch past hyped up product launches don’t come to bite you in the ass.

I don’t mind when the get rich quick market work their core market, as there is a group of weak minded individuals who are addicted to buying that stuff. But I always get pissed off when someone claims that your field is trash or a scam (as an angle to sell something else), and then they later start trying to paint themselves as an expert in your field.

Here is a video snippet of Ryan Deiss exclaiming his ignorance of the SEO field & how he got ripped off thrice because he knew so little he couldn’t tell a bad service provider from a good one.

“If you want to get free traffic you have to get good at the cut-throat game of SEO (which I for one am not). … SEO for most of us isn’t the right answer.” – Ryan Deiss

And his latest info-product (in perhaps a series of dozens of them?) is called Perpetual Traffic Formula. In the squeeze page he highlights that it offers you the opportunity to… “Discovering a crack in Google algorithm so big it simply can’t be patched. Being able repeat the process for similar results in UNLIMITED niches.”

Anyhow, the Droid has a pretty good review of how awful his sites are doing in terms of “perpetual traffic.”

If you want to buy from a person who *always* has another new product with a secret short cut to sell, Ryan is THE guy. If you want to learn how to evaluate the quality of products being sold, here are love this.

SEO Book.com – Learn. Rank. Dominate.