Posted by Sam Crocker
Today we’re going to break down a number of different tools and resources for getting insights into competitors traffic data. We have looked at a handful of tools here and will break them down one by one as to their strengths and weaknesses, as well as the validity and usefulness of the data provided. Ultimately we just wanted to share with you some other information sources out there that you can add to SEOmoz’s list of tools that are great for competitor analysis (my personal favourites being the Linkscape Visualization and Comparison tool as well as the Competitive Link Research tool).
However, I have had a number of clients asking me for a better view of overall market size and what kind of traffic their competitors are getting. Despite the fact that this has, unfortunately, at times meant crushing a few dreams about who a genuine/realistic competitor is or should be (i.e. NOT mashable if you are a new site) it can be tricky to find meaningful predictive data even when you know who your competitors are.
Initially I wanted this to be an experiment testing out a number of services and running them against Analytics data to compare like for like and find which sites provided the most accurate information. I compiled Analytics data from 25 websites with hopes of comparing the real numbers (from Analytics) against the predictions of the other tools to try to find which was the most reliable across a number of sites from different sectors with a range of monthly traffic from ~1,000 monthly visitors to over 48,000,000 monthly visitors.
The idea was to report on data across a number of these platforms for average monthly visits, total yearly visits, geographical visits and so forth. Unfortunately, there were many fewer tools that provided this data than I initially anticipated and it quickly became clear that we weren’t going to be able to compare apples to apples and there is no substitute for internal data… but through the combination of some of the below tools you can get a good idea of what sort of traffic your competitors sites are getting.
So, the experiment was a bit of a failure, but I learned more than my fair share about the tools so let’s have a look at which tools are available and which tasks/comparisons they can be used for. I’m hanging on to all the data I collected and at a future date (if I ever hear back from some of the data sources) I will post a follow-up/re-do of the experiment.
It is worth pointing out that a number of these sites suggest they can provide better data if you claim the site(s) in question. I cannot testify to the accuracy of this because we have not looked into this (and could not feasibly claim the data for all 25 sites), also, all comments are based upon the free version of the tools as we did not have paid access to any of the tools.
Alexa is good for comparing different sites traffic and for monitoring general traffic trends. It can be quite useful for comparing one site to a competitor site (up to 5 sites at a time).
The index is massive and contains some data about all of the 25 sites we tested.
Not so great for the smaller sites. As you can see below, you won’t get any of the traffic charts for sites ranked outside of the top 100,000 (which means if you Alexa thinks you are getting any fewer than 10,000 visits per month you’re unlikely to glean any great information.
Accuracy is a serious concern. This does taint the usefullness of the tool in general.
The numbers reported are not helpful for predicting traffic on their own.
We want to keep this all anonymous but let’s just say one site that we know gets 10-20,000 visits per month had an Alexa rank that was more than 5 times better than a site that we know gets 75,000+ visitors per month. And this was not just a one-off event.
I would have to seriously quetsion the reliability of this tool. It didn’t seem to be too bad at predicting the trends for a single site but the charts are extremely difficult to make any real use of. The information on bounce rate seems fairly accurate (give or take a few percent) but the trends for bounce rate seemed much less accurate (e.g. the ups and downs did not seem to correspond with similar peaks and valleys in Analytics).
Perhaps most interestingly it seems to be skewed in favour of sites within the search marketing space. Sites in the search marketing space that we looked at regulary outranked sites receiving more than 10 times as much traffic on a monthly basis.
How to best use Alexa?
The tool is interesting for comparing similar sites or sites within an industry. I would like to recommend the tool but based upon my experience and this particular data set I would have to say I would be very cautious about using this to make any meaningful suggestions or estimates on traffic data. It is a great concept for a site but does not seem to have been particularly accurate.
The most accurate data seemed to be the data from the visitors by country (the order was fairly accurate and the percentages we looked at were not to far off). To the extent that this data would be useful to have for your competitors this would be one good use of Alexa data.
The insights for audience demographics could also potentially be extremely valuable, though accuracy will always be a question.
Free. Options for site audits for 9
Speak the right language (unique visitors, visits, etc).
Ability to compare multiple sites
Data is easy to understand and well presented.
Somewhat limited number of sites – many sites that it classifies as "low sample sites"
Cost of "Pro" option
Again, accuracy is a serious concern here. The data was off in some cases by as much as 2,000% for monthly visits. The accuracy seemed to be a bit better for the peaks in traffic and some of the general trends we looked at, but was certainly not reliable enough for us to suggest reporting competitor traffic based upon this information.
How to best use Compete?
It should come as no surprise that Compete is best used for comparing competitors. The scale of the data is way off but some of the trends seemed to be fairly reliable. I wouldn’t advise reporting any numbers from this data (as they do not seem close/reliable at all – often off by a factor of 2 or more), however the trends are reliable. The information could be meaningfully used to look into seasonal trends between competitors. The demographic information (again, not being able to comment on the accuracy) would also be quite interesting but would require registering your site.
I can’t very well recommend the PRO services as I was not able to gain access and was unwilling to pay the cost just for the blog post. I would be extremely interesting in looking further into some of the referral data and the keywords data but this is not available as part of the standard free toolset.
Cost: Free. The PRO membership is 9 per month.
Unfortunately we struggled with ComScore. We were unable to get a login or sneak a peak at any of the data. Thus, obviously we cannot comment on the validity of the data, only some of the offerings.
Best use of ComScore:
ComScore offers a number of reports and insights into markets including reports on Local market size, as well as information about valuable/important keywords in an industry. It would be very interesting to find out where this data was coming from and how good it was, but we were not able to achieve this in time to publish this information.
Costs were not listed on the site, but rather suggest contacting ComScore directly.
Sites also visited data is good
Keywords searched for can be quite valuable
Audience interests data interesting
Lack of data for small sites
The accuracy was really mixed. For many of the sites AdPlanner provided much better data than some of the others, however, they were still off by miles for some sites – off by as much as 1000%. Again, the data on this in general tended to be better than many of the others, but given the occassional "big miss" I would not be comfortable using this data to make traffic predictions for a client.
How to best use Google Ad Planner:
The data about other sites visited as well as keywords searched for (with affinity) could be extremely valuable. As well as some of the other metrics reported on and audience interests. However, the traffic data is not particularly meangingful and is not to be relied upon.
Pro-tip: The data tends to be better when site owners have granted permission to analytics to publish data, I know we all love open and friendly, but this isn’t the sort of thing you neccessarily want to make easier for your competitors to find.
Trends around Keyphrases and keyphrase groups
Difficult to read the data
No hard and fast numbers about traffic
Hard to compare entire sites to one another
You can bet that the accuracy of this data is going to be pretty good given that the data provider has access to more data than anyone else on the internet. However, the fact that the numbers are normalised and more designed for keyphrases and search terms and trends than for traffic data means that the search volume will correspond perfectly with the traffic to a site.
How to best use Google Insights:
Google Insights could be quite helpful for finding the most valuable pockets of keyphrases and keyphrase groups. This could be particularly valuable when looking at a competitor site and trying to figure out which of their keyphrases are driving the most traffic. Further to that point, it could help you see which of the keyphrases within a keyphrase group might be the most valuable.
Good for illustrating magnitudes of difference between sites
Allows comparison of multiple websites
Includes regional information
Not good for comparing sites fairly similar in size
The data seems to be more accurate when only trying to compare traffic from search, it does not seem to do as well in picking the winning recipient of overall traffic. Given that these trends are Google Trends this is reasonable and still paints a fair landscape for an SEO’s needs.
When comparing websites with drastically different traffic numbers the rough visual estimation appears to correspond quite well with the observed analytics data as well.
It’s a shame there are no actual numbers for the data, but that would just be too easy.
How to best use Google Trends for Websites?
Trends is great for broad information gathering. It gives some insite into similar searches when comparing sites, and in general it is unlikely that you will find better comparative data out there without direct access to your competitor’s analytics account. However, Trends does not provide numbers and thus can only be used to venture a guess at what sorts of numbers competitors are pulling in.
When two sites are relatively similar in size Google Trends does not always pick the winner in terms of monthly traffic correctly. For example, one of the sites we tested received around 7.5m monthly visits whilst another received around 8m and Google ranked the 7.5m website higher. However, it is worth noting that the 7.5m visitor site received considerably more volume from search than did the 8m visitor site so from an SEO standpoint this data is probably quite accurate.
Unfortunately we were not able to get data from HitWise in time. The HitWise team was very helpful, responsive and agreeable and we will share this data once we have gotten our hands on it. However, we had not received the data back on the websites in the study in time for publication.
Best use of HitWise:
HitWise, similarly to comScore works on a reporting basis insofar as you speak to them about the types of market reports you would like or you can create custom reports. Whilst we obviously cannot comment on the accuracy of the data the services offered look to be better tailored to an SEOs needs than do the reports offered by comScore. However, generally speaking HitWise will not work with agencies which will be a bit of a bummer for some of you.
Cost: Free-5+ per report
The range in cost seems to be fairly large. Whether the data warrants the pricing structure cannot really be judged without looking at the data, though they do make some data freely available through their website.
Traffic Numbers that are easy to follow
Design and display of information
Demographic information (when available)
Media Planner Tool
Lacks data for small-medium trafficked sites
Inability to compare sites
Definitely the biggest shortcoming of the Quantcast data is accuracy. As with some of the other sources the traffic data is estimated and is nowhere near accurate on the sites for which Quantcast had any data. Data was off by as much as 10 times the actual analytics data for s
ome of the sites. Again, I cannot say that I would recommend sharing any of the data with a client as an accurate predictor of a competitor’s traffic.
Best Use of Quantcast:
Although the data is not particularly reliable for the traffic data some of the other tools the site has to offer seem quite interesting and worth further investigation. The demographics information is also particularly interesting because it provides a reference as to how the data compare against the internet average. This sort of data could be particularly valuable for analysing a market by compiling data across multiple sites.
Data Includes sites of all sizes
List of Keyphrases and rankings for thos terms
Most accurate numbered data of all tools looked at
Pay to get full data lists
Data only for Google traffic
The data was not perfectly accurate, though generally speaking SEMrush did not miss the mark for any of the sites we tested the same way a number of the other tools did. This is, obviously not to say that this data is infallible or that there won’t be some issues with some sites, but the data was surprisingly accurate. As with some of the Google data the information reported is just the Google SE traffic, but this is our main area of focus and was quite accurate when drilling down into that specific area of traffic within analytics.
Best Use of SEMrush:
Although imperfect, this tool came the closest to providing accurate data that I would at least with a word of warning, be willing to share with a client about potential expectations or about where there competitors may be traffic wise. Most importantly, the add-on options and ability to see the keyword lists and how the competitor ranks for these terms is extraordinarily appealing to me.
Cost: Free-9 per month
I hope that the findings from all this research will be valuable to you. At the end of the day it is an incomplete study and I look forward to following up on it when I have another big chunk of time and if/when I get access to comScore, HitWise, Compete PRO and SEMrush Pro. For the time being I would rely most heavily on SEMrush for predicting traffic and estimating how well a competitor is doing, but all of these tools add something to the ever growing toolbelt even if it may be for a purpose other than that which I was hoping they would achieve for me – we all know I love to misuse tools and I’m sure I will come up with some creative ways to use these insights.
Thanks a lot and look forward to any feedback you might have in the comments below or feel free to contact me on Twitter.
A few weeks ago, Brent Payne made a post about “whitehat cloaking” and changing your content based on referring website. He asked for some feedback on Twitter, causing some follow up discussions. I had a few people asking for examples about how to do this. In this two part post, first we’ll look at some theory about why would you want to do it, under what circumstances, and how to do it without angering the Google Gods. In tomorrow’s post “How to Conditionally Change Your Content,” I’ll give you some ideas about how to implement this.
Let’s talk about the high level strategy items first. Why would you want to serve different content to different users:
To use a cooking metaphor, I’m not serving each of these people a different meal, but I’m varying the seasoning to suit my guest’s individual tastes. Let’s get past the superficial. What are some things you could do differently for, say, social media traffic? Under most situations, social media visitors don’t click adsense, banner ads, and that sort of thing. For social media traffic, your best outcome will be getting them to link to your page, vote/retweet your page, or visit other pages. What you want to think about is how you can change your content to help you meet those goals.
With Google’s announcement that site speed is a factor, many savvy webmasters opted out of third party buttons and began to use smaller, lightweight, on-site graphics. While this helps with site speed, it doesn’t help with social engagement. If you want more social interaction show bigger buttons up top, especially the third party buttons with active vote/tweet counts. I would remove as much advertising as you could. I would replace this with graphics or sections featuring other social content. If you use tags to isolate your social content it would be easy to pull out using a DB query. How about showing your most popular or most emailed pages. Rather than showing a social media audience 25 pages of your top 25 list, consolidate all of the content onto one page.
What about search traffic? How can you change the content to better suit their needs? In some cases you may want to remove content like the sidebar, making your pages more like single page squeeze pages. Of course this will depend on the page content like, say, a product page. You may want to be more aggressive with advertising placement if you run an adsense or affiliate website. You could also vary the advertising a bit. I’ve spoken before about using tags to target your advertising, but why not use search query terms as well. If someone came to your website searching for [cheap mexico vacations], normally you would just serve them ads about cruises, hotels, or vacation packages to Mexico. However, if you trapped for search queries containing the word [cheap] you might also want to mix in some value based vacation advertising.
While there are some advantages to doing this, there are some pitfalls as well. This type of behavior makes for a more complicated website to maintain and run, so make sure you have the resources for the long haul. Secondly you have to be concerned about the search engines and giving the appearance of cloaking with ill intent. The more dramatic the main content is from one version to another, the more likely it is to upset a search engine. For example, if you serve a 1,400 word article to direct traffic, a 700 word trimmed down version to search traffic, and a 400 word version to social traffic, you are taking some risks. I would make sure that search engine bots get a version that is very close if not identical to the version that users coming from a search engine will get.
Posted by randfish
Yesterday, Perfect Market, a company that "helps publishers create value from their online content with little effort and no risk1" released a study that’s been getting quite a bit of attention. The study analyzes the relative traffic value per visit of several types of content, coming to the conclusion2 that "while the Lindsay Lohan sentencing and other celebrity coverage drove significant online traffic for major news publishers, articles about unemployment benefits, the Gulf oil spill, mortgage rates and other serious topics were the top-earning news topics based on advertising revenue per page view."
Coverage included the New York Times’ Traffic Bait Doesn’t Bring Ad Clicks, Columbia Journalism Review’s Celebs are Loud, but Hard News Pays, Nieman Journalism Lab’s Public Interest News Can Be More Valuable to Publishers than Traffic Bait and Search Engine Land’s Hard News Pays More than Chasing Search Trends.
I’m worried for a few reasons:
Granted, from a personal perspective, I love the idea that writing about celebrity gossip and other "soft news" isn’t profitable and therefore might be less prevalent in the future. It’s purely opinion, but I suspect that many share my sentiment that the United States’ major media outlets are far too focused on shallow reporting of topics (like those mentioned in the Perfect Market analysis) that deserve far less attention than, say, understanding what caused the mortgage crisis, who’s spending money on elections and why, the success other nations have had in dealing with crime, poverty, drugs, multiculturalism, etc.
However, anytime a skin-deep, single-metric analysis like this makes its way into major publications, it has an effect on content publication that’s not necessarily positive. If executives, editors and journalists start using singular metrics rather than deep analyses of data to make decisions, their publications will suffer and their content and marketing budgets will be misallocated.
If Perfect Market (or another source) could show:
I’d be far more inclined to agree with the conclusions the press is reporting.
If you can’t fully/accurately analyze the true lifetime value to your publication of so-called "bait" (and I don’t just mean celebrity-obsessed soft news, but a broader group of creative, traffic-driving pieces), that’s OK. Just don’t presume a single metric like "ad click value" combined with "page views" will give you the whole story. The web is all about providing data, and you’re cheapening your own value when you cut corners to this extent.
BTW – I don’t mean to cast all the blame on Perfect Market – they did some reasonable data analysis and shared the findings. I wish it had included a bit more caveats, but their job is promoting their work. I’m more concerned with how the media treated the story – reporting, exaggerating and not bothering to dig deeper. Just look at the opening lines of the NYTimes piece3:
Sure, articles about Lindsay Lohan’s repeat trips to rehabilitation and Brett Favre’s purported sexual peccadilloes generate loads of reader traffic, but do they actually make decent money for the Web sites that publish them? According to a new analysis, no.
That’s not what the analysis showed. It showed one metric and it’s impact, but it didn’t explore the overall value of the page views, visits and CLTV (Customer Lifetime Value) of the stories it examined. Let’s hope the publishers do a more thorough job and that we, as content creators & marketers, think carefully about how to value the content we create and the traffic we attract.
Posted by richardbaxterseo
If you think about it, search engines are more or less constantly driving us SEO people to keep our technical SEO strategy in a state of constant refinement. This “evolution” of the marketing environment we thrive in is a great thing, it challenges us to come up with new ways to improve our traffic generating capabilities, user experience and the overall agility of our websites.
Here are a few ideas (6, to be exact) based on issues I’ve encountered in QA or on our recent client work that I hope will provide a little food for thought the next time you’re planning SEO enhancements to your site.
1) Leverage UGC (review content) beyond the product page
UGC is brilliant, particularly on product, content thin or affiliate sites. Making it easy for users to leave a review is powerful stuff, but are you making the most of your precious user generated comments? Consider this scenario. So many users leave product reviews on your pages that you decide to limit the number of reviews visible for that product. You could cherry pick some UGC for category listings pages, adding to the uniqueness of those otherwise content-thin pages too.
2) Use “other users found this document for”I know Tom loves this trick, and rightly so. You can turn insightful recent searches data into valuable on-page uniqueness. Internal search and external referrals are great, but how about extending the process to make it easy for users to evaluate, extend, tag or remove terms they feel are irrelevant to the page?
This simple example shows how users of a forum site may have found that thread. I think there’s a whole lot more you can do with this trick, but it’s a start:
3) Consider delivering search engine friendly URLs in your internal site search results
I know how “out there” this might initially sound, but why settle for search engine unfriendly URLs on your internal site search pages? I have seen lots of examples of links being awarded to unfriendly, internal site search URLS. Why do we spend so much time carefully crafting our external URLs, only to completely forget our internal search URLs? A little extra development work to apply a meaningful pattern to your search result page URLs today could lead to the construction of an entirely new content type down the line.
Look at how folks are linking to these search query pages, and note the first example (where instead of a URL rewrite, this site is using breadcrumbs to make their ranking page URL appear more friendly):
4) Microformats are really gaining traction – be creative with them
What we’ve found with Microformats is that webmasters tend to apply the markup to web pages hosting the content, but that’s where they stop. Imagine you have a website that sells tickets. Do you add hCalendar to your event page and walk away? No! You can nest other Microformats such as hProduct and hReview, and syndicate your formatted data to other internal pages, snippets on your homepage and category pages. Any mention of an event, a link to a product or a review snippet should use the appropriate mark-up, consistently across your website.
5) Work hard to resolve errors and improve site speed
Think about how Google have placed site performance at the top of their agenda. I genuinely believe that a site riddled with performance issues and errors is tolerated less today by search engines than ever before. Websites with platform issues can raise serious problems for SEO, users, conversion and repeat visits. Fortunately, there are plenty of tools (including SEOmoz Pro, IIS Toolkit, Pingdom Tools and Webmaster Tools from Bing and Google) to help you identify and tackle these issues head on. Go and set aside some performance maintenance time, if you haven’t done for a while.
6) Watch your homepage title in Google’s SERPs
Google can be pretty aggressive when it comes to choosing the most appropriate text to appear in your title snippets. Sometimes, you might disagree with Google’s choice! Our tests so far indicate that the NOODP meta tag (used to prevent Google using the DMOZ description from displaying in your SERPS) can prevent Google from doing this, even if you have no DMOZ listing.
That “penny drop” moment when a new technical SEO strategy idea presents itself has to be my favourite part of SEO work. I’m glad that technical strategy has to evolve as search engines develop. I really can’t see a time in the near future when that will change.
If you’d like to hear more tips, I’ll be speaking at next week’s A4Uexpo in London on exactly this topic. If you’re there, be sure to drop by and say hello. My buddy Dave Naylor will be introducing me (I have no idea what he’s going to say) and hopefully there’s going to be some time to do a preview of the session over on his blog soon.
Posted by Kate Morris
As a consultant, I work with many In-House SEO teams with strategy and other issues that arise throughout the course of the year. One trend we are seeing is that these In-House teams are having a hard time coming up with accurate traffic-centered goals. Traffic is the base for many metrics measurements, so being able to semi-accurately predict that number in the coming year is important for every business.I can hear you all now, "Well there is the Google Keyword Tool … use that." Typically, that is my answer too, but there have been major questions about the accuracy of Google’s keyword tools and others available to webmasters, marketers, and search engine optimization teams.
(If you will comment with your favorite keyword tool other than those I mention, I’ll happily test and add it here!)
There was a shift recently with the Google Keyword Tool. The Legacy/API version is showing different numbers than the newest Beta interface. David Whitehouse and Richard Baxter both noticed this shift as well and did a few tests on accuracy. The jury is still out as to which version is more accurate, the legacy or the new keyword tool. But I believe like Mr. Whitehouse that the newer tool is the updated one, but that does not make it more accurate.
To be clear, when I speak of the Legacy, API, and Beta tools, I do mean different versions of the Google Keyword Tool. First, from what I can see using the SEOmoz Keyword Difficulty tool, the Google API pulls from the Legacy tool, so they are one and the same. The Legacy tool is the prior interface for the current Beta version of the Keyword Tool. We had previously assumed that these pulled the same numbers, but my research and that of others proves otherwise.
But wait! *infomercial voice* There is more!
There is also the Search-based Keyword Tool that aids AdWords advertiser’s in choosing relevant keywords based on search behavior and a specified website. This tool is explained by Google here and gives more in depth information on account organization and cost.
But even this tool is not on par with the other two when it comes to impressions. A random query in the Search-based tool returned a suggestion for the keyword "maragogi." The Search-Based tool says there should be 12,000 monthly searches. The Legacy tool returns 110 Local Exact match searches, 33,100 Global Exact match, and 201,000 Global Broad match. The new tool returns information only for a global setting (all countries, all languages). That returns 74,000 searches broad and phrase match, and 12,100 for exact match. It seems like the Search-based tool is more like the exact global match in this one instance. But what is a business supposed to do with all of these numbers?!?!?
(hint: always use exact match)
If these tools are possibly inaccurate, how do our clients go about setting their yearly strategy goals?
Put simply, in search, you never want to rely on one set of results or one ranking report. Data over time and from many sources is best. But with the lack of tools out there and Google bringing in at least 65% of traffic organically for most sites, how do you get the best numbers?
First, you need to start out by figuring out how many impressions a keyword or set of keywords can bring in on average for a specific month. If you are in a cyclical industry, this will have to be done per month of the calendar year.
Below is a look at some information I pulled using the tools mentioned for the key phrase "curtain fabric."
The idea here is that if you take into account all of the numbers out there, you might see a trend that you can use for estimating future traffic. If there is no trend, then a median of the numbers can be used as your metric. A few other tools that you might look into include Word Tracker and Keyword Spy. You can see that the numbers are all over the place, but looking at these figures, I’d guess that the keyword might bring in around 6,500 impressions a month in the UK.
The downside is that WordTracker and KeywordSpy don’t allow you to look at exact match information versus broad match. When performing keyword research, you always want to look at the local (target to your country) exact match information. Too many people pull keyword information use broad match and get inflated numbers for all phrases related to that key phrase.
The absolute best way to get accurate numbers about traffic over time is to run a PPC campaign. I pulled some numbers from a few campaigns (for our client’s sake we have masked a number of the actual key phrases) in attempts to see if the new keyword tool is accurate to actual trafffic in August. The keywords pulled were all exact match in the campaign and the information pulled from the keyword tool was Local Exact and set to the country that the campaign was targeting.
As you can see, some of these are higher and some lower. What I found that there really is no definitive answer of if the Google Keyword Tool is accurate. Take a look at the results for the example I used before, curtain fabric. The campaign saw 11,389 impressions, much higher than the new keyword tool, and lower than some other keyword tools. This is why a well run PPC campaign is important if you want to get a more accurate look at impression numbers.
Please note that I didn’t get a chance to ensure that these accounts were all showing at all times during the month, but they were all accurately geo-targeted and all showed on the top of the first page on average.
After getting a good idea of the number of impressions, you then need to take into account where you are showing for that keyword on average organically (aka your rank). While we cannot know specific click through numbers for every search done on the web, there have been some studies done on how much of those impressions the top organic result gets, the second and so on. The one I used the most often is from Chitika. Using the percent of the traffic below and the impression numbers, you should be able to get a good idea of the visitors you can expect per month organically for a specific key phrase.
So using the "curtain fabric" example, assuming that the site I am working on has maintained an average ranking over the last few months of #3 organically, I could expect about 1300 visits from Google for the keyword in a month (11.42% of 11,389 impressions).
get everything figured out, keep in mind that your past metrics are another good way of seeing how close you are to getting the traffic about right. Assuming that no major changes have occurred (like lack of metrics data in the last year), a look back is the most accurate way to understand traffic flow and trending on your site. Pull the unique visitors for every month of the last year and do some analysis on percent increase month over month. This can be done on any level in most analytics programs – overall traffic trends all the way down to the keyword level.
A look at overall traffic per month in Google Analytics for organic searches from Google:
A look at traffic for a specific keyword over the last year per month from Google organic:
In the end though, making predictions are just that, educated guesses. Pulling data from all available sources and using your own historical data can assist in making an educated prediction for the next year. Keep in mind though that things never stay the same. Google Instant just proved that with one of the biggest changes we have seen in a while.
Marketing generally has 2 core strategies in terms of customers: finding new customers & keeping your current/old customers happy. The best businesses tend to keep the interest of their customers for months and years through consistently improving their products and services to deliver more value. Whereas the other sorts of businesses tend to be hard-close / hype driven & always promoting a new product / software / scheme. It is never a complete system being sold, but some “insider secret” shortcut that unearths millions automatically while you sleep – perpetually.
One of the problems with false scarcity hype launches is that it attracts the type of customers who can’t succeed. The people who are receptive to that sort of marketing want to be sold a dream, they are not the type of people who want to put the time and effort in to become successful. They are at stage 2 in this video: “my life sucks” … so sell me a story that will instantly make everything better without requiring any change from me at all.
Another one of the problems with the hype launch business model is that it requires you to keep repeating the sales process like a traveling salesman. Each day you need to think up a new scheme or angle to sell a new set of crap from, and you have to hope that the web has a short enough memory that the scammy angles used to pitch past hyped up product launches don’t come to bite you in the ass.
I don’t mind when the get rich quick market work their core market, as there is a group of weak minded individuals who are addicted to buying that stuff. But I always get pissed off when someone claims that your field is trash or a scam (as an angle to sell something else), and then they later start trying to paint themselves as an expert in your field.
Here is a video snippet of Ryan Deiss exclaiming his ignorance of the SEO field & how he got ripped off thrice because he knew so little he couldn’t tell a bad service provider from a good one.
“If you want to get free traffic you have to get good at the cut-throat game of SEO (which I for one am not). … SEO for most of us isn’t the right answer.” – Ryan Deiss
And his latest info-product (in perhaps a series of dozens of them?) is called Perpetual Traffic Formula. In the squeeze page he highlights that it offers you the opportunity to… “Discovering a crack in Google algorithm so big it simply can’t be patched. Being able repeat the process for similar results in UNLIMITED niches.”
Anyhow, the Droid has a pretty good review of how awful his sites are doing in terms of “perpetual traffic.”
If you want to buy from a person who *always* has another new product with a secret short cut to sell, Ryan is THE guy. If you want to learn how to evaluate the quality of products being sold, here are love this.SEO Book.com – Learn. Rank. Dominate.