Posted by randfish
In early June of this year, SEOmoz released some ranking correlation data about Google’s web results and how they mapped against specific metrics. This exciting work gave us valuable insight into Google’s rankings system and both confirmed many assumptions as well as opened up new lines of questions. When Google announced their new Places Results at the end of October, we couldn’t help but want to learn more.
In November, we gathered data for 220 search queries – 20 US cities and 11 business "types" (different kinds of queries). This dataset is smaller than our web results, and was intended to be an initial data gathering project before we dove deeper, but our findings proved surprising significant (from a statistical standpoint) and thus, we’re making the results and report publicly available.
As with our previous collection and analysis of this type of data, it’s important to keep a few things in mind:
With those out of the way, let’s dive into the dataset, which you can download a full version of here:
Interestingly, the results we gathered seem to indicate that across multiple cities, the Google Places ranking algorithm doesn’t differ much, but when business/query types are considered, there’s indications that Google may indeed be changing up how the rankings are calculated (an alternative explanation is that different business segments simply have dramatically different weights on the factors depending on their type).
For this round of correlation analysis, we contracted Dr. Matthew Peters (who holds a PhD in Applied Math from Univ. of WA) to create a report of his findings based on the data. In discussing the role that cities/query types played, he noted:
City is not a significant source of variation for any of the variables, suggesting that Google’s algorithm is the same for all cities. However, for 9 of the 24 variables we can reject the null hypothesis that business type is a not significant source of variation in the correlation coefficients at a=0.05. This is highly unlikely to have occurred by chance. Unfortunately there is a caveat to this result. The results from ANOVA assume the residuals to be normally distributed, but in most cases the residuals are not normal as tested with a Shapiro-Wilk test.
You can download his full report here.
Next, let’s look at some of the more interesting statistical findings Matt discovered. These are split into 4 unique sections, and we’re looking only at the correlations with Places results (though the data and report also include web results).
With the exception of PageRank, all data comes via SEOmoz’s Linkscape data API.
NOTE: In this data, mozRank and PageRank are not significantly different than zero.
All data comes via SEOmoz’s Linkscape data API.
NOTE: In this data, all of the metrics are significant.
All data comes directly from the results page URL or the Places page/listing. Business keyword refers to the type, such as "ice cream" or "hospital" while city keyword refers to the location, such as "Austin" or "Portland." The relatively large, negative correlation with the city keyword in URLs is an outlier (as no other element we measured for local listings had a significant negative correlation). My personal guess is nationwide sites trying to rank individually on city-targeted pages don’t perform as well as local-only results in general and this could cause that biasing, but we don’t have evidence to prove that theory and other explanations are certainly possible.
NOTE: In this data, correlations for business keyword in the URL and city keyword in the title element were not significantly different than zero.
All data comes directly from Google Places’ page about the result.
NOTE: In this data, all of the metrics are significant.
Our hope is to do this experiment again with more data and possibly more metrics in the future. Your suggestions are, of course, very welcome.
As always, we invite you to download the report and raw data and give us any feedback or feel free to do your own analyses and come to your own conclusions. It could even be valuable to use this same process for results you (or your clients) care about and find the missing ingredients between you and the competition.
Some of you may have been hit by Google’s 20 October algorithm change.
And some of you wouldn’t have noticed any difference.
On 20 October, a number of sites got trashed. Rankings, and traffic, plummeted through the floor. The webmaster forums lit up. Aaron noticed it. I noticed it. Yet, other webmasters wondered what all the fuss was about.
As many of you know, there is not just one ranking algothimn. There are many algorithms. What affects one site may not affect another. Rather interestingly, Google’s John Mu dipped into this thread on Google’s support forum, offering these words of wisdom (HatTip: Barry)
It looks like the changes you’re seeing here may be from an algorithmic change. As part of our recent algorithmic changes (which the outside world sometimes refers to as the “May Day update” because it happened primarily in May), our algorithms are assessing the site differently. This is a ranking change, not any sort of manual spam penalty, and not due to any technical issues with regards to crawling or indexing your content. You can hear more about this change in Matt’s video: ”
Various parts of our algorithms can apply to sites at different times, depending on what our algorithms find. While we initially rolled out this change earlier this year, the web changes, sites change, and with that, our algorithms will continually adapt to the current state on the web, on those sites. While it might be confusing to see these changes at the same time as this issue, they really aren’t related, nor is this a general algorithm change (so if other sites have seen changes recently, it probably doesn’t apply to them as well).
Matt’s video, made four months ago, was talking about the algorithmic MayDay change. John Mu adds: “Various parts of our algorithms can apply to sites at different times” In other words, whatever happened in May may not affect your site in May, or June, or July, but might hit you many months later. This implies that your site may trip a threshold, and be judged quite differently than it was the day before.
This still doesn’t completely explain why so many sites were hit on the same day, but then Google don’t typically explain things in detail.
To complicate matters, there was an acknowledged indexing problem, affecting new content, particularly on blogs. Again, John appears to suggest this was a separate issue.
Now, all SEOs are used to algorithm changes. Nothing new. But this one has me genuinely perplexed, mainly because of the type of sites that got hit.
Time for some self-searching Q&A about one of my own sites:
Q: So, how many links did you buy?
Q: Are you selling links?
Q: Linking to “bad neighborhoods”?
A: Not that’ I’m aware of…..
Q: Did you link-build in an aggressive manner?
A: No. I did no link building, whatsoever.
A: That’s not a question.
Q: So you just published content?
Q: And people linked to your site, of their own accord?
A: Yep. I guess they liked it.
Q: Was your content heavily SEO’d?
A: No. In fact, I gave writers specific instructions not to do anything resembling “SEO copywriting”. It ruins the flow for readers.
Q: All original content?
A: All original. Hand written. No machines involved anywhere.
Q: So this site conforms to Google’s Webmaster Guidelines?
A: I’d say it lies well within them. “Be useful to end users”, was the guiding principle.
Yet it got hit hard.
What’s also interesting is the nature of the sites that replaced it. I checked keyword after keyword, and found script driven, aggressive black-hat, content-free sites in top positions. Not in all cases – there are certainly useful sites that deserve to be there, and deserve to appear above mine. Fair play. However, there were plenty of sites of – shall we say – dubious merit- occupying high positions.
Now, I believe in publishing useful, unique content, and not paying too much attention to SEO, other than covering the basics. SEO is one strategy amongst many, and sites should, first and foremost, prove useful to people.
Clearly, no site is immune. You can stay within Google’s Webmaster guidelines, and get taken out. I knew that anyway, but when the sites that don’t follow the guidelines replace you…
….I’ll admit – it grates.
Presumably, Google rewards the sites it likes with high rankings, and if we see a lot of aggressive sites filling the top page, should we therefore assume that aggressive sites are what Google actually wants?
I’d like to think not.
Perhaps they are just trying to mess with our heads?
Or they messed up?
Or the changes are still bedding in?
Or they really do want it this way?
I’m still watching, and considering. Perhaps the site will just pop back up in due course. Or perhaps I need to go back to the drawing board. I’ll let you know how I get on.
If you’ve noticed something similar on your sites, chime in on the comments.SEO Book.com – Learn. Rank. Dominate.
The latest video from Matt Cutts talks about the value of SEO to Google.
The questioner asks:
“Why does Google support SEO specialists with advice? Google’s business is to sell text ads…”?
Matt explains that Google sees SEO helping, rather than hindering, their business model long term.
SEOs create – and encourage site-owners to create – the very sites that Google’s technology demands i.e. context accessible by an automated crawler, largely text based, and clearly marked up.
By having sites that jive well with Google’s technology, this lowers Google’s costs, and helps make Google results more relevant in the eyes of the end user. The larger their index, the more chances Google has to answer the query. SEOs love creating crawlable content!
This means the end user keeps coming back, which in turn translates to Google’s bottom line.
It’s also a good idea to give webmasters something, else Google risks an more adversarial relationship, which again can cause Google problems.
So SEO is good for Google’s business – the “good” type of SEO, as defined by Google, of course.
Matt, as always, is giving the side of the story Google wants you to hear.
His position sounds reasonable, generous, and inclusive, and it is – in many respects. But make no mistake – Google aren’t there for webmasters. Google will do what is good for Google. If SEO was bad for Google, Google would not be reaching out to the SEO community, in much the same way they don’t reach out to, say, the malware writer community. They just stamp it out.
Matt is a master of public relations. Webmasters can learn a lot from Matt in terms of how to handle their own public relations challenges.
Here are a few pointers, based on Matt Cutts approach:
Matt doesn’t talk from on high. He doesn’t talk at his audience. He talks with them. He attends events where his audience congregate, and he encourages interaction and questions. This activity serves to build a personal relationship, which helps make his messages easier to convey and sell.
Look for ways in which you can go *to* your audience/customers. Where do they hang out? Address them on their own terms, and in their own environment. Regularly encourage questions, criticism and feedback. When it comes time to announce new products and services, your audience is likely to be more receptive than if your communications are anonymous and sporadic.
Ok, this might be all very well for Matt Cutts. Everyone pays attention to Google, because Google are important. However, no matter how big or small your audience, you still must find a way to relate to them.
These days, it’s not so much what people say, it’s often who is saying it. Modern media is driven by personalities. The content of the message is seldom good enough to stick, unless it is truly remarkable.
People listen to Matt in ways they don’t listen to an anonymous Google press release because of the personal relationship he has worked hard to establish. This works just as well for small businesses. In fact, this is one of the big advantages of a small business – the personal touch. Google is a big company, but they work hard to appear like a small one, at least in terms of their personal relations approach with webmasters.
Matt also gets out in front of issues. If there’s something going on in the web community relating to his area, he’s almost certainly quick to comment on it. By doing so, he can control and frame the conversation in terms that suit Google. If there are industry issues that relate to your work or company, use them as an opportunity to grab the spotlight. Try to become the media go-to person in your local community for issues by building relationships with media and news outlets.
PR consultants aren’t quite as necessary as they used to be. They aren’t redundant, but the most important lesson to learn from Matt Cutts is that PR is something you need to embody. It’s not just a function that you slap on, or hire in, when it suits, and still be as effective. Make PR flow through all you do.
Matt’s greatest skill is not making it look like PR at all.SEO Book.com – Learn. Rank. Dominate.
Posted by Dana Lookadoo
This post was originally in YOUmoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of SEOmoz, Inc.
Day 1 of SEOmoz Pro Training was like being at a race track. The course careened from clicks to conversions and from search results to landing pages. The audience watched 9 speakers drive their search marketing race cars at speeds faster than fingers can type. Given the finger-breaking speeds, it was fortunate all SEO fans were well fueled – beginning with a healthy breakfast buffet, mid-morning energy bars, lunch (more all-you-can-eat) and a scrumptious mid-afternoon pit stop with fresh cookies and treats. After everyone was fed each time, it was off to the races.
Todd Freisen was in the sports booth service as emcee, host of ceremonies, referee, judge and time keeper. The event was like a well-oiled machine. Maybe that’s why they call Todd, "Oilman."
When I said "yes" to attending the Mozinar on a Press Pass, I didn’t realize I was going to be covering a sporting event. GoodNewsCowboy asked me how I was going to recap and condense this "wild ride." I realized there was a lot of horsepower on-stage and that we were at the SEOmoz Training Raceway.
Mozinar fans experienced exhilaration and gleaned insights as we watched performance race car drivers present their seminar presentations. The following race highlights are condensed from 32 pages of notes. I strongly suggest you buy the Pro Seminar DVD when it’s produced so you can see under the hood for yourself.
From Clicks to Conversions with Local, Social, Analytics and SEO in Between
1st up: Rand Fishkin had pole position and drove a car with a most unusual name, "It’s a Mad, Mad, Mad, Mad SERP."
The results we are seeing in blended search results are even more unusual, starting with changes of the past 2 weeks. For those who attend SEO races regularly and are watching Google, this may be old news. For others, brace yourself. A branded search can have more than 2 results. Rand explained:
Changes to Image SEO was next, and guess what? Google has a new image search interface.
The image below results from clicking on one of the images for the artist "manet" and clicking on an image
Rand covered 10 Tips for Image Rankings. (Since we are in race synopsis mode, we’ll speed through this.) One quick takeaway was the minimum image size:
Image Pixel Size – If you go smaller than 400×300 pixels your chances to show in image search are dramatically decreased.
So you don’t have to remember any formulas, basic on-page SEO factors for image SEO include page title and surrounding text.
It’s or easier to get into video SERPs than to get into the regular SERPS. There is lower competition than ordinary results (most of the time), so take the opportunity. Follow this inclusion process to enter your video race for top ranking:
Step #1: Embed Video Content on Your Pages
Step #2: Create Thumbnail Images for Videos
Step #3: Build a Video XML Sitemap & Submit
Step #4: PROFIT $$$
See Google Webmaster Tools for Video to learn more.
Rand’s foot stayed pedal-to-the-metal as he showed how to produce Rich Snippets in the SERPs. Why is this important? This is where you get most of your clicks. His closing remarks were retweeted with fervor:
"If you can stay on top of this, you will have a big win. It demands full-time SEO."
2nd up: David Mihm was full-speed as he raced through "Ranking in Competitive Local Results." He explained:
Straight from Google’s mouth:
Local intent is 20% of total search volume (April 2010)
And who would imagine that local results could equal 100% of page 1? Try a search for "dentist chicago." (If it’s not 100%, it’s close.)
Google organic results are not, however, the dominate factor for local search. Neither are results from Yahoo! or Bing. Local search is now:
Understand that local requires a different mindset from traditional SEO, because the ecosystems vary:
"It is essential to have a holistic local search marketing strategy."
"Even if all your boss cares about is that friggin’ 7-pack!"
Resources to claim your listings:
"The Big Three" major data providers:
Citations – David recommended a new citation finder tool by Darren Shaw & Garrett French: Whitespark.ca Citation Finder
Find local SEO resources on GetListed.org.
3rd up to race: Dan Zarrella racing in the "Science of Twitter" car. Dan warned us he talked fast. Pro Seminar attendees listened attentively, but given the subject was Twitter … many tweeted insights into how one can get clicks and retweets.
Dan’s takeaways were in 140. Below are my fave top three:
Takeaway: Don’t talk about yourself so much.
Paraphrased: If you want more followers, stop talking about yourself!
Takeaway: Try to stay positive.
If you want to get bummed out, people can go on the News. Even if talking about the oil spill, stay hopeful.
Takeaway: If you want people to click your links, Tweet slower.
Don’t "go Oprah" on your Twitter account, moderate.
Improve your "retweetability" factor by including a combination of the following Top 20 Most Retweetable Words:
Timing for retweets:
Links posted on the weekend and at the end of the week have a higher click through rate.
Tip: Want to see how well a bit.ly link is doing, CTR?
Alright … one more Twitter insight before we close …
He had noted that women follow a lot more people and tend to tweet more. They are more social. (We already knew women talk and socialize more, but now Dan’s numbers confirm it.)
Dan covered a lot of geeky ground focused on the science and study of social media, use of FourSquare and more.. I have 5+ pages of notes from Dan’s presentation alone. But I’m concerned this blog post will get too long to be readable.
Check out Dan’s set of social media tools.
4th up and last race of the morning was the "Presentation Off" between Will Critchlow and Rand Fishkin.
I’ll expand on that race in a follow-up post. Do you want to guess who won this year? Will went into the race with a 2-year winning streak.
Posted by jennita
Last week I covered SES San Francisco for SEOmoz. Every time I attend a conference, I try to go to sessions that will have information I can bring back to the community. Sometimes I look for sessions that aim to answer questions we see a lot in Q & A or that I notice popping up in comments on the blog. Either way, my focus is usually to find information that will be helpful to the community.
Now and then I get a little greedy though, and attend sessions that will benefit me in my job. Luckily I hit the sweet spot at SES and found a little of both. Rather than straight up regurgitate what speakers presented, I thought I’d take their insights and show some examples specific to SEOmoz.
At SES I was reminded about my problem with A.F. (analytics forgetfulness) and a few things that I personally should be doing to not only be better at my job, but to help the company and community. Marty Weintraub from aimClear was the one that initially got me thinking in the “Deep Dive Into Analytics” panel on the first day.
How often do we look at traffic sources and focus on which sites are sending traffic… ok always. But what about looking at the actual people from those sites that are sending traffic. Let’s take Twitter for example. When most people are tweeting they’re usually either in an app or they’re on the web looking from their own page, which shows up as “/” for most referrers.
But sometimes, people are viewing a specific person’s twitter page and THEN click your link. In those instances, Google Analytics will show the actual twitter user page as the referrer. This is a quick and easy way to find out WHO is sending you traffic. This person is also probably someone who is an influencer in your community. Finding who the top referrers are is the first step, next you’ll want to use Klout (or another service) to see what their actual reach is. This doesn’t only work for Twitter though, check out the example below that I found looking at delicious referrers.
This is a list of referrers from delicious.com. Let’s see what Chris Brogan, an influencer in the Social Media space bookmarked.
Aha! Makes perfect sense, he bookmarked the Facebook Marketing Guide. It didn’t send a TON of traffic, but just think of the possibilites if we actually contacted him and worked together with Chris.
These are people who are individually sending traffic to your page, you probably should think about how you can use that information. As the Community Manager for SEOmoz I know that I will use it to reach out to them. Perhaps retweet them or ask them to write a YOUmoz post. Every organization is different, and this is just one idea. But take the concept of finding the users sending you traffic and run with it!
My good friend Cindy Krum would probably strangle me for having forgotten all about mobile. This was another area Marty mentioned and I bet many people don’t focus on it. As an example, I thought I’d jump into our analytics and see how mobile users converted.
Yikes!! Before the recent update to our PRO landing page, we had just one PRO signup from a mobile device. That’s seriously pathetic. In the last month, we’ve had 7, which I’d imagine means that the changes we made, help mobile users sign up on our site. But it’s still ridiculously low!
I also thought about looking at what visits to the tools page looked like from mobile and non-mobile browsers. Ouch! This is our highest traffic page behind the home page. The iPhone, iPad and Android were the top 3 mobile devices (not surprisingly really). Perhaps we should make it a bit easier for these devices to access our site and tools.
That’s 482 uniques out of 61,102. Definitely something to work on.
That is an exact quote from Michael DeHaven, the SEO Product Manager at Bazaarvoice. Here at SEOmoz we most definitely understand the power of UGC for SEO (waves over at YOUmoz… hi!). But how can you use user generated content to help boost your traffic? Michael gave examples of how UGC helped several companies to increase traffic by adding unique, relavant, keyword rich content.
Check out this particular example for Swanson Health Products. The first image shows the product content. Sure it does have some unique content and some of the keywords they’re going for but in general the content is fairly weak.
In the next image, you see all the great keywords that reviewers of the products have added all on their own. These aren’t SEOs creating content, but real people saying what they feel about the product. Hello! What a great way to increase content to your product pages.
Another example he gave was for Opentable. Their initial implementation had the UGC uncrawlable. After they made a change and opened it up to search engines and were indexed, they had a 17% lift in traffic. Just by allowing the ratings to be indexed. Whoa!
The last example that stuck out in my mind that he gave was that QVC started sending emails to people after they purchased a product asking for a review of the product. It seems like common sense to do something like this, but at the same time it’s absolute genius. I bet you can think of at least one way to get visitors to your site to add content. Whether that’s in a review, a comment, a suggestion, whatever! Ask them a question; people love to give their opinions.
The point is… as Michael said it best “UGC is content that rocks,” so don’t forget about it!
This was the focus of the keynote by BJ Fogg the Director of Persuasive Technology Lab at Stanford University on the second day. Now, what does that mean exactly? The idea (and I hope I get this right) to make it easy for people who are ready to do something, to do it.
For example, one reason that Twitter did so well in the beginning is that they allowed people to use text message, to send tweets. Obviously they still do, but now many people use various mobile apps when they’re on their phone. When Twitter first took off though, people were
used to reading short messages with a certain cutoff length, so tweeting was simple via text. People who were motivated to tell the world what they ate for breakfast, had the ability to do it quickly and easily.
There are several ways we could employ this here on the SEOmoz site, and one way I thought we could do this is to make it easier to sign up for PRO when you want to use a PRO only tool. Check out the example below for our Keyword Difficulty tool.
Sure, you can click on "log in" and from that page you can sign up and create a free account, but there’s no way other than the "Go PRO" link at the top of the navigation to take someone to become a PRO member. If someone found their way to the Keyword Difficulty tool and is ready to use it, let’s motivate them to become a member. Or at the very least, check out a free version.
Ok, honestly we know this happens on our site, and we’re currently in the works of improving a lot of it (plus watch for a wicked awesome new site design next week!). But think about your site, and what you want people to do on your site. Are you hindering them in any way, or are you making it easy for them or difficult? BJ also discussed the idea that the "lightest touch works." Often times the motivation exists on the users side, but they just need to be facilitated through the action. Where can you make improvements on your site?
Also on the second day, I attended a great session “Search, PR and the Social Butterfly.” I loved that Lisa Buyer focused on ways to attract journalists to your information. She mentioned that 100% of journalists use Google as a tool when working on stories. Think about it. Your PR strategies (and we’re not talking the PageRank ones now) need to be online where the journalists are looking. So if they’re searching, you want to be there!
She talked about today’s PR being a mix of being optimized, publicized and socialized. That means making sure you’ve optimized your content for not only your customers but for the media as well. Make sure you’re using keywords, relevant titles and don’t forget to add social links to your press releases. Lisa had a few great tips I wanted to share on publicizing and socializing to get the information out there. Don’t just sit around waiting for it to come to you. Here are just a few ways to get your content out there:
One of the things that jumped out at me the most was their use of Klout to find the influencers. This is somewhat similar to my first point above, but what they did was look up every person that registered for PubCon in Klout to see their influence and reach among Twitter. They then reached out to those with high Klout, like this guy, and thanked them for signing up, or retweeted them, etc. By contacting the people who can motivate and influence your followers (see how I just tied all my points together there?) while on their mobile phone (ok I’m stretching it), you end up gaining more reach.
This is actually something we try to do here at SEOmoz every day, how can you motivate your influencers?
Speaking of conferences, we have just a few tickets left for the SEOmoz Seminar next week. Grab them before we’re completely sold out!
Posted by Paras Chopra
This post was originally in YOUmoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of SEOmoz, Inc.
Conversion Rate Optimization (CRO) is the newest darling of Internet Marketers, after all what good is traffic if it doesn’t convert. Unfortunately (or fortunately, depending how you look at it), unlike Pay Per click (PPC) marketing, CRO isn’t a game of how much money you can throw. In fact, this field requires as much creativity, as it requires monetary investment. That’s what makes conversion rate optimization a fair arena. Your well-funded, bigger competitors can of course beat you at generating more traffic but they can’t beat you at the conversion rate game (unless you allow them to).
Every website has unique conversion goals, so the approach to conversion rate optimization is unique for every website. You should not expect to follow tips from a “best-practices” article and boost your conversion rates instantly. Chances are high that what worked for others may not work for you. So, the biggest step in increasing conversion rates is coming up with creative ideas and designs that can work.
Even though conversion rate optimization is a very custom process for every website, over the course of last couple of years (and course of more than 1000 split tests) I have observed a few general patterns which yielded great results. Different ideas for increasing conversion rate are worth discussing because they become a great source of input for coming up with your own ideas. In this article I will discuss all such generic ideas for conversion rate optimization detailed through different case studies. Let’s start by discussing what role does design play in increasing conversion rate.
Role of Design
From conversion perspective, design of a website is the most important aspect amongst all variables involved. The difference between better converting design and worse converting design usually boils down to not confusing the visitor in what he is expected to do on a page. Take a look at the examples below:
Base camp homepage design: 14% increase in conversions
What made newer design convert 14% more visitors? A clean design. The new design clearly guides a visitor towards Plans and Pricing link while the old design presented a whole lot of choices. Need more proof that having less choices on page can increase conversion rates? Have a look at the case study below:
Gyminee homepage redesign: 20% increase in conversion rate
In addition to reducing the number of choices for the visitor, having a design that shows you as a professional and trustworthy company can also increase conversions. Take a look at the following case study, where the redesigned sales page has various trust elements (seal, money back guarantee, testimonials) and the design has various little tweaks (color scheme, buttons instead of links for download, layout, etc.) which made it look professional. Note that the sales (not just conversions) increased by 20% just by changing the design. No additional products, no additional traffic, pure conversion rate optimization:
AquaSoft sales page redesign: 20% increase in sales
There are more such case studies where design played a key role in optimizing conversion rates. Have a look at them below:
Role of Headline and Copy
When you receive an email it’s the name of the sender and the subject line of email that influences your decision to open it right way or to post pone it to future. Similarly, when a visitor arrives on your website, it’s the design/brand name AND the headline of the page that influences his decision to engage with your page. Visitors’ attention is the costliest commodity on the Internet and your page’s headline is where it goes right after arriving on it.
Take a look at the case study below where 37Signals tested different kinds of headlines (and the winning one boosted conversion rate by 30%).
Highrise Headline test – 30% increase in conversions
The winning variation said “30-day Free Trial on All Accounts” and worst performing variation said “Start a HighRise Account”. Note that clear, no-nonsense headline won. If you think about it, if a visitor is on Signup page he obviously knows that he is signing up for HighRise account. The winning headline clearly convinces the already interested visitor that there is nothing to loose as they offer a 30-day free trial.
Another example of how much headlines matter: CityCliq, a startup in local marketing industry, split tested the positioning of their product.
CityCliq headline test: 90% increase in conversions
Here are different headlines they tested:
The winning headline “create a webpage for your business” tells the visitor what exactly does CityCliq does and no wonder it increased conversions by 90%. As they say, don’t make your visitors think.
Right after looking at headline, if his interest is piqued, a visitor looks at the (text/video) copy on the page. That’s why a combined optimization of headline and copy proves to be effective, as it did for SEOMoz:
Conversion Rate Experts’ How we made million for SEOmoz
They tested a variety of headlines and copy elements on the landing page for Pro subscription. In the end, they found out that a headline that piqued interest and a copy that laid out what exactly constitutes a Pro subscription won (no matter how long it turned out to be).
Other case studies where headline and copy mattered:
Role of Call-to-Action
So, you optimized your design, optimized headlines and page copy. You got visitor interested and motivated to try whatever you are offering. There is still one last hurdle before you can throw a success party for your CRO project. Yes, call-to-action is the last hurdle for you to cross. Even though call-to-action may be considered as minutiae for CRO, the following case studies demonstrate that even simple A/B testing of call-to-action can result in great improvements.
A highly motivated visitor will sniff out even the poorest of all call-to-action buttons. So, while optimizing this aspect of your page, make note that you are optimizing for the busy, semi-interested visitor. If he can’t locate how to try out whatever you are offering, he will hit the back button. (And in CRO, back button is the greatest enemy of all).
37Signal’s call to action – signups increased by 200%
The now-omnipresent “See Plans and Pricing” increased signups for HighRise by 200%. I have included this case study not to convince you to replace all your buttons with this text (it may not actually work for you). Rather, the point is to convince you that even small changes in call-to-action can have dramatic impact on conversion rates. And the best thing about call-to-action is that they are so easy to test. It literally takes 5 minutes to get such test up and running.
Another oft-repeated test is to see which color works best for a call-to-action (unsurprisingly, a bright color such as red mostly works better, this may be because they are eye catchy and drives visitors attention towards them). As an example, along with testing test “Signup for free” v/s “Get Started Now”, Dmix also tested Green v/s red buttons and found out that red colored works button.
Dmix case study – 72% increase in conversions
To repeat my earlier point, with call-to-action sometimes surprisingly trivial changes can produce significant results. Take a look at the following case study:
Soocial’s homepage – 28% increase in conversions
Notice that all they did was to add “It’s free” alongside Sign up now to boost conversion rate. This is definitely a trivial change, but why won’t you test such trivial changes if they don’t take much effort and have potential to fatten your bottomline?
Some other case studies where call-to-action helped increase conversion rate:
Role of You
The framework of optimizing design, headline, copy and call-to-action should provide you with a good plan to design your CRO program. What matters in increasing conversions is not making your visitor think about what you are offering and how to actually try that offering. Try to make everything obvious and simple, guiding your visitor from headline to copy to call-to-action like a smooth flowing river.
However, no matter how many case studies you read and what theory I propose here, in the end your conversion rate optimization program will turn out to be unique because your website is unique, your audience is unique and your goals are unique. The real key to increasing conversion rate is to keep experimenting and keep doing tests.
Author Bio: Paras Chopra is the founder of Visual Website Optimizer, world’s easiest A/B split testing software. Thousands of companies and agencies have been a
ble to increase sales and conversions upto 90% within first few days of using the tool (read published case studies). You can follow the company on Twitter @wingify.
If you have been in the SEO field for any serious length of time you have probably come across (and benefited from) some of Tedster’s work – either directly, or indirectly from others who have repackaged his contributions as their own. He is perhaps a bit modest, but there are few people in the industry as universally well respected as he is. I have been meaning to interview him for a while now, and he is going to be speaking at Pubcon South on April 14th in Dallas, so I figured now was as good a time as any
How long have you been an SEO, and how did you get into the field?
I started building websites in 1995 and the word SEO hadn’t been invented. I came from a background in retail marketing, rather than technology or graphic design. So my orientation wasn’t just “have I have built a good site?”, but also “are enough people finding my site?”
The best method for bringing in traffic seemed to be the search engines, so I began discussing this kind of marketing with other people I found who had the same focus. Ah, the good old days, right? We were so basic and innocently focused, you know?
If you could list a few key documents that helped you develop your understanding of search, which would be the most important ones?
Here are a few documents that acted as major watersheds for me:
Is PageRank still crucial? Or have other things replaced it in terms of importance?
What PageRank is measuring (or attempting to measure) is still very critical — both the quality and number of other web pages that link to the given page. We don’t need to worship those public PR numbers, but we definitely do need quality back-links (and quality internal linking) to rank well on competitive queries.
There appears to be something parallel to PR that is emerging from social media — some metric that uses the model of influencers or thought leaders. But even with that in the mix, ranking would still depend on links, but they would be modified a bit by “followers”, “friends”, since many social sites are cautious with do-follow links.
Lets play: I have got a penalty – SEO edition. Position 6, -30, 999, etc. Are these just bogus excuses from poor SEOs who have no job calling themselves SEOs, or are they legitimate filters & penalties?
If the page never ranked well, then yes – it could well be a bogus excuse by someone whose only claim to being an SEO is that they read an e-book and bought some rank tracking software. However, Google definitely has used very obvious numeric demotions for pages that used to rank at the top.
The original -30 penalty is an example that nailed even domain name “navigational” searches. It affected some sites that did very aggressive link and 301 redirect manipulation.
What was originally called the -950 (end of results) penalty, while never an exact number, most definitely sent some very well ranked pages down into the very deep pages. Those websites were often optimized by very solid SEO people, but then Google came along and decided that the methods were no longer OK.
In recent months, those exact number penalties seem to have slipped away, replaced something a bit more “floating” and less transparent. My guess is that a negative percentage is applied to the final run re-ranking, rather than subtracting a fixed number. Google’s patent for Phrase-based Indexing does mention both possible approaches.
But even using percentages rather than a fixed number, when a top-ranked page runs afoul of some spam prevention filter, it can still tank pretty far. We just can’t read the exact problem from the exact number of positions lost anymore.
Do you see Google as having many false positives when they whack websites?
Yes, unfortunately I do. From what I see, Google tends to build an algorithm or heuristic that gathers up all the URLs that seem to follow their “spam pattern du jour” — and then they all get whacked in one big sweep. Then the reconsideration requests and the forum or blog complaints start flying and soon Google changes some factor in that filter. Viola! Some of the dolphins get released from the tuna net.
One very public case was discussed on Google Groups, where an innocent page lost its ranking because a “too much white space” filter that misread the effect of an iframe!
Google’s John Mueller fixed the issue manually by placing a flag on that one site to trigger a human inspection if it ever got whacked in the future. I’d assume that the particular filter was tweaked soon after, although there was no official word.
How many false positives does it take to add up to “many”? I’d guess that collateral damage is a single digit percentage at most — probably well under 5% of all filtered pages, and possibly less than 1%. It still hurts in a big way when it hits YOUR meticulously clean website. And even a penalty that incorrectly nails one site out of 300 can still affect quite a lot over the entire web.
How often when rankings tank do you think it is do to an algorithmic issue versus how often it is via an editorial issue with search employees?
When there are lots of similar complaints at the same time, then it’s often a change in some algorithm factor. But if it’s just one site, and that site hasn’t done something radically new and different in recent times, then it’s more likely the ranking change came from a human editorial review.
Human editors are continually doing quality review on the high volume, big money search results. It can easily happen that something gets noticed that wasn’t seen before and that slipped through the machine part of the algorithm for a long time.
That said, it is scary how often sites DO make drastic errors and don’t realize it. You see things like:
Google did a big favor for honest webmasters with their “Fetch as googlebot” tool. Sometimes it’s the easiest way to catch what those hacker criminals are doing.
When does it make sense for an SEO to decide to grovel to Google for forgiveness, and when should they try to fix it themselves and wait out an algorithmic response?
If you know what you’ve been doing that tripped the penalty, fix it and submit the Reconsideration Request. If you don’t know, then work on it — and if you can’t find a danged thing wrong, try the Google Webmaster Forums first, then a Request. When income depends on it, I say “grovel”.
I don’t really consider it groveling, in fact. The Reconsideration Request is one way Google acknowledges that their ranking system can do bad things to good websites.
I’ve never seen a case where a request created a problem for the website involved. It may not do any good, but I’ve never seen it do harm. I even know of a case where the first response was essentially “your site will never rank again” — but later on, it still did. There’s always hope, unless your sites are really worthless spam.
Many SEOs theorize that sometimes Google has a bit of a 2-tier justice system where bigger sites get away with murder and smaller sites get the oppressive thumb. Do you agree with that? If no, please explain why you think it is an inaccurate view. If yes, do you see it as something Google will eventually address?
I’d say there is something like that going on — it comes mostly because Google’s primary focus is on the end user experience. Even-handed fairness to all websites is on the table, but it’s a secondary concern.
The end user often expects to see such and such an authority in the results, especially when it’s been there in the past. So Google itself looks broken to a lot of people if that site gets penalized. They are between a rock and a hard place now.
What may happen goes something like this: an A-list website gets penalized, but they can repair their spam tactics and get released from their penalty a lot faster than some less prominent website would. It does seem that some penalties get released only on a certain time frame, but you don’t see those time frames applied to an A-list.
This may even be an effect of some algorithm factor. If you watch the flow of data between the various Google IP addresses, you may see this: There are times when the domain roots from certain high value websites go missing and then come back. Several data center watchers I know feel that this is evidence for some kind of white-list.
If there is a white-list, then it requires a history of trust plus a strong business presence to get included. So it might make also sense that forgiveness can come quickly.
As a practical matter, for major sites there can easily be no one person who knows everything that is going on in all the business units who touch the website.
Someone down the org chart may hire an “SEO company” that pulls some funny business and Google may seem to turn a blind eye to it, because the site is so strong and so important to Google’s end user. They may also just ignore those spam signals rather than penalize them.
Large authority site content mills are all the rage in early 2010. Will they still be an effective business model in 2013?
It’s tough to see how this could be quickly and effectively reined in, at least not by algorithm. I assume that this kind of empty filler content is not very useful for visitors — it certainly isn’t for me. So I also assume it must be on Google’s radar.
I’d say there’s a certain parallel to the paid links war, and Google’s first skirmishes in that arena gave then a few black eyes. So I expect any address to the cheap content mills to be taken slowly, and mostly by human editorial review.
The problem here is that every provider of freelance content is NOT providing junk – though some are. As far as I know, there is no current semantic processing that can sort out the two.
Given that free forums have a fairly low barrier to entry there are perhaps false alarms every day on ringing in the next major update or some such. How do you know when change is the real deal? Do you passively track a lot of data? And what makes you so good at taking a sea of tidbits and sort of mesh them into a working theme?
I do watch a lot of data, although not nearly to the degree that I used to. Trying to reverse engineer the rankings is not as fruitful as it used to be —especially now that certain positions below the top three seem to be “audition spots” rather than actually earned rankings.
It helps to have a lot of private communications — both with other trusted SEOs and also with people who post on the forums. When I combine that kind of input with my study of the patents and other Google communications, usually patterns start to stand out.
When you say “audition spots” how does that differ from “actually earned rankings”? Should webmasters worry if their rankings bounce around a bit? How long does it typically take to stabilize? Are there any early signs of an audition going good or bad? Should webmasters try to adjust mid-stream, and if so, what precautions should they take?
At least in some verticals, Google seems to be using the bottom of page 1 to give promising pages a “trial” to see how they perform. The criteria for passing these trials or “auditions” are not very clear, but something about the page looks good to Google, and so they give it a shot.
So if a page suddenly pops to a first page ranking from somewhere deep, that’s certainly a good sign. But it doesn’t mean that the new ranking is stable. If a page has recently jumped way up, it may also go back down. I wouldn’t suggest doing anything drastic in such situations, and I wouldn’t overestimate that new ranking, either. It may only be shown to certain users and not others. As always, solid new backlinks can help – especially if they are coming from an area of the web that was previously not heard from in the backlink profile. But I wouldn’t play around with on-page or on-site factors at a time like that.
There’s also a situation where a page seems to have earned a lot of recent backlinks but there’s something about those links that smells a bit unnatural. In cases like that, I’ve seen the page get a page one position for just certain hours out of the day. But again, it’s the total backlink profile and its diversity that I think is in play. If you’ve done some recent “link building” but it’s all one type, or the anchor text is too obviously manipulated, then look around for some other kinds of places to attract some diversity in future backlinks.
On large & open forums lots of people tend to have vastly different experience sets, knowledge sets, and even perhaps motives. How important is your background knowledge of individuals in determining how to add their input into your working theme? Who are some of the people you trust the most in the search space?
I try never to be prejudiced by someone’s recent entry into the field. Sometimes a very new person makes a key observation, even if they can’t interpret it correctly.
There is a kind of “soft SEO” knowledge that is rampant today and it isn’t going to go away. It’s a mythology mill and it’s important not to base a business decision on SEO mythology. So, I trust hands on people more than manager types and front people for businesses. If you don’t walk the walk, then for me your talk is highly suspect.
I pay attention to how people use technical vocabulary — do they say URL when they mean domain name? Do they say tag when they mean element or attribute? Not that we don’t all use verbal shortcuts, but when a pattern of technical precision becomes clear, then I listen more closely.
I have long trusted people who do not have prominent “names” as well as some who do. But I also trust people more within their area of focus, and not necessarily when they offer opinions in some other area.
I hate to make a list, because I know someone is going to get left out accidentally. Let’s just say “the usual suspects.” But as an example, if Bruce Clay says he’s tested something and discovered “X”, you can be pretty sure that he’s not blowing sunshine.
Someone who doesn’t have huge name recognition, but who I appreciate very much is Dave Harry (thegypsy). That’s partly because he pays attention to Phrase-based Indexing and other information retrieval topics that I also watch. I used to feel like a lone explorer in those areas before I discovered Dave’s contributions.
What is the biggest thing about Google where you later found out you were a bit off, but were pretty certain you were right?
That’s easy! Using the rel=”nofollow” attribute for PR sculpting. Google made that method ineffective long before I stopped advocating it. I think I actually blushed when I read the comment from Matt Cutts that the change had been in place for over a year.
What is the biggest thing about Google where you were right on it, but people didn’t believe until months or years later?
The reality of the poorly named “minus 950″ penalty. I didn’t name it, by the way. It just sort of evolved from the greater community, even though I kept trying for “EOR” or “End Of Results.
At PubCon South I believe you are speaking on information architecture. How important is site structure to an effective SEO strategy? Do you see it gaining or losing importance going forward?
It is hugely important – both for search engines and for human visitors.
Information Architecture (IA) has also been one the least well understood areas in website development. IA actually begins BEFORE the technical site structure is set up. Once you know the marketing purpose of the site, precisely and in granular detail, then IA is next.
IA involves taking all the planned content and putting it into buckets. There are many different ways to bucket any pile of content. Some approaches are built on rather personal idiosyncrasies, and other types can be more universally approachable. Even if you are planning a very elaborate, user tagged “faceted navigation” system, you still need to decide on a default set of content buckets.
That initial bucketing process then flows into deciding the main menu structure. Nest you choose the menu labels, and this is the stage where you fix the actual menu labels and fold in keyword research. But if a site is built on inflexible keyword targets from the start, then it can often be a confusing mess for a visitor to navigate.
As traffic data grows in importance for search ranking, I do see Information Architecture finally coming into its own. However, the value for the human visitor has always clearly visible on the bottom line.
What are some of the biggest issues & errors you see people make when setting up their IA?
There are two big pitfalls I run into all the time:
How would you compare PubCon South against other conferences you have attended in the past?
PubCon South is a more intimate venue than, say Vegas. That means less distraction and more in-depth networking. Even though people do attend from all over the world, there is a strong regional attendance that also gives the conference a different flavor — one that I find a very healthy change of pace.
In addition, PubCon has introduced a new format — the Spotlight Session. One entire track is made completely of Spotlight Sessions with just one or two presenters, rather than an entire panel. These are much more interactive and allow us to really stretch out on key topics.
Thanks Tedster! If you want to see Tedster speak he will be at Pubcon Dallas on the 14th, and if you want to learn about working with him please check out Converseon. You can also read his latest musings on search and SEO by looking into the Google forums on WebmasterWorld. A few months back Tedster also did an interview with Stuntdubl.SEO Book.com – Learn. Rank. Dominate.
When the iPad was announced in January, it was the very first time Apple created something that I immediately wanted. I’ve never been an Apple fanboy, I don’t work on Macs for either my primary workstations or laptop – I use them for testing and ensuring compatibility only. But the iPad was different. I immediately saw it’s purpose as consumption device and it’s promise as a productivity tool.
The naysayers that tagged the iPad as “the iFail” seemed to focus on how it fails as a laptop or netbook, and how it lacks the features expected of those devices. While there are definitely limitations to what you can do, and how easily certain tasks can be accomplished, I gave Apple the benefit of the doubt, based on their track record, and knowing that this is a first generation device. As Edgar Bronfman, Warner Music Group CEO noted “No one’s got rich betting against Steve Jobs.”
Having never purchased a Kindle or netbook, I was looking forward to the 3G iPad enabling me to essentially ditch my laptop while traveling, and be a complete solution for any media (reading, music, movies) consumption needs. Most of my colleagues derided the “working exclusively from the iPad” notion as wishful thinking, and told me to be sure to pack the laptop as well. After using the iPad for over a month now, I’m looking forward to proving them wrong this week, as I thoroughly road test it during SMX Advanced. I’ve spent the last few weeks assembling the various tools (apps) that I’ll need, setting up accounts, passwords, etc. and can honestly say, with the benefit of the cloud, there’s nothing that I’d need to take care of while on the road, that I can’t do on the iPad.
Most of the apps that I use or need, relate to development – managing sites, updating sites, and keeping the trains running on time. To that end, these are the tools I employ on the iPad:
The lack of mulitasking (for now) on the iPad, means accomplishing some tasks can be a little tricky, or are at best, not intuitive. One such task involves taking an attachment from an email, and getting it onto a server to use in post, on a page, or otherwise link to. To accomplish this – you really just need one .99 app – GoodReader.
Advertisement: Find out how to get your bloggers talking about your products ViralConversations.com. #8
Related posts brought to you by Yet Another Related Posts Plugin.