Tags Archives

You are currently viewing all posts tagged with from.

Posted by randfish

In early June of this year, SEOmoz released some ranking correlation data about Google’s web results and how they mapped against specific metrics. This exciting work gave us valuable insight into Google’s rankings system and both confirmed many assumptions as well as opened up new lines of questions. When Google announced their new Places Results at the end of October, we couldn’t help but want to learn more.

In November, we gathered data for 220 search queries – 20 US cities and 11 business "types" (different kinds of queries). This dataset is smaller than our web results, and was intended to be an initial data gathering project before we dove deeper, but our findings proved surprising significant (from a statistical standpoint) and thus, we’re making the results and report publicly available.

As with our previous collection and analysis of this type of data, it’s important to keep a few things in mind:

  1. Correlation ≠ Causation – the findings here are merely indicative of what high ranking results are doing that lower ranking results aren’t (or, at least, are doing less of). It’s not necessarily the case that any of these factors are the cause of the higher rankings, they could merely be a side effect of pages that perform better. Nevertheless, it’s always interesting to know what higher ranking sites/pages are doing that they’re lower ranking peers aren’t.
  2. Statistical Signifigance – the report specifically highlights results that are more than two standard errors away from statistical significance (98%+ chance of non-zero correlation). Many of the factors we measured fall into this category, which is why we’re sharing despite the smaller dataset. In terms of the correlation numbers, remember that 0.00 is no correlation and 1.0 is perfect correlation. It’s in our opinion that in algorithms like Google’s, where hundreds of factors are supposedly at play together, data in the 0.05-0.1 range is interesting and data in the 0.1-0.3 range potentialy worth more significant attention.
  3. Ranked Correlations – the correlations are comparing pages that ranked higher vs. those that ranked lower, and the datasets in the report and below are reporting on average correlations across the entire dataset (except where specified), with standard error as a metric for accuracy.
  4. Common Sense is Essential – you’ll see some datapoints, just like in our web results set, that would suggest that sites not following the  commonly held "best practices" (like using the name of the queried city in your URL) results in better rankings. We strongly urge readers to use this data as a guideline, but not a rule (for example, it could be that many results using the city name in the URL are national chains with multiple "city" pages, and thus aren’t as "local" in Google’s eyes as their peers).

With those out of the way, let’s dive into the dataset, which you can download a full version of here:

  • The 20 cities included:
    • Indianapolis
    • Austin
    • Seattle
    • Portland
    • Baltimore
    • Boston
    • Memphis
    • Denver
    • Nashville
    • Milwaukee
    • Las Vegas
    • Louisville
    • Albuquerque
    • Tucson
    • Atlanta
    • Fresno
    • Sacramento
    • Omaha
    • Miami
    • Cleveland
  • The 11 Business Types / Queries included:
    • Restaurants
    • Car Wash
    • Attorneys
    • Yoga Studio
    • Book Stores
    • Parks
    • Ice Cream
    • Gyms
    • Dry Cleaners
    • Hospitals

Interestingly, the results we gathered seem to indicate that across multiple cities, the Google Places ranking algorithm doesn’t differ much, but when business/query types are considered, there’s indications that Google may indeed be changing up how the rankings are calculated (an alternative explanation is that different business segments simply have dramatically different weights on the factors depending on their type).

For this round of correlation analysis, we contracted Dr. Matthew Peters (who holds a PhD in Applied Math from Univ. of WA) to create a report of his findings based on the data. In discussing the role that cities/query types played, he noted:

City is not a significant source of variation for any of the variables, suggesting that Google’s algorithm is the same for all cities. However, for 9 of the 24 variables we can reject the null hypothesis that business type is a not significant source of variation in the correlation coefficients at a=0.05. This is highly unlikely to have occurred by chance. Unfortunately there is a caveat to this result. The results from ANOVA assume the residuals to be normally distributed, but in most cases the residuals are not normal as tested with a Shapiro-Wilk test.

You can download his full report here.

Next, let’s look at some of the more interesting statistical findings Matt discovered. These are split into 4 unique sections, and we’re looking only at the correlations with Places results (though the data and report also include web results).

Correlation with Page-Specific Link Popularity Factors

Google Places Correlations with Page-Specific Link Popularity Elements

With the exception of PageRank, all data comes via SEOmoz’s Linkscape data API.

NOTE: In this data, mozRank and PageRank are not significantly different than zero.

Domain-Wide Link Popularity Factors

Google Places Domain Link Factor Correlations

All data comes via SEOmoz’s Linkscape data API.

NOTE: In this data, all of the metrics are significant.

Keyword Usage Factors

Google Places Keyword Usage Correlations 

All data comes directly from the results page URL or the Places page/listing. Business keyword refers to the type, such as "ice cream" or "hospital" while city keyword refers to the location, such as "Austin" or "Portland." The relatively large, negative correlation with the city keyword in URLs is an outlier (as no other element we measured for local listings had a significant negative correlation). My personal guess is nationwide sites trying to rank individually on city-targeted pages don’t perform as well as local-only results in general and this could cause that biasing, but we don’t have evidence to prove that theory and other explanations are certainly possible.

NOTE: In this data, correlations for business keyword in the URL and city keyword in the title element were not significantly different than zero.

Places Listings, Ratings + Reviews Factors

Google Places Li<br />
stings Correlations 

All data comes directly from Google Places’ page about the result.

NOTE: In this data, all of the metrics are significant. 

Interest Takeaways and Notes from this Research:

  • In Places results, domain-wide link popularity factors seem more important than page-specific ones. We’ve heard that links aren’t as important in local/places and the data certainly suggest that’s accurate (see the full report to compare correlations), but they may not be completely useless, particularly on the domain level.
  • Using the city and business type keyword in the page title and the listing name (when claiming/editing your business’s name in the results) may give a positive boost. Results using these keywords seem to frequently outrank their peers. For example: Portland Attorneys Places Results
     
  • More is almost always better when it comes to everything associated with your Places listing – more related maps, more reviews, more "about this place" results, etc. However, this metric doesn’t appear as powerful as we’d initially thought. It could be that the missing "consistency" metric is a big part of why the correlations here weren’t higher.
  • Several things we didn’t measure in this report are particularly interesting and it’s sad we missed them. These include:
    • Proximity to centroid (just tough to gather for every result at scale)
    • Consistency of listings (supposedly a central piece of the Local rankings puzzle) in address, phone number, business name, type
    • Presence of specific listing sources (like those shown on GetListed.org for example)
  • This data isn’t far out of whack with the perception/opinions of Local SEOs, which we take to be a good sign, both for the data, and the SEOs surveyed :-)

Our hope is to do this experiment again with more data and possibly more metrics in the future. Your suggestions are, of course, very welcome.


As always, we invite you to download the report and raw data and give us any feedback or feel free to do your own analyses and come to your own conclusions. It could even be valuable to use this same process for results you (or your clients) care about and find the missing ingredients between you and the competition.

p.s. Special thanks to Paris Childress and Evgeni Yordanov for help in the data collection process.

Do you like this post? Yes No


SEOmoz Daily SEO Blog

Some of you may have been hit by Google’s 20 October algorithm change.

And some of you wouldn’t have noticed any difference.

On 20 October, a number of sites got trashed. Rankings, and traffic, plummeted through the floor. The webmaster forums lit up. Aaron noticed it. I noticed it. Yet, other webmasters wondered what all the fuss was about.

As many of you know, there is not just one ranking algothimn. There are many algorithms. What affects one site may not affect another. Rather interestingly, Google’s John Mu dipped into this thread on Google’s support forum, offering these words of wisdom (HatTip: Barry)

It looks like the changes you’re seeing here may be from an algorithmic change. As part of our recent algorithmic changes (which the outside world sometimes refers to as the “May Day update” because it happened primarily in May), our algorithms are assessing the site differently. This is a ranking change, not any sort of manual spam penalty, and not due to any technical issues with regards to crawling or indexing your content. You can hear more about this change in Matt’s video: ”

…and….

Various parts of our algorithms can apply to sites at different times, depending on what our algorithms find. While we initially rolled out this change earlier this year, the web changes, sites change, and with that, our algorithms will continually adapt to the current state on the web, on those sites. While it might be confusing to see these changes at the same time as this issue, they really aren’t related, nor is this a general algorithm change (so if other sites have seen changes recently, it probably doesn’t apply to them as well).

Matt’s video, made four months ago, was talking about the algorithmic MayDay change. John Mu adds: “Various parts of our algorithms can apply to sites at different times” In other words, whatever happened in May may not affect your site in May, or June, or July, but might hit you many months later. This implies that your site may trip a threshold, and be judged quite differently than it was the day before.

This still doesn’t completely explain why so many sites were hit on the same day, but then Google don’t typically explain things in detail.

To complicate matters, there was an acknowledged indexing problem, affecting new content, particularly on blogs. Again, John appears to suggest this was a separate issue.

Forget About Search Engines, Just Publish

Now, all SEOs are used to algorithm changes. Nothing new. But this one has me genuinely perplexed, mainly because of the type of sites that got hit.

Time for some self-searching Q&A about one of my own sites:

Q: So, how many links did you buy?
A: None.
Q: Are you selling links?
A: Nope.
Q: Linking to “bad neighborhoods”?
A: Not that’ I’m aware of…..
Q: Did you link-build in an aggressive manner?
A: No. I did no link building, whatsoever.
Q: Huh?
A: That’s not a question.
Q: So you just published content?
A: Right.
Q: And people linked to your site, of their own accord?
A: Yep. I guess they liked it.
Q: Was your content heavily SEO’d?
A: No. In fact, I gave writers specific instructions not to do anything resembling “SEO copywriting”. It ruins the flow for readers.
Q: All original content?
A: All original. Hand written. No machines involved anywhere.
Q: So this site conforms to Google’s Webmaster Guidelines?
A: I’d say it lies well within them. “Be useful to end users”, was the guiding principle.

Yet it got hit hard.

What’s also interesting is the nature of the sites that replaced it. I checked keyword after keyword, and found script driven, aggressive black-hat, content-free sites in top positions. Not in all cases – there are certainly useful sites that deserve to be there, and deserve to appear above mine. Fair play. However, there were plenty of sites of – shall we say – dubious merit- occupying high positions.

Curious.

Be Useful. Perhaps

Now, I believe in publishing useful, unique content, and not paying too much attention to SEO, other than covering the basics. SEO is one strategy amongst many, and sites should, first and foremost, prove useful to people.

Clearly, no site is immune. You can stay within Google’s Webmaster guidelines, and get taken out. I knew that anyway, but when the sites that don’t follow the guidelines replace you…

….I’ll admit – it grates.

Presumably, Google rewards the sites it likes with high rankings, and if we see a lot of aggressive sites filling the top page, should we therefore assume that aggressive sites are what Google actually wants?

I’d like to think not.

Perhaps they are just trying to mess with our heads?

Or they messed up?

Or the changes are still bedding in?

Or they really do want it this way?

I’m still watching, and considering. Perhaps the site will just pop back up in due course. Or perhaps I need to go back to the drawing board. I’ll let you know how I get on.

If you’ve noticed something similar on your sites, chime in on the comments.

SEO Book.com – Learn. Rank. Dominate.

The latest video from Matt Cutts talks about the value of SEO to Google.

The questioner asks:

“Why does Google support SEO specialists with advice? Google’s business is to sell text ads…”?

Matt explains that Google sees SEO helping, rather than hindering, their business model long term.

How?

SEOs create – and encourage site-owners to create – the very sites that Google’s technology demands i.e. context accessible by an automated crawler, largely text based, and clearly marked up.

By having sites that jive well with Google’s technology, this lowers Google’s costs, and helps make Google results more relevant in the eyes of the end user. The larger their index, the more chances Google has to answer the query. SEOs love creating crawlable content!

This means the end user keeps coming back, which in turn translates to Google’s bottom line.

It’s also a good idea to give webmasters something, else Google risks an more adversarial relationship, which again can cause Google problems.

So SEO is good for Google’s business – the “good” type of SEO, as defined by Google, of course.

Win-Win

Matt, as always, is giving the side of the story Google wants you to hear.

His position sounds reasonable, generous, and inclusive, and it is – in many respects. But make no mistake – Google aren’t there for webmasters. Google will do what is good for Google. If SEO was bad for Google, Google would not be reaching out to the SEO community, in much the same way they don’t reach out to, say, the malware writer community. They just stamp it out.

Matt is a master of public relations. Webmasters can learn a lot from Matt in terms of how to handle their own public relations challenges.

Here are a few pointers, based on Matt Cutts approach:

Public Relations Is Relations With The Public

Matt doesn’t talk from on high. He doesn’t talk at his audience. He talks with them. He attends events where his audience congregate, and he encourages interaction and questions. This activity serves to build a personal relationship, which helps make his messages easier to convey and sell.

Look for ways in which you can go *to* your audience/customers. Where do they hang out? Address them on their own terms, and in their own environment. Regularly encourage questions, criticism and feedback. When it comes time to announce new products and services, your audience is likely to be more receptive than if your communications are anonymous and sporadic.

Ok, this might be all very well for Matt Cutts. Everyone pays attention to Google, because Google are important. However, no matter how big or small your audience, you still must find a way to relate to them.

These days, it’s not so much what people say, it’s often who is saying it. Modern media is driven by personalities. The content of the message is seldom good enough to stick, unless it is truly remarkable.

People listen to Matt in ways they don’t listen to an anonymous Google press release because of the personal relationship he has worked hard to establish. This works just as well for small businesses. In fact, this is one of the big advantages of a small business – the personal touch. Google is a big company, but they work hard to appear like a small one, at least in terms of their personal relations approach with webmasters.

Matt also gets out in front of issues. If there’s something going on in the web community relating to his area, he’s almost certainly quick to comment on it. By doing so, he can control and frame the conversation in terms that suit Google. If there are industry issues that relate to your work or company, use them as an opportunity to grab the spotlight. Try to become the media go-to person in your local community for issues by building relationships with media and news outlets.

PR consultants aren’t quite as necessary as they used to be. They aren’t redundant, but the most important lesson to learn from Matt Cutts is that PR is something you need to embody. It’s not just a function that you slap on, or hire in, when it suits, and still be as effective. Make PR flow through all you do.

Matt’s greatest skill is not making it look like PR at all.

SEO Book.com – Learn. Rank. Dominate.

Posted by Dana Lookadoo

Day 1 of SEOmoz Pro Training was like being at a race track. The course careened from clicks to conversions and from search results to landing pages. The audience watched 9 speakers drive their search marketing race cars at speeds faster than fingers can type. Given the finger-breaking speeds, it was fortunate all SEO fans were well fueled – beginning with a healthy breakfast buffet, mid-morning energy bars, lunch (more all-you-can-eat) and a scrumptious mid-afternoon pit stop with fresh cookies and treats. After everyone was fed each time, it was off to the races.

Todd Freisen was in the sports booth service as emcee, host of ceremonies, referee, judge and time keeper. The event was like a well-oiled machine. Maybe that’s why they call Todd, "Oilman."

Will Critchlow, Todd Freisen, Rand Fishkin - SEOmoz Pro

When I said "yes" to attending the Mozinar on a Press Pass, I didn’t realize I was going to be covering a sporting event. GoodNewsCowboy asked me how I was going to recap and condense this "wild ride." I realized there was a lot of horsepower on-stage and that we were at the SEOmoz Training Raceway.

Mozinar was a wild ride

Mozinar fans experienced exhilaration and gleaned insights as we watched performance race car drivers present their seminar presentations. The following race highlights are condensed from 32 pages of notes. I strongly suggest you buy the Pro Seminar DVD when it’s produced so you can see under the hood for yourself.

From Clicks to Conversions with Local, Social, Analytics and SEO in Between

1st up: Rand Fishkin had pole position and drove a car with a most unusual name, "It’s a Mad, Mad, Mad, Mad SERP."

The results we are seeing in blended search results are even more unusual, starting with changes of the past 2 weeks. For those who attend SEO races regularly and are watching Google, this may be old news. For others, brace yourself. A branded search can have more than 2 results. Rand explained:

  • You have to be seen as a brand.
  • You have to have lots of links pointing to those pages with the brand name.
  • You need to have a high volume set of people searching for those terms, so off-site advertising and media buys can influence the SERPs.

Changes to Image SEO was next, and guess what? Google has a new image search interface.

  • Image results don’t always match image SERP’s order, i.e. images for the artist "manet."
  • Understand, and be prepared. You will not always get the same position in the blended results, leading to frustration.
  • Image SEO value is reduced by the new overlay.

The image below results from clicking on one of the images for the artist "manet" and clicking on an image

Image SEO Value Reduced by overlay

Tip: Write some JavaScript that breaks the pict overlay to avoid having the image overlay. Not only does it produce the longest, ugliest URL, but "it’s just an invite to right click and steal this image."

Rand covered 10 Tips for Image Rankings. (Since we are in race synopsis mode, we’ll speed through this.) One quick takeaway was the minimum image size:

Image Pixel Size – If you go smaller than 400×300 pixels your chances to show in image search are dramatically decreased.

So you don’t have to remember any formulas, basic on-page SEO factors for image SEO include page title and surrounding text.

Video SERPs

It’s or easier to get into video SERPs than to get into the regular SERPS. There is lower competition than ordinary results (most of the time), so take the opportunity. Follow this inclusion process to enter your video race for top ranking:

Step #1: Embed Video Content on Your Pages
Step #2: Create Thumbnail Images for Videos
Step #3: Build a Video XML Sitemap & Submit
Step #4: PROFIT $$$

See Google Webmaster Tools for Video to learn more.

Rand’s foot stayed pedal-to-the-metal as he showed how to produce Rich Snippets in the SERPs. Why is this important? This is where you get most of your clicks. His closing remarks were retweeted with fervor:

"If you can stay on top of this, you will have a big win. It demands full-time SEO."

2nd up: David Mihm was full-speed as he raced through "Ranking in Competitive Local Results." He explained:

Straight from Google’s mouth:
Local intent is 20% of total search volume (April 2010)

And who would imagine that local results could equal 100% of page 1? Try a search for "dentist chicago." (If it’s not 100%, it’s close.)

Google organic results are not, however, the dominate factor for local search. Neither are results from Yahoo! or Bing. Local search is now:

  • Craigslist
  • Twitter
  • FaceBook
  • Citysearch
  • Google Products
  • Mobile devices
  • Garmin GPS
  • Wikipedia
  • Virtual Augmented Reality

Understand that local requires a different mindset from traditional SEO, because the ecosystems vary:

Organic Search Ecosystem

Local Search Ecosystem

  • Traditional SEO is about optimizing websites.
  • Local SEO is about optimizing locations.

Takeaway:

"It is essential to have a holistic local search marketing strategy."

yle="margin-left: 40px;">"Even if all your boss cares about is that friggin’ 7-pack!"

Resources to claim your listings:

"The Big Three" major data providers:

Citations – David recommended a new citation finder tool by Darren Shaw & Garrett French: Whitespark.ca Citation Finder

Find local SEO resources on GetListed.org.

3rd up to race: Dan Zarrella racing in the "Science of Twitter" car. Dan warned us he talked fast. Pro Seminar attendees listened attentively, but given the subject was Twitter … many tweeted insights into how one can get clicks and retweets.

 

Dan’s takeaways were in 140. Below are my fave top three:

Takeaway: Don’t talk about  yourself so much.

Paraphrased: If you want more followers, stop talking about yourself!

Takeaway: Try to stay positive.

If you want to get bummed out, people can go on the News. Even if talking about the oil spill, stay hopeful.

Takeaway: If you want people to click your links, Tweet slower.

Don’t "go Oprah" on your Twitter account, moderate.

Improve your "retweetability" factor by including a combination of the following Top 20 Most Retweetable Words:

Top 20 Most Retweetable Words
Timing for retweets:

Links posted on the weekend and at the end of the week have a higher click through rate.

Tip:  Want to see how well a bit.ly link is doing, CTR?

  1. Put a bit.ly link in the browser.
  2. Type a plus sign after it;
  3. Hit enter to see how many times it’s been clicked through.
  4. Retweeting is an elegant viral mechanism.

Alright … one more Twitter insight before we close …

He had noted that women follow a lot more people and tend to tweet more. They are more social. (We already knew women talk and socialize more, but now Dan’s numbers confirm it.)

Dan covered a lot of geeky ground focused on the science and study of social media, use of FourSquare and more.. I have 5+ pages of notes from Dan’s presentation alone. But I’m concerned this blog post will get too long to be readable.

Check out Dan’s set of social media tools.

4th up and last race of the morning was the "Presentation Off" between Will Critchlow and Rand Fishkin.

I’ll expand on that race in a follow-up post. Do you want to guess who won this year? Will went into the race with a 2-year winning streak.

Do you like this post? Yes No


SEOmoz Daily SEO Blog

Posted by jennita

Last week I covered SES San Francisco for SEOmoz. Every time I attend a conference, I try to go to sessions that will have information I can bring back to the community. Sometimes I look for sessions that aim to answer questions we see a lot in Q & A or that I notice popping up in comments on the blog. Either way, my focus is usually to find information that will be helpful to the community.

Now and then I get a little greedy though, and attend sessions that will benefit me in my job. Luckily I hit the sweet spot at SES and found a little of both. Rather than straight up regurgitate what speakers presented, I thought I’d take their insights and show some examples specific to SEOmoz.

1. Who are the specific people sending you traffic?

At SES I was reminded about my problem with A.F. (analytics forgetfulness) and a few things that I personally should be doing to not only be better at my job, but to help the company and community. Marty Weintraub from aimClear was the one that initially got me thinking in the “Deep Dive Into Analytics” panel on the first day.

How often do we look at traffic sources and focus on which sites are sending traffic… ok always. But what about looking at the actual people from those sites that are sending traffic. Let’s take Twitter for example. When most people are tweeting they’re usually either in an app or they’re on the web looking from their own page, which shows up as “/” for most referrers.

But sometimes, people are viewing a specific person’s twitter page and THEN click your link. In those instances, Google Analytics will show the actual twitter user page as the referrer. This is a quick and easy way to find out WHO is sending you traffic. This person is also probably someone who is an influencer in your community. Finding who the top referrers are is the first step, next you’ll want to use Klout (or another service) to see what their actual reach is. This doesn’t only work for Twitter though, check out the example below that I found looking at delicious referrers.

This is a list of referrers from delicious.com. Let’s see what Chris Brogan, an influencer in the Social Media space bookmarked.

 

Aha! Makes perfect sense, he bookmarked the Facebook Marketing Guide. It didn’t send a TON of traffic, but just think of the possibilites if we actually contacted him and worked together with Chris.

These are people who are individually sending traffic to your page, you probably should think about how you can use that information. As the Community Manager for SEOmoz I know that I will use it to reach out to them. Perhaps retweet them or ask them to write a YOUmoz post. Every organization is different, and this is just one idea. But take the concept of finding the users sending you traffic and run with it!

2. Don’t forget about mobile

My good friend Cindy Krum would probably strangle me for having forgotten all about mobile. This was another area Marty mentioned and I bet many people don’t focus on it. As an example, I thought I’d jump into our analytics and see how mobile users converted.

Yikes!! Before the recent update to our PRO landing page, we had just one PRO signup from a mobile device. That’s seriously pathetic. In the last month, we’ve had 7, which I’d imagine means that the changes we made, help mobile users sign up on our site. But it’s still ridiculously low!

I also thought about looking at what visits to the tools page looked like from mobile and non-mobile browsers. Ouch! This is our highest traffic page behind the home page. The iPhone, iPad and Android were the top 3 mobile devices (not surprisingly really). Perhaps we should make it a bit easier for these devices to access our site and tools. :)

That’s 482 uniques out of 61,102. Definitely something to work on.

3.    “UGC is content that rocks”

That is an exact quote from Michael DeHaven, the SEO Product Manager at Bazaarvoice. Here at SEOmoz we most definitely understand the power of UGC for SEO (waves over at YOUmoz… hi!). But how can you use user generated content to help boost your traffic? Michael gave examples of how UGC helped several companies to increase traffic by adding unique, relavant, keyword rich content.

Check out this particular example for Swanson Health Products. The first image shows the product content. Sure it does have some unique content and some of the keywords they’re going for but in general the content is fairly weak.

In the next image, you see all the great keywords that reviewers of the products have added all on their own. These aren’t SEOs creating content, but real people saying what they feel about the product. Hello! What a great way to increase content to your product pages.

Another example he gave was for Opentable. Their initial implementation had the UGC uncrawlable. After they made a change and opened it up to search engines and were indexed, they had a 17% lift in traffic. Just by allowing the ratings to be indexed. Whoa!

The last example that stuck out in my mind that he gave was that QVC started sending emails to people after they purchased a product asking for a review of the product. It seems like common sense to do something like this, but at the same time it’s absolute genius. I bet you can think of at least one way to get visitors to your site to add content. Whether that’s in a review, a comment, a suggestion, whatever! Ask them a question; people love to give their opinions. :)

The point is… as Michael said it best “UGC is content that rocks,” so don’t forget about it!

4.    Put “Hot Triggers” in the path of motivated people

This was the focus of the keynote by BJ Fogg the Director of Persuasive Technology Lab at Stanford University on the second day. Now, what does that mean exactly? The idea (and I hope I get this right) to make it easy for people who are ready to do something, to do it.

For example, one reason that Twitter did so well in the beginning is that they allowed people to use text message, to send tweets. Obviously they still do, but now many people use various mobile apps when they’re on their phone. When Twitter first took off though, people were
used to reading short messages with a certain cutoff length, so tweeting was simple via text. People who were motivated to tell the world what they ate for breakfast, had the ability to do it quickly and easily.

There are several ways we could employ this here on the SEOmoz site, and one way I thought we could do this is to make it easier to sign up for PRO when you want to use a PRO only tool. Check out the example below for our Keyword Difficulty tool.

Sure, you can click on "log in" and from that page you can sign up and create a free account, but there’s no way other than the "Go PRO" link at the top of the navigation to take someone to become a PRO member. If someone found their way to the Keyword Difficulty tool and is ready to use it, let’s motivate them to become a member. Or at the very least, check out a free version.

Ok, honestly we know this happens on our site, and we’re currently in the works of improving a lot of it (plus watch for a wicked awesome new site design next week!). But think about your site, and what you want people to do on your site. Are you hindering them in any way, or are you making it easy for them or difficult? BJ also discussed the idea that the "lightest touch works." Often times the motivation exists on the users side, but they just need to be facilitated through the action. Where can you make improvements on your site?

5.    Public Relations, the other PR

Also on the second day, I attended a great session “Search, PR and the Social Butterfly.” I loved that Lisa Buyer focused on ways to attract journalists to your information. She mentioned that 100% of journalists use Google as a tool when working on stories. Think about it. Your PR strategies (and we’re not talking the PageRank ones now) need to be online where the journalists are looking. So if they’re searching, you want to be there!

She talked about today’s PR being a mix of being optimized, publicized and socialized. That means making sure you’ve optimized your content for not only your customers but for the media as well. Make sure you’re using keywords, relevant titles and don’t forget to add social links to your press releases. Lisa had a few great tips I wanted to share on publicizing and socializing to get the information out there. Don’t just sit around waiting for it to come to you. Here are just a few ways to get your content out there:

  • Use a social media newsroom like PRESSfeed
  • Find journalists on muckrack.com (a place to find journalists who are on Twitter)
  • Subscribe to HARO (help a reporter out) and submit pitches directly to journalists
  • Post your Press Releases to PRWeb and watch it get distributed (this is a paid service)
  • Use Social Media to find journalists you want to reach out to
  • Join #journchat Monday nights from 8-10pm EST on Twitter to chat with journalists, PR and bloggers
  • Look at LinkedIn and Facebook
Brett Tabke from WebmasterWorld also spoke on this panel and talked about "the PubCon story." His story about how last year PubCon didn’t spend a dime on marketing ads, and ONLY focused on twitter, made me absolutely giddy. I had heard rumors of this in the past, but to see the actual statistics was pretty cool. Oh, and not only did they not any money, they also saw an increase of 30% in attendance. What the… what?!

One of the things that jumped out at me the most was their use of Klout to find the influencers. This is somewhat similar to my first point above, but what they did was look up every person that registered for PubCon in Klout to see their influence and reach among Twitter. They then reached out to those with high Klout, like this guy, and thanked them for signing up, or retweeted them, etc. By contacting the people who can motivate and influence your followers (see how I just tied all my points together there?) while on their mobile phone (ok I’m stretching it), you end up gaining more reach.

This is actually something we try to do here at SEOmoz every day, how can you motivate your influencers?

Final Takeaways and Actions

  1. Don’t forget analytics. Use the information to find influencers sending you traffic.
  2. What about mobile? Do you have users who would love to use your site on their mobile device but can’t?
  3. UGC is content that rocks. How can you utilize UGC on your site?
  4. Put "Hot Triggers" in the path of motivated people.
  5. Public Relations is social now, so get on it.

This year SES had a ton to offer and I highly recommend you check out some of the live blogging from the event. Check out the recap of Liveblogging for day 1, day 2 and day 3.

Speaking of conferences, we have just a few tickets left for the SEOmoz Seminar next week. Grab them before we’re completely sold out!

Do you like this post? Yes No


SEOmoz Daily SEO Blog

Posted by Paras Chopra

Conversion Rate Optimization (CRO) is the newest darling of Internet Marketers, after all what good is traffic if it doesn’t convert. Unfortunately (or fortunately, depending how you look at it), unlike Pay Per click (PPC) marketing, CRO isn’t a game of how much money you can throw. In fact, this field requires as much creativity, as it requires monetary investment. That’s what makes conversion rate optimization a fair arena. Your well-funded, bigger competitors can of course beat you at generating more traffic but they can’t beat you at the conversion rate game (unless you allow them to).

Every website has unique conversion goals, so the approach to conversion rate optimization is unique for every website. You should not expect to follow tips from a “best-practices” article and boost your conversion rates instantly. Chances are high that what worked for others may not work for you. So, the biggest step in increasing conversion rates is coming up with creative ideas and designs that can work.

Even though conversion rate optimization is a very custom process for every website, over the course of last couple of years (and course of more than 1000 split tests) I have observed a few general patterns which yielded great results. Different ideas for increasing conversion rate are worth discussing because they become a great source of input for coming up with your own ideas. In this article I will discuss all such generic ideas for conversion rate optimization detailed through different case studies. Let’s start by discussing what role does design play in increasing conversion rate.

Role of Design

From conversion perspective, design of a website is the most important aspect amongst all variables involved. The difference between better converting design and worse converting design usually boils down to not confusing the visitor in what he is expected to do on a page. Take a look at the examples below:

Base camp homepage design: 14% increase in conversions

Basecamp old v/s new design

What made newer design convert 14% more visitors? A clean design. The new design clearly guides a visitor towards Plans and Pricing link while the old design presented a whole lot of choices. Need more proof that having less choices on page can increase conversion rates? Have a look at the case study below:

Gyminee homepage redesign: 20% increase in conversion rate 

Gymniee old v/s new design

In addition to reducing the number of choices for the visitor, having a design that shows you as a professional and trustworthy company can also increase conversions. Take a look at the following case study, where the redesigned sales page has various trust elements (seal, money back guarantee, testimonials) and the design has various little tweaks (color scheme, buttons instead of links for download, layout, etc.) which made it look professional. Note that the sales (not just conversions) increased by 20% just by changing the design. No additional products, no additional traffic, pure conversion rate optimization:

AquaSoft sales page redesign: 20% increase in sales

Aquasoft old v/s new

There are more such case studies where design played a key role in optimizing conversion rates. Have a look at them below:

Role of Headline and Copy

When you receive an email it’s the name of the sender and the subject line of email that influences your decision to open it right way or to post pone it to future. Similarly, when a visitor arrives on your website, it’s the design/brand name AND the headline of the page that influences his decision to engage with your page. Visitors’ attention is the costliest commodity on the Internet and your page’s headline is where it goes right after arriving on it.

Take a look at the case study below where 37Signals tested different kinds of headlines (and the winning one boosted conversion rate by 30%).

Highrise Headline test – 30% increase in conversions

Winning variation for Highrise

The winning variation said “30-day Free Trial on All Accounts” and worst performing variation said “Start a HighRise Account”. Note that clear, no-nonsense headline won. If you think about it, if a visitor is on Signup page he obviously knows that he is signing up for HighRise account. The winning headline clearly convinces the already interested visitor that there is nothing to loose as they offer a 30-day free trial.

Another example of how much headlines matter: CityCliq, a startup in local marketing industry, split tested the positioning of their product.

CityCliq headline test: 90% increase in conversions

  alt="Citycliq homepage" />

Here are different headlines they tested:

  • Businesses grow faster online! (too fuzzy and so what if they do)
  • Get found faster! (found where?)
  • Online advertising that works! (too generic)
  • Create a webpage for your business (clear, concise and to-the-point)

The winning headline “create a webpage for your business” tells the visitor what exactly does CityCliq does and no wonder it increased conversions by 90%. As they say, don’t make your visitors think.

Right after looking at headline, if his interest is piqued, a visitor looks at the (text/video) copy on the page. That’s why a combined optimization of headline and copy proves to be effective, as it did for SEOMoz:

Conversion Rate Experts’ How we made million for SEOmoz

SEOMoz

They tested a variety of headlines and copy elements on the landing page for Pro subscription. In the end, they found out that a headline that piqued interest and a copy that laid out what exactly constitutes a Pro subscription won (no matter how long it turned out to be).

Other case studies where headline and copy mattered:

Role of Call-to-Action

So, you optimized your design, optimized headlines and page copy. You got visitor interested and motivated to try whatever you are offering. There is still one last hurdle before you can throw a success party for your CRO project. Yes, call-to-action is the last hurdle for you to cross. Even though call-to-action may be considered as minutiae for CRO, the following case studies demonstrate that even simple A/B testing of call-to-action can result in great improvements.

A highly motivated visitor will sniff out even the poorest of all call-to-action buttons. So, while optimizing this aspect of your page, make note that you are optimizing for the busy, semi-interested visitor. If he can’t locate how to try out whatever you are offering, he will hit the back button. (And in CRO, back button is the greatest enemy of all).

37Signal’s call to action – signups increased by 200%

Highrise

The now-omnipresent “See Plans and Pricing” increased signups for HighRise by 200%. I have included this case study not to convince you to replace all your buttons with this text (it may not actually work for you). Rather, the point is to convince you that even small changes in call-to-action can have dramatic impact on conversion rates. And the best thing about call-to-action is that they are so easy to test. It literally takes 5 minutes to get such test up and running.

Another oft-repeated test is to see which color works best for a call-to-action (unsurprisingly, a bright color such as red mostly works better, this may be because they are eye catchy and drives visitors attention towards them). As an example, along with testing test “Signup for free” v/s “Get Started Now”, Dmix also tested Green v/s red buttons and found out that red colored works button.

Dmix case study – 72% increase in conversions

 DMix

To repeat my earlier point, with call-to-action sometimes surprisingly trivial changes can produce significant results. Take a look at the following case study:

Soocial’s homepage – 28% increase in conversions

Soocial

Notice that all they did was to add “It’s free” alongside Sign up now to boost conversion rate. This is definitely a trivial change, but why won’t you test such trivial changes if they don’t take much effort and have potential to fatten your bottomline?

Some other case studies where call-to-action helped increase conversion rate:

Role of You

The framework of optimizing design, headline, copy and call-to-action should provide you with a good plan to design your CRO program. What matters in increasing conversions is not making your visitor think about what you are offering and how to actually try that offering. Try to make everything obvious and simple, guiding your visitor from headline to copy to call-to-action like a smooth flowing river.

However, no matter how many case studies you read and what theory I propose here, in the end your conversion rate optimization program will turn out to be unique because your website is unique, your audience is unique and your goals are unique.  The real key to increasing conversion rate is to keep experimenting and keep doing tests.

 

Author Bio: Paras Chopra is the founder of Visual Website Optimizer, world’s easiest A/B split testing software. Thousands of companies and agencies have been a
ble to increase sales and conversions upto 90% within first few days of using the tool (read published
case studies). You can follow the company on Twitter @wingify.

Do you like this post? Yes No


SEOmoz Daily SEO Blog
Post image for You’re All Wrong! Paid Links From Offtopic Sites Do Count

“Paid links from off topic sites get devalued and flagged.” A common claim by otherwise savvy SEOs such as Ross Hudgens.

Ross, would you take three links from the Wall Street Journal if its reporters write about internet marketing?

Would you buy those links?

I would.

Both for the direct traffic (duh) and the SEO boost (duh).

The page is relevant, the readers are interested in the field… what more do you want?

Guys like Ross seem to think that if a site isn’t on your topic, then you don’t get any value. The reasoning is that their site is irrelevant, so the links are bogus.

That’s obviously wrong.

For starters, the Yahoo directory isn’t on any topic. Yet the links are good.

Second, even if the regular audience isn’t core/interested, the author obviously is.

Last I checked, the test for a link’s acceptability to Google was editorial discretion. Not readership discretion.

Lastly, sites are dynamic. You can start out writing about your mother and then end up talking about my mother. Then you no longer have a site. But I digress…

(Those of you who care about paid links and text links, check out my 101 tips on how to buy text link ads.

If that’s your bag, you’re probably also advanced enough for an advanced SEO book free chapter. Also, I personally liked Mike’s bit on buying links for third party sites.)

p.s. I know that the title should have been, ‘Can Count’ not ‘Do Count’, because links from offtopic sites in a sidebar aren’t getting you anywhere. So sometimes, paid links from offtopic sites don’t work. But “do” has more pop to it than the timid “can” … it’s a style issue.
Creative Commons License photo credit: TheTruthAbout…

This post originally came from Michael Gray who is an SEO Consultant. Be sure not to miss the Thesis WordPress Theme review.

You’re All Wrong! Paid Links From Offtopic Sites Do Count


tla starter kit

Related posts:

  1. Google Adds Paid Links Reporting Google added a new section to webmaster central today entitled…
  2. Paid Links, Pubcon and Matt Cutts I went to two sessions today where Matt Cutts was…
  3. Seth Is Wrong On Paid vs Free Journalism I’ve been writing about thinking critically recently and the Malcolm…
  4. Hey Matt Cutts Be Fair if You Are Going After Paid Links Really I wasn’t going to post this but since Matt…
  5. Get Paid Links from the US Olympic Team So hey Google care to explain why this isn’t a…

Advertisers:

  1. Text Link Ads – New customers can get 0 in free text links.
  2. CrazyEgg.com – Supplement your analytics with action information from click tracking heat maps.
  3. BOTW.org – Get a premier listing in the internet’s oldest directory.
  4. Ezilon.com Regional Directory – Check to see if your website is listed!
  5. Page1Hosting – Class C IP Hosting starting at .99.
  6. Directory Journal – List your website in our growing web directory today.
  7. Content Customs – Unique and high quality SEO writing services, providing webmasters with hundreds of SEO articles per week
  8. Glass Whiteboards – For a professional durable white board with no ghosting, streaking or marker stains, see my Glass Whiteboard Review
  9. Need an SEO Audit for your website, look at my SEO Consulting Services
  10. KnowEm – Protect your brand, product or company name with a continually growing list of social media sites.
  11. Scribe SEO Review find out how to better optimize your wordpress posts.
  12. TigerTech – Great Web Hosting service at a great price.

Michael Gray – Graywolf’s SEO Blog

Posted by Danny Dover

 This weeks Whiteboard Friday is a little bit different than normal but a lot more awesome. I took the lead this week and am sharing 5 tips that beginners can use to get links from bloggers. This educational video is full to the brim with helpful tips, odd tangents and one very poor impression of a news anchor.

 


 

The Beginner’s Guide to Getting Links from Bloggers

Try all of the following tactics and focus on building upon whichever one works best for your situation.

Check BoxMake Lists of Niche Linkers (and post them) – Most bloggers are by nature marketers. Take of advantage of this by writing material that helps them market themselves. For example, if you wanted to get a link from a blog in the car space, you might make a list of the top 3 Honda Civic blogs. Remember to go niche and avoid stating the obvious (I.E. Techcrunch is the number one tech blog). "The obvious" doesn’t attract links.

Check BoxDo Interviews – This tactic has two main benefits. First, by conducting interviews you get interesting content to write about. What could be more interesting than what the industry experts are talking about? (Clever interviewers will realize the answer is the stuff the experts don’t want to talk about). Secondly, by getting your name/brand in the head of an expert, you have more chances of getting links from them in the future.

Check BoxBe Virtually Social – Being virtually social is easy and can provide higher short term ROI than talking to people face to face. (e.g. It is easier to get a link from someone who is in the process of writing something online than it is to get a link from someone who is not at a computer.) I use the following three avenues to do this:

  • Facebook – Since Facebook replaced "Fan" with "Like", it is now easier than ever to promote your work via Facebook without being too "salesy". This won’t get you links per say (as the entire process exists within Facebook’s ecosystem) but it can help drive traffic.
  • Twitter – Like Facebook, Twitter won’t necessarily help you build followed links but it does help you spread your brand/product/idea around the net. This makes it a good long term strategy.
  • Blogs – One great way to get links is to find supporting evidence for a given blog post and letting it’s author know about it. If they use it, they are likely to cite you as the source.

funny pictures of cats with captions

In addition to promoting my work on Twitter, I use the platform to spread Internet awesomeness. (Thus the image above)

Check BoxMake In-Person Connections – This is the best long term way to get links. When bloggers are trying to come up with supporting evidence for a point they are trying to make, they are much more likely to use a example that is already in their head than they are to go out and search for it. The best way to get into someones head (other than a chisel) is to meet them and spend time with them. (Wait a chisel? Did anyone else read that?)

Check BoxSend Linkers Demos of Things (Websites, New Products, Games, Etc…) That You Want Links To and Include the Linkers Stuff In It. – This tactic is newer and has been proving to work very well. If you want coverage from a specific blogger, try including their blog in the product demo and sending it to them. This way they can promote themselves while promoting your work.


Danny Dover Twitter

If you have any other advice that you think is worth sharing, feel free to post it in the comments. This post is very much a work in progress. As always, feel free to e-mail me if you have any suggestions on how I can make my posts more useful. All of my contact information is available on my SEOmoz profile under Danny. Thanks!

Do you like this post? Yes No


SEOmoz Daily SEO Blog

If you have been in the SEO field for any serious length of time you have probably come across (and benefited from) some of Tedster’s work – either directly, or indirectly from others who have repackaged his contributions as their own. He is perhaps a bit modest, but there are few people in the industry as universally well respected as he is. I have been meaning to interview him for a while now, and he is going to be speaking at Pubcon South on April 14th in Dallas, so I figured now was as good a time as any :)

How long have you been an SEO, and how did you get into the field?

I started building websites in 1995 and the word SEO hadn’t been invented. I came from a background in retail marketing, rather than technology or graphic design. So my orientation wasn’t just “have I have built a good site?”, but also “are enough people finding my site?”

The best method for bringing in traffic seemed to be the search engines, so I began discussing this kind of marketing with other people I found who had the same focus. Ah, the good old days, right? We were so basic and innocently focused, you know?

If you could list a few key documents that helped you develop your understanding of search, which would be the most important ones?

Here are a few documents that acted as major watersheds for me:

Is PageRank still crucial? Or have other things replaced it in terms of importance?

What PageRank is measuring (or attempting to measure) is still very critical — both the quality and number of other web pages that link to the given page. We don’t need to worship those public PR numbers, but we definitely do need quality back-links (and quality internal linking) to rank well on competitive queries.

There appears to be something parallel to PR that is emerging from social media — some metric that uses the model of influencers or thought leaders. But even with that in the mix, ranking would still depend on links, but they would be modified a bit by “followers”, “friends”, since many social sites are cautious with do-follow links.

Lets play: I have got a penalty – SEO edition. Position 6, -30, 999, etc. Are these just bogus excuses from poor SEOs who have no job calling themselves SEOs, or are they legitimate filters & penalties?

If the page never ranked well, then yes – it could well be a bogus excuse by someone whose only claim to being an SEO is that they read an e-book and bought some rank tracking software. However, Google definitely has used very obvious numeric demotions for pages that used to rank at the top.

The original -30 penalty is an example that nailed even domain name “navigational” searches. It affected some sites that did very aggressive link and 301 redirect manipulation.

What was originally called the -950 (end of results) penalty, while never an exact number, most definitely sent some very well ranked pages down into the very deep pages. Those websites were often optimized by very solid SEO people, but then Google came along and decided that the methods were no longer OK.

In recent months, those exact number penalties seem to have slipped away, replaced something a bit more “floating” and less transparent. My guess is that a negative percentage is applied to the final run re-ranking, rather than subtracting a fixed number. Google’s patent for Phrase-based Indexing does mention both possible approaches.

But even using percentages rather than a fixed number, when a top-ranked page runs afoul of some spam prevention filter, it can still tank pretty far. We just can’t read the exact problem from the exact number of positions lost anymore.

Do you see Google as having many false positives when they whack websites?

Yes, unfortunately I do. From what I see, Google tends to build an algorithm or heuristic that gathers up all the URLs that seem to follow their “spam pattern du jour” — and then they all get whacked in one big sweep. Then the reconsideration requests and the forum or blog complaints start flying and soon Google changes some factor in that filter. Viola! Some of the dolphins get released from the tuna net.

One very public case was discussed on Google Groups, where an innocent page lost its ranking because a “too much white space” filter that misread the effect of an iframe!

Google’s John Mueller fixed the issue manually by placing a flag on that one site to trigger a human inspection if it ever got whacked in the future. I’d assume that the particular filter was tweaked soon after, although there was no official word.

How many false positives does it take to add up to “many”? I’d guess that collateral damage is a single digit percentage at most — probably well under 5% of all filtered pages, and possibly less than 1%. It still hurts in a big way when it hits YOUR meticulously clean website. And even a penalty that incorrectly nails one site out of 300 can still affect quite a lot over the entire web.

How often when rankings tank do you think it is do to an algorithmic issue versus how often it is via an editorial issue with search employees?

When there are lots of similar complaints at the same time, then it’s often a change in some algorithm factor. But if it’s just one site, and that site hasn’t done something radically new and different in recent times, then it’s more likely the ranking change came from a human editorial review.

Human editors are continually doing quality review on the high volume, big money search results. It can easily happen that something gets noticed that wasn’t seen before and that slipped through the machine part of the algorithm for a long time.

That said, it is scary how often sites DO make drastic errors and don’t realize it. You see things like:

  • nofollow robots meta tags getting imported from the development server

  • robots.txt and .htaccess configurations gone way wrong
  • hacked servers that are hosting cloaked parasite content

Google did a big favor for honest webmasters with their “Fetch as googlebot” tool. Sometimes it’s the easiest way to catch what those hacker criminals are doing.

When does it make sense for an SEO to decide to grovel to Google for forgiveness, and when should they try to fix it themselves and wait out an algorithmic response?

If you know what you’ve been doing that tripped the penalty, fix it and submit the Reconsideration Request. If you don’t know, then work on it — and if you can’t find a danged thing wrong, try the Google Webmaster Forums first, then a Request. When income depends on it, I say “grovel”.

I don’t really consider it groveling, in fact. The Reconsideration Request is one way Google acknowledges that their ranking system can do bad things to good websites.

I’ve never seen a case where a request created a problem for the website involved. It may not do any good, but I’ve never seen it do harm. I even know of a case where the first response was essentially “your site will never rank again” — but later on, it still did. There’s always hope, unless your sites are really worthless spam.

Many SEOs theorize that sometimes Google has a bit of a 2-tier justice system where bigger sites get away with murder and smaller sites get the oppressive thumb. Do you agree with that? If no, please explain why you think it is an inaccurate view. If yes, do you see it as something Google will eventually address?

I’d say there is something like that going on — it comes mostly because Google’s primary focus is on the end user experience. Even-handed fairness to all websites is on the table, but it’s a secondary concern.

The end user often expects to see such and such an authority in the results, especially when it’s been there in the past. So Google itself looks broken to a lot of people if that site gets penalized. They are between a rock and a hard place now.

What may happen goes something like this: an A-list website gets penalized, but they can repair their spam tactics and get released from their penalty a lot faster than some less prominent website would. It does seem that some penalties get released only on a certain time frame, but you don’t see those time frames applied to an A-list.

This may even be an effect of some algorithm factor. If you watch the flow of data between the various Google IP addresses, you may see this: There are times when the domain roots from certain high value websites go missing and then come back. Several data center watchers I know feel that this is evidence for some kind of white-list.

If there is a white-list, then it requires a history of trust plus a strong business presence to get included. So it might make also sense that forgiveness can come quickly.

As a practical matter, for major sites there can easily be no one person who knows everything that is going on in all the business units who touch the website.

Someone down the org chart may hire an “SEO company” that pulls some funny business and Google may seem to turn a blind eye to it, because the site is so strong and so important to Google’s end user. They may also just ignore those spam signals rather than penalize them.

Large authority site content mills are all the rage in early 2010. Will they still be an effective business model in 2013?

It’s tough to see how this could be quickly and effectively reined in, at least not by algorithm. I assume that this kind of empty filler content is not very useful for visitors — it certainly isn’t for me. So I also assume it must be on Google’s radar.

I’d say there’s a certain parallel to the paid links war, and Google’s first skirmishes in that arena gave then a few black eyes. So I expect any address to the cheap content mills to be taken slowly, and mostly by human editorial review.

The problem here is that every provider of freelance content is NOT providing junk – though some are. As far as I know, there is no current semantic processing that can sort out the two.

Given that free forums have a fairly low barrier to entry there are perhaps false alarms every day on ringing in the next major update or some such. How do you know when change is the real deal? Do you passively track a lot of data? And what makes you so good at taking a sea of tidbits and sort of mesh them into a working theme?

I do watch a lot of data, although not nearly to the degree that I used to. Trying to reverse engineer the rankings is not as fruitful as it used to be —especially now that certain positions below the top three seem to be “audition spots” rather than actually earned rankings.

It helps to have a lot of private communications — both with other trusted SEOs and also with people who post on the forums. When I combine that kind of input with my study of the patents and other Google communications, usually patterns start to stand out.

When you say “audition spots” how does that differ from “actually earned rankings”? Should webmasters worry if their rankings bounce around a bit? How long does it typically take to stabilize? Are there any early signs of an audition going good or bad? Should webmasters try to adjust mid-stream, and if so, what precautions should they take?

At least in some verticals, Google seems to be using the bottom of page 1 to give promising pages a “trial” to see how they perform. The criteria for passing these trials or “auditions” are not very clear, but something about the page looks good to Google, and so they give it a shot.

So if a page suddenly pops to a first page ranking from somewhere deep, that’s certainly a good sign. But it doesn’t mean that the new ranking is stable. If a page has recently jumped way up, it may also go back down. I wouldn’t suggest doing anything drastic in such situations, and I wouldn’t overestimate that new ranking, either. It may only be shown to certain users and not others. As always, solid new backlinks can help – especially if they are coming from an area of the web that was previously not heard from in the backlink profile. But I wouldn’t play around with on-page or on-site factors at a time like that.

There’s also a situation where a page seems to have earned a lot of recent backlinks but there’s something about those links that smells a bit unnatural. In cases like that, I’ve seen the page get a page one position for just certain hours out of the day. But again, it’s the total backlink profile and its diversity that I think is in play. If you’ve done some recent “link building” but it’s all one type, or the anchor text is too obviously manipulated, then look around for some other kinds of places to attract some diversity in future backlinks.

On large & open forums lots of people tend to have vastly different experience sets, knowledge sets, and even perhaps motives. How important is your background knowledge of individuals in determining how to add their input into your working theme? Who are some of the people you trust the most in the search space?

I try never to be prejudiced by someone’s recent entry into the field. Sometimes a very new person makes a key observation, even if they can’t interpret it correctly.

There is a kind of “soft SEO” knowledge that is rampant today and it isn’t going to go away. It’s a mythology mill and it’s important not to base a business decision on SEO mythology. So, I trust hands on people more than manager types and front people for businesses. If you don’t walk the walk, then for me your talk is highly suspect.

I pay attention to how people use technical vocabulary — do they say URL when they mean domain name? Do they say tag when they mean element or attribute? Not that we don’t all use verbal shortcuts, but when a pattern of technical precision becomes clear, then I listen more closely.

I have long trusted people who do not have prominent “names” as well as some who do. But I also trust people more within their area of focus, and not necessarily when they offer opinions in some other area.

I hate to make a list, because I know someone is going to get left out accidentally. Let’s just say “the usual suspects.” But as an example, if Bruce Clay says he’s tested something and discovered “X”, you can be pretty sure that he’s not blowing sunshine.

Someone who doesn’t have huge name recognition, but who I appreciate very much is Dave Harry (thegypsy). That’s partly because he pays attention to Phrase-based Indexing and other information retrieval topics that I also watch. I used to feel like a lone explorer in those areas before I discovered Dave’s contributions.

What is the biggest thing about Google where you later found out you were a bit off, but were pretty certain you were right?

That’s easy! Using the rel=”nofollow” attribute for PR sculpting. Google made that method ineffective long before I stopped advocating it. I think I actually blushed when I read the comment from Matt Cutts that the change had been in place for over a year.

What is the biggest thing about Google where you were right on it, but people didn’t believe until months or years later?

The reality of the poorly named “minus 950″ penalty. I didn’t name it, by the way. It just sort of evolved from the greater community, even though I kept trying for “EOR” or “End Of Results.

At PubCon South I believe you are speaking on information architecture. How important is site structure to an effective SEO strategy? Do you see it gaining or losing importance going forward?

It is hugely important – both for search engines and for human visitors.

Information Architecture (IA) has also been one the least well understood areas in website development. IA actually begins BEFORE the technical site structure is set up. Once you know the marketing purpose of the site, precisely and in granular detail, then IA is next.

IA involves taking all the planned content and putting it into buckets. There are many different ways to bucket any pile of content. Some approaches are built on rather personal idiosyncrasies, and other types can be more universally approachable. Even if you are planning a very elaborate, user tagged “faceted navigation” system, you still need to decide on a default set of content buckets.

That initial bucketing process then flows into deciding the main menu structure. Nest you choose the menu labels, and this is the stage where you fix the actual menu labels and fold in keyword research. But if a site is built on inflexible keyword targets from the start, then it can often be a confusing mess for a visitor to navigate.

As traffic data grows in importance for search ranking, I do see Information Architecture finally coming into its own. However, the value for the human visitor has always clearly visible on the bottom line.

What are some of the biggest issues & errors you see people make when setting up their IA?

There are two big pitfalls I run into all the time:

  • Throwing too many choices at the visitor. Macy’s doesn’t put everything they sell in their display windows, and neither should a website.

  • Using the internal organization of the business as the way to organize the website. That includes merely exporting a catalog to a web interface.

How would you compare PubCon South against other conferences you have attended in the past?

PubCon South is a more intimate venue than, say Vegas. That means less distraction and more in-depth networking. Even though people do attend from all over the world, there is a strong regional attendance that also gives the conference a different flavor — one that I find a very healthy change of pace.

In addition, PubCon has introduced a new format — the Spotlight Session. One entire track is made completely of Spotlight Sessions with just one or two presenters, rather than an entire panel. These are much more interactive and allow us to really stretch out on key topics.

Thanks Tedster! If you want to see Tedster speak he will be at Pubcon Dallas on the 14th, and if you want to learn about working with him please check out Converseon. You can also read his latest musings on search and SEO by looking into the Google forums on WebmasterWorld. A few months back Tedster also did an interview with Stuntdubl.

SEO Book.com – Learn. Rank. Dominate.
Post image for How to FTP Files From Email Using an iPad

When the iPad was announced in January, it was the very first time Apple created something that I immediately wanted. I’ve never been an Apple fanboy, I don’t work on Macs for either my primary workstations or laptop – I use them for testing and ensuring compatibility only. But the iPad was different. I immediately saw it’s purpose as consumption device and it’s promise as a productivity tool.

The naysayers that tagged the iPad as “the iFail” seemed to focus on how it fails as a laptop or netbook, and how it lacks the features expected of those devices. While there are definitely limitations to what you can do, and how easily certain tasks can be accomplished, I gave Apple the benefit of the doubt, based on their track record, and knowing that this is a first generation device. As Edgar Bronfman, Warner Music Group CEO noted “No one’s got rich betting against Steve Jobs.”

Having never purchased a Kindle or netbook, I was looking forward to the 3G iPad enabling me to essentially ditch my laptop while traveling, and be a complete solution for any media (reading, music, movies) consumption needs. Most of my colleagues derided the “working exclusively from the iPad” notion as wishful thinking, and told me to be sure to pack the laptop as well. After using the iPad for over a month now, I’m looking forward to proving them wrong this week, as I thoroughly road test it during SMX Advanced. I’ve spent the last few weeks assembling the various tools (apps) that I’ll need, setting up accounts, passwords, etc. and can honestly say, with the benefit of the cloud, there’s nothing that I’d need to take care of while on the road, that I can’t do on the iPad.

Most of the apps that I use or need, relate to development – managing sites, updating sites, and keeping the trains running on time. To that end, these are the tools I employ on the iPad:

The Tools

The lack of mulitasking (for now) on the iPad, means accomplishing some tasks can be a little tricky, or are at best, not intuitive. One such task involves taking an attachment from an email, and getting it onto a server to use in post, on a page, or otherwise link to. To accomplish this – you really just need one .99 app – GoodReader.

Step-by-Step: How To FTP Email Attachments on the iPad

  1. Download the GoodReader app
  2. Configure your email account(s) under “Connect to Servers” in the right pane
  3. Configure your FTP account(s) under “Connect to Servers”
  4. Connect to the email account with the attachment by selecting that account under “Connect to Servers”
  5. Find attachment needed (GoodReader will automatically poll your account for any messages with attachments) and download it by selecting it/clicking on it from the email window
  6. Close the email dialog box – the dowloaded file should now appear under “My Documents” in the left pane
  7. Click “Manage Files” in the right pane
  8. Select the attachment in the left pane, then in the right pane, click “copy”
  9. Click “Connect to Servers” and select the FTP server
  10. In the pop up window for the now-connected server, navigate to the folder you want to upload the file to
  11. Click “Paste” (button in lower right corner of server window)
  12. …and you’re done!

Creative Commons License photo credit: anitakhart

Advertisement: Find out how to get your bloggers talking about your products ViralConversations.com. #8

This post originally came from Michael Gray who is an SEO Consultant. Be sure not to miss the Thesis WordPress Theme review.

How to FTP Files From Email Using an iPad


Related posts:
  1. How to FTP Upload Image Files From an iPad I recently became the new owner of an iPad…
  2. iPad Apps and Social Media In case you didn’t already know, I’m the owner…
  3. Web Based Email For More than One Address In addition to playing with my feedreader I’ve also been…

Related posts brought to you by Yet Another Related Posts Plugin.


Michael Gray – Graywolf’s SEO Blog