Tags Archives

You are currently viewing all posts tagged with Content.
Post image for Creating Head & Tail Content

Although I have mentioned head & tail content on this blog a few times, I haven’t devoted an entire post to it, which is probably an oversight on my part.

The first step in creating head and tail content is to create the tail pieces first. I went into some detail about this in the post keeping your articles focused. If you haven’t already read it, you probably should–if only so that this article makes the most sense. Once you have all or most of the tail content written, it’s time to think about the the head. While I’ll often use outsourced services for tail content (see textbroker.com review) for head pieces, I like to use higher quality writers, as this can sometimes become the flagship content of your website. You’ll want to have the writer read all the the tail sections and give them some ideas about how to link to them. You want links in the body with keyword rich anchor text (see how to silo the content); you don’t want “click here” used for anchor text.

look for ways to give tail pieces more than one head …

In some cases, your head content will be social content and not flagship content. In that case the only difference is that you’ll want to have offsite links mixed in as well. Social content that only links to pages on your website looks very self-serving. In some cases I will push a social piece with all external links then, after it’s run its course, I’ll edit the links and swap some of them for internal links to tail pieces.

Last, look for ways to give tail pieces more than one head. From the original example, let’s say you’ve written tail articles about “Visiting the Empire State Building.” You could create multiple head pieces like “Ten Must-See Destinations in NYC,” “Historical Skyscrapers in the Big Apple,” or “Family Friendly Destinations in New York City.” By giving these tail pieces more crawling points, you do a better job of interlinking your website. If your Head pieces are social pieces, this turns them into effective link hubs (see How to Diagnose and Improve Website Crawling)

So what are the takeaways from this post:

  • Start with writing tightly-focused tail posts
  • Tie the tail pieces together with an interlinked head post
  • Decide if your head piece will be flagship or social content, and add external links if it is social
  • Look for opportunities to to create more than one head to link to the tail pieces

Creative Commons License photo credit: cotaro70s

This post originally came from Michael Gray who is an SEO Consultant. Be sure not to miss the Thesis WordPress Theme review.

Creating Head & Tail Content

tla starter kit

Related posts:

  1. Content Ideas – Creating an Ongoing Series One of the problems that website owners and bloggers encounter…
  2. Is Web 2.0 Creating An Ad Trend Towards Promoting Content? If you pay attention to the ads that monetize most…
  3. Creating and Using an SEO Editorial Calendar I’ve mentioned the benefits of having an editorial calendar several…
  4. Google has it’s Head up Wikipedias Ass C’mon Google WTF how can you possibly say that Wikipedia…
  5. Creating Better Auto-Generated Text Over the past two weeks we’ve taken a look at…

Advertisers:

  1. Text Link Ads – New customers can get 0 in free text links.
  2. BOTW.org – Get a premier listing in the internet’s oldest directory.
  3. Ezilon.com Regional Directory – Check to see if your website is listed!
  4. Page1Hosting – Class C IP Hosting starting at .99.
  5. Directory Journal – List your website in our growing web directory today.
  6. Content Customs – Unique and high quality SEO writing services, providing webmasters with hundreds of SEO articles per week
  7. Majestic SEO – Competitive back link intellegence for SEO Analysis
  8. Glass Whiteboards – For a professional durable white board with no ghosting, streaking or marker stains, see my Glass Whiteboard Review
  9. Need an SEO Audit for your website, look at my SEO Consulting Services
  10. KnowEm – Protect your brand, product or company name with a continually growing list of social media sites.
  11. Scribe SEO Review find out how to better optimize your wordpress posts.
  12. TigerTech – Great Web Hosting service at a great price.

Michael Gray – Graywolf’s SEO Blog

Posted by RobOusbey

This post begins with a particular dilemma that SEOs have often faced:

  • websites that use AJAX to load content into the page can be much quicker and provide a better user experience
  • BUT: these websites can be difficult (or impossible) for Google to crawl, and using AJAX can damage the site’s SEO.

Fortunately, Google has made a proposal for how webmasters can get the best of both worlds. I’ll provide links to Google documentation later in this post, but it boils down to to some relatively simple concepts.

Although Google made this proposal a year ago, I don’t feel that it’s attracted a great deal of attention – even though it ought to be particularly useful for SEOs. This post is targeted to people who’ve not explored Google’s AJAX crawling proposal yet – I’ll try to keep it short, and not too technical!

I’ll explain the concepts and show you a famous site where they’re already in action. I’ve also set up my own demo, which includes code that you can download and look at.

The Basics

Essentially, sites following this proposal are required to make two versions of their content available:

  1. Content for JS-enabled users, at an ‘AJAX style’ URL
  2. Content for the search engines, at a static ‘traditional’ URL – Google refers to this as an ‘HTML snapshot’

Historically, developers had made use of the ‘named anchor‘ part of URLs on AJAX-powered websites (this is the ‘hash’ symbol, #, and the text following it). For example, take a look at this demo  – clicking menu items changes named anchor and loads the content into the page on the fly. It’s great for users, but search engine spiders can’t deal with it.

Rather than using a hash, #, the new proposal requires using a hash and an exclamation point: #!

The #! combination has occasionally been called a ‘hashbang’ by people geekier than me; I like the sound of that term, so I’m going to stick with it.

Hashbang Wallop: The AJAX Crawling Protocol

As soon as you use the hashbang in a URL, Google will spot that you’re following their protocol, and interpret your URLs in a special way – they’ll take everything after the hashbang, and pass it to the site as a URL parameter instead. The name they use for the parameter is: _escaped_fragment_

Google will then rewrite the URL, and request content from that static page. To show what the rewritten URLs look like, here are some examples:

  • www.demo.com/#!seattle/hotels becomes www.demo.com/?_escaped_fragment=seattle/hotels
  • www.demo.com/users#!name=rob becomes www.demo.com/users?_escaped_fragment_=name=rob

As long as you can get the static page (the URL on the right in these examples) to display the same content that a user would see (at the left-hand URL), then it works just as planned.

Two Suggestions about Static URLs

For now, it seems that Google is returning static URLs in its index – this makes sense, since they don’t want to damage a non-JS user’s experience by sending them to a page that requires Javascript. For that reason, sites may want to add some Javascript that will detect JS-enabled users, and take the to the ‘enhanced’ AJAX version of the page they’ve landed on.

In addition, you probably don’t want your indexed URLs to show up in the SERPs with the ‘_escaped_fragment_’ parameter in them. This can easily be avoided by having your ‘static version’ pages at more attractive URLs, and using 301 redirects to guide the spiders from the _escaped_parameter_ version to the more attractive example.

E.G.: In my first example above, the site may choose to implement a 301 redirect from
www.demo.com?_escaped_fragment=seattle/hotels to www.demo.com/directory/seattle/hotels

 

A Live Example

Fortunately for us, there’s a great demonstration of this proposal already in place on a pretty big website: the new version of Twitter.

If you’re a Twitter user, logged-in, and have Javascript, you’ll be able to see my profile here:

However, Googlebot will recognize that as a URL in the new format, and will instead request this URL:

Sensibly, Twitter want to maintain backward compatibility (and not have their indexed URLs look like junk) so they 301 redirect that URL to:

(And if you’re a logged-in Twitter user, that last URL will actually redirect you back to the first one.)

 

Another Example, With Freely Downloadable Code

I’ve set up a demo of these practices in action, over at: www.gingerhost.com/ajax-demo

Feel free to have a play and see how that page behaves. If you’d like to see how it’s implemented from a ‘backend’ perspective, hit the download link on that page to grab the PHP code I used. (N.B.: I’m not a developer; if anyone spots any glaring errors, please feel free to let me know so I can correct them!)

 

More Examples, Further Reading

The Google Web Toolkit showcase adheres to this proposal; experimenting with removing the hasbang is left as an exercise for the reader.

The best place to being further reading on this topic is definitely Google’s own help pages. They give information about how sites should work to fit with this proposal, and have some interesting implementation advice, such as using server-side DOM manipulation to create the snapshot (though I think their focus on this ‘headless browser’ may well have put people off implementing this sooner.)

Google’s Webmaster Central blog has the official announcement of this, and John Mueller invited discussion in the WMC Forums.

Between Google’s blog, forum and help pages, you should find everything you need to turn your fancy AJAX sites into something that Google can love, as well as your users. Have fun!

 

Do you like this post? Yes No


SEOmoz Daily SEO Blog

Posted by Fryed7

What is it we SEOs do? Most of our answers probably boil down to this; we help webpages rank higher at search engines by improving each of the three cornerstones of SEO. The first aspect; technical problems – like indexable content, meta robots tags and URL structures – has been cracked by SEOmoz’s awesome web app. Suddenly we can get a complete dashboard of errors to go and sort – easy.

Then of course, then there’s the “trust” issue. Getting authoritative and relevant links; and with Open Site Explorer where advanced link analysis and data is now only a click away. And with the a huge range of link building tips, strategies, and tactics here, it’s fair to say that we’ve got the SEO ninja skills to go and create “trust-worthy” websites.

3 Cornerstones of SEO

So that leaves content…

Content is abstract. It’s irrational. It’s hard for CEOs, managers and influential decision-makers to get there heads around. It’s fantastic.

What’s the point in what you read?

We consume content to solve problems, be entertained and to satisfy curiosity. Based on where you are in a decision making process, you can divide ‘content’ into four different categories. This post is all about defining each category.

In an age of tweetdeck, rss, five sentence emails and the internet making us stupid, supposedly, who on earth is hanging around to read meaningful stuff? I mean, it’s a bit over-rated when you’ve got to be checking your inbox every five minutes, keeping current with Twitter, and all these feeds, and then some…

IMAGE via: Geek and Poke

The reason such technology exists is so we can be on the edge of stuff.

We can see and read the latest ideas, news and commentary. We can connect with people who share common interests and start a conversation. That kind of ‘content’ is a) meaningless to those who aren’t in the know and b) not particularly relevant a week or so down the line.

This is what is making the web at the moment – current conversation. Everyone can chip-in on what other people have to say. We all have our own circles of influence where we can share and spread ideas. We’re all wittering away with our own little thoughts – it’s not cohesive and it’s unlikely to be useful to an outsider trying to figure it all out – at least on it’s own. I call this Blurb.

Blurb Content is conversation.

It’s two way. Blurb is exclusive in that it’s meaningless to those who don’t understand the community, who don’t know the secret handshake and who aren’t clued up on the topic – but for those who are “in the know”, blurb is where discussion, debates and drama define opinions and leads to decision making. Within the club, blurb is awesome.

We’re lucky on blogs like this to have really great conversations, fleshing out theories and the results from experiments; it attracts intelligent two-way conversation. It’s why you might tweet about it more, because there’s so much value in the conversation. It’s why you’re more likely to take action, because you’ve heard it thrashed out by a handful of the industry brains. It’s why you’re more likely to come back for more conversation.

Equally, there’s pretty useless blurb. “Great post” “really enjoyed it” or “tldr” which has no real value to other visitors, and therefore no real value to search engines either. The real power of blurb and UGC is things like this (YOUmoz), Threadless and – dare I say it? – Wikipedia. People have been empowered to go and create their own awesome corner of the web.

The Rule of Blurb – Culture Valuable two-way Conversation.

Conversation is the fuel of the web; and with hundreds of millions of us online, that’s the potential for a big conversation. The problem we face, both as SEOs and marketers in general is initiating that conversation.
 

Who’s Gonna Break the Ice?

IMAGE: UrologyOnline


 

We can do this two ways:


1) Create content and ask for conversation (tweet this, leave a comment, let’s connect on facebook)

2) Create a system where you encourage other people to initiate conversation

Which way do you think is harder to replicate, will be more scaleable and have more influence across the web in the long term? You said two, right? The question is – how. Let’s go back to the SEOmoz model (because most of us have had a good look around this site and know it well, so it’s doubly relevant):

What got you to the point of chipping into the conversation on here? What qualified you to know what you were talking about, and pitch in with something valuable? I bet that this blog post hasn’t taught you everything you know about SEO (and if it did, you’d probably reside to saying: “great post. really interesting stuff” anyways).

The reason why is because at some point in your SEO education, you’ve stumbled across someone or something with “the answers”. Something that answers your questions fully. Where somebody has simply communicated the concepts behind SEO to you in one or more pieces of content.

  • A good book…
  • An awesome video…
  • A seminar…

The fundamental difference is it’s a one-way conversation.

Consider this scenario; your lost in an foreign city – you were supposed to be in an office meeting fifteen minutes ago. What do you do? You ask a local. They tell you h
ow to get there. You listen and do what they say. They’re the expert, so you listen.

Example two. You have a medical problem. You go to your doctor. Your doctor examines you and tells you your problem, and prescribes a cure. Sometimes you might be reluctant, but you trust their skills and expertise so you do exactly what they say.

You watch a talent show on TV and want to take up the guitar. You find a teacher and hang on their every word whilst trying to work out how to play chords. You may ask them to go over something again, but it’s still a one-way conversation.

This behaviour is typical of “newbies”. You’re mind is like a sponge, you’re being entirely receptive to someone else’s ideas and explanations and because of this you’ll be able to understand and talk about the problem and solution – i.e. you can engage in the conversation on the web. This kind of content focuses and concentrates attention on one specific problem.

This is called Definitive Content.

This brings up three things:

Definitive book1) Definitive content cultures conversation and decision-making

Definitive Content educates people so, with their expanded knowledge can engage in conversation and make informed decisions. This content is educational. People who are searching for information have already identified that they’re not comfortable making uninformed decisions. They’re looking for “the answer”

2) Definitive content must be remarkable + awesome + white-paper-worthy.

In a world where attention is a scarce resource, your definitive content needs to stand out from the crowd and be worth the time spent consuming it. It must be remarkable in order to have conversation about it. It must also be jaw-droppingly awesome so reactions and remarks are positive. And it must be white-paper-worthy in order to address the problem fully without “selling” (that comes later).

3) Blurb is frustrating for learners becuase it isn’t definitive

That’s why bloggers teaching stuff bitterly frustrates me. Back to basics, a ‘web log’ was originally meant for journalism, commentary and personal tales, and yet the platform has been stretched over other uses. So people now create niche blogs and post about something specific, perhaps offering tips. So far, harmless blurb…

Then they try writing something “definitive”…

This doesn’t work for three main reasons:

  • Bloggers are afraid of completing the article – they thrive from the conversations that evolve from a good blog post which doesn’t quite close all the doors.
  • Bloggers are afraid of forcing their readers to spend too much time reading for fear they’ll get bored. Bloggers are dependent on ‘little and often’ readership.
  • Bloggers are possibly even afraid of spending extra time on “definitive content” for fear that they won’t be able to produce enough posts so readers will lose interest.

And what’s sad, is that after the first few days after the post is published, the traffic will drop down to a mere fraction of what it was, since your readership has simply “been there, done that”. Congratulations; you’re now in a business where your ‘product’ becomes worthless practically overnight.

Blogging is about the person, not the problem.

Blogging has it’s place creating blurb content, not definitive content (when you confuse the two, you have a personal problem). In fact, blogging could be considered a response to definitive content; it’s the ultimate example of user-generated content, or rather… user-generated conversation. The early days of SEOmoz saw Rand posting his commentary to SEO news.

Now, that’s not a stab at blogging – more a criticism of how people blog. Some of the best blogs about blogging use definitive content in order to bring newbies up to speed so their regular blurb is both relevant and newbies can talk about it. Darren Rowse’s Problogger is one of the biggest and best blogs about blogging, and even so Darren suggests buying the ProBlogger book in order to get all the details on starting up all in one place. And that makes sense, doesn’t it?

Everyone’s blogging like sheep, churning out loads of mediocre content. The world doesn’t need more content. It needs more remarkable, definitive content. Suddenly, those creating Definitive Content become somebody. Blogging has it’s place in it’s roots; a platform for commentary on news, personal affairs and creating conversation – not being manipulated out of place creating definitive pieces.

(There was a really interesting article about the Death of the Boring Blog Post which essentially outlines this problem from a design perspective. Apparently the answer is ‘blogazines’ – but this doesn’t solve the fundamental problem of answering the problem people are typing in. Pretty is impressive but doesn’t necessarily mean it’s the best.)

Definitive content is the stuff which you reference, re-read, remember and in some cases – recite! Ever been in a position where you’ve been telling someone about an awesome book, or video that you’ve gotten a bit obsessed with? And what’s interesting, is even if it isn’t necessarily “current” or trending on Twitter, you’ll still reference it ‘cause it’s awesome. Hence, Definitive Content is evergreen – which means in the long run it’s a high effort-reward strategy.

Definitive Content Strategy

Step 1) Find an in-demand niche within a niche.
Step 2) Go be king.

In emerging industries, rarely have people launched with awesome definitive content. Instead, as the industry matures and begins to fragment – then the niche players can identify and distinguish themselves. A great example is looking at the search marketing industry:

  1. Cindy Krum created Rank-Mobile.com ~2007; a website selling her mobile marketing consultancy services. She’s established herself by being the go to girl for all things to do with mobile. She’s enforced this by literally writing the book on Mobile Marketing, and then supplementing this with her blog commentary on industry news- her blurb.
  2. David Mihm is ‘local search guy’. His collaboration to create the Local Search Ranking Factors (currently in it’s third volume) with other top brains in the industry helps not only define the fundamentals of search but also positions him and his website as experts. On top of this, he blurbs about local search all around the SEO s
    pace.
  3. Perry Marshall wrote the book on Google Adwords in 2006 as businesses began to wake up to Adwords and the program really began to take off. He offer expensive consulting-based direct marketing products to his email list which he’s also built up by offering freebie definitive content for signing up (email courses, PDFs, mp3s etc.)
  4. SEOmoz! Countless Definitive Content pieces like the Beginners Guide to SEO or the Search Engine Ranking Factors articles which get referenced by hundreds of SEO blogs and professionals. This is then supplemented with an the SEOmoz and YOUmoz blogs with the weekly Definitive ‘Whiteboard Friday’ videos fueling the fire.
clockTiming is important with creating Definitive Content – I think there are two important factors:

  • Be the first.
  • Do it yesterday

All three of these people followed these two principles and suddenly you’ve got four excellent examples where ‘content is king’. No one’s anointed these people as experts – instead they’ve written their way to the top and they were first to do it.

Definitive content is all well and good, but if no one know’s about you and it, then it’s not going to be of much benefit. This is where my earlier question of creating content asking for conversation vs. creating a system that asks for conversation comes into play.

You’ve created your Definitive Content; now you’ve got to use your network, your social sphere of influence, your ‘leverage’ to promote it. Naturally, they use content – perhaps a review post, video, google ad – or even just a tweet – to introduce your Definitive Content. This is called Manifesto Content and this in itself is a behaviour search engines are also looking for.

Manifesto Content does the simple job of introducing the problem, introducing you, and introducing your way of answering that problem

It pre-sells your Definitive Content. Think about the weight of links in this context; the origin of your inbound links will contain content of some sort (at least to provide value to a visitor) – that content is Manifesto Content. It’s kinda like a CV for the Definitive Content, and the better the Manifesto Content, the better your first impression – and first impressions count.

IMAGE: CartoonStock.com

first impressions

Manifesto Content distribution is a better way to consider link building. Link building is a game about numbers; Manifesto Content distribution is about building unmeasurable things like trust and credibility – which shows up to search engines as “link getting”.

  • Do link directories offer great introductory content to you and your website with just a title, few lines of text and dozens of other pieces of similar content around them?
  • Do guest posts or interviews for relevant related blogs offer great introductory content to you and your website?
  • Does a Twitter ‘win a widget’ competition asking for retweets offer great introductory content?

As I said at the beginning, content is abstract, hence the philosophical-esque questions! However, this thinking is essential if you’re to come up with your own Manifesto Content   marketing strategy. Here’s a handful articles on getting your Manifesto Content shared:

The size, strength and distribution of your manifesto content will determine the overall strength of your web content, and of course good SEO practices of ensuring it gets indexed, it targets specific problem keywords and is “technically tidy” to ensure your Manifesto Content gets targeted traffic and click-throughs.

Great. Now Show Me the Money.

Now, you’ve been introduced as a credible source of information, you’ve educated them and cultured conversation-making abilities so they can engage in blurb. They’re now in an informed discussion about their problem, and likely, your solution if you target your blurb correctly – and all the while, you’ve been earning trust and credibility as someone who know’s what they’re talking about…

Why wouldn’t they consider your solution you’re selling?

This removes the need to “hard sell”. You don’t need to be a copywriting jedi because you’ve already built a level of equity that can’t be copied, even by the best copywriters – they’ve already know you and trust you. To hard sell would simply be a sign of insecurity and stupidity. That said, you need to be able to write sales copy with confidence so you don’t fudge the important bit! Luckily, the brains at Copyblogger will teach you how to ‘sell without selling’ – here’s their best definitive article on writing sales letters (with part 2 and part 3)

Roundup

That’s rather a lot to take in; so a quick roundup. The best way to illustrate how content strategy works is by comparing it to a jet engine.

A what…?!

Bare with me on this. A jet engine, at it’s most basic, has four parts. A front fan, a compressor, an ignition stage and the back turbine with a nozel – or very simply; suck, squeeze, bang, blow (excuse the innuendoes) – and these exactly map onto our four-part content funnel.

It’s essential that they all work together in order to produce results, like this:

tribal seo jet engine

  • Manifesto content is the Suck. It draws people into your content funnel.
  • Definitive content is the Squeeze. It focuses attention and educates prospects.
  • Blurb is the Bang. It’s where conversation and the magic happens.
  • Copy is the Blow. It’s where decisions become actions and the whole thing moves forwards.

What I like particularly about this analogy, is that the actual physics matches the real life SEO analogy:

  • Most of the power of the engine comes from the front fa
    n – the size, strength and distribution of your Manifesto Content will correlate to the overall output of your web content strategy
  • Without the compression stage, air doesn’t have nearly as much pressure for when it’s ignited – without Definitive Content, your content funnel doesn’t have nearly as much focus and attention to culture conversation
  • The burning reaction releases energy – conversation leads to decisions being made, opinions being formed and CHANGE.
  • In a jet engine, “exploding” gas is only going to go backwards – highly targeted, focused prospects with a problem, who are educated about their options and are engaging in conversation about their problem – are likely to make decisions (and buy).
  • The flow of fuel keeps the engine going round – the flow of conversation keeps the content funnel functioning and growing.

What this also helps explain is why guerilla-content SEO is so much better than ‘traditional’ advertising which is more like a rocket. Create a reaction of advertising bucks and “targeted” prospects and point it in some direction is complicated (it’s rocket science) and not sustainable without continued effort.

This compares to the Manifesto > Definitive > Blurb > Copy content strategy which is “evergreen” once you’ve created it. A ‘definitive’ piece of content will always be there, as will the articles linking to it. What it means is your web content strategy (including search) is dependent on how you culture conversation. Let me introduce the concept of Tribes –  Tribes are created when you connect people around a cause

Seth’s talk on TED explains…

(If you haven’t come across Seth Godin before, you’re in for a treat Everyone who I’ve worked with who I’ve asked to watch this video has viewed it all the way through said it was awesome. Net result? We’ve both gotten more done.

So take just 17 minutes out and watch Seth’s talk to understand why Tribes will shape our future. If you really don’t have time now, keep this tab open and watch it over lunch or something.)

Finished the video?

This is what I see SEO as – getting in the problem solving business… and not just solving your problems. “I’m not ranking number 1 – I’ll go and build some links”. Put that in context on Tribal SEO. “I’m not ranking number 1 – I’ll go and promote manifesto content”. Creating a tribe will drive your content. Tribes need to connect via blogs, online communities, social networks – in any case you need to be at the helm and leading.

We have the responsibility to create awesomeness.

Morgan Freeman

You’ve heard the ‘Voice of Google’, Matt Cutts, bangs on and on about creating content for visitors vs. creating content for search engines. He’s absolutely right – if you’re trying to make crummy content and webpages rank, just like trying to sell crummy products and services, then shame on you!

I’m gonna end with a couple of questions and an apology. I’ve broken one of the cardinal unwritten rules of blogging (keep it short, stupid!) and you’ve probably spent waaaay too much time reading and watching all this. Whoops…

But then again, does Defintive Content need a cap on the length. Shouldn’t it be as long as it needs to be? Which begs the question, how would you classify this post based on the scale I’ve talked about?

  • Is it Manifesto Content? Does it introduce you to new problems, people and answers?
  • Is it Definitive Content? Sure, I introduce a few ideas and articulate them in a way you’ve perhaps not seen before – but I haven’t “written the book” on Tribal SEO so to speak. Heck, I’m just a kid – why would you share and bookmark this? So far this is just a hypothesis – I need to enlist help in defining and proving these principles, which leads me to…
  • Blurb. Is this merely a topic for discussion, something that’ll be todays topic of conversation and yet will be forgotten by this time tomorrow?
  • Or is it copy? Me, shamelessly trying to promote myself or the Mozzers in a bid for private gain!

Secondly, how do you see this Manifesto > Definitive > Blurb > Copy content cycle fit in with this Whiteboard Friday concept of ‘The Path to Conversion’ and your business?

And finally, do you think that ‘Tribes’ make an effective long-term SEO strategy in your business, or any other business that springs to mind?

Let’s chat.

Do you like this post? Yes No


SEOmoz Daily SEO Blog
Post image for Thesis Tutorial: How to Conditionally Change Content

In yesterday’s post, we spoke about why you would want to change your content based on traffic intent. In today’s post, I’m going to give you a basic framework about how to do it. This post is written as a Thesis Tutorial, because working with Thesis is just easier (see my Thesis review), but you can easily adapt the code to any website or theme.

OK. Like all Thesis customizations, we’re going to need to open up the custom_functions.php file. In this example, we are going to offer different social buttons based on where the user came from. If they came from a social site, we’ll show the interactive buttons with counts/votes. If they came from anyplace else, we’re going to show static graphic icon buttons.

For the website I’m using I put the buttons under the author byline, so my code will go in that function (if you are going to copy and paste, wait until the end for the final code so you get all the semicolons and parentheses).

//this is author byline
function uauthor_byline()

I’m going to make sure the buttons only appear on single pages so I’ll need the following bit of code:

if (is_single())

Ok now we get to the programming. We’ll need two variables and one array. The variables will hold the referring URL and a flag that tells whether that condition is true, and the array will hold the list of sites we are checking against.

$CUsocial = false; // this is the flag for social traffic
$CUref = $_SERVER['HTTP_REFERER']; //this is the referring URL
$CUsocialar = array('reddit', 'stumbleupon', 'digg', 'twitter', 'facebook', 'delicious'); // this is an array of social sites

So what we want to do next is take the list of social sites and see if any of them are in the referer. If they are, we set the flag to true.

//check all the social sites
$i=0;while ($i<=count($CUsocialar)){
if(stristr($CUref, $CUsocialar[$i])) {
$CUsocial = true;
}
$i++;
}

Now, if you’re a stickler, you could make the case that, if Twitter was in the filename and not the domain, we could get a false positive, and you would be correct. I just don’t think that’s going to happen often enough to be a real concern. OK so now we know whether the referring site is any of the social websites we want to trap for. If it is, the $CUsocial variable will be ‘true’ so we’ll need this bit of code:

if ($CUsocial){
//code for active social websites goes here
}else{
//code for default condition goes here
?>}

The code above has a placeholder for the buttons you can get from places like digg, stumbleupon and facebook. Since there are so many tutorials and instructions from the original websites, I just left placeholders. Here’s the full code:

function uauthor_byline() {
if (is_single()){
$CUsocial = false; // this is the flag for social traffic
$CUref = $_SERVER['HTTP_REFERER']; //this is the referring URL
$CUsocialar = array('reddit', 'stumbleupon', 'digg', 'twitter', 'facebook', 'delicious'); // this is an array of social sites
//check all the social sites
$i=0;while ($i<=count($CUsocialar)){ if(stristr($CUref, $CUsocialar[$i])) { $CUsocial = true; } $i++; }
if ($CUsocial){
//code for active social websites goes here
}else{
//code for default condition goes here
}
}

The following is just a starting point and can be re-used and expanded upon. For example, if you want to trap for search engines, here’s the extra code you would need:

$CUsearch = false; //this is the flag for search traffic
$CUsearchar = array('google', 'yahoo', 'bing'); //this is an array of search sites

//check all the search sites
$i=0;while ($i<=count($CUsocialar)){
if(stristr($CUref, $CUsearchar[$i])) {
$CUsearch = true;
}
$i++;
}

But you can use the code all over the site content, header, sidebar, etc. You can combine it with date based triggers, or there are many, many different possibilities, if you spend time playing with the code.
Creative Commons License photo credit: The U.S. Army

This post originally came from Michael Gray who is an SEO Consultant. Be sure not to miss the Thesis WordPress Theme review.

Thesis Tutorial: How to Conditionally Change Content

tla starter kit

Related posts:

  1. Thesis Tutorial – Adding Date Based Triggers to Your Posts There are a lot of times when you are working…
  2. Change Your Content Based on Traffic Intent A few weeks ago, Brent Payne made a post about…
  3. Thesis Tutorial – How to Add Adsense Section Targeting Using Adsense on your blog usually isn’t the most profitable…
  4. Make Thesis Work Better With Digg and Facebook If you’re involved with social media sites like Digg, Facebbok…
  5. How to Add a Carousel to Your Thesis Blog If you’ve spent any time visiting blogs lately chances are…

Advertisers:

  1. Text Link Ads – New customers can get 0 in free text links.
  2. BOTW.org – Get a premier listing in the internet’s oldest directory.
  3. Ezilon.com Regional Directory – Check to see if your website is listed!
  4. Page1Hosting – Class C IP Hosting starting at .99.
  5. Directory Journal – List your website in our growing web directory today.
  6. Content Customs – Unique and high quality SEO writing services, providing webmasters with hundreds of SEO articles per week
  7. Majestic SEO – Competitive back link intellegence for SEO Analysis
  8. Glass Whiteboards – For a professional durable white board with no ghosting, streaking or marker stains, see my Glass Whiteboard Review
  9. Need an SEO Audit for your website, look at my SEO Consulting Services
  10. KnowEm – Protect your brand, product or company name with a continually growing list of social media sites.
  11. Scribe SEO Review find out how to better optimize your wordpress posts.
  12. TigerTech – Great Web Hosting service at a great price.

Michael Gray – Graywolf’s SEO Blog
Post image for Change Your Content Based on Traffic Intent

A few weeks ago, Brent Payne made a post about “whitehat cloaking” and changing your content based on referring website. He asked for some feedback on Twitter, causing some follow up discussions. I had a few people asking for examples about how to do this. In this two part post, first we’ll look at some theory about why would you want to do it, under what circumstances, and how to do it without angering the Google Gods. In tomorrow’s post “How to Conditionally Change Your Content,” I’ll give you some ideas about how to implement this.

Let’s talk about the high level strategy items first. Why would you want to serve different content to different users:

  • Social media traffic is advertising averse, so show them fewer ads and more social oriented content
  • Search traffic can be goal/purchase oriented, so try to serve them content designed to help you do that
  • Direct traffic can get the full brand treatment designed to build subscribers, regular visitors, or a sense of community

To use a cooking metaphor, I’m not serving each of these people a different meal, but I’m varying the seasoning to suit my guest’s individual tastes. Let’s get past the superficial. What are some things you could do differently for, say, social media traffic? Under most situations, social media visitors don’t click adsense, banner ads, and that sort of thing. For social media traffic, your best outcome will be getting them to link to your page, vote/retweet your page, or visit other pages. What you want to think about is how you can change your content to help you meet those goals.

With Google’s announcement that site speed is a factor, many savvy webmasters opted out of third party buttons and began to use smaller, lightweight, on-site graphics. While this helps with site speed, it doesn’t help with social engagement. If you want more social interaction show bigger buttons up top, especially the third party buttons with active vote/tweet counts. I would remove as much advertising as you could. I would replace this with graphics or sections featuring other social content. If you use tags to isolate your social content it would be easy to pull out using a DB query. How about showing your most popular or most emailed pages. Rather than showing a social media audience 25 pages of your top 25 list, consolidate all of the content onto one page.

What about search traffic? How can you change the content to better suit their needs? In some cases you may want to remove content like the sidebar, making your pages more like single page squeeze pages. Of course this will depend on the page content like, say, a product page. You may want to be more aggressive with advertising placement if you run an adsense or affiliate website. You could also vary the advertising a bit. I’ve spoken before about using tags to target your advertising, but why not use search query terms as well. If someone came to your website searching for [cheap mexico vacations], normally you would just serve them ads about cruises, hotels, or vacation packages to Mexico. However, if you trapped for search queries containing the word [cheap] you might also want to mix in some value based vacation advertising.

While there are some advantages to doing this, there are some pitfalls as well. This type of behavior makes for a more complicated website to maintain and run, so make sure you have the resources for the long haul. Secondly you have to be concerned about the search engines and giving the appearance of cloaking with ill intent. The more dramatic the main content is from one version to another, the more likely it is to upset a search engine. For example, if you serve a 1,400 word article to direct traffic, a 700 word trimmed down version to search traffic, and a 400 word version to social traffic, you are taking some risks. I would make sure that search engine bots get a version that is very close if not identical to the version that users coming from a search engine will get.

In the next post I’ll walk you through some of the steps on How to Conditionally Change Your Content from a programming perspective.
Creative Commons License photo credit: Rob Hughes

This post originally came from Michael Gray who is an SEO Consultant. Be sure not to miss the Thesis WordPress Theme review.

Change Your Content Based on Traffic Intent

tla starter kit

Related posts:

  1. Is Google Stealing Your Content and Hijacking Your Traffic Google has long been an advocate of “build great content”;…
  2. Advertising on Content Based Websites After spending time on research, creating content, buying a domain…
  3. Putting a Content Based Website Together We’ve covered long term content and short term content, information…
  4. No More Link Begging: 4 Engagement Methods for Content-Based Link Building Link begging is the practice of identifying link prospects, usually…
  5. Thesis Tutorial – Adding Date Based Triggers to Your Posts There are a lot of times when you are working…

Advertisers:

  1. Text Link Ads – New customers can get 0 in free text links.
  2. BOTW.org – Get a premier listing in the internet’s oldest directory.
  3. Ezilon.com Regional Directory – Check to see if your website is listed!
  4. Page1Hosting – Class C IP Hosting starting at .99.
  5. Directory Journal – List your website in our growing web directory today.
  6. Content Customs – Unique and high quality SEO writing services, providing webmasters with hundreds of SEO articles per week
  7. Majestic SEO – Competitive back link intellegence for SEO Analysis
  8. Glass Whiteboards – For a professional durable white board with no ghosting, streaking or marker stains, see my Glass Whiteboard Review
  9. Need an SEO Audit for your website, look at my SEO Consulting Services
  10. KnowEm – Protect your brand, product or company name with a continually growing list of social media sites.
  11. Scribe SEO Review find out how to better optimize your wordpress posts.
  12. TigerTech – Great Web Hosting service at a great price.

Michael Gray – Graywolf’s SEO Blog

Posted by Stephen Tallamy

At the end of last year the website I work on, LocateTV, moved into the cloud with Amazon Web Services (AWS) to take advantage of increase flexibility and reduced running costs. A while after we switched I found that Googlebot was crawling the site almost twice as much as it used to. Looking into it some more I found that Google had been crawling the site from a subdomain of amazonaws.com.

The problem is, when you start up a server on AWS it automatically gets a public DNS entry which looks a bit like ec2-123.456.789.012.compute-1.amazonaws.com. This means that the server will be available through this domain as well as the main domain that you will have registered to the same IP address. For us, this problem doubled itself as we have two web servers for our main domain and hence the whole of the site was being crawled through two different amazonaws.com subdomains and www.locatetv.com.

Now there were no external links to these AWS subdomains but, being a domain registrar, Google was notified of the new DNS entries and went ahead and indexed loads of pages. All this was creating extra load on our servers and a huge duplicate content problem (which I cleaned up, after quite a bit of trouble – more below).

A pretty big mess.

I thought I’d do some analysis into how many other sites were being affected by this problem. A quick search on Google for site:compute-1.amazonaws.com and site:compute.amazonaws.com reveals almost 1/2 million web pages indexed (often dodgy stats with this command but it gives some scale of the issue):

site:compute-1.amazonaws.com

My guess is that most of these pages are duplicate content with the site owners having separate DNS entries for their site. Certainly this is the case for the first few sites I checked:

  • http://ec2-67-202-8-9.compute-1.amazonaws.com is the same as http://www.broadjam.com
  • http://ec2-174-129-207-154.compute-1.amazonaws.com is the same as http://www.elephantdrive.com
  • http://ec2-174-129-253-143.compute-1.amazonaws.com is the same as http://boxofficemojo.com
  • http://ec2-174-129-197-200.compute-1.amazonaws.com is the same as http://www.promotofan.com
  • http://ec2-184-73-226-122.compute-1.amazonaws.com is the same as http://www.adbase.com

For Box Office Mojo, Google is reporting 76,500 pages indexed for the amazonaws.com address. That’s a lot of duplicate content in the index. A quick search for something specific like "Fastest Movies to Hit 0 Million at the Box Office" shows duplicates from both domains (plus a secure subdomain and the IP address of one of their servers – oops!):

Fastest Movies to Hit 0 Million at the Box Office

Whilst I imagine Google would be doing a reasonable job of filtering out the duplicates when it comes to most keywords, it’s still pretty bad to have all this duplicate content in the index and all that wasted crawl time.

This is pretty dumb for Google (and other search engines) to be doing. It’s pretty easy to work out that both the real domain and the AWS subdomain resolve to the same IP address and that the pages are the same. They could be saving themselves a whole lot of time time crawling URLs that are due to a duplicate DNS entry.

Fixing the source of the problem.

As good SEOs we know that we should do whatever we can to make sure that there is only one domain name resolving to a site. There is no way, at the moment, to stop AWS from adding the public DNS entries and so a way to solve this is to make sure that if the web server is accessed using the AWS subdomain then redirect to the main domain. Here is an example using Apache mod_rewrite of how to do this:

RewriteCond %{HTTP_HOST} ec2-123-456-789-012.compute-1.amazonaws.com
RewriteRule ^(.*)$ http://www.mydomain.com/ [R=301,L]

This can be put either in the httpd.conf file or the .htaccess file and basically says that if the requested host is ec2-123-456-789-012.compute-1.amazonaws.com then 301 redirect all URLs to the equivalent URL on www.mydomain.com.

This fix quickly stopped Googlebot from crawling our amazonaws.com subdomain addresses, which took considerable load off our servers, but by the time I’d spotted the problem there were thousands of pages indexed. As these pages were probably not doing any harm I thought I’d just let Google find all the 301 redirects and remove the pages from the index. So I waited, and waited, and waited. After a month the number of pages indexed (according to the site: command) was exactly the same. No pages had dropped out of the index.

Cleaning it up.

To help Google along I decided to submit a removal request using Webmaster Tools. I temporarily removed the 301 redirects too allow Google to see my site verification file (obviously it was being redirected to the verification file on my main domain) and then put the 301 redirect back in. I submitted a full site removal request but it was rejected because the domain was not being blocked by robots.txt. Again, this is pretty dumb in my opinion because the whole of the subdomain was being redirected to the correct domain.

As I was a bit annoyed with the fact that the removal request would not work in the way I wanted it to I thought I’d leave Google another month to see if it found the 301 redirects. After at least another month, no pages had dropped out of the index. This backs up my suspicion that Google does a pretty poor job of finding 301 redirects for stuff that isn’t in the web’s link graph. I have found this before, where I have changed URLs, updated all internal links to point at the new URLs and redirected the old URL. Google doesn’t seem to go back through it’s index and re-crawl pages that it hasn’t found in it’s standard web crawl to see if they have been removed or redirected (or if it does, it does it very, very slowly).

Having had no luck with the 301 approach, I decide to change to using a robots.txt file to block Google. The issue here is that, clearly, I didn’t want to edit my main robot.txt to block bots as that would stop crawling of my main domain. Instead, I created a file called robots-block.txt that contained the usual blocking instructions:

User-agent: *
Disallow: /

I then replaced the redirect entries from my .htaccess file to something like this:

RewriteCond %{HTTP_HOST} ec2-123-456-789-012.compute-1.amazonaws.com
RewriteRule ^robots.txt$ robots-block.txt [L]

This basically says that if the requested host is ec2-123-456-789-012.compute-1.amazonaws.com and the requested path is rob
ots.txt
then serve the robot-block.txt file instead. This means I effectively have a different robots.txt file served from this subdomain. Having done this I went back to Webmaster Tools, submitted the site removal request and this time it was accepted. "Hey presto", my duplicate content was gone! For good measure I replaced the robots.txt mod_rewrite with the original redirect commands to make sure any real users are redirected properly.

Reduce, reuse, recycle.

This was all a bit of a fiddle to sort out and I doubt many webmasters hosting on AWS will have even realised that this is an issue. This is not purely limited to AWS, as a number of other hosting providers also create alternative DNS entries. It is worth finding out what DNS entries are configured for the web server(s) serving a site (this isn’t always that easy but you can use your access logs/analytics to get an idea) and then making sure that redirects are in place to the canonical domain. If you need to remove any indexed pages then hopefully you can do something similar to the solution I proposed above.

There are some things that Google could do to help solve this problem:

  • Be a bit more intelligent in detecting duplicate domain entries for the same IP address.
  • Put some alerts into Webmaster Tool so webmasters know there is a potential issue.
  • Get better at re-crawling pages in the index not found in the standard crawl to detect redirects
  • Add support for site removal when a site wide redirect is in place

In the meantime, hopefully I’ve given some actionable advice if this is a problem for you.

Do you like this post? Yes No


SEOmoz Daily SEO Blog

Posted by Aaron Wheeler

I’ve always liked encyclopedias; when I was in middle school I started using Encarta on CD-ROM, and sure, I usually needed it for "help" with my homework, but sometimes I would stray to non-copy-and-pasting-from-encyclopedia activities and watch terribly animated videos of war battles or Shakespearean plays. My poor children will never know the joys of a succinct five page article on the American Revolution with an accompanying 30-second 160 X 200 resolution video! I suppose they’ll have to make due with the way too informative Wikipedia article and an accompanying overly high-def retelling of events – do they really need to be able to see Benjamin Franklin’s hickeys?

Anyways, if my aforementioned future kids do end up needing to write about the American Revolution, and you have a great site about it, how can you make sure they end up seeing your content? There are a lot of reasons for why it can be hard to rank for reference content, but fortunately, Whiteboard Friday is here to help! This week, Rand discusses some great ways to get your reference content to the top of the SERPs.

Embed video
<object width="640" height="360" id="wistia_198774" classid="clsid:D27CDB6E-AE6D-11cf-96B8-444553540000"><param name="movie" value="http://seomoz-cdn.wistia.com/flash/embed_player_v1.1.swf"/><param name="allowfullscreen" value="true"/><param name="allowscriptaccess" value="always"/><param name="wmode" value="opaque"/><param name="flashvars" value="videoUrl=http://seomoz-cdn.wistia.com/deliveries/b1b27e1be98d755e1c28b75fc022af912e6cf58a.bin&stillUrl=http://seomoz-cdn.wistia.com/deliveries/808b32ef4219b82b744fbd52b30d073084b97243.bin&unbufferedSeek=false&controlsVisibleOnLoad=false&autoPlay=false&endVideoBehavior=default&playButtonVisible=true&embedServiceURL=http://distillery.wistia.com/x&accountKey=wistia-production_3161&mediaID=wistia-production_198774&mediaDuration=821.15"/><embed src="http://seomoz-cdn.wistia.com/flash/embed_player_v1.1.swf" width="640" height="360" name="wistia_198774" type="application/x-shockwave-flash" allowfullscreen="true" allowscriptaccess="always" wmode="opaque" flashvars="videoUrl=http://seomoz-cdn.wistia.com/deliveries/b1b27e1be98d755e1c28b75fc022af912e6cf58a.bin&stillUrl=http://seomoz-cdn.wistia.com/deliveries/808b32ef4219b82b744fbd52b30d073084b97243.bin&unbufferedSeek=false&controlsVisibleOnLoad=false&autoPlay=false&endVideoBehavior=default&playButtonVisible=true&embedServiceURL=http://distillery.wistia.com/x&accountKey=wistia-production_3161&mediaID=wistia-production_198774&mediaDuration=821.15"></embed></object><script src="http://seomoz-cdn.wistia.com/embeds/v.js" charset="ISO-8859-1"></script><script>if(!navigator.mimeTypes[‘application/x-shockwave-flash’])Wistia.VideoEmbed(‘wistia_198774′,640,360,{videoUrl:’http://seomoz-cdn.wistia.com/deliveries/b1b27e1be98d755e1c28b75fc022af912e6cf58a.bin’,stillUrl:’http://seomoz-cdn.wistia.com/deliveries/808b32ef4219b82b744fbd52b30d073084b97243.bin’,distilleryUrl:’http://distillery.wistia.com/x’,accountKey:’wistia-production_3161′,mediaId:’wistia-production_198774′,mediaDuration:821.15})</script><a href="http://www.seomoz.org/">SEOmoz – SEO Software</a>

 

Video Transcription

Howdy, SEOmoz fans! Welcome to another edition of Whiteboard Friday. This week we’re talking specifically about reference content, which is a type of content that often has a tough time earning external links, has a tough time getting rankings and visibility in the search results. Yet a lot of people are both (a) interested in it form a searcher perspective and (b) have marketers who are interested in ranking for that type of topic so that they can draw in traffic to help brand their site to sell advertising, to build themselves up as industry authorities, and sometimes even to make direct sales as they relate to that reference content.

So, let’s start with some tips, some specific action items that you can take that will help your reference content get more rankings. When I talk about reference content, I mean everything from, like, dictionary-type definitions to encyclopedic types of content to how-to content. Anything that is sort of less about a news item, an exciting development, or a blog post and more like a piece of content that is simply informational in nature and designed to provide sort of an evergreen long-term resource. It’s tough to get this stuff ranking, but I think we can help.

First off, let’s talk about keyword usage. As you’re building out this content, a lot of people think, "All right. I need to have a certain number of the target keywords and I’m going to use these keyword varia
tions and I’m going to have this keyword density." I talk about the keyword density myth a lot of the time. The problem is, and I think one of the reasons it doesn’t resonate with folks or why people still say, "You know what, I think Rand is full of it on keyword density. It totally works," is because it is true that in many cases you can have a keyword that is used a certain number of times, a small number of times on a page and you can increase the number of times that it is used on a page and see the rankings go up. People say, "Well, that’s proof that keyword density works." In some semantic form, that is technically correct.

The problem is density itself is not necessarily, is almost certainly not the metric. Let’s say very certainly not the metric that search engines are using. So when you use that metric you might be conflating different variables. It could indeed be the case that adding more keywords and increasing technically what could be measured through density is helpful. But density itself is a bad way to measure things. What I’d urge you to instead think about is, "Am I hitting all of these items, and am I doing a good job with them?" If I am, chances are good that increasing my density, measuring my density, is going to add no value. Certainly, it’s the case that the search engines don’t measure it. We don’t want to be doing things that are sort of obviously known to be not used by the engines.

So, things like using the keyword element in the title, preferably at the beginning of the title, particularly for reference content is really good. People want to see right in the title in the search engine results that your page is about the content that they’re searching for in the H1 headline. The H1 headline may not help all that much, or specifically using the H1 tag to designate your headline, as opposed to just having it big, bold, and at the top of the page, may not help that much. So if it is a pain, I wouldn’t worry about it. But if you can, it is sort of a nice, good semantics thing to do. Good web standards.

Certainly, having it in the headline, whether you’re using the H1 tag or not, is important because when someone clicks on that result and reaches your page, you want to reinforce the notion right there at the top of the page, in the headline, that this content that they’ve reached is about what they searched for and it is what they just clicked on. When you have the disconnect between those words and phrases, I really worry that a lot of times your bounce rate will increase, you’ll see people leaving the page. It makes good sense form a usability perspective.

The meta description is certainly a good place to use it. It will get bolded and highlighted in the search result. Even though it doesn’t directly help with rankings.

The URL, same story. Although URLs do seem to have some nice correlation. It looked like in our ranking models that they have some causation influencing that. Certainly, you can see that when you change over to search friendly URLs that use the keywords in there those are very nice for SEO purposes as well.

The body tag usage. This is where people get super obsessed with keyword density. Most of the time, unless your article is really huge, I don’t worry very much about keyword density or the number of times you use it. You use it a few times, you use it the number of times that it makes sense in the document — two, three, four, five, six, right. Those are fine. But I wouldn’t obsess about like, "Okay. Wait. I think we have it nine times here. We should only have it eight because the average of the top ten is that they’re only using it X many times." Get out of town. Like, no way, man. This stuff is not helping here. It is good to use it in the body tag.

It also is surprisingly good to use it in things like the image Alt tag and in the file name of an image that’s on the page. I don’t know what it is. It could just be correlation. It might not be causation, but it turns out that the image Alt tag is higher correlated than H1s are. So, maybe it’s just the case that people like having images that are on the topic. Or maybe the search engines actually do have a preference about this kind of stuff.

You should definitely be worried about readability. If a normal, average user comes to the page and they read it, but the material is not connecting with them and doesn’t make great sense, get out of there. It’s trouble. This is one of the ways that SEOs and people in independent websites can really compete with Wikipedia, which is oftentimes hard to read, hard to parse, hard to understand, not tremendously well written. It’s written by a group of authors a lot of the time. A lot of the material can be dense. The same goes for a lot of professionally published content that just isn’t as accessible.

Completeness. So, one of the things that I definitely think about and this relates back to sort of topic modeling and LDA stuff to whatever extent that’s being used. Certainly it seems like it is being used to some substantive effect, but we don’t know exactly how much. Being able to comprehensively cover the topic that you’re talking about will mean that more people like your content, reference it, use it, enjoy it, share it with their friends, and it means that they are getting value out of it, which means that metrics like time on site and browse rate will go up, which might help your SEO, might not help your SEO, but will certainly help your site metrics. You care about those, too.

Then, I think a lot about the angle that you’re taking with your writing. Things like, I’m going to take a research-driven angle, or I’m going to take an opinion-driven angle, or I’m going to take sort of a showing all the different controversial sides of this, or I am going to walk through the history of this. Having that angle that is sort of unique and people say, "Wow, when I visit SEOmoz, I feel like I get a really thorough understanding of all the issues around a particular topic. Or I get a very opinionated piece from Rand about what he thinks about a particular SEO tactic and how people have used it. Then I get different sorts of opinions in the comments." That angle that you take can brand your site, brand your domain, and your company as having useful information on that topic. All of these things are far better to think about. If you nail those, you’re going to win out over keyword density.

Next item that you do have to worry about with reference content is architecture — internal architecture and internal linking. We talk about this ideal link architecture, the ideal pyramid, a lot. You start with your home page. If you can do this thing where you’ve got a hundred links approximately-ish per page, a hundred unique links, and that’s linking down to the second level with all of your categories and each of those are linking down to subcategories, you can get to a million pages in just one, two, three hops. Three hops from any single page on a site to a million subpages means that even the most robust quantity of reference content can be reached in a small number of clicks. That portends really good things for search engines and for users who are trying to parse through your material and potentially surf your site.

This is a great way to think about organizing your site. You’re never going to get to this perfect layer, but if you can think about this organization as a structure as you’re planning, it would be very helpful. You don’t have to do this with your home page either. If you think about something like a sitemap, an HTML sitemap on your site that you link to in the footer of every page and that page links to all of these and then they all link to these, you’ve accomplished the same thing. You’ve basically made it three or maybe four hops from any page on your site to a million pages. That’s a really good thing.

You should also be thinking about things like using categories and s
ubcategories intelligently. You can’t just be listing content. Those categories and subcategory pages have to be useful and valuable in and of themselves. We’ve talked about that a little bit in the past here on SEOmoz, too. The relevance and usefulness of those pages is going to predict whether they themselves can draw in links. If these pages can draw in external links, you know that’s going to help all the pages that they point to down below to rank better, to earn more linkages and page rank and trust. Those metrics that will flow down through a site.

I think it is very important and very wise to look at models like what Wikipedia has done and NY Times has done, what About.com has done, with cross-referencing content at deep levels. When you get to these deep pages down here and it has a link back up to that category and over to this page which it’s referencing in the content, that’s super useful from a visitor’s standpoint because they’ll click more. You might have a higher browse rate, a higher pages per session, as well as driving SEO value in that the search engines might see this one or see that this is linked too and then follow those links out from there, pass more link juice and more crawling power across those pages.

The last one, and I know the most challenging one, is earning external links. Reference content, are you kidding me? It just doesn’t get linked to, you know. How are you going to win with this stuff? But there are ways. Successful companies have done really good things on this front. The first one I recommend is from the content perspective. Multimedia content, visual explanations, these kinds of things rock. I was pointing today on Twitter to a post from King Arthur Flour. Can you think of a more boring company? King Arthur Flour? Are you kidding me? They have an amazing blog. Their blog has earned hundreds, thousands of links because they’ve produced these blog posts that are sort of reference content about how to bake French bread and how to do no-knead bread. What they do is make them highly multimedia intensive. So, every step of the way they’ve got photo after photo after photo after photo. Tons of comments. People just loving it to death. Granted, you know, they’re in a moderately interesting area of recipes, but it is super competitive, and yet they rank for this stuff. They’re able to draw people in. And they can show off the fact that, you know, King Arthur Flour is sort of very highly rated for this kind of thing by other professional chefs, etc. Those visual explanations, the video content, they rock, right. You’re watching Whiteboard Friday, huh?

Next piece that I really like is doing things with research content as well as like charts, graphs, and data. Even if you take your data from third party sources and you reference back to it, if you’re the one who produces the actual visual chart, other people who want to embed that chart, want to use it in presentations, want to use it in blog posts, who want to talk about it, are going to use your materials. You can check out the SEOmoz free charts section where we take a bunch of data that’s from sources like Eightfold Logic and comScore and Nielsen and Hitwise, put them all together, and then put them into interesting charts that other people can reference and embed on their pages. Of course, they’ll link back to those original sources, as well as to us. Those are great ways to get your reference content to actually earn those links.

The last one, two methods to kind of go out there and do distribution. Those are licensing and translation. These tactics are ideal because you’ll see all these other sites that are copying your content are linking in to your work, referencing back to that original. That is going to provide for the fact that even though these might be technically duplicate content, when the engines see them referencing your single source, especially multiples referencing your single source, they’re going to know this is the original. You can do this with licensing where you say, "Hey, I know you are in this industry and you’d like to license out some content. I’ll be the reference resource for you. You can put this stuff on your site."

It’s brilliant, too, for translation. As the Web is getting more global, more people are interested in this. More people are trying to rank for search content in all sorts of other countries. You can say, "Oh, buon giorno! Would you like to translate this piece into Italiano?" Right? Those kinds of things are absolutely phenomenal.

By the way, I had a great time in Milan with some friends from WebRanking.it and Marco, exceptional experience. The Social Media Conference there had 25,000 people come to it. It’s insane. People care about SEO overseas, and you can leverage that to get these translated articles out there on the Web and then to have the links point back to you. What does it look like to Google when ten sites from all over the world are all pointing back to your reference articles? It looks like you’re going to win at SEO.

All right, everyone. I hope you’ve enjoyed this edition of Whiteboard Friday. I hope you’ll join us again next week for another one. Take care.

Video transcription by SpeechPad.com



Follow SEOmoz on Twitter! Also, you can always follow me, Aaron.

If you have any tips or advice that you’ve learned along the way, or if you also love pretty much anything HBO produces, we’d love to hear about it in the comments below. Post your comment and be heard!

Do you like this post? Yes No


SEOmoz Daily SEO Blog

Posted by Dr. Pete Red car in a sea of white carsWhen you’re a small fish in a sea of competitors, getting noticed by search engines is never easy. If you’re a car dealer, local restaurant, real estate agent, lawyer, doctor, etc., you’re not only competing with hundreds of other businesses just like yours, but when it comes to link-building, everyone is trying to pick the same low-hanging fruit.

Strong content that attracts natural links can really help break the mold of low quality directories, blog comments, and spammy article marketing, but where do you start and how do you stand out? The world only needs so many mortgage calculators. I’d like to offer a few tactics to get you moving (and thinking) in the right direction.

Car Dealers (Tactic: Positive UGC)

Many companies are afraid of user-generated content (UGC). They imagine the worst – negative comments, brand-bashing, customer service horror stories. Although that fear is often overblown, it’s easy to sympathize. It is possible, though, to use UGC and still control the message.

Let me illustrate with a story. In the late 90s, my parents bought a Saturn. Back then, Saturn was known for their unique buying experience – when you signed your paperwork, they took your picture, posted it on the wall, and the employees all came out and cheered. It was a little odd, admittedly, but it was definitely a memorable experience.

Why not use that same approach online? Find your brand evangelists – ask your customers to submit photos of themselves with their cars, for example. This type of positive UGC has a number of advantages:

  • You’ll tend to attract brand loyalists.
  • People will link to their content on your site out of vanity.
  • You’ll create natural testimonials.

Restaurants (Tactic: Positive UGC)

This is another spin on the car dealership idea. If you’re a restaurant, you have to deal with reviews. They can really make or break your business, especially now that there are entire companies dedicated to flooding the internet with positive (or negative) reviews. Why not ask for feedback in a way that naturally spins positive? For example, add a feature to your site where you ask people to post pictures of their favorite dish from your restaurant. No one has a bad favorite dish – the haters will naturally exclude themselves. Meanwhile, the brand evangelists will love seeing their photo posted online and will naturally tell their friends.

Real Estate (Tactic: Local Interest)

Real estate websites and even blogs have a tendency to be generic – they talk about why it’s time to buy, how to find a decent interest rate, etc. This information, done well, is fine, but it’s hard to stand out when you’re saying the same things that 1,000 other realtors are saying.

Why not focus on the local angle? Think more broadly than just real estate – talk about the highlights of the neighborhoods you sell in. This could be everything from the best schools and local tourist attractions to talking about your favorite local restaurants. Don’t be afraid to get a little personal, and you’ll tap into a few advantages:

  • You’ll show you know and like the neighborhood you sell in.
  • Local content will naturally attract local links.
  • You’ll naturally highlight the reasons to live in your neighborhood.

Lawyers (Tactic: Local Expertise)

Lawyers, like realtors, face the problem of how to say the same things as everyone else and still sound unique. Again, focus on your own niche and the local angle (assuming you’re a smaller office). Highlight local stories that show how the law impacts your area – this could be everything from crime stories to civil suits. Discuss these stories in the context of your practice. You could even have fun with it – talk about weird laws in your state or city, for example. The advantages?

  • You’ll show that you’re up to date with current laws and events.
  • People will see that you understand how the law impacts them.
  • Local interest stories naturally attract local links.

Don’t Be Afraid to Get Creative

If there’s a theme here, it’s that you can’t be afraid to start getting creative, even if you think you’re in a "boring" industry. Think about what got you into your business in the first place – there’s always a story, and the more you put your own spin on content, the more authentic and unique it will naturally become. Say something that no one else is saying, and natural links will create themselves.

"Hundreds of cars" image provided by ShutterStock.

Do you like this post? Yes No


SEOmoz Daily SEO Blog
Post image for How to Do A Content Audit of Your Website

If you have a website that’s been around for a few years and you’re looking for ways to make some improvements, one of the tactics I recommend is doing a content audit. 

When you do a content audit you have a few goals in mind:

  • Get rid of any low quality or unimportant pages
  • Look for pages or sections that can be improved or updated
  • Improve your rankings by more effectively using your link equity, internal anchor text, and interlinking your content

Get the Data

your inbound link equity can only support a certain number of pages …

The first thing you need to do is to get an understanding of where your website currently stands. You’ll need a list of the pages of your website, the number of inbound links, and amount of visitors your page receives. If you are using Webmaster central, you can export a spreadsheet of all the pages with the number of links. The next thing you have to do is add a column for page views. I like to use a timeframe between a year and year and half.

Depending on the number of pages your website has, it could take a while to get all this data. This is the perfect task for an intern or outsourced labor from a place like ODesk. I recently performed this task on a website that has 1800 URL’s. It cost me , and I had the data back in just over 24 hours.

Identify the Low Performing Pages

The two primary factors I like to look at are how many links does a post/page have and how much traffic did it generate in the past 18 months. Any page that generated less than 100 page views is a candidate for deletion. Additionally, any page that generated less than 25 links is also a candidate for deletion.

Delete or Rewrite

At this point you’ll have a list of pages that generated minimal links and/or traffic and are therefore candidates for deletion or revision. This is where it requires some decision making. If a page generated a lot of links but not much traffic, I’m probably going to keep it intact. The same is true for pages with high traffic but a low number of links. When pages are low on links and low on traffic, you have to use your judgment. In some cases, the post was a throwaway post–important at the time but not important now. Those are easy to justify deleting. In other cases, you’ll want to keep them.

At the very least I would suggest looking at the pages to see if you can improve them. In some cases the information is outdated and needs a complete rewrite. In other cases it just requires a little updating. One of the tools I’ve found to be helpful is Scribe SEO (see my  Scribe SEO review). It gives you a quick overview and can sometimes make a few quick easy suggestions to improve a page. A third option is creating a living URL style page. When you rewrite or revise pages you really want to look for ways to maximize your internal anchor text and linkage whenever possible.

Why Should You Delete Old Posts or Pages

When I talk about this practice, a lot of people wonder why would you bother deleting pages. After all, there’s no harm in keeping them around and you’ve already spent the time and energy having them created. For the answer, we need to look at the concept of link equity. Each website only has a certain amount of links, trust, and authority coming into it … this concept is called link equity. That link equity can only support a certain number of pages. For example a brand new website with few links won’t be able to have thousands of pages in the index: the search engines simply don’t have enough signals of quality to support anything more than superficial crawling.  Additionally IMHO ever since the “mayday update” the days of “infinite websites” have come to an end.

When I mention deleting old posts, sometimes bloggers look like they are going to break down in tears, as if I asked them to abandon a puppy with no food or water outside in a freezing snowstorm. If you’re the type of person who has a deep emotional attachment to your posts, you aren’t running a business website. You are creating Aunt Millie’s Christmas Letter.

Backups and Redirections

Before you delete a single post make sure you have multiple backups of all of your posts. You want the ability to bring your posts back if you delete one by accident. If you use WordPress, you can trash a page/post and it’s deleted from public view, but it lingers in limbo for 30 days and is easy to bring back. If any of the pages have more than a handful of links you should set up a redirection. Try to redirect to a similar-themed post or revised post if possible. If not then the homepage, the sitemap, or archives page. A controversial step is to redirect to a different commercial page or to create a link hub somewhere else. Let your conscience be your guide to your approach.

Lastly, you want to trap for 404 errors and redirect anything you might have missed. Again, if you use WordPress, the redirection plugin takes care of the 404 and redirections in one spot.

What are the takeaways in this post:

  • Make a list of all your pages with inbound links and traffic stats from the past year
  • Identify and isolate the worst performing pages
  • Subdivide the list into pages to delete or pages to revise/rewrite
  • Backup pages before deleting
  • Set up redirections for any pages that are deleted
  • Monitor 404 errors for any deletions or redirections you missed

Creative Commons License photo credit: ansik

This post originally came from Michael Gray who is an SEO Consultant. Be sure not to miss the Thesis WordPress Theme review.

How to Do A Content Audit of Your Website

tla starter kit

Related posts:

  1. How to Silo Your Website: The Content The following is part of the series How To Silo…
  2. Putting a Content Based Website Together We’ve covered long term content and short term content, information…
  3. How Do You Archive Pages on a High Post Volume Website Today’s post is an answer to a question I took…
  4. How To Silo Your Website:The Sidebar The following post is part of a series on How…
  5. How To Figure Out What Parts of Your Website Aren’t Being Crawled When Google took away the supplemental index last year, they…

Advertisers:

  1. Text Link Ads – New customers can get 0 in free text links.
  2. BOTW.org – Get a premier listing in the internet’s oldest directory.
  3. Ezilon.com Regional Directory – Check to see if your website is listed!
  4. Page1Hosting – Class C IP Hosting starting at .99.
  5. Directory Journal – List your website in our growing web directory today.
  6. Content Customs – Unique and high quality SEO writing services, providing webmasters with hundreds of SEO articles per week
  7. Majestic SEO – Competitive back link intellegence for SEO Analysis
  8. Glass Whiteboards – For a professional durable white board with no ghosting, streaking or marker stains, see my Glass Whiteboard Review
  9. Need an SEO Audit for your website, look at my SEO Consulting Services
  10. KnowEm – Protect your brand, product or company name with a continually growing list of social media sites.
  11. Scribe SEO Review find out how to better optimize your wordpress posts.
  12. TigerTech – Great Web Hosting service at a great price.

Michael Gray – Graywolf’s SEO Blog
Post image for Content Ideas – Creating an Ongoing Series

One of the problems that website owners and bloggers encounter on a regular basis is coming up with ideas for posts. One of the tactics that I like to employ is creating a regular ongoing post series.

So let’s tackle the big questions first: what’s an example of post series? How about “Sandwich Mondays” from NPR. The basic premise is every monday they publish a post about sandwhiches. Sometimes these are reviews as in the case of the Denny’s Fried Cheese post. Sometimes there’s a travel/tourism theme as in the case of the Pop Tarts Restaurant in Times Square. Other times it could be a flashback pop-culture reference like the Pixy Stix and Cap’n Crunch Cereal sandwich from The Breakfast Club.

Click here to view the embedded video.

You can take this concept and use it on lots of different sites. For example, on a real estate blog, how about writing an in depth post about a school district in a neighborhood you work. On a clothing site do a series that features, each week, one pair of fashionable shoes under . This is a pretty easy concept to run with. All it takes a little imagination.

To get the most out of this approach you should try using an editorial calendar. Now this doesn’t mean you have to eat a sandwich every Monday. You can get 3-4 weeks or more ahead of yourself and schedule the posts in advance. You can also have multiple series. In the example of the real estate website, maybe you’ll also have a series about local libraries and programs they offer. If you keep each of the posts narrowly focused, you can tie it all together with the head and tail content approach. These types of posts are something that will benefit from having a bit of personality and opinion to them because it’s what makes them interesting. Additionally, the more opinion you use, the less you are going to be able to outsource. Lastly use tags and maximize internal anchor text to  get the most SEO value, silo your content, and better target your ads.

What are the takeaways here:

  • Choose 1-3 series of weekly, biweekly, or monthly posts
  • Use an editorial calendar to help you plan and publish your content
  • Keep each of the posts narrow and focused
  • Interlink the series with other series using the “head and tail” concept
  • Maximize your internal anchor text by interlinking to other related posts
  • Tag your posts to help you serve the most effective advertisements

Creative Commons License photo credit: adactio

This post originally came from Michael Gray who is an SEO Consultant. Be sure not to miss the Thesis WordPress Theme review.

Content Ideas – Creating an Ongoing Series

tla starter kit

Related posts:

  1. Is Web 2.0 Creating An Ad Trend Towards Promoting Content? If you pay attention to the ads that monetize most…
  2. Creating Better Auto-Generated Text Over the past two weeks we’ve taken a look at…
  3. How to Silo Your Website: The Content The following is part of the series How To Silo…
  4. Creative Link Building Ideas There’s lots of talk on the net right now that…
  5. Creating the Ultimate Personal Hub and Nerve Center Last week I was reading Steve Rubel’s turning Gmail into…

Advertisers:

  1. Text Link Ads – New customers can get 0 in free text links.
  2. BOTW.org – Get a premier listing in the internet’s oldest directory.
  3. Ezilon.com Regional Directory – Check to see if your website is listed!
  4. Page1Hosting – Class C IP Hosting starting at .99.
  5. Directory Journal – List your website in our growing web directory today.
  6. Content Customs – Unique and high quality SEO writing services, providing webmasters with hundreds of SEO articles per week
  7. Majestic SEO – Competitive back link intellegence for SEO Analysis
  8. Glass Whiteboards – For a professional durable white board with no ghosting, streaking or marker stains, see my Glass Whiteboard Review
  9. Need an SEO Audit for your website, look at my SEO Consulting Services
  10. KnowEm – Protect your brand, product or company name with a continually growing list of social media sites.
  11. Scribe SEO Review find out how to better optimize your wordpress posts.
  12. TigerTech – Great Web Hosting service at a great price.

Michael Gray – Graywolf’s SEO Blog