Again there are two obvious questions, and I’ll address those first. Not only is this a piece of software I use on a regular basis, but it’s one I have paid for. I wasn’t comp’d or given a free version. Secondly, no I don’t think a piece of software can replace a human audit, but software can be used to gather data quickly and more efficiently than a human ever could, and that’s the role that I use Website Auditor to perform.
Like all of the SEO Power Suite software it’s easy to install (Windows or Mac) and get running. You just input the URL, specify a crawling depth, let it know which parameters you want it to get, set up any exclusions, and let it run.
Now, until you’ve run it a few times, I’d suggest leaving the options at the default setting. One that you may wish to be careful with is crawling depth. If you have a large website, set this number to a lower number. As I mentioned before crawling depth can be a powerful tool to help you figure out crawling problems (see how I diagnose and improve crawling problems). Once the initial crawl is done, it will come back to ask if you want to update the missing info. Again, let it do its thing.
Now while this software does have human emulation built in, if you selected certain factors like Google Page Rank it can trip up the Google Captcha flags. The software will prompt you for the info, but if you don’t enter it, it will time out and skip it. You can rerun it again just looking for the missing info. The company also offers a captcha breaking service. Depending on how many URLs you are analyzing and what parameters you selected, this can take up to a few hours to run. So again my advice is to have this software on a server, not your main computer.
To be honest I’m not a big fan of the default view. I find the nested tree structure a little hard to read.
I prefer the PageRank view. It’s just a lot easier to deal with IMHO.
What I suggest is to create your own report with the information you want on it that you can export via clipboard.
Additionally, as with all the SEO Power Suite Software, you can produce reports. Here’s a sample report from the software. Bear in mind that I stopped it midway so some of the data is missing. If you use the enterprise version you can also customize the report.
As I said, I don’t recommend running the program, spitting out a report, and moving on: you should look at the data and analyze it. A computer is good at gathering data, but it’s not good at being insightful or offering up expertise.Website Auditor is piece of software I’ve paid for and have used for a few months. It has a few quirks as far as captcha information and display, but I find it helps. If you think you might like it, why not take advantage of the free download?
With the proliferation of smart phones of varying screen sizes, flash compatibility, and most recently apple tv and google tv, many website owners are choosing to solve this problem with multiple sites and domains. While this solution can work, there are plenty of ways it can go wrong. In this post I’ll try to help you understand why this is usually not the best choice.
First, let’s make sure we are talking about the same issue. When I talk about creating multiple websites I mean having example.com for desktops, having m.example.com for mobile users, and example.tv for tv-based browsers like apple tv and google tv. Additionally, you could also have separate domains, sub domains, or folders for flash/non-flash content. The first problem is this creates a huge maintenance point. Unless you have the staff and budget, maintaining multiple versions of the same website is going to consume a larger and larger amount of resources. The more pages you have, the more versions you will have to maintain, and it will grow exponentially.
Another negative aspect is buildup of links. If you have multiple versions, all of those versions will start to build link equity, both internal link equity and external link equity. You could try and do some redirection but, unless you handle redirection is perfectly, it inevitably leads to link trust/equity being divided across multiple resources and lower overall rankings. In my experience you are much better off using one domain with one URL implementation, no matter what/where/how the end user is viewing your content.
That’s not to say you shouldn’t change your content based on what the user is using to view your content or where they came from (see changing your content based on traffic intent). What I am saying is keep the domain/URL consistent and change the presentation via server side code and style sheets. This is also what Google recommends in their google tv implementation guide . The one place I will caution using Google’s advice is with 302 redirects. IMHO Google has a sketchy history handling 302′s, and I would steer clear of that issue entirely.
In addition to maintenance and link equity, you need to think about the user experience. If people are sharing your URL and it crosses platforms, like desktop to mobile, desktop to tv, tv to mobile, or mobile to tv, there is the potential for things to go wrong. Unless you redirect based on browser platform, you will run the risk of serving content that’s formatted incorrectly and might not be readable/usable. Want a real life scenario? Let’s say I’m reading Facebook on my iphone and click a link that one of my friends posted. If they posted a link to the TV version and I try to view it on my mobile phone, it’s not going to work.
In my experience using multiple websites to solve platform specific content formatting issues is seldom the best choice and leads to bigger problems down the road.
What are the takeaways from this post:
The one instance where I feel it’s advisable to use multiple domains or subdomains is country level tld’s or for different languages. If you own example.com and have a French version I would use example.fr, example.com/fr/ or fr.example.com to serve content, especially if you are trying to capture traffic from French language searches and search engines like google.fr.
Joel Goldstein will teach you the 10 things to keep in mind when building a website. This is applicable whether you currently have a website or are considering building one in the near future. Mr. Goldstein is the President of the Peer Marketing Group as well as a best selling author on marketing.
1) Attract and Hold the viewers attention immediately. This is different for each generation however here is a synopsis on what they like to see. The Baby Boomers enjoy seeing well put together material with quotes and credentials. Generation X is attracted to Bold words and when an advertisement raises a question. Generation Y or Millennials are drawn to large in your face pictures, they also are more likely to purchase something if their friends do. Don’t expect that your customer will look around your site for hours, they will only stay engaged as long as you have their attention. Give a clear path to a end goal, whether that be calling you or an online purchase. The more simple the better.
Best Practices Include: avoiding graphics, flash animation, large pictures.
2) Target your customers by demographic and generation. Build your website according to what kind of customer you are looking to reach. If your website is directed at professionals, make the site bright and clear. If your site is targeting young teenagers, make the website more informal and relaxed. Give them an action to act on from your website.
3) Narrow down your website. Do not offer multiple products on the same page this will only confuse your customer. Dedicate a separate page for each product. If you wish to advertise your other products, do so in a subtle manner in the footer or sidebar.
4) Build your credibility immediately. Having a great website is a good first step, however if you are looking to build trust in your future customers state your credentials upfront. The Internet creates a air of mistrust in each website visitor, in order to sell your products you must first build the trust in your visitor to a level where they are confident buying from you.
5) Give correct information about yourself including an address, email and phone number. Going a step further and adding a clear privacy statement is a good tool for establishing credibility.
6) Offer a money back guarantee or a satisfaction guarantee. Give the client control of whether you want them to keep the product or not. The 5% of clients who return your products will eliminate any actionable negative bloggers or bad pr from being released online or to the press.
7) Make the money transaction on your website as easy as possible. Making the process short is essential for eliminating buyers remorse and eliminating confusion. Give your clients as many payment modes as possible including credit cards, online payments, electronic checks and a mailing address for customers paying via the mail. Amazon.com has patented the one click checkout process, due to the power of eliminating hurdles in the buying process.
8) Make navigating your website as simple as possible. Each page should lead seamlessly into the next. Creating a 1. 2. 3. step process is a great solution to streamlining your process.
9) Create the website with a color palate that matches the look and feel of your company. If you are marketing your product or service to professionals make the palate neutral and light, if you are targeting a younger generation you can take liberties and have a more bold color palate.
10) Keep in mind that most of your customers will find your website in the search engine. Design your pages so that your product pages each have individual titles and keywords to give you as much targeted exposure as possible. Use title tags and description tags to narrow this SEO scope down. By creating 15 landing pages for your website you will maximize exposure and increase traffic 15 times!
Bonus: I saved this obvious but most important tip as last. Give good information and give your viewer the option to get more good information by signing up to get more information from you. To that point if you would like to be informed on my future articles please visit my website http://JoelGoldstein.com and sign up to receive emails from me.
Related posts brought to you by Yet Another Related Posts Plugin.
While some people consider Foursquare to be nothing more than the latest shiny cat toy for social media gurus and badge collectors, I’ve recently found a few uses for both business and pleasure.
I recently took a trip to Disney World in Florida. After checking in at the Orlando Airport, I got a pop up from Rae Hoffman about the expert travel lane. Foursquare’s assumption is that I am more likely to trust information that comes from people I trust and who are my friends. As I traveled around Disney World that week, I got tips from Chris Brogran, Jennifer Laycock and others and, in most cases, I found the info useful, helpful, and interesting.
A few weeks later I was traveling through Midway Airport. When I was looking at tips, two of the places that stood out as places to eat were Potbelly’s and Harry Carry’s–both were places I had eaten before and both were good
So as a publisher how can you use this? When you are building a website about a topic you aren’t an expert in or are looking to capture natural language queries for, one of the place you look is Yahoo answers (“how do I do X”, “what’s the best way to do Y” and “Is X doing Y” type of stuff). If you’re looking to build a website and don’t have any on the ground, first hand expertise, using the tips from Foursquare is an excellent idea. For example here’s a list of tips from people who stayed at Disney’s Polynesian Resort. Now you’ll have to do the obvious spam filtering and reality check, but to be honest I haven’t seen a lot that in Foursquare tips, especially tips that have been done by multiple people. For now the social proof seems to be holding.
This post originally came from Michael Gray who is an SEO Consultant. Be sure not to miss the Thesis WordPress Theme review.
How Website Owners and Publishers Can Use Foursquare
Posted by randfish
The process of launching a new website is, for many entrepreneurs, bloggers and business owners, an uncertain and scary prospect. This is often due to both unanswered questions and incomplete knowledge of which questions to ask. In this post, I’ll give my best recommendations for launching a new site from a marketing and metrics setup perspective. This won’t just help with SEO, but on traffic generation, accessibility, and your ability to measure and improve everything about your site.
Nothing can be improved that is not tracked. Keeping these immortal words of wisdom in mind, get your pages firing analytics code before your first visitor. Google Analytics is the obvious choice, and customization options abound (for most sites more advanced than a basic blog, I’d highly recommend at least using first-touch attribution).
Google analytics, or any other package (see some alternatives here), needs to be placed on every page of your site and verified. Do yourself a favor and install in a template file you can be sure is on every page (e.g. footer.php). GA’s instructions will indicate that placing the code at the top of the page is key, but I’m generally in favor of leaving it at the bottom to help page load time for visitors (though the new asynchronous GA code is pretty fast).
Both Google & Bing have webmaster tools programs that monitor data about your site and message it back to you through online interfaces. This is the heartbeat of your site from the search engines’ perspective and for that reason, it’s wise to stay on top of the data they share.
That said, the numbers inside these tools are not always perfect, and often have serious flaws. The referring keywords and traffic data are, in my experience, far off what analytics tools will report (and in those cases, trust your analytics, not the engines’ tools). Likewise, crawl, spidering and indexation data isn’t always solid, either. Nonetheless, new features and greater accuracy continue to roll out (more of the former than the latter unfortunately) and it’s worth having these both set up.
No matter how perfect you or your developers are, there’s always problems at launch – broken links, improper redirects, missing titles, pages lacking rel=canonical tags (see more on why we recommend using it and the dangers of implementing improperly), files blocked by robots.txt, etc.
By running a crawl test with a free tool like Xenu or GSiteCrawler, or leveraging a paid tool like Custom Crawl from Labs or the Crawl Service in the Web App (pictured above), you can check your site’s accessibility and insure that visitors and search engines can reach pages successfully in the ways you want. If you launch first, you’ll often find that critical errors are left to rot because the priority list fills up so quickly with other demands on development time. Crawl tests are also a great way to verify contractor or outsourced development work.
In addition to testing for search engine and visitor accessiblity, you’ll want to make sure the gorgeous graphics and layout you’ve carefully prepared checks out in a variety of browsers. My rule is to test anything that has higher than 2% market share, which currently means (according to Royal Pingdom): Internet Explorer, Firefox, Chrome, Safari and Opera.
There’s a great list of browser testing options from FreelanceFolder here, so I’ll just add that in-person testing, on your own PCs & Macs, is also a highly recommended use of an hour.
Virtually every site will have some form of structured data being pushed out through an RSS feed. And, just like visitor analytics, if you want to improve the reach and quality of the feed, you’ll need to leverage data.
Feedburner is the de facto software of choice, and it’s very solid (though, good alternatives do exist). Getting your feed and the analytics to track and measure it is typically a very easy process because there’s nothing to verify – you can create and promote any feed you want with just a few button clicks.
One important recommendation – don’t initially use the counter "chicklet" like:
It has a bad psychological impact to see that no one has subscribed to your new RSS feed. Instead, just provide a standard link or graphic and after you’ve amassed a few hundred or thousand readers, use the numeric readout to provide additional social proof.
No matter what your site is, there are actions you’re hoping visitors will take – from tweeting a link to your post to leaving a comment to buying a product or subscribing to an email list. Whatever those actions might be, you need to record the visits that make them through your analytics tool. Casey Henry’s post on Google Analytics’ Event Tracking will provide a thorough walkthrough.
Once action tracking is in place, you can segment traffic sources and visit paths by the actions that were taken and learn more about what predicts a visitor is going to be valuable. If you’re pouring hours each day into Twitter m but seeing no actions, you might try a different channel, even if the traffic volume is high.
Before a formal launch, it can be extremely helpful to get a sense of what users see, experience and remember when they browse to your site for a few seconds or try to take an action. There’s some fantastic new software to help with this, including Clue App, screenshot below:
Last week, I set up a Clue App test for SEOmoz’s homepag
e in 30 seconds and tweeted a single link to it, which garnered 158 kind responses with words and concepts people remembered from the visit. This type of raw testing isn’t perfect, but it can give you a great look into the minds of your visitors. If the messages being taken away aren’t the ones you intended, tweaking may be critical.
No matter what your website does, you live and die by some key metrics. If you’re starting out as a blogger, your RSS subscribers, unique visits, pageviews and key social stats (tweets, links, Facebook shares, etc) are your lifeblood. If you’re in e-commerce, it’s all of the above plus # of customers, sales, sales volume, returning vs. new buyers, etc.
Whatever your particular key metrics might be, you need a single place – often just a basic spreadsheet – where these important numbers are tracked on a daily or weekly basis. Setting this up before you launch will save you a ton of pain later on and give you consistent statistics to work back from and identify trends with in the future.
This may seem non-obvious, but it’s shocking how a friendly email blast to just a few dozen of your close contacts can help set the stage for a much more successful launch. Start by building a list of the people who owe you favors, have helped out and who you can always rely on. If you’re feeling a bit more aggressive in your marketing, you can go one circle beyond that to casual business partners and acquaintences.
Once you have the list, you’ll need to craft an email. I highly recommend being transparent, requesting feedback and offering to return the favor. You should also use BCC and make yourself the recipient. No one wants to be on a huge, visible email list to folks they may not know (and get the resulting reply-all messages).
TheAlerts Service from Google certainly isn’t perfect, but it’s free, ubiquitous, and can give you the heads up on some of the sites and pages that mention your brand or link to you in a timely fashion.
Unfortunately, the service sends through a lot of false positives – spam, scraper sites and low quality junk. It also tends to miss a lot of good, relevant mentions and links, which is why the next recommendation’s on the list.
In order to keep track of your progress and identify the sites and pages that mention or link to your new site, you’ll want to set up a series of queries that can run on a regular basis (or automated if you’ve got a good system for grabbing the data and putting it into a tracking application). These include a number of searches at Google, Twitter and Backtype:
The queries should use your brand name in combination with specific searches, like the example below (using "seomoz" and "seomoz.org"):
You can add more to this list if you find them valuable/worthwhile, but these basics should take you most of the way on knowing where your site has been mentioned or referenced on the web.
Capturing the email addresses of your potential customers/audience can be a huge win for the influence you’re able to wield later to promote new content, products or offerings. Before you launch, you’ll want to carefully consider how and where you can offer something in exchange for permission to build an email list.
One of the most common ways to build good lists is to offer whitepaper, e-book, video or other exclusive content piece for download/access to those who enter an email address. You can also collect emails from comment registration (which tend to be lower overall quality), through an email newsletter subscription offering (which tend to be very high quality) or via a straight RSS subscription (but you’ll need to self-manage if you want to have full access to those emails). Services like MailChimp, ExactTarget, Constant Contact and iContact are all options for this type of list building and management.
Social media has become popular and powerful enough that any new site should be taking advantage of it. At a minimum, I’d recommend creating accounts on the following networks:
And if you have more time or energy to devote, I’d also invest in these:
Setting up these accounts diligently is important – don’t just re-use the same short bio or snippet over and over. Spend the time to build fleshed out profiles that have comprehensive information and interact/network with peers and those with similar interests to help build up reputation on the site. The effort is worth the reward – empty, unloved social accounts do virtually nothing, but active ones can drive traffic, citations, awareness and value.
BTW – Depending on the size and structure of your site, you may also want to consider creating a Facebook Fan Page, a LinkedIn Company Page and profiles on company tracking sites like Crunchbase, BusinessWeek and the Google Local Business Center.
If you’ve just set up your social account, you’ve likely added your new site as a reference point already, but if not, you should take the time to visit your various social profiles and make sure they link back to the site you’re launching.
Not all of these links will provide direct SEO value (as many of them are "nofollowed"), but the references and clicks you earn from those investigating your profiles based on your participation may prove invaluable. It’s also a great way to leverage your existing branding and participation to help the traffic of your new site.
Depending on your niche, you may have traditional media outlets, bloggers, industry luminaries, academics, Twitter personalities, powerful offline sources or others that could provide your new site with visibility and value. Don’t just hope that these folks find you – create a targeted list of the sites, accounts and individuals you want to connect with and form a strategy to reach the low hanging fruit first.
The list should include as much contact information as you can gather about each target – including Twitter account name, email (if you can find it), and even a physical mailing address. You can leverage all of these to reach out to these folks at launch (or have your PR company do it if you have one). If you tell the right story and have a compelling site, chances are good you’ll get at list a few of your targets to help promote, or, at the least visit and be aware of you.
This is SEO basics 101, but every new site should keep in mind that search engines get lots of queries for virtually everything under the sun. If there are keywords and phrases you know you want to rank for, these should be in a list that you can measure and work toward. Chances are that at launch, you won’t even be targeting many of these searches with specific pages, but if you build the list now, you’ll have the goal to create these pages and work on ranking for those terms.
As you’re doing this, don’t just choose the highest traffic keywords possible – go for those that are balanced; moderate to high in volume, highly relevant in terms of what the searcher wants vs. what your page/site offers and relatively low in difficulty.
See this post for more tips – Choosing the Right Keyphrases – from Sam Crocker.
WIthout goals and targets, there’s no way to know whether you’re meeting, beating or failing against expectations – and every endeavor, from running a marathon to cooking a meal to building a company or just launching a personal blog will fail if there aren’t clear expectations set at the start. If you’re relatively small and just starting out, I’d set goals for the following metrics:
And each of these should have 3, 6 and 12 month targets. Don’t be too agressive as you’ll find yourself discouraged or, worse, not taking your own targets seriously. Likewise, don’t cut yourself short by setting goals that you can easily achieve – stretch at least a little.
Every 3-6 months, you should re-evaluate these and create new goals, possibly adding new metrics if you’ve taken new paths (RSS subscribers, views of your videos, emails collected, etc.)
I know this one’s a bit self-serving, but I’d like to think I’d add it here even if it wasn’t my company (I recently set up my own personal blog and found the crawling, rank tracking and new GA inegration features pretty awesome for monitoring the growth of a new site).
The SEOmoz Web App has a number of cool tracking and monitoring features, as well as recommendations for optimizing pages targeting keywords, that make it valuable for new sites that are launching. The crawl system can serve to help with #3 on this list at the outset, but ongoing, it continues to crawl pages and show you your site’s growth and any errors or missed opportunities. Tracking rankings can let you follow progress against item #16, even if that progress is moving from ranking in the 40s to the 20s (where very little search traffic will be coming in, even if you’re making progress). And the new GA integration features show the quantity of pages, keywords and visits from search engines to track progress from an SEO standpoint.
Using this list, you should be able to set up a new site for launch and feel confident that your marketing and metrics priorities are in place. Please feel free to share other suggestions for pre and post-launch tactics to help get a new site on its feet. I’m looking forward to seeing what other recommendations you’ve got.
If you have a website that’s been around for a few years and you’re looking for ways to make some improvements, one of the tactics I recommend is doing a content audit.
When you do a content audit you have a few goals in mind:
The first thing you need to do is to get an understanding of where your website currently stands. You’ll need a list of the pages of your website, the number of inbound links, and amount of visitors your page receives. If you are using Webmaster central, you can export a spreadsheet of all the pages with the number of links. The next thing you have to do is add a column for page views. I like to use a timeframe between a year and year and half.
Depending on the number of pages your website has, it could take a while to get all this data. This is the perfect task for an intern or outsourced labor from a place like ODesk. I recently performed this task on a website that has 1800 URL’s. It cost me , and I had the data back in just over 24 hours.
The two primary factors I like to look at are how many links does a post/page have and how much traffic did it generate in the past 18 months. Any page that generated less than 100 page views is a candidate for deletion. Additionally, any page that generated less than 25 links is also a candidate for deletion.
At this point you’ll have a list of pages that generated minimal links and/or traffic and are therefore candidates for deletion or revision. This is where it requires some decision making. If a page generated a lot of links but not much traffic, I’m probably going to keep it intact. The same is true for pages with high traffic but a low number of links. When pages are low on links and low on traffic, you have to use your judgment. In some cases, the post was a throwaway post–important at the time but not important now. Those are easy to justify deleting. In other cases, you’ll want to keep them.
At the very least I would suggest looking at the pages to see if you can improve them. In some cases the information is outdated and needs a complete rewrite. In other cases it just requires a little updating. One of the tools I’ve found to be helpful is Scribe SEO (see my Scribe SEO review). It gives you a quick overview and can sometimes make a few quick easy suggestions to improve a page. A third option is creating a living URL style page. When you rewrite or revise pages you really want to look for ways to maximize your internal anchor text and linkage whenever possible.
When I talk about this practice, a lot of people wonder why would you bother deleting pages. After all, there’s no harm in keeping them around and you’ve already spent the time and energy having them created. For the answer, we need to look at the concept of link equity. Each website only has a certain amount of links, trust, and authority coming into it … this concept is called link equity. That link equity can only support a certain number of pages. For example a brand new website with few links won’t be able to have thousands of pages in the index: the search engines simply don’t have enough signals of quality to support anything more than superficial crawling. Additionally IMHO ever since the “mayday update” the days of “infinite websites” have come to an end.
When I mention deleting old posts, sometimes bloggers look like they are going to break down in tears, as if I asked them to abandon a puppy with no food or water outside in a freezing snowstorm. If you’re the type of person who has a deep emotional attachment to your posts, you aren’t running a business website. You are creating Aunt Millie’s Christmas Letter.
Before you delete a single post make sure you have multiple backups of all of your posts. You want the ability to bring your posts back if you delete one by accident. If you use WordPress, you can trash a page/post and it’s deleted from public view, but it lingers in limbo for 30 days and is easy to bring back. If any of the pages have more than a handful of links you should set up a redirection. Try to redirect to a similar-themed post or revised post if possible. If not then the homepage, the sitemap, or archives page. A controversial step is to redirect to a different commercial page or to create a link hub somewhere else. Let your conscience be your guide to your approach.
Lastly, you want to trap for 404 errors and redirect anything you might have missed. Again, if you use WordPress, the redirection plugin takes care of the 404 and redirections in one spot.
What are the takeaways in this post:
One of the questions that often comes up is does Google hate affiliate websites, and are they penalized in the algorithm?
The answer to that is slightly nuanced but, for simplicity’s sake, they don’t hate affiliate websites. Nor have I seen any evidence that shows affiliate sites are penalized. What Google does hate is thin affiliate websites with little or no trust. However, a better question to ask is can Google detect affiliate websites, and can they make it harder for affiliate websites to rank … ? But those are entirely different questions.
If you’ve read the leaked quality rater guide from 2009, you’ll see that Google has set up lot of hurdles specifically making it harder for affiliate websites to “pass” the sniff test. One of the quickest and easiest ways that Google can determine an affiliate website is through “naked” links to common affiliate programs like Linkshare, CJ, ShareASale, and others. But, really, how good can Google be at detecting those links? Well, here’s a publicly available free tool put out by Sitonomy that checks what types of programming tools are being used by a website.
Now if the folks at Sitonomy can detect that 4% of the links on a page are from CJ, I’m positive that Google can as well. I’m sure Google can tell on page level throughout the site and the site as a whole. I’m also quite sure Google has an idea at what point, whether by percentage or by total number of links, that a site becomes an affiliate website. It would also be fairly easy to say, once you cross that threshold, you need a higher level of trust to rank for competitive terms. This is one of the reasons I strongly disagree with Lori Weiman, who says affiliates should never cloak links.
So what are the takeaways here:
The following is part of the series How To Silo Your Website. Be sure you have read parts one and two: How To Silo Your Website: The Masthead and How to Silo Your Website: The Breeadcrumb. In this part, we will be taking a look at the content area.
When I talk about the content, I mean the part of your website template where the information is. At the time of the writing of this post it is my belief (and the belief of many others) that links in this area are weighted differently than links in other parts of your page (ie: sidebar, footer, and masthead). For you to get the most value out of your internal site structure and internal anchor text, this is where you do it.
I’ve written before about the value of evergreen content, news-related content, seasonal content, predictive content . I’ve also written about how to write interesting content with around keywords and keeping posts on topic with narrowly focussed subjects, which you can review at your leisure.
What I’d like to talk about is how you link it all together. In the breadcrumb part of this series I spoke about the value of having a flat site architecture. For a silo or theming approach to SEO, this means limiting the links in your posts. You want to link only to other content that is relevant and good for the end user and to other sections or posts within the same silo. This isn’t so much about conserving pagerank or link equity but is more about funneling it where you want it to go.
I have written before about using tags and categories and auto linking high level keywords within the text. This is a critical part of internal linking for maximum effectiveness. You can see this strategy in use at Wikipedia (George Washington, Henry 1st, Marylin Monroe) and The New York Times (Senate Votes to Confirm Elena Kagan for U.S. Supreme Court, BP Done Pumping Cement Into Well, A Chance to Re-examine Aaron). In the case of the NYT, look first at the words then look at the SERP’s for those words. The NYT has more trust and authority than you do, but smart use of internal anchor text helps them rank well for high level news concepts and news figures names.
When you set up the links, it’s critical to use identical or nearly identical anchor text for each of the links. Sometimes you will have to deviate a little for sentence structure, but ideally it should only be one or two words. This is hard if you use a writer who doesn’t write for the web and omits keywords within their posts. A trick I use to get around this is use parenthesis like this (see Adsense Tips & Tricks for more information). Another tip is to put links to related posts or similar articles as an inset or at the end of each article.
One last point. The one place you want to reduce or eliminate links is on product pages. The only links you should have on those pages are to other products, buying guides, or reviews. Once someone has entered the conversion funnel, you want to limit the amount of ways they can leave. As an example of this, notice how the masthead on amazon becomes unclickable during the checkout. It’s not to limit page rank or link equity (spiders should never get into your cart … ever): it’s to minimize cart abandonment.
What are the takeaways: