Tags Archives

You are currently viewing all posts tagged with Site.

Posted by richardbaxterseo

If you think about it, search engines are more or less constantly driving us SEO people to keep our technical SEO strategy in a state of constant refinement. This “evolution” of the marketing environment we thrive in is a great thing, it challenges us to come up with new ways to improve our traffic generating capabilities, user experience and the overall agility of our websites.

Here are a few ideas (6, to be exact) based on issues I’ve encountered in QA or on our recent client work that I hope will provide a little food for thought the next time you’re planning SEO enhancements to your site.

1)      Leverage UGC (review content) beyond the product page

UGC is brilliant, particularly on product, content thin or affiliate sites. Making it easy for users to leave a review is powerful stuff, but are you making the most of your precious user generated comments? Consider this scenario. So many users leave product reviews on your pages that you decide to limit the number of reviews visible for that product. You could cherry pick some UGC for category listings pages, adding to the uniqueness of those otherwise content-thin pages too.

the power of UGC

2)      Use “other users found this document for”

I know Tom loves this trick, and rightly so. You can turn insightful recent searches data into valuable on-page uniqueness. Internal search and external referrals are great, but how about extending the process to make it easy for users to evaluate, extend, tag or remove terms they feel are irrelevant to the page?

This simple example shows how users of a forum site may have found that thread. I think there’s a whole lot more you can do with this trick, but it’s a start:

users found this page for these keywords

3)      Consider delivering search engine friendly URLs in your internal site search results

I know how “out there” this might initially sound, but why settle for search engine unfriendly URLs on your internal site search pages? I have seen lots of examples of links being awarded to unfriendly, internal site search URLS. Why do we spend so much time carefully crafting our external URLs, only to completely forget our internal search URLs? A little extra development work to apply a meaningful pattern to your search result page URLs today could lead to the construction of an entirely new content type down the line.

Look at how folks are linking to these search query pages, and note the first example (where instead of a URL rewrite, this site is using breadcrumbs to make their ranking page URL appear more friendly):

interesting search results pages with links

4)      Microformats are really gaining traction – be creative with them

What we’ve found with Microformats is that webmasters tend to apply the markup to web pages hosting the content, but  that’s where they stop. Imagine you have a website that sells tickets. Do you add hCalendar to your event page and walk away? No! You can nest other Microformats such as hProduct and hReview, and syndicate your formatted data to other internal pages, snippets on your homepage and category pages. Any mention of an event, a link to a product or a review snippet should use the appropriate mark-up, consistently across your website.

5)      Work hard to resolve errors and improve site speed

Think about how Google have placed site performance at the top of their agenda. I genuinely believe that a site riddled with performance issues and errors is tolerated less today by search engines than ever before. Websites with platform issues can raise serious problems for SEO, users, conversion and repeat visits. Fortunately, there are plenty of tools (including SEOmoz Pro, IIS Toolkit, Pingdom Tools and Webmaster Tools from Bing and Google) to help you identify and tackle these issues head on. Go and set aside some performance maintenance time, if you haven’t done for a while.

6)      Watch your homepage title in Google’s SERPs

Google can be pretty aggressive when it comes to choosing the most appropriate text to appear in your title snippets. Sometimes, you might disagree with Google’s choice! Our tests so far indicate that the NOODP meta tag (used to prevent Google using the DMOZ description from displaying in your SERPS) can prevent Google from doing this, even if you have no DMOZ listing.

From this;

Without ODP

To this:

better title display in serps

That “penny drop” moment when a new technical SEO strategy idea presents itself has to be my favourite part of SEO work. I’m glad that technical strategy has to evolve as search engines develop. I really can’t see a time in the near future when that will change.

If you’d like to hear more tips, I’ll be speaking at next week’s A4Uexpo in London on exactly this topic. If you’re there, be sure to drop by and say hello. My buddy Dave Naylor will be introducing me (I have no idea what he’s going to say) and hopefully there’s going to be some time to do a preview of the session over on his blog soon.

Do you like this post? Yes No


SEOmoz Daily SEO Blog

Posted by Kate Morris

As a consultant, I work with many In-House SEO teams with strategy and other issues that arise throughout the course of the year. One trend we are seeing is that these In-House teams are having a hard time coming up with accurate traffic-centered goals. Traffic is the base for many metrics measurements, so being able to semi-accurately predict that number in the coming year is important for every business. 

I can hear you all now, "Well there is the Google Keyword Tool … use that." Typically, that is my answer too, but there have been major questions about the accuracy of Google’s keyword tools and others available to webmasters, marketers, and search engine optimization teams.

(If you will comment with your favorite keyword tool other than those I mention, I’ll happily test and add it here!)

The Google Keyword Tools (yes, plural)

There was a shift recently with the Google Keyword Tool. The Legacy/API version is showing different numbers than the newest Beta interface. David Whitehouse and Richard Baxter both noticed this shift as well and did a few tests on accuracy. The jury is still out as to which version is more accurate, the legacy or the new keyword tool. But I believe like Mr. Whitehouse that the newer tool is the updated one, but that does not make it more accurate. 

To be clear, when I speak of the Legacy, API, and Beta tools, I do mean different versions of the Google Keyword Tool. First, from what I can see using the SEOmoz Keyword Difficulty tool, the Google API pulls from the Legacy tool, so they are one and the same. The Legacy tool is the prior interface for the current Beta version of the Keyword Tool. We had previously assumed that these pulled the same numbers, but my research and that of others proves otherwise.

But wait! *infomercial voice* There is more!

There is also the Search-based Keyword Tool that aids AdWords advertiser’s in choosing relevant keywords based on search behavior and a specified website. This tool is explained by Google here and gives more in depth information on account organization and cost.  

But even this tool is not on par with the other two when it comes to impressions. A random query in the Search-based tool returned a suggestion for the keyword "maragogi." The Search-Based tool says there should be 12,000 monthly searches. The Legacy tool returns 110 Local Exact match searches, 33,100 Global Exact match, and 201,000 Global Broad match. The new tool returns information only for a global setting (all countries, all languages). That returns 74,000 searches broad and phrase match, and 12,100 for exact match. It seems like the Search-based tool is more like the exact global match in this one instance. But what is a business supposed to do with all of these numbers?!?!?

(hint: always use exact match)

Back to Strategy

If these tools are possibly inaccurate, how do our clients go about setting their yearly strategy goals?

Put simply, in search, you never want to rely on one set of results or one ranking report. Data over time and from many sources is best. But with the lack of tools out there and Google bringing in at least 65% of traffic organically for most sites, how do you get the best numbers? 

Impressions

First, you need to start out by figuring out how many impressions a keyword or set of keywords can bring in on average for a specific month. If you are in a cyclical industry, this will have to be done per month of the calendar year. 

1. Pull from both Google Tools and other Keyword Tools

Below is a look at some information I pulled using the tools mentioned for the key phrase "curtain fabric."

The idea here is that if you take into account all of the numbers out there, you might see a trend that you can use for estimating future traffic. If there is no trend, then a median of the numbers can be used as your metric. A few other tools that you might look into include Word Tracker and Keyword Spy. You can see that the numbers are all over the place, but looking at these figures, I’d guess that the keyword might bring in around 6,500 impressions a month in the UK. 

The downside is that WordTracker and KeywordSpy don’t allow you to look at exact match information versus broad match. When performing keyword research, you always want to look at the local (target to your country) exact match information. Too many people pull keyword information use broad match and get inflated numbers for all phrases related to that key phrase. 

2. Run a PPC campaign if possible.

The absolute best way to get accurate numbers about traffic over time is to run a PPC campaign. I pulled some numbers from a few campaigns (for our client’s sake we have masked a number of the actual key phrases) in attempts to see if the new keyword tool is accurate to actual trafffic in August. The keywords pulled were all exact match in the campaign and the information pulled from the keyword tool was Local Exact and set to the country that the campaign was targeting. 

As you can see, some of these are higher and some lower. What I found that there really is no definitive answer of if the Google Keyword Tool is accurate. Take a look at the results for the example I used before, curtain fabric. The campaign saw 11,389 impressions, much higher than the new keyword tool, and lower than some other keyword tools. This is why a well run PPC campaign is important if you want to get a more accurate look at impression numbers. 

Please note that I didn’t get a chance to ensure that these accounts were all showing at all times during the month, but they were all accurately geo-targeted and all showed on the top of the first page on average. 

Finding Traffic Based on Rank

After getting a good idea of the number of impressions, you then need to take into account where you are showing for that keyword on average organically (aka your rank). While we cannot know specific click through numbers for every search done on the web, there have been some studies done on how much of those impressions the top organic result gets, the second and so on. The one I used the most often is from Chitika. Using the percent of the traffic below and the impression numbers, you should be able to get a good idea of the visitors you can expect per month organically for a specific key phrase.

 

So using the "curtain fabric" example, assuming that the site I am working on has maintained an average ranking over the last few months of #3 organically, I could expect about 1300 visits from Google for the keyword in a month (11.42% of 11,389 impressions).

Past Metrics

Once you
get everything figured out, keep in mind that your past metrics are another good way of seeing how close you are to getting the traffic about right. Assuming that no major changes have occurred (like lack of metrics data in the last year), a look back is the most accurate way to understand traffic flow and trending on your site. Pull the unique visitors for every month of the last year and do some analysis on percent increase month over month. This can be done on any level in most analytics programs - overall traffic trends all the way down to the keyword level. 

A look at overall traffic per month in Google Analytics for organic searches from Google:

A look at traffic for a specific keyword over the last year per month from Google organic:

Educated Guesses

In the end though, making predictions are just that, educated guesses. Pulling data from all available sources and using your own historical data can assist in making an educated prediction for the next year. Keep in mind though that things never stay the same. Google Instant just proved that with one of the biggest changes we have seen in a while. 

Do you like this post? Yes No


SEOmoz Daily SEO Blog
Alexa Logo Alexa, a free and well-known website information tool, recently released a paid service.

For 9 per site Alexa will audit your site (up to 10,000 pages) and return a variety of different on-page reports relating to your SEO efforts.

It has a few off-page data points but it focuses mostly on your on-page optimization.

Alexa Site Audit Review Homepage

You can access Alexa’s Site Audit Report here:

http://www.alexa.com/siteaudit

Report Sections

Alexa’s Site Audit Report breaks the information down into 6 different sections (some which have additional sub-sections as well)

  • Overview
  • Crawl Coverage
  • Reputation
  • Page Optimization
  • Keywords
  • Stats

The sections break down as follows:

Site Audit sections and subsections

So we ran Seobook.com through the tool to test it out :)

Generally these reports take about a day or two, ours had some type of processing error so it took about a week.

Overview

The first section you’ll see is the number of pages crawled, followed by 3 “critical” aspects of the site (Crawl Coverage, Reputation, and Page Optimization). All three have their own report sections as well. Looks like we got an 88. Excuse me, but shouldn’t that be a B+? :)

So it looks like we did just fine on Crawl Coverage and Reputation, but have some work to do with Page Optimization.

Alexa Site Audit Overview

The next section on the overview page is 5 recommendations on how to improve your site, with links to those specific report sections as well. At the bottom you can scroll to the next page or use the side navigation. We’ll investigate these report sections individually but I think the overview page is helpful in getting a high-level overview of what’s going on with the site.

Alexa Site Audit Overview

Crawl Coverage

This measures the “crawl-ability” of the site, internal links, your robots.txt file, as well as any redirects or server errors.

Reachability

The Reachability report shows you a break down of what HTML pages were easy to reach versus which ones were not so easy to each. Essentially for our site, the break down is:

  • Easy to find – 4 or less links a crawler must follow to get to a page
  • Hard to find – more than 4 links a crawler must follow to get to a page

The calculation is based on the following method used by Alexa in determining the path length specific to your site:

Our calculation of the optimal path length is based on the total number of pages on your site and a consideration of the number of clicks required to reach each page. Because optimally available sites tend to have a fan-out factor of at least ten unique links per page, our calculation is based on that model. When your site falls short of that minimum fan-out factor, crawlers will be less likely to index all of the pages on your site.

Alexa Site Audit Reachability Report

A neat feature in this report is the ability to download your URL’s + the number of links the crawler had to follow to find the page in a .CSV format.

Alexa Site Audit Reachability Report Download Links

This is a useful feature for mid-large scale sites. You can get a decent handle on some internal linking issues you may have which could be affecting how relevant a search engine feels a particular page might be. Also, this report can spot some weaknesses in your site’s linking architecture from a usability standpoint.

On-Site Links

While getting external links from unique domains is typically a stronger component to ranking a site it is important to have a strong internal linking plan as well. Internal links are important in a few ways:

  • The only links where you can 100% control the anchor text (outside of your own sites of course, or sites owned by your friends)
  • They can help you flow link equity to pages on your site that need an extra bit of juice to rank
  • Users will appreciate a logical, clear internal navigation structure and you can use internal linking to get them to where you want them to go

Alexa will show you your top linked to (from internal links) pages:

Onsite Links Alexa Site Audit

You can also click the link to the right to expand and see the top ten pages that link to that page:

Expanded Onsite Links Report

So if you are having problems trying to rank some sub-pages for core keywords or long-tail keywords, you can check the internal link counts (and see the top 10 linked from pages) and see if something is amiss with respect to your internal linking structure for a particular page.

Robots.txt

Here you’ll see if you’ve restricted access to these search engine crawlers:

  • ia_archiver (Alexa)
  • googlebot (Google)
  • teoma (Ask)
  • msnbot (Bing
  • slurp (Yahoo)
  • baiduspider (Baidu)
Site Audit Robots.Txt

If you block out registration areas or other areas that are normally restricted, then the report will say that you are not blocking major crawlers but will show you the URL’s you are blocking under that part of the report.

There is not much that is groundbreaking with Robots.Txt checks but it’s another part of a site that you should check when doing an SEO review so it is a helpful piece of information.

Redirects

We all know what happens when redirects go bad on a mid-large sized site :)

Redirects Gone Bad

This report will show you what percentage of your crawled pages are being redirected to other pages with temporary redirects.

The thing with temporary redirects, like 302′s, is that unlike 301′s they do not pass any link juice so you should pay attention to this part of the report and see if any key pages are being redirected improperly.

Redirect Report Alexa Site Audit

Server Errors

This section of the report will show you any pages which have server errors.

Alexa Site Audit Server Errors

Making sure your server is handling errors correctly (such as a 404) is certainly worthy of your attention.

Reputation

The only part of this module is external links from authoritative sites and where your site ranks in conjunction with “similar sites” with respect to the number of sites linking to your sites and similar sites.

Links from Top Sites

The analysis is given based on the aforementioned forumla:

Alexa Reputation

Then you are shown a chart which correlates to your site and related sites (according to Alexa) plus the total links pointing at each site which places the sites in a specific percentile based on links and Alexa Rank.

Since Alexa is heavily biased towards webmaster type sites based on their user base, these Alexa Rank’s are probably higher than they should be but it’s all relative since all sites are being judged on this measure.

Alexa Site Audit Link Chart

The Related Sites area is located below the chart:

Related Sites Link Module Alexa Audit

Followed by the Top Ranked sites linking to your site:

Alexa Site Audit Top Ranked Sites

I do not find this incredibly useful as a standalone measure of reputation. As mentioned, Alexa Rank can be off and I’d rather know where competing sites (and my site or sites) are ranking in terms of co-occurring keywords, unique domains linking, strength of the overall link profile, and so on as a measure of true relevance.

It is, however, another data point you can use in conjunction with other tools and methods to get a broader idea of your site and related sites compare.

Page Optimization

Checking the on-page aspects of a mid-large sized site can be pretty time consuming. Our Website Health Check Tool covers some of the major components (like duplicate/missing title tags, duplicate/missing meta descriptions, canonical issues, error handling responses, and multiple index page issues) but this module does some other things too.

Link Text

The Link Text report shows a break down of your internal anchor text:

Link Text Report Alexa

Click on the pages link and see the top pages using that anchor text to link to a page (shows the page the text is on as well as the page it links too):

Link Expansion Site Audit Report

The report is based on the pages it crawled so if you have a very large site or lots and lots of blog posts you might find this report lacking a bit in terms of breadth of coverage on your internal anchor text counts.

Broken Links

Checks broken links (internal and external) and groups them by page, which is an expandable option similar to the other reports:

Alexa Broken Links Report Xenu is more comprehensive as a standalone tool for this kind of report (and for some of their other link reports as well).

Duplicate Content

The Duplicate Content report groups all the pages that have the same content together and gives you some recommendations on things you can do to help with duplicate content like:

  • Working with robots.txt
  • How to use canonical tags
  • Using HTTP headers to thwart duplicate content issues
Alexa Duplicate Content Overview

Here is how they group items together:

Alexa Duplicate Content Grouped Links

Anything that can give you some decent insight into potential duplicate content issues (especially if you use a CMS) is a useful tool.

Duplicate Meta Descriptions

No duplicate meta descriptions here!

Alexa Site Audit Duplicate Meta Descriptions

Fairly self-explanatory and while a meta description isn’t incredibly powerful as standalone metric it does pay to make sure you have unique ones for your pages as every little bit helps!

Duplicate Title Tags

You’ll want to make sure you are using your title tags properly and not attacking the same keyword or keywords in multiple title tags on separate pages. Much like the other reports here, Alexa will group the duplicates together:

Alexa Site Audit Duplicate Title Tags

They do not currently offer a missing title tag or missing meta description report which is unfortunate because those are worthwhile metrics to report on.

Low Word Count

Having a good amount of text on a page is good way to work in your core keywords as well as to help in ranking for longer tail keywords (which tend to drive lots of traffic to most sites). This report kicks out pages which have (in looking at the stats) less than 150 words or so on the page:

Alexa Site Audit Low Word Count

There’s no real magic bullet for the amount of words you “should” have on a page. You want to have the right balance of word counts, images, and overall presentation components to make your site:

  • Linkable
  • Textually relevant for your core and related keywords
  • Readable for humans

Image Descriptions

Continuing on with the “every little bit helps” mantra, you can see pages that have images with missing ALT attributes:

Alexa Site Audit ALT Attribute Overview

Alexa groups the images on per page, so just click the link to the right to expand the list:

Alexa Site Audit ALT Attribute Groupings

Like meta descriptions, this is not a mega-important item as a standalone metric but it helps a bit and helps with image search.

Session IDs

This report will show you any issues your site is having due to the use of session id’s.

Alexa Site Audit Session ID

If you have issues with session id’s and/or other URL parameters here you should take a look at using canonical tags or Google’s parameter handling (mostly to increase the efficiency of your site’s crawl by Googlebot, as Google will typically skip the crawling of pages based on your parameter list)

Heading Recommendations

Usually I cringe when I see automated SEO solutions. The headings section contains “recommended” headings for your pages. You can download the entire list in CSV format:

Automated Headings Alexa

The second one listed, “interface seo”, is on a page which talks about Google adding breadcrumbs to the search results. I do not think that is a good heading tag for this blog post. I suspect most of the automated tags are going to be average to less than average.

Keywords

Alexa’s Keyword module offers recommended keywords to pursue as well as on site recommendations in the following sub-categories:

  • Search Engine Marketing (keywords)
  • Link Recommendations (on-site link recommendations

Search Engine Marketing

Based on your site’s content Alexa offers up some keyword recommendations:

Alexa Site Audit Keyword Recommendations

The metrics are defined as:

  • Query – the proposed keyword
  • Opportunity – (scales up to 1.0) based on expected search traffic to your site from keywords which have a low CPC. A higher value here typically means a higher query popularity and a low QCI. Essentially, the higher the number the better the relationship is between search volume, low CPC, and low ad competition.
  • Query Popularity (scales up to 100) based on the frequency of searches for that keyword
  • QCI – (scales up to 100) based on how many ads are showing across major search engines for the keyword

For me, it’s another keyword source. The custom metrics are ok to look at but what disappoints me about this report is that they do not align the keywords to relevant pages. It would be nice to see “XYZ keywords might be good plays for page ABC based on ABC’s content”.

Link Recommendations

This is kind of an interesting report. You’ve got 3 sets of data here. The first is the “source page” and this is a listing of pages that, according to Alexa’s crawl, are pages that appear to be important to search engines as well as pages that are easily crawled by crawlers:

Alexa Site Audit Link Recommendations

These are pages Alexa feels should be pages you link from. The next 2 data sets are in the same table. They are “target pages” and keywords:

Alexa Site Audit Link Recommendations Target

Some of the pages are similar but the attempt is to match up pages and predict the anchor text that should be used from the source page to the target page. It’s a good idea but there’s a bit of page overlap which detracts from the overall usefulness of the report IMO.

Stats

The Stats section offers 3 different reports:

  • Report Stats – an overview of crawled pages
  • Crawler Errors – errors Alexa encountered in crawling your site
  • Unique Hosts Crawled – number of unique hosts (your domain and internal/external domains and sub-domains) Alexa encountered in crawling your site

Report Stats

An overview of crawl statistics:

Alexa Site Audit Report Stats

Crawler Errors

This is where Alexa would show what errors, if any, they encountered when crawling the site

Alexa Site Audit Crawl Errors

Unique Hosts Crawled

A report showing which sites you are linking to (as well as your own domain/subdomains)

Alexa Site Audit Unique Hosts

Is it Worth 9?

Some of the report functionality is handled by free (in some cases) tools that are available to you. Xenu does a lot of what Alexa’s link modules do and if you are a member here the Website Health Check Tool does some of the on-page stuff as well.

I would also like to see more export functionality especially in lieu of white label reporting. The crawling features are kind of interesting and the price point is fairly affordable as one time fee.

The Alexa Site Audit Report does offer some benefit IMO and the price point isn’t overly cost-prohibitive but I wasn’t really wowed by the report. If you are ok with spending 9 to get a broad overview of things then I think it’s an ok investment. For larger sites sometimes finding (and fixing) only 1 or 2 major issues can be worth thousands in additional traffic.

It left me wanting a bit more though, so I might prefer to spend that 9 on links since most of the tool’s functionality is available to me without dropping down the fee. Further, the new SEOmoz app also covers a lot of these features & is available at a monthly price-point, while allowing you to run reports on up to 5 sites at a time. The other big thing for improving the value of the Alexa application would be if they allowed you to run a before and after report as part of their package. That way in-house SEOs can not only show their boss what was wrong, but can also use that same 3rd party tool as verification that it has been fixed.

SEO Book.com – Learn. Rank. Dominate.

Posted by fabioricotta

Hi SEOmoz folks,

Sometimes we begin a new SEO consulting job and do not to know where to start our Link Building. We have a lot of options but the first thing I really like to do is to analyze what my competitors are doing. As we know, one of the best ways to analyze backlinks is by using Open Site Explorer (OSE). With this tool we can submit a domain and see which pages on the web are linking to it and some awesome metrics. We can use it to begin our analysis.

The first thing you need to do is to create a competitor list. Then you need to go to OSE and insert your competitor(s) domain(s). Then you will filter by links from "External Pages Only" and "All Pages in the Root Domain", as you can see below. With these filters, we guarantee that we will have an overall look at your competitors’ website backlinks.

Open Site Explorer

After the above steps, we need to export all this data by clicking on "Export to CSV". After that, you will import this data to Excel:

Import CSV to Excel

Next, you will remove the 6 first lines, as they are only comments. Then you need to select the first line, click on the Data Tab and select "Filter". This will give you the ability to sort every column by some filters.

Now we can begin our competitor analysis. For this part, I have chosen 9 commonly used link building strategies that you can use OSE and find what your competitors are doing. So, let’s take a look:

Finding Directories

As some SEOs know, using Directories as part of your link building strategy can provide a good value to your backlink profile. If your competitor is using any directory strategy, we can find it using OSE data, filtering the Title column by the text filter "directory" or you can filter the URL column with text "directory". The good part of that is that you can see the Page Authority and Domain Authority of each directory page that your competitor is listed in and figure out to which one you should submit your website. A "bonus" filter you can use is filter by PA above 5 and DA above 20, so you will remove all the bad directories from your list.

Niche Forums

One thing that I really like are forums, maybe it’s because most of my knowledge came from there. Well, thinking about link building and SEO, when you find a niche directory, you find a community that talks about the same thing (or related) as you. If those members recommend your services you can get really good leads. So, one thing you can do is to investigate which forums your competitors were recommended in, so you can interact with those people. The idea here is to filter the Title column by the text filter "forum" or you can filter the URL column with text "forum". Using it will retrieve all the forums that provide at least 1 link to your competitor. You can use the same tip here that I gave in the last topic.

Powerful Profile Pages

Sometimes when we do a link building strategy we use some profiles to post and interact with customers and people about our website. And sometimes, those platforms that we use for it provide ways to drop a link (eg. user website). Based on this idea, one cool idea is to check which social networks your competitor is working. You can do it easily by filtering the URL column or Title column by text filters "user" or "profile". After identifying those profiles check how you competitor is working with it, like how is he interacting with the community, check if he is creating new content, check which keywords he is using on that new content.

A good tip here is to check the backlinks to that profile page. We noticed that some competitors are buying links for that profile page, so they can get more juice and spread it to their content. I am not telling you to do the same, but maybe you can file a spam report.

Tag Pages

A common and cheap link building tactic is to submit your website to social bookmarking websites. Sometimes, social bookmarking does not provide a strong enough value, but many SEOs use it as a base for their link building strategy. So, you can find which social bookmarking websites your competitors are using. The good thing (tip) here is to find a niche social bookmarking website. Those kind of websites can provide you some good leads as they are related to your niche. So, be careful when checking this.

To find the tag pages and then the social bookmarking websites, you can filter the Title column by text filters "tag" or "tagged". Another filter you can use is "tag" in the URL column.

Where They are Submitting Articles

As Rand pointed in a previous Whiteboard Friday, if you create a good Article Submission strategy you can get some good links and traffic. For example, you can filter the URL column with some already known article directories ("ezinearticles.com", "amazines.com", "articlealley.com", "articleindex.net", "goarticles.com", "articlesltd.net", "365articles.com", "articletrader.com", "articlesbase.com", "thebestarticles.com", "mycontentbuilder.com", "thinkarticle.com", "articlerumble.com", "gsarticles.com", etc…).

The idea here is to find where your competitors are gaining links and then find their profiles. After that, grab a list of all articles that they posted and run a OSE report for each link (you can do it using the SEOmoz API). Check which ones have a large number of backlinks. Then you need to check why they attracted so many links and just use that idea to create some new content.

A bonus tip here is that some article directories enable comments with link… so, try to comment in your competitors’ best articles.

Resource Pages as Good Backlink Sources

Some years ago, one of the common things that webmasters did was to create pages listing some useful links as resources. Nowadays it’s not common but the point is that there are a lot of resource pages out there. So you can check if your competitor is listed in any resource page and then ask the webmaster to include your valuable website. It’s really easy, but don’t forget to be generous and really show that your website can help their visitors.

To find the resources page, you can filter the URL column using the text filter "resources". I’ve tried to filter the Title column but I didn’t like the results I found.

Competitors Press Releases

When we talk about press releases we need to be careful about our objectives. The first thing here is to identify which company your competitor is using to distribute their press releases. So you can filter the URL column by the common PR Distribution companies ("prweb", "send2press", "prnewswire", etc…) and since those companies sometimes publish the press release inside their domain, you can find your competitor’s press releases. The second step is to grab a list of all press releases they published and do the same thing I told you about article directories’ profiles. Find which are the most linked press releases and why. This will give you some advantage in your next press release.

Linkbait with InfoGraphics

One of the latest link building tactics is to create amazing InfoGraphics. The cool thing for link building is that if you create a good infographic it can go viral and provide a lot of backlinks. So the point here is to see if your competitors are using infographics to get links. To check it, just filter the Title column by text filter "infographic" and you will find the list of infographics that give links to your competitors.

The point here is that you can tell me "Hey, when I create an InfoGraphic I post it at my site, not in someone’s else blog". You are right, but the point here is that some websites can’t use / post those kind of images inside their structure, so they need to publish it as guest post.

A tip here is: if you find an infographic inside a blog, don’t forget to comment in the comments area. You can get some value there.

Trusted links: Any .EDU or .GOV links?

Most of the linkbuilders love .edu and .gov links. They are strong, they are trusted and they really rock. Based on that, you can check if your competitors have any link coming from any of those TLDs. You can find it filtering the URL column by text filter ".edu" or ".gov".

You need to check why your competitors have those links and then try to find a way to get them. Don’t forget to avoid those .edu crap networks.

Wikipedia Links

Worldwide known, Wikipedia is a great source of visitors and leads. We can’t count their backlinks because of nofollow, but they still provide value by sending you traffic. We made some Wikipedia strategies for some clients and those links are just growing our referral visitors. You can find the Wikipedia pages that link to your competitors by just filtering the URL column by text filter "wikipedia.org".

One thing to remember is that Wikipedia (moderators) does not like spam or commercial stuff. So the easy way we find to get a link from them is by adding some valuable content, specially when you adds notes about statistics that you published in your press release. This really rocks and in most cases they allow you to reference your data source (you).

Conclusions

We saw in this article that using a SEO tool such as Open Site Explorer could help you to find what our competitors are doing, providing us some insights on how to create our SEO strategy. It is important to highlight that I am not telling you to get the same backlinks that your competitors had, but I am trying to show you is that you can begin your strategy by getting the best of what your competitors did, and then, improve with your own ideas.

Hope you liked this post!

Fabio Ricotta is the Co-Founder of MestreSEO, a brazillian SEO company.

Do you like this post? Yes No


SEOmoz Daily SEO Blog

Posted by Danny Dover

  Want happier website visitors and higher rankings? This week’s Whiteboard Friday is about how and why to speed up your website. It is more technical than previous videos so I tried to spice it up with an ode to one of my favorite canceled TV Shows, Pop-up Video. Can’t stand the content? At least the added commentary is entertaining. (It is the perfect plan ;-p)



7 Ways to Take Advantage of Google’s Site Speed Algorithm

The following are seven proven techniques well known websites use to boost their site speed.

1. Enable Gzip

Gzip is a open source compression algorithm that can used to compress your website’s content before your server sends the data to a visitor’s browser. This makes your servers job easier and makes pages load faster for your users. You can learn how to enable Gzip here.

2. Minify Javascript/CSS

Minify is the process (and software) for removing unnecessary formatting characters from code. This makes your files smaller and your visitors happier. You can learn all about this process here.

3. Use a CDN (Content Distribution Network)

CDNs are systems of interconnected server resources that spread content and assets around the globe to shorten the distance between server and prospective user. They are commonly used by the Web’s most popular websites. You can find a list of free CDNs here.

4. Optimize Images

You can take advantage of the countless man hours that have been devoted to image compression and make your users happier by simply saving your images as the appropriate type. As a very general rule of thumb, I recommend saving photos as JPEGs and graphics as PNGs.

5. Use External Javascript/CSS

When a browser requests a website from a server it can only download a set number of files of the same type at any given point. While this isn’t true of all file types, it is a good enough reason to host applicable files on alternative subdomains. This is only recommended for sites where the pros of speed will outweigh the SEO cons of creating a new subdomain.

6. Avoid Using Excess Redirects

While redirects can be extremely useful, it is important to know that implementing them does force your servers to do slightly more work per applicable request. Always avoid redirect strings (301 -> 301 -> 200 or even worse 301 -> 302 -> 200) and use these tools sparingly.

7. Use Fewer Files

The most straightforward way to speed up your website is to simply use fewer files. Less files means less data. My favorite method of doing this is utilizing CSS sprites. You can read how popular websites are using this trick here.


Google’s Mission to Speed Up the Web

Fueled by the massive potential of the Internet, Googlers are working on many projects in their attempt to speed up the Web:



Follow me on Twitter, Fool!
or
Follow SEOmoz on Twitter (who is slightly less blunt)

If you have any other advice that you think is worth sharing, feel free to post it in the comments. This post is very much a work in progress. As always, feel free to e-mail me if you have any suggestions on how I can make my posts more useful. All of my contact information is available on my SEOmoz profile under Danny. Thanks!

Do you like this post? Yes No


SEOmoz Daily SEO Blog

Posted by randfish

After last week’s Whiteboard Friday on the penalties paid links can incur, I got several questions about whether paid/spammy links could be used as a weapon to potentially harm someone else’s rankings. In this post, I’ll walk through why this is rarely the case, how you can defend yourself from potential scenarios and why this isn’t a great tactic to employ against your competitors.

Can Paid Links Be Used as Weapons in the SERPs?

The short answer is "almost never." But, as is typical in the SEO world, there’s a lot more in the long version.

In general, it’s very, very hard to bring down a white hat site/page ranking well in the search results. Although Google isn’t perfect at catching spam (e.g. our recent video featuring the success of some very obvious paid links in a well known network), they seem to be surprisingly excellent (almost prescient) at detecting the intent of links. My suspicion is that sites who buy links to prop up their own rankings have very different patterns than those who have competitors buying links to them. These patterns exist on the sites themselves, in other sites registered to the owners, in link footprints and in usage/search behavior.

Effect of Spammy/Paid Links on Websites

It could, in fact, be that the "penalties" many SEOs often ascribe to paid links are in fact the result of a much more sophisticated analysis by Google looking at multiple aspects of a site’s presence before making a determination of the link intent. Given that, in nearly 10 years of SEO, I’ve only heard of two reasonably verifiable instances of "Google-bowling" (the process of pointing bad links at a site or page to hurt it’s rankings) working, my guess is that Google’s webspam team has developed some very impressive methods here.

Many SEOs have also suggested that a certain "bar of trust" can be achieved in Google, after which, negative links may be devalued, but likely don’t cause penalties or rankings drops. This makes a lot of sense to me (though it’s nearly impossible to prove), since "Google-bowling" is largely defeated and even good sites who stray into black/gray hat link building will simply find themselves wasting money, rather than being removed from the results (which could, for many popular brands/sites, cause a loss of relevance in the results for users).

Thus, if you are trying to wield paid links as a weapon against your ranking competitors, it’s far more likely to work against the new(ish) site ranking #65 for your keywords rather than those who’ve earned their way to the top spots with white hat techniques.

Defending Yourself from Potential Link Attacks

Have you recently broken the heart of a black hat link broker’s son or daughter? Stepped on a link farmer’s superhero cape? Talked smack about a nefarious panelist at an SEO conference not realizing they were just around the corner? The best defense, in this case, is a good defense (don’t go buying and renting links to others; you’re only enriching the spammers).

Many, many SEOs and webmasters worry a tremendous amount about spammy links pointing to their sites and pages. By and large, this isn’t a concern and it happens to every site on the web. Just look at some of the spamtastic links that point to SEOmoz (via this Yahoo! query):

Spammy Links to SEOmoz

If you see a collection of scraper sites filled with pharmaceutical, financial, legal, real estate and other questionable links with surprisingly well-optimized anchor text appearing in Google Alerts or your 24-hour reputation monitoring queries (e.g. http://www.google.com/search?as_q=seomoz&as_qdr=d&num=100 - which queries Google for all pages mentioning "seomoz" in the past 24 hours) don’t panic. If you exist on the web, you’re going to attract these types of links and the search engines will not punish you for it, even if you’re a relatively new, untrusted site.

However, if you start acquiring links that look an awful lot like they’re part of an intentional, paid link network (great anchor text, pointing to internal pages on the site, coming from footers and sidebars that contain other irrelevant, anchor-text rich links), there may be some cause for concern. Your best course of action is to submit a spam report to Google from your own, verified, Webmaster Tools account, noting that you have nothing to do with the links and want to make sure Google doesn’t think you’ve created, endorsed or paid for them.

This action is rarely necessary or worthwhile, but if you’re highly concerned about competitive conduct, it’s not a bad route to take. Of course, you’ll want to make sure you don’t actually engage in any black/gray hat activity yourself or it could trigger the wrong kind of review by a webspam team member.

Should I Buy Links to Push Down My Competitors?

Not unless you feel the link brokers of the world are more worthy than your favorite charity.

Seriously, the chances you’ll have a negative impact are far lower than the changes you’ll actually help (again, I refer back to our paid link WB Friday experiment in which the obvious link network had positive effects, even on the brand new site). The money is far better off spent on editorial content, public relations, social media campaigns and white hat SEO efforts for your own stite. Bringing someone else down may seem temporarily, emotionally satisfying, but it’s the wrong way to approach SEO (and life in general, if I may be so bold).

Looking forward to the discussion in the comments and happy to talk through the filtration processes and failsafes (or at least, my speculation) Google may employ.

p.s. The new Beginner’s Guide to SEO has more on understanding + recovering from search spam penalties.

Do you like this post? Yes No


SEOmoz Daily SEO Blog

Posted by Lindsay

A typical SEO site audit takes me around 50 hours to complete. If it is a small site (<1000 pages), I am working efficiently, and the client hasn’t requested a lot of extra pieces, this figure can come in as low as 35 hours. If the site is large and has a lot of issues to document, the time investment inches closer to 70 hours.

At SEOmoz, we usually asked for a project time-line of six weeks to complete a full site audit. You need the extended schedule for resource coordination, editing for uniform voice and additional considerations when a team is involved. Even working on my own I prefer a six week time-line because it allows me to juggle several projects simultaneously and to put-down and pick-up various pieces as the mood strikes.

Regardless of how much time I spend on an audit, the best stuff is usually revealed in the first day. At the beginning of a project you’re excited, the client is excited and there is so much undiscovered opportunity! In this post, I’ll outline my recommendations for making the most of day one on a new SEO audit project. I’ve organized it by retro digital clock time stamp for your visual pleasure.

8:00 - Template Prep

Template Preparation

You have a 9:00 client call, so you better get cracking! Take the time upfront to get your documents ready. The first thing I do once I’ve received a signature on the dotted line is prepare two files; my Excel scorecard and the Word audit document.

The audits I’ve worked on have always been extremely custom. Even so, the base document without client content is around 20 pages. This may sound like a lot, but once you prepare a cover sheet, table of contents, the appropriate headings and sub-headings for all the important SEO factors, and short (reusable) descriptions about each factor… it adds up to a hearty file.

I recommend that you create the base Word and Excel files and save them. Try not to work backwards off of an existing audit that you have on hand. Before I was an SEO myself, I was an SEO client of several smart folks. More than once the deliverables I received included other client names. It happens! ‘CTRL+F’ is not fool proof.

9:00 Client Call

The Client Call

Whether you closed the deal yourself or you are lucky enough to have a fleet of salespeople doing that type of leg-work for you, a client kick-off call once the deal has been signed is important. Spend an hour getting to know your primary contacts. Hopefully this includes a senior stakeholder, a marketing lead, and a development lead. More often then not, these meetings are over the phone with the assitance of a web conferencing tool like GoToMeeting.

A sample agenda is as follows;

  • Introductions (all)
  • Site Tour (client)
  • Past & Present SEO Initiatives (client)
  • Key Areas of Concern (client)
  • What is Required to Get Things Implemented (client)
  • Review of Statement of Work & Deliverables Schedule (you)

When you come out of this meeting, you should have an excellent understanding of the website, business needs, and key pain points from the client. You’ll also have had an opportunity to set expectations.

Bonus Tip: If you are working with an in-house SEO person, find out about the projects they have been trying to push through. You may be able to help them get that SEO enhancement moved up the development pipeline and make them look good in the process.

Coffee Break!

Use this time to recharge your caffeine and make notes about the call.

10:15 Leverage Coworkers

Leverage Your Coworkers

If you are part of a consulting team, like we had at SEOmoz, ping the other SEOs. This is expecially true if you will be tackling this particular project solo. Send them an email and request that they conduct a quick 15 minute assessment of the site. We did this with great success at SEOmoz. With a dream team that included Rand, Jen and Danny the output of 45 combined quick assessment minutes was incredible.

If you are an indepenent SEO, you can still use a system like this. Form a group of trusted SEOs and provide this support for each other. Be mindful of NDAs and potential conflicts of interest (see Sarah’s post on consulting contracts for more great details).

1030 Free Form Exploration

Free Form Exploration

I’m pretty structured in my approach to SEO auditing, but there is nothing structured about my process during the free form exploration phase. I’m all about creating efficiencies through discipline and a deliberate work plan. That is what gets the project done and brings home the bacon. However, I always set aside at least three hours for unstructured play and exploration.SEO is part art and part science. The actions I’m attempting to describe here are definitely more Pablo Picasso than Marie Curie.

I fire up all of my FireFox Plugins and browse the site, start GSiteCrawler, hit-up Google with a flurry of search operators, run LinkScape/Open Site Explorer, have a grand ol’ time in SEOmoz Labs, and check out the keyphrase landscape with Quintura and SEMrush. One find leads to another and I never know where I’ll end up. No two sites are alike and I’m still coming across things I’ve never seen with each new audit.

CFA Page Analysis
Analyze Page via the mozBar showing a less-than-fantastic title tag

I’d say I find 80% of a site’s issues and opportunities during this brief free form exploration. Most of the remaining 45+ hours of a project are spent elaborating on the findings and detailing the action plan to support my original finds.

Be sure to take notes and screen shots as you go. Bonus points if you manage to input them directly into your master Word file. Huge time saver.

1:30 Lunch

Lunch

Try to step away from the laptop, but bring a notepad with you. No doubt your brain will still be working as your hands work to fill your belly.

2:30 Client Email

Client Email

Based on the morning’s kick-off call and your findings in the free form exploration process you no doubt have a few questions for the client. If you don’t already have access to Webmaster Tools and analytics, now is a good time to ask. I usually have questions for the client about things that aren’t always apparent from an external view of the site such as how their expiring content policies work. This follow-up email keeps the communication lines open, impresses the client because you’ve uncovered so much opportunity already, and gives them a chance to ask additional questions or provide more info.

3:00

Populate Some Data

At the end of a busy day I like to shift my focus to something that requires less brain power and benefits from simple funcitons like copy & paste. I usually wrap up my day by populating things like the current robots.txt file (for analysis later), top 25 links from Open Site Explorer, etc.

Top Pages via OSE
Top Pages via OSE – Yikes! They need to fix those 404s…

Action Items

  1. Take the time to set-up your templates first.
  2. Schedule a call with the to kick off the project.
  3. Ping your coworkers or a small private SEO network to give a quick assessment.
  4. Give yourself time to play and explore freely.
  5. Get key follow-up questions into the client early.
  6. Choose something easy for the end of the day.

Thanks for giving me a read! I’m working on a bi-weekly series that covers all things audit. If you liked this, you might also like 4 Ways to Improve your SEO Site Audit. You can find me in SEOmoz’s PRO Q&A and on Twitter as @Lindzie.

Do you like this post? Yes No


SEOmoz Daily SEO Blog