Posted by richardbaxterseo
If you think about it, search engines are more or less constantly driving us SEO people to keep our technical SEO strategy in a state of constant refinement. This “evolution” of the marketing environment we thrive in is a great thing, it challenges us to come up with new ways to improve our traffic generating capabilities, user experience and the overall agility of our websites.
Here are a few ideas (6, to be exact) based on issues I’ve encountered in QA or on our recent client work that I hope will provide a little food for thought the next time you’re planning SEO enhancements to your site.
1) Leverage UGC (review content) beyond the product page
UGC is brilliant, particularly on product, content thin or affiliate sites. Making it easy for users to leave a review is powerful stuff, but are you making the most of your precious user generated comments? Consider this scenario. So many users leave product reviews on your pages that you decide to limit the number of reviews visible for that product. You could cherry pick some UGC for category listings pages, adding to the uniqueness of those otherwise content-thin pages too.
2) Use “other users found this document for”I know Tom loves this trick, and rightly so. You can turn insightful recent searches data into valuable on-page uniqueness. Internal search and external referrals are great, but how about extending the process to make it easy for users to evaluate, extend, tag or remove terms they feel are irrelevant to the page?
This simple example shows how users of a forum site may have found that thread. I think there’s a whole lot more you can do with this trick, but it’s a start:
3) Consider delivering search engine friendly URLs in your internal site search results
I know how “out there” this might initially sound, but why settle for search engine unfriendly URLs on your internal site search pages? I have seen lots of examples of links being awarded to unfriendly, internal site search URLS. Why do we spend so much time carefully crafting our external URLs, only to completely forget our internal search URLs? A little extra development work to apply a meaningful pattern to your search result page URLs today could lead to the construction of an entirely new content type down the line.
Look at how folks are linking to these search query pages, and note the first example (where instead of a URL rewrite, this site is using breadcrumbs to make their ranking page URL appear more friendly):
4) Microformats are really gaining traction – be creative with them
What we’ve found with Microformats is that webmasters tend to apply the markup to web pages hosting the content, but that’s where they stop. Imagine you have a website that sells tickets. Do you add hCalendar to your event page and walk away? No! You can nest other Microformats such as hProduct and hReview, and syndicate your formatted data to other internal pages, snippets on your homepage and category pages. Any mention of an event, a link to a product or a review snippet should use the appropriate mark-up, consistently across your website.
5) Work hard to resolve errors and improve site speed
Think about how Google have placed site performance at the top of their agenda. I genuinely believe that a site riddled with performance issues and errors is tolerated less today by search engines than ever before. Websites with platform issues can raise serious problems for SEO, users, conversion and repeat visits. Fortunately, there are plenty of tools (including SEOmoz Pro, IIS Toolkit, Pingdom Tools and Webmaster Tools from Bing and Google) to help you identify and tackle these issues head on. Go and set aside some performance maintenance time, if you haven’t done for a while.
6) Watch your homepage title in Google’s SERPs
Google can be pretty aggressive when it comes to choosing the most appropriate text to appear in your title snippets. Sometimes, you might disagree with Google’s choice! Our tests so far indicate that the NOODP meta tag (used to prevent Google using the DMOZ description from displaying in your SERPS) can prevent Google from doing this, even if you have no DMOZ listing.
That “penny drop” moment when a new technical SEO strategy idea presents itself has to be my favourite part of SEO work. I’m glad that technical strategy has to evolve as search engines develop. I really can’t see a time in the near future when that will change.
If you’d like to hear more tips, I’ll be speaking at next week’s A4Uexpo in London on exactly this topic. If you’re there, be sure to drop by and say hello. My buddy Dave Naylor will be introducing me (I have no idea what he’s going to say) and hopefully there’s going to be some time to do a preview of the session over on his blog soon.
Posted by Kate Morris
As a consultant, I work with many In-House SEO teams with strategy and other issues that arise throughout the course of the year. One trend we are seeing is that these In-House teams are having a hard time coming up with accurate traffic-centered goals. Traffic is the base for many metrics measurements, so being able to semi-accurately predict that number in the coming year is important for every business.I can hear you all now, "Well there is the Google Keyword Tool … use that." Typically, that is my answer too, but there have been major questions about the accuracy of Google’s keyword tools and others available to webmasters, marketers, and search engine optimization teams.
(If you will comment with your favorite keyword tool other than those I mention, I’ll happily test and add it here!)
There was a shift recently with the Google Keyword Tool. The Legacy/API version is showing different numbers than the newest Beta interface. David Whitehouse and Richard Baxter both noticed this shift as well and did a few tests on accuracy. The jury is still out as to which version is more accurate, the legacy or the new keyword tool. But I believe like Mr. Whitehouse that the newer tool is the updated one, but that does not make it more accurate.
To be clear, when I speak of the Legacy, API, and Beta tools, I do mean different versions of the Google Keyword Tool. First, from what I can see using the SEOmoz Keyword Difficulty tool, the Google API pulls from the Legacy tool, so they are one and the same. The Legacy tool is the prior interface for the current Beta version of the Keyword Tool. We had previously assumed that these pulled the same numbers, but my research and that of others proves otherwise.
But wait! *infomercial voice* There is more!
There is also the Search-based Keyword Tool that aids AdWords advertiser’s in choosing relevant keywords based on search behavior and a specified website. This tool is explained by Google here and gives more in depth information on account organization and cost.
But even this tool is not on par with the other two when it comes to impressions. A random query in the Search-based tool returned a suggestion for the keyword "maragogi." The Search-Based tool says there should be 12,000 monthly searches. The Legacy tool returns 110 Local Exact match searches, 33,100 Global Exact match, and 201,000 Global Broad match. The new tool returns information only for a global setting (all countries, all languages). That returns 74,000 searches broad and phrase match, and 12,100 for exact match. It seems like the Search-based tool is more like the exact global match in this one instance. But what is a business supposed to do with all of these numbers?!?!?
(hint: always use exact match)
If these tools are possibly inaccurate, how do our clients go about setting their yearly strategy goals?
Put simply, in search, you never want to rely on one set of results or one ranking report. Data over time and from many sources is best. But with the lack of tools out there and Google bringing in at least 65% of traffic organically for most sites, how do you get the best numbers?
First, you need to start out by figuring out how many impressions a keyword or set of keywords can bring in on average for a specific month. If you are in a cyclical industry, this will have to be done per month of the calendar year.
Below is a look at some information I pulled using the tools mentioned for the key phrase "curtain fabric."
The idea here is that if you take into account all of the numbers out there, you might see a trend that you can use for estimating future traffic. If there is no trend, then a median of the numbers can be used as your metric. A few other tools that you might look into include Word Tracker and Keyword Spy. You can see that the numbers are all over the place, but looking at these figures, I’d guess that the keyword might bring in around 6,500 impressions a month in the UK.
The downside is that WordTracker and KeywordSpy don’t allow you to look at exact match information versus broad match. When performing keyword research, you always want to look at the local (target to your country) exact match information. Too many people pull keyword information use broad match and get inflated numbers for all phrases related to that key phrase.
The absolute best way to get accurate numbers about traffic over time is to run a PPC campaign. I pulled some numbers from a few campaigns (for our client’s sake we have masked a number of the actual key phrases) in attempts to see if the new keyword tool is accurate to actual trafffic in August. The keywords pulled were all exact match in the campaign and the information pulled from the keyword tool was Local Exact and set to the country that the campaign was targeting.
As you can see, some of these are higher and some lower. What I found that there really is no definitive answer of if the Google Keyword Tool is accurate. Take a look at the results for the example I used before, curtain fabric. The campaign saw 11,389 impressions, much higher than the new keyword tool, and lower than some other keyword tools. This is why a well run PPC campaign is important if you want to get a more accurate look at impression numbers.
Please note that I didn’t get a chance to ensure that these accounts were all showing at all times during the month, but they were all accurately geo-targeted and all showed on the top of the first page on average.
After getting a good idea of the number of impressions, you then need to take into account where you are showing for that keyword on average organically (aka your rank). While we cannot know specific click through numbers for every search done on the web, there have been some studies done on how much of those impressions the top organic result gets, the second and so on. The one I used the most often is from Chitika. Using the percent of the traffic below and the impression numbers, you should be able to get a good idea of the visitors you can expect per month organically for a specific key phrase.
So using the "curtain fabric" example, assuming that the site I am working on has maintained an average ranking over the last few months of #3 organically, I could expect about 1300 visits from Google for the keyword in a month (11.42% of 11,389 impressions).
get everything figured out, keep in mind that your past metrics are another good way of seeing how close you are to getting the traffic about right. Assuming that no major changes have occurred (like lack of metrics data in the last year), a look back is the most accurate way to understand traffic flow and trending on your site. Pull the unique visitors for every month of the last year and do some analysis on percent increase month over month. This can be done on any level in most analytics programs – overall traffic trends all the way down to the keyword level.
A look at overall traffic per month in Google Analytics for organic searches from Google:
A look at traffic for a specific keyword over the last year per month from Google organic:
In the end though, making predictions are just that, educated guesses. Pulling data from all available sources and using your own historical data can assist in making an educated prediction for the next year. Keep in mind though that things never stay the same. Google Instant just proved that with one of the biggest changes we have seen in a while.
For 9 per site Alexa will audit your site (up to 10,000 pages) and return a variety of different on-page reports relating to your SEO efforts.
It has a few off-page data points but it focuses mostly on your on-page optimization.
You can access Alexa’s Site Audit Report here:http://www.alexa.com/siteaudit
Alexa’s Site Audit Report breaks the information down into 6 different sections (some which have additional sub-sections as well)
The sections break down as follows:
So we ran Seobook.com through the tool to test it out
Generally these reports take about a day or two, ours had some type of processing error so it took about a week.
The first section you’ll see is the number of pages crawled, followed by 3 “critical” aspects of the site (Crawl Coverage, Reputation, and Page Optimization). All three have their own report sections as well. Looks like we got an 88. Excuse me, but shouldn’t that be a B+?
So it looks like we did just fine on Crawl Coverage and Reputation, but have some work to do with Page Optimization.
The next section on the overview page is 5 recommendations on how to improve your site, with links to those specific report sections as well. At the bottom you can scroll to the next page or use the side navigation. We’ll investigate these report sections individually but I think the overview page is helpful in getting a high-level overview of what’s going on with the site.
This measures the “crawl-ability” of the site, internal links, your robots.txt file, as well as any redirects or server errors.
The Reachability report shows you a break down of what HTML pages were easy to reach versus which ones were not so easy to each. Essentially for our site, the break down is:
The calculation is based on the following method used by Alexa in determining the path length specific to your site:
Our calculation of the optimal path length is based on the total number of pages on your site and a consideration of the number of clicks required to reach each page. Because optimally available sites tend to have a fan-out factor of at least ten unique links per page, our calculation is based on that model. When your site falls short of that minimum fan-out factor, crawlers will be less likely to index all of the pages on your site.
A neat feature in this report is the ability to download your URL’s + the number of links the crawler had to follow to find the page in a .CSV format.
This is a useful feature for mid-large scale sites. You can get a decent handle on some internal linking issues you may have which could be affecting how relevant a search engine feels a particular page might be. Also, this report can spot some weaknesses in your site’s linking architecture from a usability standpoint.
While getting external links from unique domains is typically a stronger component to ranking a site it is important to have a strong internal linking plan as well. Internal links are important in a few ways:
Alexa will show you your top linked to (from internal links) pages:
You can also click the link to the right to expand and see the top ten pages that link to that page:
So if you are having problems trying to rank some sub-pages for core keywords or long-tail keywords, you can check the internal link counts (and see the top 10 linked from pages) and see if something is amiss with respect to your internal linking structure for a particular page.
Here you’ll see if you’ve restricted access to these search engine crawlers:
If you block out registration areas or other areas that are normally restricted, then the report will say that you are not blocking major crawlers but will show you the URL’s you are blocking under that part of the report.
There is not much that is groundbreaking with Robots.Txt checks but it’s another part of a site that you should check when doing an SEO review so it is a helpful piece of information.
We all know what happens when redirects go bad on a mid-large sized site
This report will show you what percentage of your crawled pages are being redirected to other pages with temporary redirects.
The thing with temporary redirects, like 302’s, is that unlike 301’s they do not pass any link juice so you should pay attention to this part of the report and see if any key pages are being redirected improperly.
This section of the report will show you any pages which have server errors.
Making sure your server is handling errors correctly (such as a 404) is certainly worthy of your attention.
The only part of this module is external links from authoritative sites and where your site ranks in conjunction with “similar sites” with respect to the number of sites linking to your sites and similar sites.
The analysis is given based on the aforementioned forumla:
Then you are shown a chart which correlates to your site and related sites (according to Alexa) plus the total links pointing at each site which places the sites in a specific percentile based on links and Alexa Rank.
Since Alexa is heavily biased towards webmaster type sites based on their user base, these Alexa Rank’s are probably higher than they should be but it’s all relative since all sites are being judged on this measure.
The Related Sites area is located below the chart:
Followed by the Top Ranked sites linking to your site:
I do not find this incredibly useful as a standalone measure of reputation. As mentioned, Alexa Rank can be off and I’d rather know where competing sites (and my site or sites) are ranking in terms of co-occurring keywords, unique domains linking, strength of the overall link profile, and so on as a measure of true relevance.
It is, however, another data point you can use in conjunction with other tools and methods to get a broader idea of your site and related sites compare.
Checking the on-page aspects of a mid-large sized site can be pretty time consuming. Our Website Health Check Tool covers some of the major components (like duplicate/missing title tags, duplicate/missing meta descriptions, canonical issues, error handling responses, and multiple index page issues) but this module does some other things too.
The Link Text report shows a break down of your internal anchor text:
Click on the pages link and see the top pages using that anchor text to link to a page (shows the page the text is on as well as the page it links too):
The report is based on the pages it crawled so if you have a very large site or lots and lots of blog posts you might find this report lacking a bit in terms of breadth of coverage on your internal anchor text counts.
Checks broken links (internal and external) and groups them by page, which is an expandable option similar to the other reports:Xenu is more comprehensive as a standalone tool for this kind of report (and for some of their other link reports as well).
The Duplicate Content report groups all the pages that have the same content together and gives you some recommendations on things you can do to help with duplicate content like:
Here is how they group items together:
Anything that can give you some decent insight into potential duplicate content issues (especially if you use a CMS) is a useful tool.
No duplicate meta descriptions here!
Fairly self-explanatory and while a meta description isn’t incredibly powerful as standalone metric it does pay to make sure you have unique ones for your pages as every little bit helps!
You’ll want to make sure you are using your title tags properly and not attacking the same keyword or keywords in multiple title tags on separate pages. Much like the other reports here, Alexa will group the duplicates together:
They do not currently offer a missing title tag or missing meta description report which is unfortunate because those are worthwhile metrics to report on.
Having a good amount of text on a page is good way to work in your core keywords as well as to help in ranking for longer tail keywords (which tend to drive lots of traffic to most sites). This report kicks out pages which have (in looking at the stats) less than 150 words or so on the page:
There’s no real magic bullet for the amount of words you “should” have on a page. You want to have the right balance of word counts, images, and overall presentation components to make your site:
Continuing on with the “every little bit helps” mantra, you can see pages that have images with missing ALT attributes:
Alexa groups the images on per page, so just click the link to the right to expand the list:
Like meta descriptions, this is not a mega-important item as a standalone metric but it helps a bit and helps with image search.
This report will show you any issues your site is having due to the use of session id’s.
If you have issues with session id’s and/or other URL parameters here you should take a look at using canonical tags or Google’s parameter handling (mostly to increase the efficiency of your site’s crawl by Googlebot, as Google will typically skip the crawling of pages based on your parameter list)
Usually I cringe when I see automated SEO solutions. The headings section contains “recommended” headings for your pages. You can download the entire list in CSV format:
The second one listed, “interface seo”, is on a page which talks about Google adding breadcrumbs to the search results. I do not think that is a good heading tag for this blog post. I suspect most of the automated tags are going to be average to less than average.
Alexa’s Keyword module offers recommended keywords to pursue as well as on site recommendations in the following sub-categories:
Based on your site’s content Alexa offers up some keyword recommendations:
The metrics are defined as:
For me, it’s another keyword source. The custom metrics are ok to look at but what disappoints me about this report is that they do not align the keywords to relevant pages. It would be nice to see “XYZ keywords might be good plays for page ABC based on ABC’s content”.
This is kind of an interesting report. You’ve got 3 sets of data here. The first is the “source page” and this is a listing of pages that, according to Alexa’s crawl, are pages that appear to be important to search engines as well as pages that are easily crawled by crawlers:
These are pages Alexa feels should be pages you link from. The next 2 data sets are in the same table. They are “target pages” and keywords:
Some of the pages are similar but the attempt is to match up pages and predict the anchor text that should be used from the source page to the target page. It’s a good idea but there’s a bit of page overlap which detracts from the overall usefulness of the report IMO.
The Stats section offers 3 different reports:
An overview of crawl statistics:
This is where Alexa would show what errors, if any, they encountered when crawling the site
A report showing which sites you are linking to (as well as your own domain/subdomains)
Some of the report functionality is handled by free (in some cases) tools that are available to you. Xenu does a lot of what Alexa’s link modules do and if you are a member here the Website Health Check Tool does some of the on-page stuff as well.
I would also like to see more export functionality especially in lieu of white label reporting. The crawling features are kind of interesting and the price point is fairly affordable as one time fee.
The Alexa Site Audit Report does offer some benefit IMO and the price point isn’t overly cost-prohibitive but I wasn’t really wowed by the report. If you are ok with spending 9 to get a broad overview of things then I think it’s an ok investment. For larger sites sometimes finding (and fixing) only 1 or 2 major issues can be worth thousands in additional traffic.
It left me wanting a bit more though, so I might prefer to spend that 9 on links since most of the tool’s functionality is available to me without dropping down the fee. Further, the new SEOmoz app also covers a lot of these features & is available at a monthly price-point, while allowing you to run reports on up to 5 sites at a time. The other big thing for improving the value of the Alexa application would be if they allowed you to run a before and after report as part of their package. That way in-house SEOs can not only show their boss what was wrong, but can also use that same 3rd party tool as verification that it has been fixed.SEO Book.com – Learn. Rank. Dominate.
Posted by Danny Dover
Want happier website visitors and higher rankings? This week’s Whiteboard Friday is about how and why to speed up your website. It is more technical than previous videos so I tried to spice it up with an ode to one of my favorite canceled TV Shows, Pop-up Video. Can’t stand the content? At least the added commentary is entertaining. (It is the perfect plan ;-p)
The following are seven proven techniques well known websites use to boost their site speed.
Gzip is a open source compression algorithm that can used to compress your website’s content before your server sends the data to a visitor’s browser. This makes your servers job easier and makes pages load faster for your users. You can learn how to enable Gzip here.
Minify is the process (and software) for removing unnecessary formatting characters from code. This makes your files smaller and your visitors happier. You can learn all about this process here.
CDNs are systems of interconnected server resources that spread content and assets around the globe to shorten the distance between server and prospective user. They are commonly used by the Web’s most popular websites. You can find a list of free CDNs here.
You can take advantage of the countless man hours that have been devoted to image compression and make your users happier by simply saving your images as the appropriate type. As a very general rule of thumb, I recommend saving photos as JPEGs and graphics as PNGs.
When a browser requests a website from a server it can only download a set number of files of the same type at any given point. While this isn’t true of all file types, it is a good enough reason to host applicable files on alternative subdomains. This is only recommended for sites where the pros of speed will outweigh the SEO cons of creating a new subdomain.
While redirects can be extremely useful, it is important to know that implementing them does force your servers to do slightly more work per applicable request. Always avoid redirect strings (301 -> 301 -> 200 or even worse 301 -> 302 -> 200) and use these tools sparingly.
The most straightforward way to speed up your website is to simply use fewer files. Less files means less data. My favorite method of doing this is utilizing CSS sprites. You can read how popular websites are using this trick here.
Fueled by the massive potential of the Internet, Googlers are working on many projects in their attempt to speed up the Web:
If you have any other advice that you think is worth sharing, feel free to post it in the comments. This post is very much a work in progress. As always, feel free to e-mail me if you have any suggestions on how I can make my posts more useful. All of my contact information is available on my SEOmoz profile under Danny. Thanks!
Posted by Lindsay
A typical SEO site audit takes me around 50 hours to complete. If it is a small site (<1000 pages), I am working efficiently, and the client hasn’t requested a lot of extra pieces, this figure can come in as low as 35 hours. If the site is large and has a lot of issues to document, the time investment inches closer to 70 hours.
At SEOmoz, we usually asked for a project time-line of six weeks to complete a full site audit. You need the extended schedule for resource coordination, editing for uniform voice and additional considerations when a team is involved. Even working on my own I prefer a six week time-line because it allows me to juggle several projects simultaneously and to put-down and pick-up various pieces as the mood strikes.
Regardless of how much time I spend on an audit, the best stuff is usually revealed in the first day. At the beginning of a project you’re excited, the client is excited and there is so much undiscovered opportunity! In this post, I’ll outline my recommendations for making the most of day one on a new SEO audit project. I’ve organized it by retro digital clock time stamp for your visual pleasure.
You have a 9:00 client call, so you better get cracking! Take the time upfront to get your documents ready. The first thing I do once I’ve received a signature on the dotted line is prepare two files; my Excel scorecard and the Word audit document.
The audits I’ve worked on have always been extremely custom. Even so, the base document without client content is around 20 pages. This may sound like a lot, but once you prepare a cover sheet, table of contents, the appropriate headings and sub-headings for all the important SEO factors, and short (reusable) descriptions about each factor… it adds up to a hearty file.
I recommend that you create the base Word and Excel files and save them. Try not to work backwards off of an existing audit that you have on hand. Before I was an SEO myself, I was an SEO client of several smart folks. More than once the deliverables I received included other client names. It happens! ‘CTRL+F’ is not fool proof.
Whether you closed the deal yourself or you are lucky enough to have a fleet of salespeople doing that type of leg-work for you, a client kick-off call once the deal has been signed is important. Spend an hour getting to know your primary contacts. Hopefully this includes a senior stakeholder, a marketing lead, and a development lead. More often then not, these meetings are over the phone with the assitance of a web conferencing tool like GoToMeeting.
A sample agenda is as follows;
When you come out of this meeting, you should have an excellent understanding of the website, business needs, and key pain points from the client. You’ll also have had an opportunity to set expectations.
Bonus Tip: If you are working with an in-house SEO person, find out about the projects they have been trying to push through. You may be able to help them get that SEO enhancement moved up the development pipeline and make them look good in the process.
Use this time to recharge your caffeine and make notes about the call.
If you are part of a consulting team, like we had at SEOmoz, ping the other SEOs. This is expecially true if you will be tackling this particular project solo. Send them an email and request that they conduct a quick 15 minute assessment of the site. We did this with great success at SEOmoz. With a dream team that included Rand, Jen and Danny the output of 45 combined quick assessment minutes was incredible.
If you are an indepenent SEO, you can still use a system like this. Form a group of trusted SEOs and provide this support for each other. Be mindful of NDAs and potential conflicts of interest (see Sarah’s post on consulting contracts for more great details).
I’m pretty structured in my approach to SEO auditing, but there is nothing structured about my process during the free form exploration phase. I’m all about creating efficiencies through discipline and a deliberate work plan. That is what gets the project done and brings home the bacon. However, I always set aside at least three hours for unstructured play and exploration.SEO is part art and part science. The actions I’m attempting to describe here are definitely more Pablo Picasso than Marie Curie.
I fire up all of my FireFox Plugins and browse the site, start GSiteCrawler, hit-up Google with a flurry of search operators, run LinkScape/Open Site Explorer, have a grand ol’ time in SEOmoz Labs, and check out the keyphrase landscape with Quintura and SEMrush. One find leads to another and I never know where I’ll end up. No two sites are alike and I’m still coming across things I’ve never seen with each new audit.
Analyze Page via the mozBar showing a less-than-fantastic title tag
I’d say I find 80% of a site’s issues and opportunities during this brief free form exploration. Most of the remaining 45+ hours of a project are spent elaborating on the findings and detailing the action plan to support my original finds.
Be sure to take notes and screen shots as you go. Bonus points if you manage to input them directly into your master Word file. Huge time saver.
Try to step away from the laptop, but bring a notepad with you. No doubt your brain will still be working as your hands work to fill your belly.
Based on the morning’s kick-off call and your findings in the free form exploration process you no doubt have a few questions for the client. If you don’t already have access to Webmaster Tools and analytics, now is a good time to ask. I usually have questions for the client about things that aren’t always apparent from an external view of the site such as how their expiring content policies work. This follow-up email keeps the communication lines open, impresses the client because you’ve uncovered so much opportunity already, and gives them a chance to ask additional questions or provide more info.
At the end of a busy day I like to shift my focus to something that requires less brain power and benefits from simple funcitons like copy & paste. I usually wrap up my day by populating things like the current robots.txt file (for analysis later), top 25 links from Open Site Explorer, etc.
Top Pages via OSE – Yikes! They need to fix those 404s…
Thanks for giving me a read! I’m working on a bi-weekly series that covers all things audit. If you liked this, you might also like 4 Ways to Improve your SEO Site Audit. You can find me in SEOmoz’s PRO Q&A and on Twitter as @Lindzie.