July 2nd, 2012 @ // No Comments
Have you put off testing your website to see if you can improve conversions? Do you think it will cost you too much money? You’ve just run out of excuses. Last month, Google introduced Google Analytics Content Experiments, a free tool that can help you improve your website.
You can read Google’s blog post about the new offering. Google Analytics Content Experiments lets you quickly and easily offer your visitors different variations of your website, and check to see which ones lead to more conversions. For example, if you want to try out different versions of a product page to see which one leads to more conversions, you can test up to six variations of a specific page.
Assuming you already use Google Analytics, you won’t need to do much (aside from creating the page variations) to use this new feature. Google Analytics provides a wizard to guide you through creating the experiment. A video in the blog post explains all the details. You can control what percentage of visitors see the new page(s), and the goal on which you wish to focus. You can monitor the data and the results from the experiment while it’s in progress, and even view predictive data to see which pages you can expect to convert better.
What this means, as Mike Fleming notes, is that if you’re not testing your website to see which versions of your product pages perform better, you’re telling everybody that you think you’re making enough profit. “Your site exists to persuade visitors to take actions, right? Well, you don’t know how good it is, or how good it could be, until you fully embrace experimentation and testing with it,” he pointed out.
Even if you’ve built your website with all of the “best practices” to be as good as it can be, and you’ve learned all you think you can from examining and dissecting other websites (especially those in your own field), you may be missing one vital point: your business is unique. “The truth is that what works for one business doesn’t necessarily work for another, even if they sell the same thing,” Fleming explained. You need to build pages that are informed by your own understanding of your individual audience.
For example, assume that you and I both sell electronic devices. If we’re going after a different demographic, different kinds of pages will perform better. Even if we’re both selling iPads, if you’re selling them to college students, you’re going to use a different approach than I would if I’m selling them to people like my neighbors in the “active adult” community in which I live!
Or let’s look at this idea just a bit differently. You and I sell the same electronic devices to different audiences. We both make assumptions based on what we know about those audiences as to what kinds of pages will perform better. But we could BOTH be wrong. And we’ll never know unless we replace those assumptions with data. Google Analytics Content Experiments offers what could be the cheapest and easiest way of getting that data. What are you waiting for?
July 2nd, 2012 @ // No Comments
Sifting through server logs has made me infinitely better at my job as an SEO. If you’re already using them as part of your analysis, congrats – if not, I encourage you to read this post.
In this post we’re going to:
It’s critical to SEOs because:
I’m going to casually assume that you at least know what server logs are and how to obtain them. Just in case you’ve never seen a server log before, let’s take a look at a sample “hit”.
Each line in a server log represents a “hit” to the web server. The following illustrations can help explain:
File request example: brochure_download.pdf
A request for /page-a.html will likely end up with multiple hits because we need to get the images, css and any other files needed to render that page.
Image credit: Media College
Every server is inherently different in logging hits, but they typically give similar information that is organized into fields. Below is a sample hit to an Apache web server, and I’ve purposely cut down the fields to make this simpler to understand:
Specifically in regards to SEO, we want to make sure that Google is crawling the pages we want to be crawled on our site – because we want them to rank. We already know what we can do internally to help pages rank in search results, such as:
Ask your client, or download a set of server logs from your hosting company. The point is to try and capture Googlebot visiting your site, except we don’t know when that’s going to happen – so you might need a few days worth of logs, or just a few hours.
To give you a real example:
Example domain has a PageRank of 6, DA of 80 and receives 200,000 visits a day. Their IIS server logs will amount to 4gB a day, but because the site is so popular, Googlebot visits at least once a day.
In this case, I would recommend a full day worth of logs to ensure we catch Googlebot.
Head over to http://www.splunk.com, sign up and download the product – free edition.
Note: the free edition will only let you upload 500mb per 24 hours.
I would recommend that you put your server logs on you local machine to make this process nice and easy.
I’ve put together a quick few screencasts, I know they sound cheesy, but whatever.
Simply click on the Export link and wait for your massive CSV to download. (Note: If the link doesn’t appear, it’s because the search isn’t finished yet)
Every time Googlebot came by the site, it spent most of it’s time crawling PPC pages and internal JSON scripts. Just to give you an idea of how much time and crawl budget was wasted, please see below:
The real problem is that we had pages on the site that hadn’t been indexed, and this was the cause. I wouldn’t have found this without the server logs and I’m very grateful I did.
It’s possible to crawl or visit a site using the Googlebot user agent, and even worse – it’s possible to spoof the Googlebot IP. I always double check a list of IPs to what I see in the server log report and I use the method officially outline by Google.
1) Crawling PPC pages
I checked that these pages weren’t indexed or receiving any traffic first, then I used robots.txt to block only Googlebot from these pages. I was very careful about this since I wanted to make sure that I didn’t block Google Adbot (the robot that needs to crawl PPC pages).
User-agent: Googlebot Disallow: /*/cppcr/ Disallow: /cppcr
2) Infinite GET requests to JSON scripts
This was just another simple robots.txt block because Google didn’t need to request these scripts. Googlebot basically got caught in a form, over and over again. Realistically, there’s no reason for any bot to crawl this, so I set the user-agent to all (*).
User-agent: * Disallow: /*/json/ Disallow: /json
I’m pretty happy to say that a week later, there was an increase of 7,000 pages in the index as reported by Webmaster tools.
Rand wrote about some good tips to prevent crawling issues, so I recommend you checking it out, as well as special thanks to the folks at ratedpeople.com for being kind enough to let me analyze and experiment on their site.
Feel free to follow me on Twitter @dsottimano, don’t forget to randomly hug a developer – even if they say they don’t like it
July 2nd, 2012 @ // No Comments
In a post penguin world, the importance of quality links and solid on-site optimization hasn’t necessarily changed. However, the urgency of many SEOs to acquire them should have. While there are still many types of links and tactics that work despite their shade of gray and low quality, Google’s efforts are making it more difficult to achieve long-term success using these tactics.
Going forward, SEOs will have to recognize and adapt to the shifting link ecology to ensure that their clients achieve success.
Let’s take a look at the old link ecology compared to the new one and examples of places that SEOs can seek the best link opportunities in a post-Penguin world.
For simplicity’s sake, let’s classify all links broadly into three buckets as explained below.
These are links that are built solely for the purpose of improving rankings and SEO value. There are many examples and you can pick your own poison. These links range from mass submissions to low quality directories, automated or manual blog comment spam, social profiles built only for a link to excessive article marketing for exact match anchor text.
The next type of link is one which provides SEO value but only generates in most cases minimal traffic to the target site. An example of this type of link is a guest blog post on a semi-relevant blog or a business listing on a popular and legitimate vertical search engine that is rarely actually seen.
The best kind of link is one which not only carries SEO weight and sends qualified traffic to a website but also generates on or offline conversions. These links can lead to a goal completion, lead generation or revenue depending on your KPI for the target site. An easy example of an ROI link is a definitive piece of content on a relevant and authoritative website other than the client’s site.
In the old link ecology, link profiles could often be depicted as such:
Link profiles in this old ecology were dominated by “Pure SEO” links that are easy to obtain and rarely provide any traffic or conversions. In addition, there would be a few “SEO + Traffic” type links sprinkled in the link profile and very few if any “ROI Links.” With this type of link profile is it any surprise that many companies were hit by Penguin? From this it’s easy to see why SEOs fear algorithm updates when they know their link building efforts are constructing pyramids that have the above profile.
Simply put, the new link ecology requires SEOs to flip the old ecology on its head as shown below:
In the new ecology, SEOs need to shift their focus on the types of links they are using their resources to acquire as explained below. Despite this new ecology having significantly fewer links than the old ecology the new ecology will have longer lasting SEO and marketing benefit for the business.
The truth of the matter is that “Pure SEO” links still work. Acquiring these types of links shouldn’t be at the top of your priority list but putting a little effort towards them still serves a purpose. For example, limited and targeted article marketing can still work for diversifying old and unnatural link ecologies as well as acquiring anchor text variations of keywords and long tail queries. But because Google will eventually devalue these types of links you should add them to your portfolio in small amounts and focus most of your efforts on acquiring the other two types of links.
As alluded to above, great examples of Traffic + SEO links are business listings on legitimate sites that in most cases may not send a lot of traffic back to your site but have the potential to generate business. A business listing on Yelp or Brownbook are great examples. These listings provide great SEO value (1st image below – SEOmoz PA and DA shown), because they are trusted websites that allow business owners to list their business. The trust and SEO value is in large part because they aren’t specifically built for the benefit of SEOs. Further, in some cases they may drive some traffic and/or revenue but in most cases both of these are minimal. (2nd image below)
Another example would be a listing on a resource page like the one below on the Utah Health Department’s website. Most people are aware of the location of their local hospital but if they aren’t they will do a search for hospitals in their area where they will most likely find them in an integrated SERP. Alternatively, there will only be a very small percentage of people who find an official hospital site after visiting the Health Department’s official list of hospitals. Despite this, there would be no reason why a hospital that is left off the list would not want to be on it. It’s not advised to ignore these types of legitimate link opportunities because these are easy links to acquire that will balance your link portfolio.
The ultimate link target provides SEO value, qualified traffic and conversions. The major distinction between ROI links and Traffic + SEO links are the quantity and quality of traffic and the resulting conversions that flow from these types of links. While Traffic + SEO links may send occasional traffic, little or few conversions result from these links. ROI links are more valuable because they are built with the buying process in mind which results in quality traffic and conversions. In other words, these links are built in places where your potential customers not only gather but visit often when they are researching and have the highest potential to convert.
A great example for a local client is a “best of” list for restaurants as pictured below. How often do you find yourself reading through “best of” lists just to pass time? This probably does not happen very often because in most cases, when you are searching for a local restaurant, you aren’t performing an informational search but rather a transactional one. This type of query is often in preparation for a visit to a new city or when looking for a place to eat in the immediate future. Those who are running these searches are most likely going to visit the restaurant’s website for menu, pricing, location and hours of operation. These details are often followed by a visit to the restaurant, which results in an offline conversion.
Obviously you can’t buy nor unilaterally place yourself on these types of lists because they are editorially given. These types of links are difficult to obtain and as a result they are trusted. As a result, these types of links are algorithm proof and should be the most sought after.
Another great example for a business that operates in a more technical, business-to-business niche and has a much more complicated buying process, is to publish authoritative articles targeted to decision makers. This content should be hosted where there could be potential conversions. Example here:
An excellent way to find opportunities such as these is to use Google Analytics. Examine specific sites within paid channels that have a high number of conversions and /or often assist in conversions. From this information you can export a list and identify the sites and the types of content you can build out to gain an increased presence. This will not only assist you in obtaining an ROI link but allow you to reduce spend on that particular site and shift budget to more efficient marketing channels
Building link profiles according to this new link ecology is a win-win situation. This is especially true if you work in an agency as it will help you build better relationships with your clients who see not only improved rankings (a means not an end) but also increased traffic and a healthy ROI. This type of link building strategy will help insulate SEOs in their efforts against constantly evolving algorithms.
Also, while this post waited in the publishing queue Danny Sullivan at Search Engine Land published a very similar post entitled: Link Building Means Earning “Hard Links” Not “Easy Links.” It is well worth a read!
I’d love to hear your thoughts and more ROI link examples in the comments below.
Article source: http://www.seo.com/blog/link-ecology-post-penguin-world/
July 2nd, 2012 @ // No Comments
We conduct regular webinars on SEO, keyword research, press releases, videos, conversion optimization and more.