July 7th, 2012 @ // No Comments
Some companies notoriously use jargon everywhere – in their press releases, on their product pages, and even in their blog posts. Do I really need to spell out why writing in plain English (or Spanish or German or whatever language your audience speaks) will win you more conversions?
It’s a basic rule of SEO: searchers enter the same terms into search engines that they use in everyday language to describe what they need. If you don’t use those terms on your website, it won’t come up for those searches. This means you’ll get less traffic from the search engines, and fewer conversions.
If you want to win over your audience, you need to speak their language. Fortunately, even bloggers who write corporatese like a native can stop incentivising their buzzwords long enough to see a real ROI. Seriously, Neil Patel offered some great tips at Search Engine Journal for using social media to help you get that common touch.
Start by reading the language in all the right places. Don’t go to industry blogs; they’re full of the buzzwords you want to avoid. Instead, visit Twitter, Facebook, Google+ and LinkedIn. Find out what words people use when they’re posting and commenting about products and services in your field. Read relevant blogs, and check out forums. Make note of the language posters use to ask and answer questions. Seek out websites with user-generated reviews, and study the words they use. Write down specific keywords and key phrases.
Another way you can find out your audience’s language is by searching for trending hash tags on Twitter. You’ll often find these cropping up during a conference or similar event. Pay attention to the language used in these tweets to get a feel for relevant topics, words and phrases.
One of the most direct ways to find out the language your audience speaks, of course, is to ask them. “Interviewing people from your target market will be the best way to learn about their needs, what they think of your product and brand, and, most importantly, the language they use,” Patel notes. You can interview people in your target market in various ways: in person, over the phone, or even via email.
When you conduct these interviews, Patel reminds you to do four things. First, target your interviewees carefully, choosing several people from each of your market segments. Second, ask a lot of specific questions; you want to get exact language, not just a vague idea of what they think of your brand and products. Third, make sure you interview enough people over a period of time; what you learn from these interviews can fuel future blog posts. And finally, use appropriate tools to help you. “Open up Excel, create columns for each target market, and drop in these themes and common words and phrases,” Patel suggests.
Finally, don’t take your own people for granted. Some of them already speak the language of your customers, out of necessity. I’m talking about anyone who deals directly with your customers. These include customer service reps, sales reps, retail reps, and even social media reps. Sit down and talk with these people. They can tell you what your customers are saying, and how they are saying it. Ask them to tell you what words and phrases your customers and audience are using, and what language will resonate with them.
Going forward, make a commitment to using this language on your website, in your blog posts, and anywhere your target audience will see it. A good relationship starts with both parties speaking the same language; you can’t expect your customers to start speaking your language, so you need to make the effort to start speaking theirs. Good luck!
July 7th, 2012 @ // No Comments
I’ll start with a simple question: have you ever thought that linking to your Twitter profile can be very difficult? Probably your answer is “not really!”, and in this case maybe you could find what I’m going to show you useful.
But let’s start from the beginning…
A while ago I was re-reading a post by Kristi Hines on SEOgadget about using your Twitter profile for your link building: I had bookmarked it since it contained good and immediate tips to build links just having a Twitter Profile, but in all this months I had forgotten to put them into practice. However, working on it, I also thought that while building links to your website, with those resources you indirectly build links to your Twitter Profile, so you would aspect an important impact on your personal branding management too, particularly for you name/nickname SERP. Is it so? Yes, in most cases it is, but the are some problems. Twitter’s bad search engine optimization, third-party links, and our own mistakes, in fact often make our link building less effective than it could be.
Let’s search on Google [Kristi Hines], for example. Kristi is a very well know professional and there are a lot of reference for her name, however her Twitter profile ranks well and helps her personal branding:
I’m not as important as her, but if I search for my name on Google.it (I’m Italian) there’s no trace of my Twitter profile (@zen2seo), even if it’s linked in several articles and in my Google+ profile. It only appears in the fourth page of results.
I asked myself why, and found something interesting. Just look at the two screenshots and you can find one of the issues I was talking about: my URL contains the escaped fragment (#!), Kristi’s doesn’t. So, simply, Twitter is duplicating its pages, with a dilution of their strength, and Google is indexing different versions of the same content.
How many times? I’ve spotted a lot of variations and only a few certain conclusions. To better understand this confusing situation, let’s check the most common causes of duplication of a website.
If you search for a www version of you profile, you won’t find any results:
This is because www URLs are 301-redirected to the non-www ones:
So, our first conclusion, for now, is that you should link to the non-www version.
Using few advanced search operators, the first duplication I’ve found comes from http/https URLs versions.
As you can see, Google is indexing both http and https versions of the site. Which one would be better to link to? It’s too early to answer to this question, but I’ll try to give you some suggestions in this post.
Since we commonly refer to our Twitter Profile using the @ sign (es. @Zen2Seo), I wondered if I was able to find URLs containing it. I haven’t found this duplication for me, but it exists in other cases.
As for the previous case, I’ve found some duplicated URLs ending with the slash (“/”)
My nickname is Zen2Seo, with a capital Z and S, but in the first screenshot you can see only lowercase letters. Does Twitter handle this difference properly? Not so much. A little deeper query shows you can have also capital letters indexed.
It’s quite easy to notice that Twitter duplicates its pages (at least statuses) on several subdomains. I stopped checking after I found EN, IT, ES, DE, FR, and each of them is affected by the same problems we’ve already exposed.
As you’ve seen there are several causes of duplication (and you can combine them too as you want), but moreover I’ve found Twitter is duplicating its content also via IP address:
In this huge URL confusion, you should be a (good?) SEO to understand what is the right URL to link to. But the majority of the people that use Twitter are not aware of this kind of issues. And Twitter doesn’t help them at all.
Remember Kristi’s URL and mine: Twitter use AJAX and URLs with the escaped fragment, so the average webmaster has another choice (better, another combination parameter) and since the actual URL of the browser shows the /#!/ part, many people link to it.
In this case, things are far more complex than the previous situations. Vanessa Fox’s interesting post about Twitter infrastructure issues shows how Twitter redirects the “normal” URL to the escaped one with a 302 redirect; here search engines crawl twitter.com/?_escaped_fragment_=/YOURNICKNAME and receive a 301 redirect to http://twitter.com/YOURNICKNAME.
I bet this is confusing for some SEOs too, but – without investigating more – we can conclude that Twitter needs AJAX URLs but probably they want the HTML URL to be indexed, so we should link to it. This consideration becomes quite a certainty since they’ve recently announced they’re getting rid of the hashbang (but just because they want to give users more speed not because of SEO issues…)
Another hint comes from Twitter trying to canonicalize URLs via canonical link tag
As you can see, they choose as canonical URL the one with:
The previous screenshots, however, demonstrate Twitter isn’t succeeding with canonicalization, so if we link to a wrong profile URL we can aspect that link won’t help us in our personal (or brand) reputation management.
Of course if you are as known as Rand Fishkin, you don’t have to worry about your Twitter profile appearing in the SERPs for your name/brand. But if you aren’t, something like this could pretty well be a problem:
So, how must we link? You could link to your canonical URL but at the moment, with Twitter unable to solve its duplications, maybe this is not a universal suggestion. I think it could make sense to choose looking at what Google prefers in its SERPs and it can be different from case to case.
So, check your ranking URL and link well!
Now, before you go, just a final note: if you’ve appreciated my SEOMoz post, feel free to follow my Twitter profile (zen2seo) or visit my SEO blog. Clearly, I expect a lot of new followers now than I’m linking to my “right” Twitter profile URL!
July 7th, 2012 @ // No Comments
Howdy, SEOmoz fans. Welcome to another special edition of Whiteboard Friday. This week we’re talking about slide decks. A lot of folks talk about how different sorts of content can be used, can be powerful on the Web for content marketing, for SEO, for social. Slide decks are a particularly powerful and useful piece and one that I’ve made great use of and I’ve seen used in lots of different spheres. I think it’s actually underpowered, and I think it’s what I’d call underexploited or underused on the Web today, particularly in industries outside of technology.
Slide decks are easy for virtually anyone to see. They’re a simple, powerful way to present content. You can present visual content. You can present charts and graphs. You can even embed video. You can do all sorts of stuff, and they are easy to make possible because you can screen capture elements from all sorts of websites and then quickly show attribution. If I want to say, “Hey everyone, here’s how you do keyword research, and here’s to watch out for the exact match portion in the AdWords tool,” I can screenshot AdWords. I can screenshot the exact match. I can point that out. I can make that very visual and compelling, and I can have a progression that tells the story.
This is a great way to show off not just technical stuff, but anything where there’s photography, where there are visuals, where there’s information that lends itself to a narrative format. This common format that slide decks have, usually PowerPoint, is something that all readers can download and share, and that’s another excellent thing because it gives your content the ability to spread further and wider.
I’d use this in all sorts of places. I recommend using it on the slide sharing platforms, we’ll talk about those, embedding it in content that you’ve got on your site, possibly making specific landing pages for it. If you’re tape recording or videoing audio over it, then what you can do is you can add those in as webinars or viewable video. There are just a lot of options for this type of content.
I wanted to provide some best practices and some tips that we’ve seen. A few things here. Number one, I want to talk about the process. Now, typically, what I recommend if you’re doing a classic slide creation is to create your slide deck, upload it to one of these major services. SlideShare, Scribd or Docstoc, all of them have reasonably good audiences. My favorite right now is SlideShare, and the reason is that it’s relatively easy if you get a decent presentation, get a good presentation, get it some traffic and attention awareness, particularly in the social world, so a lot of tweets, a lot of Facebook shares, a lot of LinkedIn shares. SlideShare will put content that does well on its homepage, and it can be featured and that means a lot more visitors who never would have seen your content otherwise. If you have a compelling title that’s interesting to your particular audience and you’ve got a good first slide that captures the attention and awareness, even in the thumbnail format, you can do really, really well on SlideShare. This is true in Scribd and Docstoc as well.
The other one I recommend is Box.net or Dropbox. You can upload and embed from those services, and remember, you don’t just have to put the slide on these services. You can then embed on a page on your website if you want most of the traffic, the attention awareness, and the experience to be controlled and owned by you. We do this a lot. I’ll upload to SlideShare with one title, and then I’ll create a page on SEOmoz, just a static page, embed the slide there, and you can expand to whatever size you want, and then I’ll make that the URL that’s shared and that works tremendously well.
Once you’ve uploaded, give your presentation publicly, whether that means it’s a webinar that you do online, whether you’re giving it in person. If you’re not going to, you can skip this step. But if you do it, there’s something really, really powerful being in front of even just a small audience, and that is you can do this. Once you start your presentation, say, “Here’s my presentation. I’ve made all the slides available for download at this URL,” and then you make a quick, easy to remember URL. I usually use bit.ly to shorten whatever the URL is so I can say it’s at bit.ly/mytalk or bit.ly/inbound2012 or bit.ly/seoforstartups, and I’ve got a lot of these. This process is phenomenal because what you can actually do is get the audience to be sharing that content right away. Super, super cool.
Now, when you do that, make sure that you don’t just say, “Hey, here’s my URL,” but also say, “If you enjoyed this talk,” so you have it at the start, you finish your presentation, you go to the end slide and you say,
“If you enjoyed this talk, I would love if you shared the presentation download link on social media.” Super cool way to go.
Number three, you can use and reuse the slide on your website or blog in a post on a page through the embed and then invite others who see it there to be able to use the content, but they need to reference back to it. This is a great way to get something we all need – links.
Number four, watch your stats. Watch your stats from your blog post, that kind of thing. Watch your stats on SlideShare Pro if you’re using that. I’ve upgraded to SlideShare Pro so I can kind of see where things go and which presentations perform better, but they’ll show you number of views regardless. From there, you can get a sense of what’s performing well, what’s not performing well. Keep doing the good stuff, not doing the bad stuff, and you can find other people’s presentations and see, “Hey, what’s been really successful for them?”
Finally, a few tips for the slides individually. Number one, link to the content. Let’s say I’ve got a slide here. See how I’ve got the URL below the graphic? That’s what you really want to do, and you want that because that will send a lot of traffic. People were curious like, “Huh, where’s that chart come from? What site information? How can I learn more about that?” Click. Now, they come to your website. Now, you’ve captured them there.
Number two, let your slides do a lot of the storytelling work. If you’re going to use this format, remember that the vast majority of people are not going to be in the audience listening to you as you present. They’re going to be on the Web just looking at these slides, and so that means that you want to do number three, which is if you’ve got some extra narration work, some content that you need to say, let’s say I’ve got a big visual, but I don’t have any context for that, go ahead and put, you can put down here in the slide some text. Upload the version that has the text at the bottom. Present the clean version when you present in person, and this works phenomenally well, because then someone who’s getting the slide will see that in there. They don’t have to listen to any audio. If you can explain the slide in one or two sentences, that’s perfect. Honestly, you shouldn’t usually have slides that take 10 minutes to explain, 5 minutes to explain, a paragraph to explain.
Finally, make sure you have your download URL on the first and last slide of the deck, like I mentioned, because if you do that, you can get people sharing at the start of your talk and people sharing at the end of your talk, and people will always be asking you for that download link. This is a great way to make sure that lots of people are reaching these pages and getting your stuff.
Next week, I would like to talk with you about some of my tips for presentations, tips for building slide decks, tips for delivering presentations, and hopefully that will help. I’m even planning to send that video to the MozCon speakers. Hopefully, it will be some good stuff. Until then, hope to see lots more slide content from you all, and we’ll see you again next week for another edition of Whiteboard Friday.
July 7th, 2012 @ // No Comments
It’s time for another Mozscape index update. New data is now available in Open Site Explorer, the Mozbar, our tools and through the API. July’s update comes with some good news, and potentially some bad news, too. As you’re likely aware, the previous two indices, while huge in size (150B+ URLs each) suffered from a lack of freshness due to the additional processing time required to calculate our link graph and metrics over such phenomenally big numbers of links pages. Today’s index is relatively large by prior standards (~72B URLs, larger than most anything we launched before April 2012). And it’s slightly fresher – the link data in the index today was crawled almost entirely in May.
This index was originally scheduled to launch earlier, but ran into troubles, including Amazon’s AWS outage and plenty of hardware failures, too. As we’ve mentioned in the past, SEOmoz is in the process of building a new private hybrid cloud datacenter that will replace AWS for Mozscape and should provide us with much greater reliability. We know how important it is to have regular data updates you can count on, and we’re putting people and money to work as fast as possible to get off the unreliability that Amazon’s systems have created.
Let’s take a look at the full metrics for this index:
And here are the latest correlations between Mozscape metrics and Google’s search results:
Because this update is much smaller in total URL size (~50% of the prior, 165 billion URL index), your link count totals will likely be much smaller, even if you’ve grown your link building efforts. Below is an example of the numbers for various Seattle startups across May’s larger index and July’s smaller one:
Above: May’s 165 Billion URL index data
Above: July’s smaller, 78 Billion URL index data
Note that, as one might expect, link counts are between 50-75% of their former value. This percentage will be lower for sites that get many links from the far corners of the less-traversed, less-popular pages and sites on the web, and higher for sites with links from more popular/well-linked-to sites and pages.
We’re working hard to grow index size in the future back up to 100Billion+ URLs. Our crawlers can already handle vastly more, and it’s just the unreliability of Amazon’s hardware that holds us back. Our engineers and sysops folks are working around the clock to get there as soon as we can.
We’ve also done some work recently to update the scoring systems for the Keyword Difficulty/SERPs Analysis Tool. You’ll know see a more accurate and usable algorithm applied to results where very fresh pages are ranking, e.g. news, sports, trending topics, etc. Here’s an example query that previously would have produced a keyword difficulty score of 1:
Libor Rate Scandal was a SERP that until a few days ago, had virtually no traffic and very different results. All of these pages are ones that have been produced in the last day or two, and thus don’t have Page Authority scores. However, the Domain Authority is now being used to help calculate KW difficulty, which should seriously help those of you who analyze fresh results.
The next 2-4 Mozscape index updates will continue to be on AWS, but we’re now running 3-4 indices in parallel (which costs a fortune, but gives us fallback options if/when Amazon’s failures lose an index or massively delay it). In the next 3-4 months, we hope to be operating indices off our new hybrid cloud environment and see much greater reliability, which will enable us to produce larger, fresher and more consistent updates.
July 7th, 2012 @ // No Comments
Have you recently updated your URL structure from underscores to hyphens? Are you constantly updating products and pages on your website? If so, it might be time to dig into XML Sitemaps, and get yours updated. Let’s take a look at this important and sometimes underutilized resource we have at our disposal as Webmasters and SEOs.
So what exactly is a XML Sitemap? A Sitemap (capital S) is a XML file that lists the URLs on a website that you want to be crawled and indexed along with a priority and a recommended frequency for crawling each specified page. Google, Yahoo and Bing are all search engines that support the XML Sitemaps protocol.
To give you a little background, just like robots.txt and markup (schema.org) all the major search engines came together to form a protocol for us to follow which creates a standard for the way you can let the search engines know about pages that might not get indexed in the regular crawling process.
There are many resources available on the web for creating a XML sitemap and depending on the size of your website, these files can typically be created in a short amount of time. Once you have created a XML sitemap and have uploaded it to the root directory of your website (http://www.yourwebsite.com/sitemap.xml), you will want to let Google and Bing know about its existence. You should have Webmaster Tools accounts set up and verified for both search engines. Having these accounts set up and verified not only allows you to submit the location of your XML Sitemap but also gives you valuable insight into how these search engines crawl and index your website, more specifically URLs submitted and URLs indexed. If you don’t currently have these accounts set up, starting the process is very easy. For both Bing and Google, one option for verifying your account is to add the meta tag they provide you into the head section of your site’s home page. If your website site is not verified in Google Webmaster Tools and you are using the asynchronous snippet (which you should be!) and it resides in your head tag (Google won’t verify the account if your tracking code resides in the body), you can save yourself a step and click on alternate methods, Google Analytics options, then verify.
Now that we have a XML Sitemap created and uploaded, we need to let the Google and Bing now that it exists and is ready to be crawled. This can be done in Google Webmaster Tools, by clicking on Sitemaps, clicking on the ADD/TEST Sitemap button in the upper right hand corner of the page and adding the location of your XML Sitemap, www.yourwebsite.com/sitemap.xml. Google is kind enough to let you know if there are issues with your Sitemap. Other data you will be provided with is the date your Sitemap was processed and how many URLs were submitted and indexed.
The process for adding and verifying your XML Sitemap in Bing Webmaster Tools is very similar to Google’s. Log into your account, click on your profile which will direct you to the dashboard. Select crawl in the top right navigation where you can select sitemaps in the left navigation from which point you can add, remove or re-submit your XML Sitemap.
In addition to making sure the search engines know and have verified the location and format of your XML Sitemap, it also needs to be listed at the bottom of your robots.txt file which is the first file a search engine bot will hit on your website to get instructions on certain directories and files on your website to ignore during its crawl.
After completing these steps you will have sent Google and Bing a roadmap of the pages on your site you would like crawled. I hope this quick overview has been helpful to anyone not familiar with XML Sitemaps and look forward to any additional insights or comments.
July 7th, 2012 @ // No Comments
Hi, I’m Nate Babbel and today I will answer the question: What is Conversion Rate Optimization?
Conversion rate is the percentage of total visitors that “convert” while visiting your site. Now depending on what type of site you have, a “conversion” will be different things, it can be a sale, a lead capture, a registration, a sign-up for a service, or any other call to action you choose to implement. So, if you have 100 people visit your site and 1 of them completes your call to action, you have a 1% conversion rate, if 2 of those 100 convert that would be a 2% conversion rate.
As you can imagine, conversion rate is extremely important. If you go from a 1% to a 2% conversion rate you are effectively doubling your total output, if your target conversion is a sale, you are doubling your total revenue!
Conversion rate optimization or the acronym CRO, is the art/science of increasing your website conversion rate.
We will discuss the best methodology for website conversion rate optimization in a later FAQ video.
Do you have a question about SEO, internet marketing or social media? If so, post your questions on the SEO.com Facebook page, Tweet us, or leave a comment on the SEO.com Google+ page. For Twitter and G+ use the hashtag #SEOCOMFAQ. Maybe we will use one of your questions in a future video.
Article source: http://www.seo.com/blog/conversion-rate-optimization-video-faq/