February 24th, 2012
// No Comments
Until recently, those who wanted to search the Internet but didn’t want to have Google tracking their activities could use Scroogle. Now that owner Daniel Brandt has taken the site offline permanently, searchers deeply concerned about privacy must look elsewhere.
Barry Schwartz covered the full story of Scroogle’s closing for Search Engine Land. Here’s the short version: about a week ago, Google began blocking the site. The privacy search engine, online since 2003, worked by “scraping” Google’s results without passing along any identifiable information or keeping any itself – thus giving users a measure of privacy they lacked when searching directly at Google itself. Getting blocked in this way might not have killed Scroogle so quickly, but the site also fell victim to a stream of distributed denial of service attacks that started back in December.
With Scroogle gone forever, where can a privacy-conscious searcher go to find what she’s looking for without giving away who she is? Fortunately, there are other options. Matt McGee provides a short list. I wish I could say that you can expect it to grow longer, but I’m not convinced; it seems as if search privacy has become a minor issue for most of us.
The best-known privacy-focused search engine is DuckDuckGo. Founded three years ago by Gabriel weinberg and self-funded until October of last year, DuckDuckGo bluntly states on its privacy page that it does not collect or share personal information. Period. To help shed light on this problem, the search engine launched DontTrackUs, which shows how Google tracks searchers – and what happens as a result of Google tracking. Thought of in some circles as “the little search engine that could,” DDG recently enjoyed its first million-search day.
Yippy bills itself as a “family friendly” portal site. Yippy states on its privacy page that it never requires personally identifiable information, and never seeks it out “unless you request a Yippy Service where that information is required.” It’s intended to be an anonymous service. It does collect “limited not-personally identifying information that your browser makes available.” It does explicitly state that it won’t sell users’ personal information to advertisers for profit. Users outside the U.S. “are subject to forms of minimal tracking” as required from their countries of origin.
There’s also hope for private searching from a couple of sources you might not have considered. The first is Ask.com. They offer the Ask Eraser. It’s a tool which can be enabled on the search site. Turning on this tool will delete your search activity from Ask.com, but not from third-party servers. In certain circumstances, such as when law or government officials request the company to retain search activity data, the tool will not delete it. Ask implies, however, that this occurs very rarely.
The second unlikely source for private searching is Google itself. The search giant offers Google Encrypted Search. Keep in mind, however, that it only offers secure search, not truly private search. As McGee notes, “Even when you’re using Google’s encrypted search, Google will know who you are (if you’re logged in to a Google account) and will still save your search activity if you’ve enabled the search history feature.”
Now that you know your options, you can make an informed decision as to where to perform search queries online, if your privacy matters to you. Good luck!
More Search Engine News Articles
More By Terri Wells
Article source: http://www.seochat.com/c/a/Search-Engine-News/Searching-Privately-in-a-PostScroogle-World/
February 24th, 2012
// No Comments
I’ve been running an experiment with some dark-hatted links for several months, consistently hoping Google will catch them and remove their value. So far… Nothing. Well, except top 3 rankings for all the anchor text pointed at those pages. Google’s webspam team has all the incentive, brainpower and money in the world, yet their bets seem to be centered firmly on Google+ and the social graph eventually subsuming the “natural” results with those biased to what our friends and connections share/+1. Fine. I get it. Link buying isn’t going away, no matter how much we wish it would.
Even if link buying is working in the short-term and webspam’s being less aggressive, I still think it’s a waste of money for three reasons:
Rankings are tactical: Earning your way to the top rankings is awesome, because it brings with it the branding, familiarity, trust, social sharing and dozens of other positive marketing signals that “earned” links carry. Spam and paid links just give you some more traffic (and not even as much as a trusted brand could earn in the same position). Conversion rates are lower than your peers, and the secondary traffic benefits from other sources, word-of-mouth, etc. never come into play.
It’s Overpriced: My wife’s travel site gets offers for several hundred dollars to put in a few links on a single post, and that’s not even an efficient market like those created by professional link sellers and link platforms. Playing the link buying game in the big leagues takes thousands to tens of thousands of dollars each month
There’s Always Risk: You’re already familiar with the horrific pain of Google’s Kafka-esque penalties, but maybe you’re banking on not getting hit, given their relative ineffectiveness over the past couple years. Problem is, Google+ has created two new kinds of risk for link spammers. The first is that social search results, which have virtually no ties to the link graph, will overwhelm “natural” results and make those purchased links largely useless. The second is that Google+ gains enough momentum and data to leverage for webspam analysis. If you’ve been pointing lots of links at sites and pages that earn no social traction, get ready to feel some pain. Maybe you’re risk-tolerant enough to scoff off both of these, but I don’t think Google+ is going anywhere, and I give them even-odds to have a social content/sharing graph big enough to pull off both within 24 months.
“Blah, blah, blah, I’ve heard your white hat evangelism before, Rand” Yeah, you have. Fair enough. So how about instead of just warning about what not to do, I give you somewhere to spend all those earmarked-for-spam dollars.
Here’s some rough calculations on link purchasing in a moderately competitive vertical:
Ranking goal: single keyword phrase plus some slight modified phrases
Required: minimum of 50 unique root domains
35 will be one-time payments, but are relatively low quality, $100 is the average price (like I said, low quality)
15 will require ongoing payments to maintain the link, $100/month (on average) will probably do it
Total cost over 12 months: ($100*35)+($100*12*15) = $21,500
So, for $21,500, you can probably buy your way into the top 3 rankings for a moderately competitive phrase in a vertical like niche travel, low-volume e-commerce products, etc. Many black hats I know would argue they can get it cheaper, and they can, but that’s usually because they own networks and properties or have relationships for which they wouldn’t pay directly. A marketing guy working in-house at a brand has none of the connections, no networks of spamfarms, nothing except dollars and a business model that can turn $21.5K in spammy links into $100K in CLTV at 50% margins for a net of $28.5K.
Now let’s try an alternative: Buying a blog.
Say you’re LastWear Clothing (a site one of my favorite Moz engineers, Marty, particularly likes). They could buy some links to key pages (in spite of all the many good reasons not to) and try to get rankings for queries like men’s hakama or womens underbust corset. There’s a small amount of existing search query demand, and they’re one of the only sources on the web selling those precise garments, so there’s a good chance that would turn into sales.
But, let’s try another thought experiment. I’ll head over to Google Reader and run a search for “steampunk” (the aesthetic of LastWear’s clothing):
The second site that pops up has a blog with 6,647 subscribers… And it’s talking about the fashion of steampunk! I think we’re on to something.
The Steampunk Workshop blog has thousands of subscribers, and they’re already clear proponents of LastWear (I know, at this point you’re thinking I planned all this from the start, but I swear, it just fell into place as I was searching/writing). That Workshop site is also running ads on the sidebar and between posts, which suggests an attempt at monetization. While not every site like this is a potential option, many are likely to be interested in an acquisition.
Here’s one way I might structure it:
Steampunk Workshop moves their blog to LastWear.com/blog
They continue blogging about all the things they normally would – no editorial interference or direction needed
LastWear helps with a more professional design, subscription buttons, some marketing polish, etc. to help the blog earn more traffic, visibility and fans
In exchange for the move, LastWear offers a monthly stipend to the blogger(s) and a lump sum payment at the end of 3 years. After those 3 years, they own the blog and the content therein, and both parties can decide how they’d like to proceed with the relationship.
If LastWear went down this road, I can promise two things; #1) they’ll get far greater short and long term ROI than buying links and #2) it will be less expensive in the long run.
To my mind, this is a no-brainer. When you buy a blog or any form of online community, you’re not simply acquiring links, you’re getting:
An engine for brand building and indirect customer acquisition
An ongoing methodology to pull in links, tweets, shares, +1s, likes, etc.
Brand evangelists who will help expand your reach and credibility
A PR opportunity like few others, even in fields where PR is hard to come by (acquisitions are talked-about, blogged-about, and make the news, even those of relatively small blogs)
Content that’s already been proven to attract an audience
All the organic signals that search engines love to see – from links to social to usage to content to branding
I honestly don’t understand why this problem exists:
It makes you want to yell, “Why don’t you just go get married already?!”
Here’s five questions I’d ask brands considering online marketing to answer before choosing link purchasing tactics over a blog investment strategy:
Which is more likely to be scalable in the long term?
Which is more likely to work across multiple channels (content, social, SEO, referring links, etc)?
Which carries a greater risk-reward ratio?
Which is more likely to increase conversion rate and customer lifetime value?
Which is more likely to earn you accolades from your community and which is more likely to earn you a rankings penalty one morning when you really need to hit your quarterly traffic numbers?
To be fair, there’s plenty of challenges and hoops to jump through in these types of transactions and some won’t work out. But, I see a huge disconnect between those who are naturally earning all the signals engines say they want (blogs and online communities) vs. those need them (commercial sites) and no reason the two can’t co-mingle. If you’re a marketer looking to invest dollars into earning a presence in the search, social and web world, you can either build it yourself or you can buy it. I hope to see lots of dollars flowing to the content pioneers who’ve already proven themselves effective earners of inbound marketing signals — the bloggers.
p.s. In the future, I hope to cover this topic in more depth and detail and provide tools and methodologies to structure discovery, transactions, value-creation, etc. but for now, I hope this post offers at least a little inspiration and an alternative use for capitol that can do far more good in the hands of bloggers than fly-by-night spam operations.
Article source: http://feedproxy.google.com/~r/seomoz/~3/CRPYXB6wC2o/buying-links-is-shallow-buying-blogs-now-thats-a-strategy
February 24th, 2012
// No Comments
Conferences are always a great way to get out and meet the SEOmoz community. Luckily we have SMX West coming up next week and quite a few Mozzers will be attending, speaking, live blogging and tweeting. We want to get a chance to meet as many of you as possible so I forced everyone to decide on a schedule so you’ll know where to find us!
Before I get into talking about where we’ll be, it’s probably best to first introduce the Mozzers so you know who to look for.
Keri Morgret – @KeriMorgret
You all learned a bit about Keri a few weeks ago when we spilled the beans about the Community Team. In addition to all her great community work, she’s also a freelance marketer and helps clients with both SEO and PPC. You will find her speaking on Wednesday at 3:30 pm on the Beyond the Google Adwords Tool: Advanced Keyword Research Tactics panel. She’ll be talking about negative keywords and some ingenius ways to make sure you’re not spending money on unnecessary keyword targeting.
You’ll also find her live blogging for http://www.seroundtable.com/. Catch her live blogging schedule below.
Michael King – @iPullRank
As an Associate for SEOmoz, Mike focuses on answering questions in QA and writing for the blog (you may have seen his epic post yesterday). He’s also done a Whiteboard Friday (or two) and is a great contributor to the SEOmoz Community. In his regular life, Mike’s the SEO Manager at Publicis Modem in NYC.
You’ll find him all over SMX West this year! He’ll be speaking on two panels: What Search Data Reveals About Customer Needs Desires – And How To Use It, and he’ll be on the Link Building Clinic. Two panels you’ll surely not want to miss!
One thing you may already know abut Mike is that he loves to get to know people, so if you see him walking by, say hi! I promise, he doesn’t bite.
Everett Sizemore – @balibones
Another grand Associate, Everett helps out by answering QA and now and then I twist his arm to write for the blog. He’s the Director of SEO Strategy at seOverflow.com and will be speaking on the panel Driving Ecommerce Retail Sales Through Search, Thursday at 1pm. If you saw his post about building deep links into e-commerce sites, then you know a bit how his mind works.
If you’re working on an e-commerce or even just a really large site, I’d highly recommend not just going to this panel but also seeking Everett out in person. He’s a ridiculous wealth of knowledge and we shouldn’t let him keep all that inside.
Charlene Inoncillo – @charcillo
As our Marketing Admin, Charlene pretty much knows everything going on at all times on the marketing team. She’s new to the industry, so reach out and say hello! (ok, not literally).
She’ll be attending all of the SMX Bootcamp sessions on the first day and in general learning all about search marketing. Be sure to stop her to say hello and show her how amazing this industry is!
Justin Vanning – @JustinVanning
Justin does Paid Search Marketing for Moz so you’ll probably see him spending much of his time in the PPC Retargeting sessions. He’ll be looking for ways to help out the SEOmoz Marketing team in addition to meeting our community.
Want to know more about our retargeting efforts, or how we do Facebook advertising? Justin’s your man. Give him a holler and ask him about his Twitter strategy.
Jen Sable Lopez – @jennita
*waves hello* If you haven’t met me yet, I’m the Community Manager here at SEOmoz. I’ll be live-tweeting the heck out of SMX so be sure to watch out for my tweets from @jennita.
I’m hitting up a lot of the SEO and social media panels. I love to sit in the front row and make faces at the speakers, so beware! If you’re not able to make to SMX follow my tweet stream and I’ll attempt to keep you up-to-date.
If you’re at the conference, please say hello! I love meeting our community members and really try to make it my goal to meet as many of you as possible.
Now that you know who you should be looking for, let’s see where all you can find us! Remember some of us are speaking, others are live blogging (or tweeting) and some of us are just attending. Also, we reserve the right to change our minds and attend different sessions as necessary.
Monday, February 27 – 6:00pm to 7:30pm
SMX Meet Greet
Most of us will be attending the networking event on Monday night, so find us and say hello! We’re also planning on going for drinks after so let us know if you’d like to join us.
Tuesday, February 28
SMX Boot Camp: Keyword Research Copywriting For Search Success – Charlene
Getting Personal, Part 1: How Google Bing Personalize With Social Connections – Jen (live tweet) + Keri (live blog)
Maximizing Paid Search Campaigns With Google’s AdWords Extensions – Justin
SMX Boot Camp: Link Building Fundamentals – Charlene
Getting Personal, Part 2: How Google Bing Personalize With Search History Geography – Jen (live tweet)
SMX Boot Camp: Paid Search Fundamentals – Charlene
Solving Problems Seeing Success In Google Places – Jen (live tweet)
Power Tools For The Paid Search Pro – Justin
SMX Boot Camp: Search Engine Friendly Web Design – Charlene
Don’t Panic! A Hitchhiker’s Guide To Surviving SEO Changes – Jen (live tweet) + Keri (live blog)
Retargeting Remarketing: The New Behavioral Ads – Justin
Wednesday, February 29
SEO For Google+ Google Search – Charlene + Jen (live tweet)
Search Ads: Taking Your Ads From Good To Great! – Justin
Real Answers For Technical SEO Problems – Mike (QA Moderating)
Building Buzz On Twitter: Getting Followed Retweeted – Charlene + Jen (live tweet)
Best Practices For Paid Search Testing – Case Study Panel – Justin
Schema.org, Rel=Author Meta Tagging Best Practices – Keri (live blog)
Building Buzz On Facebook: Getting Liked Shared – Charlene + Jen (live tweet)
Beyond The Google AdWords Tool: Advanced Keyword Research Tactics – Justin, Keri (speaking)
Creative Facebook Ad Tactics – all of us will be there!
9pm-11pm: SMX After Dark @ Motif
Thursday, March 1
The “New” Killer Content – Charlene
Justifying The Investment: Analytics For Social Media - Jen (live tweet)
Maximizing Enterprise PPC ROI – Justin
Enterprise SEO – Challenges Solutions – Charlene
What Search Data Reveals About Customer Needs Desires – And How To Use It – Jen (live tweet) + Justin + Keri (live blog) + Mike (speaking)
Driving Ecommerce Retail Sales Through Search – Everett (speaking)
Link Building Clinic – Mike (speaking)
I’m serious here. If I find out that you were at SMX and didn’t say hello, I’m going to be sad. Just think if you find us, you may even get a lovely picture with some of us… like this:
See you at SMX!
PS. If you haven’t bought your ticket yet, use the code smx10seomoz to get a discount when you register for SMX.
Article source: http://feedproxy.google.com/~r/seomoz/~3/IrGU6vJuMOI/meet-mozzers-at-smx-west-2012
February 24th, 2012
// No Comments
In the wake of Google’s Panda updates, there’s been a lot of fear regarding user metrics and how they impact SEO. Many people are afraid that “bad” signals in analytics data, especially high bounce rates and low time-on-site, could potentially harm their rankings.
I don’t think Google is tapping into analytics data directly (I’ll defend that later), and I don’t think they have to. There are two user metrics that both Google and Bing have direct access to: (1) SERP CTR, and (2) “Dwell time”, and I think those two metrics can tell them a lot about your site.
The official word from Google is that analytics data is not used for ranking. Whether or not you believe that is entirely up to you, and I’m not here to argue about it. I’ll only say that it’s rare to hear Matt say something that emphatically. I think the arguments against using analytics directly as a ranking factor are much more practical in nature…
(1) Not Everyone Uses GA
Usage stats for GA are tough to pin down, but a large 2009 study placed the adoption rate at about 28%. I’ve seen numbers as high as 40% being quoted, but it’s likely that somewhere around 2/3 of all sites don’t have GA data. It’s tough for Google to penalize or devalue a site based on a factor that only exists on 1/3 of all sites. Worse yet, some of the largest sites don’t have GA data, because those are the sites that can afford traditional, enterprise analytics (WebTrends, Omniture, etc.).
(2) GA Can Be Mis-installed
Even for sites using GA, Google can’t control how it’s installed. I can tell you from consulting and from QA here on SEOmoz that GA is often installed badly. This can elevate bounce rates, reduce time-on-site, and generally add a lot of noise to the system.
(3) GA Can Be Manipulated
Of course, there’s a malicious version of (2) – you can mis-install GA on purpose. There are ways to manipulate most user metrics, if you want to, and there’s no scalable way for Google to double-check everyone’s installation and setup. Once the GA tags are in your hands, they’ve lost a lot of control.
To be fair, others disagree and think that Google will use any data they can get their hands on. Some have even produced indirect evidence that bounce rate is in play. I’m going to argue a simple point – that Google and Bing don’t need analytics data or bounce rate. They have all the data they need from their own logs.
The 1 Reason I Don’t Buy
One argument you hear all the time is that Google can’t possibly use something like bounce rate as a ranking signal, because bounce rate is very site-dependent and unreliable by itself. I hear it so often that I wanted to take a moment to say that I don’t buy this argument, for one simple reason. ANY ranking signal, by itself, is unreliable. I don’t know a single SEO who would argue that TITLE tags don’t matter, for example, and yet TITLE tags are incredibly easy to manipulate. On-page factors in general can be spammed – that’s why Google added links to the mix. Links can be spammed – that’s why they’re adding social metrics and user metrics. With over 200 rankings factors (Bing claims over 1,000), no single factor has to be perfect.
The first metric I think Google makes broad use of is direct Click-Through Rate (CTR) from the SERPs themselves. Whether or not a result gets clicked on is one of Google’s and Bing’s first clues about whether any given result is a good match to a query. We know Google and Bing both have this data, because they directly report it to us.
In Google Webmaster Tools, you can find CTR data under “Your site on the web” “Search queries”. It looks something like this:
Bing reports similar data – from the “Dashboard”, click on “Traffic Summary”:
Of course, we also know that Google factors CTR heavily into their paid search quality score, and Bing has followed suit over the past year. While the paid search algorithm is very different from organic search, it stands to reason that they value CTR. Relevant results drive more clicks.
Last year, Bing’s Duane Forrester wrote a post called “How to Build Quality Content”, and in it he referenced something called “dwell time”:
Your goal should be that when a visitor lands on your page, the content answers all of their needs, encouraging their next action to remain with you. If your content does not encourage them to remain with you, they will leave. The search engines can get a sense of this by watching the dwell time. The time between when a user clicks on our search result and when they come back from your website tells a potential story. A minute or two is good as it can easily indicate the visitor consumed your content. Less than a couple of seconds can be viewed as a poor result.
Dwell time, in a sense, is an amalgam of bounce rate and time-on-site metrics – it measures how long it takes for someone to return to a SERP after clicking on a result (and it can be measured directly from the search engine’s own data).
Google hasn’t been quite so transparent, but there’s one piece of evidence that suggests strongly to me that they use dwell time as well (or something very similar). Last year, Google tested a feature where, if you clicked a listing and then quickly came back to the SERP (i.e. your dwell time was very low), you would get the option to block that site:
This feature isn’t currently available for all users – Google has temporarily scaled back site blocking with the launch of social personalization. The fact that low dwell time triggered the ability to block a site, though, clearly shows Google is factoring in dwell time as a quality signal.
Where these 2 metrics really shine is as a duo. CTR by itself can easily be manipulated – you can drive up clicks with misleading titles and META descriptions that have little relevance to your landing page. That kind of manipulation will naturally lead to low dwell time, though. If you artificially drive up CTR and then your site doesn’t fulfill the promise of the snippet, people will go back to the SERPs. The combo of CTR and dwell time is much more powerful and, with just 2 metrics, removes a lot of quality issues. If you have both high CTR and high dwell time, you’re almost always going to have a quality, relevant result.
I’m not suggesting that bounce rate and other user metrics don’t matter. As I said, dwell time is connected (and probably well correlated) to both bounce rate and time-on-site. Glenn Gabe had a nice post on “actual bounce rate” and why dwell time may represent an improvement over bounce rate. I’m also sticking to traditional user metrics from analytics and leaving out broader metrics, like site speed and social signals, which clearly tie into user behavior.
What I want you to do is to take a broader view of these user metrics, from the search engine’s perspective, and not get obsessed with the SEO impact of your analytics data. I’ve seen people removing and even manipulating GA tags lately, for fear of SEO issues, and what they usually end up doing is just destroying the reliability of their own data. I don’t think either Google or Bing are using direct analytics data, and even if they do down the road, they’ll probably combine that data with other factors.
You should create search snippets that drive clicks to relevant pages and build pages that make people stay on your site. At the end of the day, it sounds pretty obvious, and it’s good for both SEO and conversion. Specifically, think about the combo – driving clicks is useless (and probably even detrimental to SEO) if most of the people clicking immediately leave your site. Work to find the balance and to target relevant keywords that drive the right clicks.
Article source: http://feedproxy.google.com/~r/seomoz/~3/p9Mrt-UsUKg/the-2-user-metrics-that-matter-for-seo
February 24th, 2012
// No Comments
Two weeks ago, Tom Critchlow suggested that we work to close the gap between inbound marketing and content marketing communities. It’s time to build bridges again, this time between inbound marketing and research. In this post, you’ll find research on participation patterns, how to spot high-value users, seeding content in a new community, how to bring new life to old content, and a little bit of gamification.
Some research is already being shared with the inbound community. Bill Slawski from SEO By The Sea does a great job reading and condensing patents from the search industry. But there is so much more research waiting to be tapped.
I am currently in a PhD program and therefore attend academic conferences. They are different to MozCon, SearchLove, SMX, Blueglass and the other conferences we all usually go to. And different means different perspectives. Last week at CSCW, 160 researchers from private companies and universities presented a paper. Topics include social media analysis, collaboration, gamification, incentives, recommender algorithms and online communities. For better or worse, I did not attend 160 presentations. So this will be a very limited summary, focusing on online communities.
Why Should You Care?
Universities and private companies like IBM, Microsoft and Google do some legit research. Being familiar with this research is a competitive advantage and will help generate new ideas.
In this post I focus primarily on community building. At SearchLove last year, Rand had a slide stating a 34% growth in 4 months, primarily from Q+A, YouMoz, the blog and user profiles. Add to this that community members are some of the best link builders you’ll ever find. Getting community right is a huge win.
Who Participates In Online Communities?
Previous research offers two perspectives on participation patterns in online communities:
Some people contribute, and others do not. It is an inherent, personal trait like hair color.
Lurking is a development stage toward being an active member. All people potentially contribute, after the learning/socialization phase: users lurk for a while before participating.
Michael Muller from IBM presented fascinating research on a study on 8,711 online communities covering diverse topics with 224,232 unique users. The insight of the research shows a completely different pattern than the conventional wisdom above: 84 % of those users who participate in one or more community, lurk in others. However, the majority of members’ lifetime contributions are in the beginning on their membership. Thus, many users start off contributing like mad, then stop. This means retention is key.
(Graph is printed in Muller, 2012. See references in the bottom of this post).
Design implications: Do whatever you can to grasp new members. There are many ways to do this: Make sure they get encouraging feedback to their initial comments/contributions. Assign them a mentor. Send them nice emails. Reach out to them on social media.
Despite the overall participation trend identified by Michael Muller, some people are more likely to contribute more to new communities than others. In fact, only few people end up participating in the first place. Google+ VP Bradley Horowitz once wrote about 90-9-1 principle, describing how 1% of community members are creators, 9% are synthesizers, and the remaining 90% are users/lurkers who do not directly add anything to the community.
Rosta Farzan and colleague from Carnegie Mellon University and University of Minnesota developed an algorithm to identify potential high-contributing members. The algorithm uses the following metrics to spot a potential high value member.
Motivation (quantity, frequency, and commitment)
Ability (knowledge, trustworthiness, and politeness and clarits)
Those identified as potential high-contributing members participated 10 times more actively than those not classified.
Design implications: sometimes the gold is right in front of us, but without our knowing. Identifying high potential members early on can help us reach out and retain these creators.
Starting A New Community
In inbound marketing, one often hears the advice: go build a community. Yes, we’d all love to have flourishing communities, right? But how to get critical mass? One solution often used is seeding a site with (third party) content. This is supposed to show that the community is lively and thereby encourage users to contribute. Jacob Solomon and Rick Wash from Michigan State University tried this form of bootstrapping when starting a new wiki.
The results show that users contribute more when they are given a blank page, than they do when they see a seeded page. This makes sense, as there is more work to do on a blank page. However, contributions made on a blank page tend to be unstructured. If the users see a page with some content (e.g. headers, text chunks, objective content, opinionated content etc.), they tend to contribute content similar to the seeded content.
Design implication: If you want users to create a special kind of focused content (e.g. replies of a certain length or with a special focus), seeding can be good. The bad news: seeding content is not a shortcut to start a community as it might actually reduce contributions. Two weeks ago, Rand and Dharmesh launched Inbound. When the site was launched, it was already seeded with many good articles. According to this paper, this seeding reduced contributions, but made them more focused on the kind of articles Rand and Dharmesh want. Sounds plausible.
New Life To Old Content
This one might require a bit engineering power. But it is really neat. Aditya Pal and colleagues from University of Minnesota created an algorithm to detect expired content on a QA site. The algorithm uses metrics such as
Reference to a specific time (e.g. date, month)
Fixed vs relative time reference (ago, after, before, today, tomorrow)
Reference a date in past
Tense of the question
Design implications: Such algorithms are not only useful on QA sites. On enterprise websites, it can be used to flag content that ought to be updated, removed, rel=canonicalized or 301 redirected to new content. This creates better and fresher content on websites, as well as help avoiding old and irrelevant pages rank in Google. It can also help scale some of Cyrus Shepard’s advices on fresh content, and help you rank for QDF keywords.
(This illustration is made by Dawn Shepard for Cyrus’ post mentioned above)
Gamification has been a hot topic in the last couple of years. For many websites, the question is no longer if gamification systems should be implemented, but if it should be kept. Jennifer Thom and collaborators from IBM studied the removal of gamification points from IBM’s internal social network. The researchers found that removing the points system made users contribute significantly less than before.
Design implications: You might (also) be tired of hearing about gamification. But it kinda works… So you might want to take a look at these gamification slides from Richard Baxter:
Curious for more?
The ACM Library is very good. In fact, so good that Matt Cutts blogs about it. To access the articles, you might have to go to a library or a university. But many researchers are happy to share their research, and link to it directly to their own work from their personal websites (The authors have the rights to share their own articles for free). So a little Googling can usually provide the article.
Michael Muller (2012): Lurking as Personal Trait or Situational Disposition? Lurking and Contributing in Enterprise Social Media. Proceeding to CSCW 2012
Aditya Pal, James Margatan, Joseph Konstan (2012): Question Temporality: Identification and Uses. Proceeding to CSCW 2012
Jacob Solomon, Rick Wash (2012); Bootstrapping wikis: Developing critical mass in a fledgling community by seeding content. Proceeding to CSCW 2012
Rosta Farzan, Robert Kraut, Aditya Pal, Joseph Konstan (2012): Socializing volunteers in an online community: A field experiment. Proceeding to CSCW 2012
Jennifer Thom, David Millen, Joan DiMicco (2012): Removing Gamification from an Enterprise SNS. Proceeding to CSCW 2012
Article source: http://feedproxy.google.com/~r/seomoz/~3/mbWo-SRRYiw/what-community-builders-can-learn-from-research
February 24th, 2012
// No Comments
Hi, I’m Rick Perreault. I am cofounder and CEO of Unbounce.com. Unbounce is a platform that allows marketers to create and A/B test landing pages without having to rely on IT. That is really a part of conversion rate optimization, which is today’s subject.
When Rand asked me to come up here and do one of these Whiteboard Fridays and he asked me to do it on conversion rate optimization, you know, I run a software company. I am not a conversion rate optimization expert. However, I see hundreds of people do it every day, and I have experienced this through much of my career. So I am going to share with you what I’ve seen and how I’ve seen conversion rate optimization actually bring far more ROI to your online campaigns than not doing it.
So, let me begin. Let’s imagine this as a period of a three-month campaign. In the old days, and to some degree still today, marketers are really concerned about what happens getting people to click an ad. Let’s just say I’m using . . . these are all sample numbers. Just to keep the math simple. So, let’s just say month one I’ve got $1,000 budget and I generate 1,000 clicks and I convert, my conversion rate 1%. I get 10 customers at a cost of acquiring that customer of about $100.
Now, so month two, I say, “Okay, that’s pretty good.” Now if I am going to get more sales, generate more customers, I just need to increase how much I am spending on advertising. So I increase that to $2,000 and I get 2,000 clicks. I’m still converting at 1%. The result of that, I get 20 customers at a cost of acquisition of still $100 per customer. That hasn’t changed.
Month three, now I am going to increase that to $3,000. I get my 3,000 clicks. Again, convert at 1%, generates me 30 customers, again CPA stays at $100. Over a three-month period, a total spent of $6,000 generates me 60 sales. That’s pretty good.
Now, as time has gone on, something has changes. Smart marketers realized they could actually get even more ROI from this online advertising by focusing on what happens after the ads are clicked and focusing on moving this number higher. This is what we call conversion rate optimization. That’s A/B testing, using unique landing experiences, using analytics, and really understanding what happens after somebody clicks your ad.
So, in this example, I use the same thing. I spend my $1,000 to get my 1,000 clicks. But this time I am going to spend $200 on conversion rate optimization, and by using analytics and some A/B testing, quite quickly I am able to push my conversion rate up. So now, I push it up to 1.5%. What we see happen here, now I’ve generated 15 customers, but more importantly, my cost of acquisition has gone down to $80, a 20% improvement on the ROI.
So the next month, I continue and I spend my $1,000. I generate my 1,000 clicks. Again, I continue with my budget, my conversion rate optimization budget. Again, I do some more A/B testing, do some more analytics, create some more landing pages, improve it, test buttons, test messaging. I am able to push it up. Get 2% conversion rate. Now, look what happens here, again 20 customers at $60 to acquire them. So, again, even a better saving.
Then finally, okay, now I am really going for it. Month three, I am going to spend $3,000. I am going to get my 3,000 clicks, continue with my conversion rate optimization, my $200 here, maintain my 2% conversion rate, generate 60 customers, and the cost of acquiring them somewhere around $53, $53 something.
In this case, I spent $5,600 to generate 95 sales. Here I spent $6,000 to generate 60. The reason I was able to generate more sales on relatively the same ad spend is because I stopped worrying about just what was going on here and started focusing on what was going on here. This is the math of conversion rate optimization and this is why it is important.
So next time you are talking to your boss or a client and they’re trying to understand the value of A/B testing or using a landing page or just spending any time on thinking of what happens after their ads are clicked, the landing experiences, walk them through this exercise.
I hope that was helpful. Thank you very much.
Article source: http://feedproxy.google.com/~r/seomoz/~3/e1ESr4hjbXI/the-math-of-cro-whiteboard-friday
February 24th, 2012
// No Comments
In the world of search engine optimization, competitive intelligence is an indelible source of campaign sustenance. Knowing where competitors are ranking for key terms, and more importantly, why they might have better rankings than your site, is information you simply cannot go without. I am going to walk through a few different ideas that will help effectively analyze search competitors.
Of the many competitive research tools available, Open Site Explorer shines brightly above the alternatives. It’s very intuitive – first time users will find it incredibly easy to use. You simply enter your URL, along with up to 4 competitors, and you get lots of data to start analyzing.
Total External Links
External links are useful see how you compare to competitors in online conversation and presence, and will help you set goals in your SEO strategy.
Link building can be a numbers game, and this report will help you determine whether it’s quantity, or quality (meaning high value) links that you lack compared to other sites in your industry.
Total Linking Root Domains
The Total Linking Domains report gives you a look at the diversity of domains sending links to your competitor’s websites. If you see that they have a high number of links, but a low number of total linking root domains, you know that they are getting most of their links from the same domains. It is better to have both a high number of links as well as a high number of linking root domains.
Linking C Blocks
The Linking C Blocks (or C class IPs) report refers to the IP addresses of linking websites.
While very similar to the Linking Root Domains report, this isn’t the same thing. C Blocks are your way of identifying the true backlink diversity of any given website. Getting 500 links from 500 unique C Block IP addresses is more impressive to search engines than 500 links from 500 different domains that are hosted on the same IP (or even C Block IP), which may look suspiciously like a linking network.
It is always best to have both a high number of linking root domains and a high number of linking C blocks in your link portfolio.
Evaluating the ratio between C blocks and linking root domains is also a quick way to identify whether a particular competitor is using a network of links.
Back Link Portfolio
After we have looked through these stats and set some goals on what you want to accomplish in your strategy we can next pull the backlink portfolio and see exactly where the competitors are getting their links. It is always best to sort their link portfolio by highest to lowest by page authority. This way you can start with their highest authoritative links and analyze and see how they got their link on that site and figure out a way to get a link on that site as well, weather it is a directory that you can place a link or a blog that you can offer a guest post to. It helps to think outside the box and figure out a way to get a link on their site.
Hopefully these tips will help you the next time you analyze your competitors and you will be able to find some high quality links for your site.
Tags: Competitive Analysis, competitor analysis, seo, SEO Tips
Article source: http://www.seo.com/blog/competitive-analysis/competitive-intelligence-die-4-pieces-data/
February 24th, 2012
// No Comments
Since I work for a Search Engine Marketing Company, people are always asking me what it takes to get a website ranking on Google. A successful SEO campaign is not made up of one or two things. I like to look at it like a chemistry project where different elements are added together at certain times, in various quantities in order to create the secret SEO potion.
Press releases are an important ingredient to add to the mix because they can help you build high quality backlinks from reputable news websites and blogs and potentially help you gain media coverage.
Here are some dos and don’ts about how to write press releases for SEO purposes:
Creating Story ideas
Don’t send out a press release if you don’t have anything to announce. It sounds simple enough but so many times, SEO campaigns follow time guidelines and send out releases that aren’t timely, news worthy or high-quality. This will actually work against your campaign. Creating an idea for a great press release isn’t as hard as it sounds. Here are topic ideas for timely and newsworthy press releases:
- Grand opening of a business
- New product, service or website and how it fills void in industry
- Market research, statics, polls and results pertaining to your industry or business
- Tips or ideas for an upcoming holiday or event
- Announcement for winning a prestigious Award
- Celebration of a company milestone
- How your business is “Green”
- Charity work, scholarship programs, community outreach programs or charitable contribution
- Stock offerings or financial updates
- What bigger companies are doing in your industry and how your services compare
SEO Friendly Titles
Do use Keywords in your title. The title of your press release should ideally contain one or more of your keywords and the name of your company. Thousands of headline impressions that run RSS feeds will publish your release. Because your title becomes the title tag, it’s a very important part of what helps a website rank for your keyword. Your title should also be fun and “clickable.” A clickable title is informative, witty, interesting and hopefully all three.
Using Keyword Rich Anchor Text
Do use strategic keyword rich anchor text. The best practice to follow is not to use more than two keyword links per press release and place the most important link in the first sentence. Read, 5 Steps to Killer Keyword Research, to learn more about keywords and phrases you should target. The linked keywords that you choose to use in a press release depend on the campaign. Most of the time, you should use bigger keywords that are showing great movement and keep the momentum building. Other times, it might be necessary to give keywords that have grown stale a boost by adding them into the release in a relevant way. If a company is not ranking for any keywords going after low hanging fruit early in a campaign can help ROI immediately. Branded keywords work well in press releases, especially compared to using them in other link building strategies, such as blog posts. You can fit keywords in naturally by writing your release first, and then adding the links so your links don’t feel forced.
Quotes and Bragging Rights
Do use compelling quotes. Incorporating quotes into press releases is an easy way to toot your own horn. You should never call your company or product the greatest, best or most unique. When you are using quotes it’s okay to brag a little bit as long as you can back it up. Having a quotation come directly from the authorities within the company will help your release get picked up, and bloggers and writers will appreciate that they can pull the quotes directly from your release to use in their story.
Example of a bad quote:
“We’re excited that our seat belts will be available to lot of people,” said SeatBelt Company CEO. “Our seatbelts are the safest in the World.”
Example of a good quote:
“With the release of the SafeBelt system, Seatbelt Company will be able to help decrease the impact that a human takes during a car crash,” said SeatBelt Company CEO. “Our ultimate goal is to help save lives. We will continue to do research in order to engineer the safest seatbelts in the world.”
Optimizing your Content
Do think like Public a Relations Professional. Press releases are news announcements, regardless of whether they are for SEO purposes or not. By writing high quality, rich content that directly relates to your business your release will be naturally optimized. All releases should include the name of your organization, geographical information, a contact e-mail address and phone number. Don’t forget you are writing for the web so keep your release at 400-600 words and include a boilerplate at the end of your release. You’re allowed to use fun language but avoid colloquialisms and dragged out sentences at all costs. Press releases typically receive the most impressions and reads if they are sent out first thing on Monday or Tuesday. However it’s most important to send your release out in a time that makes sense to the topic that you’re writing about.
Don’t forget about logos, photos, embedded videos and infographics. By including multimedia, the likelihood that a news source or blog will take your whole press release package and publish it on their site (gaining you high quality links) will increase. Including logos and photos in your press release is also a great way to get your company ranked in Google Images. When doing this, make sure that your file name contains a concise clear description of the image and a keyword. Youtube videos have a great social following, and they will help increase the reach of your video and press release.
Completing the Circle
Don’t think that after you submit your release that you’re done. Press Releases, social media and other SEO strategies should all come together on a unified front. You should take the time to publish your press releases on your twitter feed, Facebook and other social platforms. After at least two weeks, do an analysis of your release and see the reach it achieved. PRWeb has a great analytical report you can download. If the press release site you use doesn’t have those statistics, an easy way to track your reach is to place quotes around the title and search it in Google. Pay attention to where your release ranks in search engines for certain keywords and your company name. Track how well it drives traffic and sales to your company’s website and how many views it receives. The goal is to get it picked up by as many reputable news organizations as possible so that your linked keywords point back to your page and boosts your SEO. Your press release will also have a chance of ranking in Google and Yahoo News for your keywords.
Don’t waste your time and money on sending out a release that is filled with jargon that no one wants to read; what you’ll get a few links from the same root domains, unnatural linking patterns and no social attributes. If you follow these press release ideas you’ll be adding a special link building element to your SEO potion.
I’d love to hear about other tips and ideas that you use when writing press releases and some success stories where press releases have helped your SEO campaign. Please leave your comments below.
Tags: press releases, SEO press releases
Article source: http://www.seo.com/blog/secret-potion-seo-dos-donts-writing-seo-press-releases/