May 16th, 2012 @ // No Comments
Just ten days after the revelation that Scott Thompson did not, in fact, hold the computer science degree he claimed on his resume, the new Yahoo CEO is resigning his position. Thompson, who replaced Carol Bartz, held the position barely four months. He will in turn be replaced by Ross Levinsohn, the company’s global media head, as interim CEO.
Activist Yahoo shareholder and hedge fund manager Daniel Loeb brought the truth about Thompson to light, discovering that the CEO did hold the financial degree he claimed on his resume, but not the computer science one. Normally, one would think that Thompson’s experience would make this a non-issue. As Greg Stirling pointed out, “he had many years of experience as a successful tech executive in Silicon Valley…the paper was a technicality of sorts – not to minimize the ethics issue.”
Apparently it’s the ethics issue that Loeb couldn’t abide. He pushed hard, not letting the issue die; it became a big public relations headache for the beleaguered media company. In addition to effectively winning Thompson’s resignation, Loeb gained seats for three of the four people on his slate for Yahoo’s board of directors. Loeb himself will take one of the seats; Loeb’s other two winning nominees include media executive Michael Wolf and turnaround specialist Harry Wilson.
With everything that Yahoo has been through, it hardly seems that matters could get any worse for the company. Shortly before the scandal, Thompson laid off 2,000 workers – about 14 percent of Yahoo’s headcount – as part of his plan to change the firm’s direction. He wanted to move Yahoo away from its historical emphasis (some would say “over-dependence”) on display advertising and shift focus toward data and personalization. One wonders how those Thompson laid off must feel now – to say nothing of those who kept their jobs!
That appears to be almost beside the point now, as Yahoo’s new interim CEO boasts a background very different from Thompson’s. Ross Levinsohn served as president of Fox Interactive Media before coming to Yahoo. He offers a strong advertising and media-focused perspective; he’s also popular at the company. Jason Hirschhorn, a former MTV digital executive, notes that Levinsohn is “well-respected in the Valley, Hollywood and on Madison Avenue…Yahoo has to lean into media and he has the plan.”
But does Levinsohn really have a plan for navigating Yahoo through this crisis? Thompson barely got a chance to put the first stage of his own plan into action. Kara Swisher posted the first memo from Levinsohn to Yahoo’s rank and file after accepting the interim CEO position. It’s hard to get anything substantial from a first message, of course. Still, the tone of that note sounds very much like he wants to revitalize the company while moving in its original, advertising-focused direction rather than change things in the way Thompson planned.
Taking a look at Levinsohn’s background, reputation, and history of on-the-job accomplishments, Peter Kafka at AllThingsD made two predictions. First, he expects Levinsohn to try to revitalize Yahoo’s ad business. The executive spent most of 2011 trying to shore up that business and bring back the company’s glory days, when Yahoo had one of the web’s best sales operations. Levinsohn put those efforts on hold when Thompson became CEO, but now that he’s in the captain’s chair, Kafka thinks he’ll try to restart those efforts.
Second, Kafka pointed to Levinsohn’s reputation as a negotiator and deal-maker (he helped News Corp. buy MySpace back in 2005) as a sign that Yahoo will probably get involved in some merger and acquisition deals. The company can’t bring a huge war chest to bear for this, unless Levinsohn can sort out the morass of Yahoo’s Asian holdings and turn a nice profit. Kafka thought Levinsohn might make a play for Hulu; it’s a deal he’s wanted to work before, but then-CEO Carol Bartz wouldn’t go for it. If he can’t acquire Hulu, “I don’t see him chasing after Instagram-like companies with big price tags and no near-term revenue plans,” Kafka muses. “I do see him making some plays on cheaper start-ups, as well as some technology plays, to shore up/replace the company’s very old infrastructure /platforms.”
These moves might help Yahoo. Will they be enough to reverse the company’s trend of losing market share? A large organization like this could continue on a downward spiral for years; indeed, Yahoo already has. I would like to see it recover, but I’m finding it difficult to believe that an approach some observers have dubbed “back to the future” will halt, let alone reverse, the company’s painfully fascinating slow-motion train wreck. I’ve been wrong before, though, and I wouldn’t mind being proven wrong again.
May 16th, 2012 @ // No Comments
(Page 1 of 2 )
Semantics concerns the meaning of words – historically a weak area for search engines. Over the years we’ve seen vast improvement in Google’s ability to understand what searchers mean when they enter keywords. You can capitalize on this fact by changing the way you conduct keyword research. Following these tips will also strengthen your website’s content.
Sujan Patel wrote a fascinating article on this topic for Search Engine Journal. After explaining how Google figures out what searchers mean when they enter keywords, he discussed five steps you can take in your keyword research that will help you get found more often by your target audience.
I’m unspeakably grateful that we’re passed the days when optimizing your keywords meant “pick a single target keyword and cram it into your web content as many times as you can,” as Patel characterizes the obsolete style. He correctly notes that “That ship has sailed.” Thanks in part to Google’s Penguin update, the search engine is better at spotting keyword stuffing. But more importantly, Google “understands” words used in context better than it used to.
Patel used the word “fan” as an example. Most searchers don’t use a single keyword anymore; they’ll put in several, which gives Google some kind of context. It knows that “stargate fan site” is not the same thing as “industrial fan review.” Because of this, it can return relevant results to searchers.
But it goes deeper than just returning websites with the specific phrase. Patel notes that “Google and the other search engines use their semantic indexing capabilities to pull results from related SERPs” and deliver the goods. What does this mean? A searcher entering the phrase “industrial fan review” will see results for that phrase, but the search engine might also include results for the phrases “industrial fan comparison,” “industrial fan guide” or “commercial fan review,” among others, to ensure relevance.
Does this mean that your website will show up for related phrases that you haven’t necessarily targeted? Quite possibly, but wouldn’t it be better to target those phrases anyway? Of course it would. The good news is, you can use the search engines’ own semantic indexing behaviors to help you do your research and plan your content around keyword phrases.
We’ll start by building what Patel referred to as “Level 1” core keywords. These are keywords that vary from our target phrase only slightly, without straying far from its meaning. We’ll enlist Google’s help for this task. Just put your key phrase into Google, and wait for the results to display. Then look at the left-hand sidebar; you’ll find a link for “Related searches.” Hit that, and Google will generate a list, complete with links. When Patel tried it for “industrial fan review,” he got 15 slightly different key phrase, including “industrial fan guide,” “”drum fan review,” “industrial fan manual,” and more.
Not all of these may be relevant, so you may want to click through to check any that look a little questionable. One “related search” Google suggested was “industrial fan lyrics;” that key phrase sounds a lot more like it’s related to music than commercial fans! But that caveat aside, the advantage of starting your keyword research this way is that Google recognizes all of these phrases as semantically related; you know it does, because it just said so. Patel notes that this makes them “a powerful starting point for our keyword research.”
May 16th, 2012 @ // No Comments
Friends and I were recently debating the finer points of serving a 410 versus a 404 response code when a brick and mortar retail analogy was born. I hope you’ll have half as much fun reading through these amateur comics as I’ve had putting them together. You might also come away with an extra line of lingo when explaining HTTP Response Codes to clients or colleagues.
When a search engine or website visitor makes a request to a web server, a three digit HTTP Response Status Code is returned. This code indicates what is about to happen. A response code of 200 means “OK, here is the content you were asking for.” A 301 says, “Gotcha. That page has moved, so I’ll send you there now.” And so on.
Einstein once said, “If you can’t explain it simply, you don’t know it well enough.” It is in this spirit that I present to you my brick-and-mortar retail store analogy.
A man walks into a store looking for a particular model water gun. In each scenario, he is greeted by a different Sales Associate (our response codes).
A 200 is the most common type of response code, and the one we experience most of the time when browsing the web. We asked to see a web page, and it was presented to us without any trouble.
We were expecting to find a web page in a particular location, but it has been moved. No worries though, the web server has sent us to the new location. Most users won’t notice that this has happened unless they watch the URL change.
You’re in the right place, but the page has moved temporarily to a new location. Just like a 301 the user doesn’t usually notice anything because the web server seemlessly moves them to the new URL.
Important SEO Implication: A 302 isn’t a permanent move. Any SEO strength that the original page had won’t be granted to the new URL.
We’ve requested a page, but a username and password are required to access it. We’re presented with a way to login.
Important SEO Implication: Search engines won’t submit a username and password for entry. If you have content hidden behind a login, it won’t show up in the search results.
We’ve requested a page that we don’t have permission to access at all. This page isn’t for us.
We’ve requested a page, but the web server doesn’t recognize our request. The page can’t be shown because the server doesn’t know what it is.
Important SEO Implication: Most default 404 pages are a dead end for users and search engines. Look at using a custom 404 for these cases.
We’ve requested a page and the web server knows what we’re asking for, but the page is gone.
Important SEO Implication: There is some debate in the SEO world as to the advantage (if any) of using a 410 over a 404 in certain cases. This post by Barry Schwartz is a good place to start your own research.
I prefer to use a 410 when removing unfavorable (perhaps penalized) content from a website. Perhaps the website has some bad links pointing to a bad neighborhood within an otherwise quality site. I’d use a 410 to say, “We know what you’re asking for, but we’ve deliberately removed it from the site, permanently.”
We’ve requested a page, and in return, we get a generic error message. No information is given. It is like looking a sales associate in the eye, asking a question, and recieving a blank stare in return.
We asked for a page, but are told that it is temporarily unavailable. Something is wrong. Perhaps the website is down for maintenance.
If you’re like me, you came to SEO out of an interest and background in Marketing, rather than approaching it from a start on the Techology side. I understood the meaning of the basic response codes for SEO (301, 302, 404) long before I understood what was technically happening. I needed to see it before I really got it. If you’re feeling the same way, you can use a browser plugin to watch the communication between the your browser and a website behind the scenes as you browse the web.
There are a number of excellent resources available to help you better understand HTTP Status Codes and determine when to use them to your best advantage for user experience and SEO.
May 16th, 2012 @ // No Comments
If the last few months of ranking changes have shown me anything, it’s that poorly executed link building strategy that many of us call white hat can be more dangerous than black-hat strategies like buying links. As a result of well intentioned but short-sighted link building, many sites have seen significant drops in rankings and traffic. Whether you employ link building tactics that are black, white, or any shade of grey, you can do yourself a favor by avoiding the appearance of link spam.
It’s become very obvious that recent updates hit sites that had overly aggressive link profiles. The types of sites that were almost exclusively within what I called the “danger zone” in a post about one month before Penguin hit. Highly unnatural anchor text and low-quality links are highly correlated, but anchor text appears to have been the focus.
I was only partially correct, as the majority of cases appear to be devalued links rather than penalties. Going forward, the wise SEO would want to take note of the types of link spam to make sure that what they’re doing doesn’t look like a type of link spam. Google’s response to and attitude towards each type of link spam varies, but every link building method becomes more and more risky as you begin moving towards the danger zone.
While not technically a form of link building, 301 “cleansing” domains are a dynamic of link manipulation that every SEO should understand. When you play the black hat game, you know the chance of getting burned is very real. Building links to a domain that redirects to a main domain is one traditionally safe way to quickly recover from Google actions like Penguin. While everyone else toils away attempting to remove scores of exact-match anchor text, the spammers just cut the trouble redirected domains loose like anchors, and float on into the night with whatever treasure they’ve gathered.
When Penguin hit, this linkfarm cleansing domain changed from a 301 to a 404 almost overnight.
Link building through redirects should be easy to catch, as new links to a domain that is currently redirecting is hardly natural behavior. To anyone watching, it’s like shooting up a flare that says, “I’m probably manipulating links.” The fact that search engines aren’t watching closely right now is no guarantee of future success, so I’d avoid this and similar behavior if future success is a goal.
I’ve already covered the potential risks of blog networks in depth here. Google hates blog networks - fake blogs that members pay or contribute content to in order to get links back to their or their clients’ sites. Guest blogging and other forms of contributing content to legitimate sites is a much whiter tactic, but consider that a strategy that relies heavily on low-quality guest blogging looks a lot like blog network spam.
With blog networks, each blog has content with a constant ratio of words to links. It posts externally to a random sites multiple times, and with a lot of “inorganic” anchor text for commercially valuable terms. Almost all backlinks to blog networks are also spam.
I cringe when I see low-quality blogs with questionable backlinks accepting guest blog posts that meet rigid word length and external link guidelines. Quality blogs tend not to care if the post is 400-500 words with two links in the bio, and quality writers tend not to ruin the post with excessive linking. Most of us see guest blogging as a white-hat tactic, but a backlink profile filled with low-quality guest posts looks remarkably similar to the profile of a site using automated blog networks.
I’d obviously steer clear of blog networks, but I’d be just as wary of low-quality inorganic guest blogs that look unnatural. Guest blog on sites with high quality standards and legitimate backlink profiles of their own.
Article link addiction is still a real thing for new SEOs. You get one or two links with anchor text of your choice, and your rankings rise. You’re not on the first page, but you do it again and get closer. The articles are easy and cheap, and they take no creativity or mental effort. You realize that you’re reaching diminishing returns on the articles, but your solution isn’t to stop – you just need to do more articles. Before you know it, you’re searching for lists of the top article sites that give followed links and looking for automated solutions to build low-quality links to your low-quality links.
Most articles are made for the sole purpose of getting a link, and essentially all followed links are self-generated rather than endorsements. Google has accordingly made article links count for very little, and has hammered article sites for their low-quality content.
Maybe you’re wondering how to get a piece of that awesome trend, but hopefully you’ll join me in accepting that article directories aren’t coming back. Because they can theoretically be legitimate, article links are generally devalued rather than penalized. As with all link spam, your risk of receiving more harsh punishment rises proportionate to the percentage of similar links in your profile.
Ironically named “Web 2.0 Blogs” by some spam peddlers, these two-page blogs on Tumblr and WordPress sub-domains never see the light of day. After setting up the free content hub with an article or two, the site is then “infused” with link juice, generally from social bookmarking links (discussed below).
Despite their prevalence, these sites don’t do much for rankings. Links with no weight come in, and links with no impact go out. They persist because with a decent free template, clients can be shown a link on a page that doesn’t look bad. Google doesn’t need to do much to weed these out, because they’re already doing nothing.
Site-wide footer links used to be all the rage. Google crippled their link-juice-passing power because most footer links pointing to external sites are either Google Bombs or paid links. Where else would you put a site-wide link that you don’t want your users to click?
To my point of avoiding the appearance of spam, Penguin slammed a number of sites with a high proportion of site-wide (footer) links that many would not have considered manipulative. Almost every free WordPress theme that I’ve seen links back to the creator’s page with choice anchor text, and now a lot of WordPress themes are desperately pushing updates to alter or remove the link. Penguin didn’t care if you got crazy with a plugin link, designed a web site, or hacked a template; the over-use of anchor text hit everyone. This goes to show that widespread industry practices aren’t inherently safe.
There will never be a foolproof way to detect every paid link. That said it’s easier than you think to leave a footprint when you do it in bulk. You have to trust your sellers not to make it obvious, and the other buyers to keep unwanted attention off their own sites. If one buyer that you have no relationship to buys links recklessly, the scrutiny can trickle down through the sites they’re buying from and eventually back to you.
If you do buy links, knowing what you’re doing isn’t enough. Make sure everyone involved knows what they’re doing. Google is not forgiving when it comes to buying links.
Speaking of footprints, I believe it’s possible to build a machine learning model to start with a profile of known links violating guidelines, which you can acquire from paid link sites and link wheel middlemen with nothing more than an email address. You can then assess a probability of a site being linked to in that manner, corroborating potential buyers and sellers with a link graph of similar profiles. I have no idea what kind of computing/programming power this would take, but the footprint is anomalous enough that it should be possible.
Exchanging links through link schemes requires a lot more faith in a bunch of strangers than I can muster. In a link wheel, you’re only as strong and subtle as your “weakest links.” My opinion is that if you’re smart enough to avoid getting caught, you’re probably smart enough to build or write something awesome that will have superior results and lower risk than link wheels.
High-quality syndication and wire services possess a few unattractive attributes for spammers: there are editorial guidelines, costs, and even fact checking. Low-quality syndication services will send almost anything through to any site that will take it. You’ll end up with a bunch of links, but not many that get indexed, and even fewer that get counted.
My experience has been that press releases have rapidly diminishing returns on syndication only, and the only way to see ROI is to generate actual, real coverage. I still see link-packed press releases all over the web that don’t have a chance of getting coverage – really, your site redesign is not news-worthy. I’m not sure whether to attribute this to bad PR, bad SEO, or both.
In this context, we’re talking about creating a real piece of linkbait for credible links, and later replacing the content with something more financially beneficial. Tricking people into linking to content is clearly not something Google would be ok with. I don’t see linkbait and switch done very often, but I die a little every time I see it. If you’re able to create and spread viral content, there’s no need to risk upsetting link partners and search engines. Instead, make the best of it with smart links on the viral URL, repeat success, and become a known source for great content.
Directories have been discussed to death. The summary is that Google wants to devalue links from directories with no true standards. Here’s a Matt Cutts video and blog post on the topic. Directory links often suffer from a high out/in linking ratio, but those worth getting are those that are actually used for local businesses (think Yelp) and any trafficked industry directories.
If the answer to any of these questions is no, don’t bother with a link. This immediately excludes all but a handful of RSS or blog feed directories, which are mostly used to report higher quantities of links. When I was trained as an SEO, I was taught that directories would never hurt, but they might help a tiny bit, so I should go get thousands of them in the cheapest way possible. Recent experience has taught us that poor directory links can be a liability.
Even as I was in the process of writing this post, it appears that Google began deindexing low-quality directories. The effect seems small so far – perhaps testifying to their minimal impact on improving rankings in the first place – but we’ll have to wait and see.
I honestly can’t speak as an authority on link farms, having never used them personally or seen them in action.
“I’m telling you right now, the engines are very very smart about this kind of thing, and they’ve seen link farming over and over and over again in every different permutation. Granted, you might find the one permutation – the one system – that works for you today, but guess what? It’s not going to work tomorrow; it’s not going to work in the long run.” – Rand in 2009
My sense is that this prediction came true over and over again. I’d love to hear your thoughts.
Links from the majority of social bookmarking sites carry no value. Pointing a dozen of them at a page might not even be enough to get the page crawled. Any quality links that go in have their equity immediately torn a million different directions if links are followed. The prevalence of spam-filled and abandoned social bookmarking sites tells me that site builders seriously over-estimated how much we would care about other people’s bookmarks.
Sites focusing on user-generated links and content have their own ways of handling trash. Active sites with good spam control and user involvement will filter spam on their own while placing the best content prominently. If you’d like to test this, just submit a commercial link to any front-page sub-Reddit and time how long it takes to get the link banned. Social sites with low spam control stop getting visitors and incoming links while being overrun by low quality external links. Just ask Digg.
Forum spam may never die, though it is already dead. About a year ago, we faced a question about a forum signature link that was in literally thousands of posts on a popular online forum. When we removed the signature links, the change was similar to effect of most forum links: zero. It doesn’t even matter if you nofollow all links. Much like social sites, forums that can’t manage the spam quickly turn into a cesspool of garbled phrases and anchor text links. Bing’s webmaster forums are a depressing example.
From time to time you’ll hear of a new way someone found to get a link on an authoritative site. Examples I have seen include links in bios, “workout journals” that the site let users keep, wish lists, and uploaded files. Sometimes these exploits (for lack of a better term) go viral, and everyone can’t wait to fill out their bio on a DA 90+ site.
In rare instances, this kind of link spam works – until the hole is plugged. I can’t help but shake my head when I see someone talking about how you can upload a random file or fill out a bio somewhere. This isn’t the sort of thing to base your SEO strategy around. It’s not long-term, and it’s not high-impact.
While similar to unintended followed links on authority domains, profile spam deserves its own discussion due to their abundance. It would be difficult for Google to take any harsh action on profiles, as there is a legitimate reason for reserving massive numbers of profiles to prevent squatters and imitators from using a brand name.
What will hurt you is when your profile name and/or anchor text doesn’t match your site or brand name.
“The name’s Insurance. Car Insurance”
When profile links are followed and indexed, Google usually interprets the page as a user page and values it accordingly. Obviously Google’s system for devaluing profile links is not perfect right now. I know it’s sometimes satisfying just to get an easy link somewhere, but profile link spam is a great example of running without moving.
If I were an engineer on a team designed to combat web spam, the very first thing I would do would be to add a classifier to blog comments. I would then devalue every last one. Only then would I create exceptions where blog comments would count for anything.
I have no idea if it works that way, but it probably doesn’t. I do know that blogs with unfiltered followed links are generally old and unread, and they often look like this:
Let’s pretend that Google counts every link equally, regardless of where it is on the page. How much do you think 1/1809th of the link juice on a low-authority page is worth to you? Maybe I’m missing something here, because I can’t imagine spam commenting being worth anything at any price. Let’s just hope you didn’t build anchor text into those comments.
Buying domains for their link juice is an old classic, but I don’t think I have anything to add beyond what Danny Sullivan wrote on the matter. I’m also a fan of Rand’s suggestion to buy blogs and run them rather than pulling out the fangs and sucking every ounce of life out of a once-thriving blog.
Domain buying still works disgustingly well in the (rare) cases where done correctly. I would imagine that dozens of redirected domains will eventually bring some unwelcome traffic to your site directly from Mountain View, but fighting spam has historically been much easier in my imagination than in reality.
This list is not meant to be comprehensive, but it should paint a picture of the types of spam that are out there, which ones are working, and what kinds of behaviors could get you in trouble.
I have very deliberately written about what spam links “look like.” If you do believe that black hat SEO is wrong, immoral, or in any way unsavory that’s fine – just make sure your white hat links don’t look like black hat links. If you think that white hat SEOs are sheep, or pawns of Google, the same still applies: your links shouldn’t look manipulative.
I’m advising against the tactics above because the potential benefits don’t outweigh the risks. If your questionable link building does fall apart and your links are devalued, there’s a significant cost of time wasted building links that don’t count. There’s also the opportunity cost – what could you have been doing instead? Finally, clearing up a manual penalty can take insane amounts of effort and remove Google’s revenue stream in the meantime.
May 16th, 2012 @ // No Comments
Every SEO campaign can be broken down into 5 fundamentals, then segmented into off-site (link building) and on-site (content) approaches to identify and understand the application. This webinar recap and slide deck (at the bottom of this post) is a top level guide to assist you with setting the correct content marketing strategy today.
A good content strategy will consist of a plan rooted in specific value-based principles. These values are those that make a website good, worth visiting, long lasting etc. The way I came to determine just what these values should be is by drawing some parallels beween human success and marketing success gleaned from the well-known self improvement book: 7 Habits of Highly Effective People.
In the beginning of this book Stephen Covey outlines that all success literature over the last 2 centuries can be broken into two approaches to achieving success. One of these approaches should be considered primary, and the other secondary.
Character Ethic: This is the primary way to achieve success. It would be an approach based on principle based solutions. Covey calls this the “inside-out approach.” Meaning that if you focus on the key characters that make up who you actually are—such as honesty, integrity, creativity etc.—and as you work to improve yourself success will naturally follow.
Personality Ethic: This is the secondary approach. Often you could define these as quick fixes, superficial or even instant gratification. These are the appearance driven tactics you find in self-help books that only focus on the surface of who you really are. They can gain you short term success, but often wean off and deter your overall success in the long run.
While approaches to gaining success under the personality ethic are valid, they really only have long term value when they are utilized by a person who has mastered his character and established a personality rooted in strong value-based principles.
So it is with a website. No matter how many links you build (personality ethic), and how flashy your website looks—it’s long term success (traffic, leads, sales etc.) and sustainability of that success can only be secured if you first focus on the character of your website.
What this means is if you focus on your websites internal content strategy (character ethic) by applying the below value based principles to establish a well thought out strategy the longevity of your website will be secured.
Here is a quick mind map of what typically should branch off of these values and develop into your content marketing strategy:
And if you were to sequence these in order of priority I would lay them out as follows:
In the end, links are only as valuable as the website you pass them to. To put it in other words, no one cares if the directions you got to a party were great if the party isn’t any good.
As you focus on these core principles that make up a quality content marketing strategy I suggest approaching the plan with 3 specifics:
As you approach these 3 things look at how they apply to the characteristics that make up an ideal content strategy by asking questions like:
As you cater a content strategy to your specific audience or customer you will achieve higher success in content marketing. People enjoy making a connection. Understand the type of people your website is most directed toward and put these people first by ensuring you build a website with content that is:
There is no doubt that people like easy. Which means as soon as the user opens a browser they are looking for a simple process.
People love to be entertained. From the wording in your copy to the design to specific media used on the page each part of content is an opportunity to entertain your visitor in one way or another. As you plan your content, look to marry your brand with some form of entertainment that speaks to the targeted audience. And no matter the audience that entertainment will be most delivered best through a combination of:
Just as much as we all love to have entertainment handed to us, we are drawn to learn more. The informational form of content is often one of the greatest for SEO purposes as well. Look at the questions your industry is currently asking and seek a unique way to deliver the right answers to the web.
Once your content is quality in terms of design and user experience then look to basic SEO implementations to ensure the nitty gritty is in order:
Page titles are placed within the head area of the source code and should:
Here are various acceptable formats to approach your page titles:
titleBranding: Page Title With Keyword Here/title
titleKeyword Here In Title With Branding Included In Middle/title
titlePage Title Keyword Inclusion Here | Branding/title
Your page headings h1 are given more prominence and importance that paragraph content p by search engines. This means that the keywords and phrases used within headings are considered indicators as to the pages relevance. In approaching the use of your page headings you should:
Body content (or words) should be integrated into nearly every area of your website. Search engines view this as quality, and users benefit from it. Words on a page are not placed their solely for search engine optimization purposes. If they are, you have got it all wrong. The content you use should:
Using keywords and variants of those keywords within the content is an integral part of making your website findable. Do not concern yourself with using the phrases a certain amount of times. In fact, using keywords correctly on a page will often happen naturally and without intention.
In approaching your content creating for a web page it is best to write the content first to be readable, purposeful and succinct. After this is done, review the content to determine if the correct key terms have been used. If they have not revise as needed.
One of the most under utilized areas of a website for optimization are images and graphics. In the placing of your images there are two opportunities for optimization for a topic:
img src=”keyword-rich-file-name.jpg” alt=”Highly Descriptive Alternate Text With Keyword Included” /
To correctly optimize a website navigation it is important that know what keywords you are targeting for each individual page. Typically this is referred to as establishing keyword mapped URLs. To ensure you navigation is best optimized it must be:
Beautifully designed static pages will become stagnant if you don’t have active content regularly supporting your websites relevance to major search terms and pushing visitors to landing pages. A major factor enabling web properties to gain increasingly more search engine real estate is active content production.
The idea is that you need to keep your website up-to-date and interesting both for people and for search engines by:
Above all else, the best way to ensure you content is active is to have a schedule laid out and broken down to the week or days if needed. Often this is referred to as an editorial calendar. Itemizing the content piece’s you plan to publish will help you look at everything with a bird’s eye, ruling out overuse of a specific topic or type of media. A well planned out content calendar successfully integrates with other marketing initiatives as well. For most internal blogs I find laying out a 12 month editorial calendar is best.
Over optimization is an increasingly detrimental issue in search marketing. It is very common for professionals to latch on to one specific strategy or become accustomed to a few singular approaches that, if done over a long period, can in turn deter the websites performance. While over-optimization has always been an issue, with the latest algorithm updates from Google, this has been publicly announced as a target.
There are many forms in which over optimization can creep into your websites SEO campaign, but looking specifically at your content on-site there 3 items to watch out for:
And if all else fails, here is a quick checklist of major items every great content strategy consists of:
What is your thought on this content strategy? Anything you would like to add? Please leave your thoughts in the comments below.
You can also view the recording of my webinar on this subject by clicking here
Article source: http://www.seo.com/blog/content-marketing-strategy-guide/