Archive for September, 2011

How to Measure Your Social Media Lead Generation Efforts


This is a guest post by Ashley Jane Brookes of HootSuite, the social media dashboard. Follow her on Twitter @ashjbee.

Social media is no longer in its infancy and is no longer an optional task for businesses and brands. Indeed, building and engaging with audiences on established social networks is now a key part of business interaction. So, as social media grows into its awkward teenage years, the next step is for these companies to understand not only how to use it, but also how to derive value from it.

There is no set-in-stone standard for social media measurement, but various guidelines for unique business cases can show how social media benefits an organization by solving problems across departments, from recruiting to promotion.

So much of social media measurement is hinged on social-specific statistics that vary by individual network, i.e.: Likes, RTs, followers, etc. But viewed alone, these measurements don’t show the true value social media provides to the bottom line. Therefore, they need to be aligned with greater metrics. As businesses strive to track how tweets turn into transactions, it’s important to understand how to measure lead generation.

When it comes to measuring social media for ROI, the following example of a sales funnel shows that lead generation sits below brand awareness – defined as the potential for your brand to be seen by audiences – and before customer retention.

social media funnel

As evident by the funnel, the social media lead generation cycle starts with a conversation and moves down a path:

  1. Engagement
  2. Opportunity
  3. Conversion

As a starting point, lead generation can be viewed as an indication of audience participation with your social messaging. By understanding this top level engagement, you’ll be able to improve the return on your investment by adjusting your messaging and campaigns based on audience activities.

1. Analytics 

Using social analytics is particularly important for gathering lead gen metrics. For example, using a shortened URL like in combination with Google Analytics and Facebook Insights will give you in-depth insight into audience interaction with your social marketing initiatives.

2. Engagement

At the top level, measuring participation will show how successful your messaging is at driving your audience down the sales funnel toward being a lead ready to covert. This measurement counts any actions that your audience takes, including:

Retweets, @Replies – These gestures on Twitter show that your content resonated with the audience, and they were inspired to share, ask a question, or associate themselves with your brand.

Likes, Comments, Wall Posts – Likes on Facebook posts are good, and comments on your Facebook page are even better. These actions are then broadcast to their friends, which gives you a chance to reach out, reply, and build an audience.

Click-Throughs – Conversions don’t happen on Twitter or Facebook per se, but clicks from your social channels to your landing pages, ecommerce store, order page, etc. indicate that your content was interesting and relevant to your audience and resulted in them moving further down the sales funnel into the opportunity stage.

3. Opportunity

The next metric is measured as the opportunity for conversion and measured through questions like: How deep did your potential customer dig into your site after click-through? Where did they drop off? Did they opt-in to receive additional information by filling out a form and downloading materials? Or did they start filling in a form but left?

You’ll also want to look at the length of time spent at each stage of their path through your site, and identify drop-off points. Deciphering who spends the most amount of time, visits the most pages on your site, and fills out conversion forms will determine where your strongest opportunities lie.

4. Conversion

Finally, the last stage of lead gen metrics is conversion, which defines how successful your actions are and allows you to tie social media details like tweets and Facebook posts into real dollars.

Depending on your business and campaigns, track questions like: Did they redeem a coupon from Foursquare? Did they sign up to your program when they clicked on a Twitter link? Did they buy the product you posted on your Facebook page? Did they add-on to their purchase? If yes, which demographic do they fit into?

Be sure to compare the average cost of purchases/units made by customers originating from social media channels with customers from other channels. You’ll likely plot out a clear graph of profitability by identifying patterns.

5. Set Goals, Test, and Measure

Remember, each business and brand has unique problems to solve. Rather than measuring your campaigns by traditional metrics, work back from your ultimate goal – whether it be to increase resume submissions or increase purchases – and ask the questions which help you to understand your audience’s origins and motivations. Since you should measure most every action from your social media-centric campaigns, this is a great chance to experiment with your messaging including tone, voice, offers, photos, and time of day. Try different tactics, measure everything, then adjust, and you’ll watch your charts go up and to the right.

How are you measuring the effectiveness of your social media lead generation efforts?


Connect with HubSpot:

HubSpot on Twitter HubSpot on Facebook HubSpot on LinkedIn HubSpot on Google Buzz 


Go to Source

Title Tags – Is 70 Characters the Best Practice? – Whiteboard Friday

Posted by Aaron Wheeler

It’s often pretty difficult to make a short title for a webpage that offers a lot of varied or super-specific information. At SEOmoz, we say that the best practice for title tag length is to keep titles under 70 characters. That’s pretty pithy considering that the title also includes your site or brand name, spaces, and other nondescript characters. So, does it matter if you go over 70 characters? How important is it to strictly adhere to this best practice? Cyrus Shepard does SEO for us here at SEOmoz, and he’ll answer that very question in this week’s Whiteboard Friday. Think title tags could or should be longer? Shorter? Let us know in the comments below!


Video Transcription

Howdy SEOmoz! Welcome to another edition of Whiteboard Friday. My name is Cyrus. I do SEO here at SEOmoz. Today we’re talking about title tag length. How long is your title tag?

Bad title tag joke. For years, we’ve been telling people, the length of your title tag should be 70 characters or less. That this is best practices. But what does this really mean? Is it absolutely true? What happens if your title tags are longer than 70 characters? For example, the title of today’s post within the meta description is 77 characters. Not this title, but the actual HTML title tag, if you look at the source code, you’ll find that the title tag of today’s Whiteboard Friday is 77 characters. We’re actually over the 70 character title tag limit. Is that bad? Are we going to go to SEO hell for that? What does that mean?

Well, recently people have been doing some experiments to see just how many characters Google will index within a title tag. For years, we thought it was 70s. It’s fluctuated. But recent experiments have shown that Google will index anywhere between 150, one person even showed that they will index over 1,000 characters, and I will link to these experiments in the post. But does this mean that you should use all of those characters to your advantage? Can you use them to your advantage? Well, I got really curious about this. So I decided to perform some experiments here on the SEOmoz blog with super long title tags. We’re talking extreme title tags, like 200 characters long, 250 characters long, just blew them out of the water just to see what would happen.


On the first experiment, I took 10 posts that did not get a lot of traffic, but they were pretty consistent traffic from week to week. I kept the old title tags and I just extended them with relevant keywords up to about 250 characters long. The results blew me away. In that first experiment, my traffic, over about a 12-week period, rose 136%. You can see, I’ll try to include a screen shot in the comments below of the Google Analytics. It exploded. I got really excited. So, I tried a second experiment. (Correction, the experiment took place over a 6 week period, not 12 like I stated in the video.)


The second experiment I tried with existing successful pages, pages that were already getting a fairly high volume of traffic, that were getting a consistent level of traffic every week. On that experiment, over about the same 12-week period, traffic rose 8%. Cool, but overall site traffic rose 9%. So it was actually 1% below the site average.

For a third experiment, I tried again on a completely different site, a personal site. I changed a few pages, title tags. Traffic actually went down over a 12-week period 11%. On that site overall site traffic went down 15%.

So, in one of these experiments, the long title tag seemed to work really well. In the other two, it just seemed to be a wash. Why did this happen, but not here? I am going to get to that in a minute.

Title Tags less than 70 Characters

Now, what are the arguments for short title tags? The best practices that you always hear about, keep it less than 70 characters. There are reasons why this is best practices and why we recommend it time and time again.

The first reason is that Google will only display the first 70 characters, in general, in their SERPs. After that, they’re truncated. Users aren’t going to see them. So, if you are writing title tags longer than 70 characters, you’re basically writing it for the search engines, and time and time again we’ve found that if you’re doing something specifically for search engines and not for users, there is probably not a lot of search engine value in it. There might be some, but probably not much.

The second reason is our Correlated Ranking Factors, a survey that we perform every couple of years. Our highest on page correlation value for keyword specific usage was if it is found, if the keyword is found in the first word of the title tag, that was a 0.09 positive correlation. It is not a huge correlation, but it was our largest on page keyword factor. Year after year after year when we perform these correlation studies, we see a direct correlation between the position of the keyword in the title tag and how important it is in the query. So, the closer the keyword is to the beginning of the title tag, the more likely it is to be important in the query. You’re going to see this time and time again. It’s very consistent. Hundreds of webmasters know this from personal experience. You want your keywords at the beginning of the title tag to rank for those keywords. The further out you do it, at 220 characters, those keywords aren’t going to count for very much.

Title Tag Best Practices

Now the third reason is kind of new in today’s world, and that is the rise of social media. Twitter limits characters to 140 characters. So, if you have a 220 character title tag and you’re trying to share it on Twitter through automatic tweets or Facebook or whatever, they look spammy, they’re not shareable, people don’t want to share them. Shorter title tags, snappy, work really well.

For all these reasons, and for most of the time we found that longer title tags don’t help you, we say that less than 70 is best practices. Now, people get confused by when we say best practices what that means. Does it mean an absolute rule? No. It just means best practices works most of the time. It’s going to be your best bet. All other things being equal, it’s going to be what you want to implement, what you want to teach people to do, and generally how you want to practice.

So, what happened here? Why did this experiment rise 136%? Well, if you remember, these were low volume pages, pages that weren’t getting a lot of traffic anyway. The reason it rose, we suspect, is because those title tags were poorly optimized in the first place. They didn’t match the content. When we added a few keywords to the end, Google interpreted that as, hey, these match a little bit better to the content, and that’s why it rose. It was a fluke. If we would have wrote the title tags better in the first place, we could have seen this traffic all along.

So, with this in mind, I have some suggestions for your future title tag use, and best practices is going to continue to be less than 70 characters.

Best Practices are Guidelines, Not Rules

The first rule is always experiment. Like I said, if we would have tried something else, if we would have written different title tags in the first place, it could have helped us. What did it cost us to change those title tags? Zero. If your pages aren’t performing well, you can always try something different and you should try something different. I still see sites all the time, large eCommerce sites, that on thousands of pages they have their brand name, the first 20 characters of the title tag in places where they shouldn’t necessarily do that. SEOmoz did that for a number of years up until a few months ago. So, always experiment, not too much, but always try different things to see what title tags are going to work best for you.

Second is write for users. Here at SEOmoz our title tag is the same as the title of our post on our blog because we think it is important to meet users’ expectations. When they see a title tag in the SERP and they click through to your page, you want them to feel like they’ve arrived where they thought they were going to arrive. So, it doesn’t always have to match the title of your post, but something similar, something to make them comfortable, and something to talk to the users.

Third, remember to keep your important keywords first. Putting your important keywords out here isn’t going to help you much unless your titles are so poorly optimized in the first place that you really should rewrite them. So, put your important keywords, they don’t always have to be in the very first position, but as close to that first position as you can.

Lastly, what happens if your title tag is over 70 characters, such as the title tag of today’s Whiteboard Friday post at 77? Don’t sweat it. In our web app, in our Pro Web App, if you go over 77 characters, we issue a warning. It is not an error. It’s a warning. We just want you to know that maybe if your title tag is over that limit that it might not be the best written title tag. You might want to have a look at it, but here at SEOmoz we have thousands of title tags that go over the 70 keyword limit, and for the most part, we’re going to be fine. Best practices means that it’s best most of the time, but you can go outside of best practices if it’s warranted.

Remember, experiment, try different things out, find out what works best for you.

That’s it for today. Appreciate your comments below. Thanks everybody.

Video transcription by

Do you like this post? Yes No

Go to Source

5 Best Practices of Facebook eCommerce Stores

facebook iconPart of the reason that Facebook has outpaced MySpace as the largest, most important social networking site in the world is that Facebook has opened its doors to external programmers. Games like Mafia Wars and Farmville have been a tremendous success and F-Commerce sites (Facebook commerce sites) are now catching up to the games and apps.

Here are five examples of how they are doing so:

1.  Be Interactive

Example: Starbucks
describe the image
Starbucks has a highly established market share and name recognition but they are also renowned for their open management style and the speed with which they implement new ideas.

At first glance, it would seem impossible to sell coffee online but Starbucks did not ignore the prospect of a Facebook store. Instead, they developed eGift cards, which allow users to designate an amount and event (birthday, holiday, etc.) and easily send the notice to a Facebook friend. It’s easy and quick to give and it is virtually guaranteed (pun intended) to bring customers. 

2. Offer Incentives

Example: Best Buy
Best Buy Facebook
The first thing you see on the Best Buy Facebook store page is a large banner for Weekly Specials. These offers are exclusive to Facebook friends and, therefore, serve as great incentives for users to add the page as a friend and follow the weekly ads closely.

With so many electronics stores available online, it is crucial for Best Buy to keep their customers engaged regularly. If they lose one sale to another site, they run the risk of that customer becoming a regular patron of the opposing site.

3. Understand Trends

Example: Old Spice
Oldspice Facebook
Old Spice sells body-cleansing products primarily but that didn’t stop them from pouncing on the opportunity to engage their Facebook audience with t-shirts. Old Spice understood that they had struck gold with commercials that have become viral and they were flexible enough to produce t-shirts with popular quotes like, “I’m on a horse,” which appeals to the exact demographic that follows them on Facebook.

When the commercials are no longer trending, the company will surely remove the shirts and begin to look for the best way to optimize their next opportunity.

4. Optimize Your Layout

Example: Liverpool Football Club
Liverpool FC Facebook
If you’re not a soccer fan, you’ll have to take my word that Liverpool is not the most famous soccer team in England. For example, Manchester United, Chelsea and Manchester City have larger fan bases but Liverpool’s Facebook store outpaces theirs, in part because of the layout.

The store is set up very cleanly, with three distinct categories that allow the user to maneuver directly to the product they are looking to purchase. The layout is effectively eliminates “noise” and gives the user direct access to the product, which decreases bounce rates and increases conversion rates.

5.  Become Lady Gaga. Or Keep Things Enticing.

Example: Lady Gaga
Facebook Lady Gaga
When you visit Lady Gaga’s store page, you’ll notice that she doesn’t upload products as often as you’d expect. She only has 12 products displayed, compared with Justin Bieber who has over 20 products displayed but still sells less than Gaga. This is partly because she builds anticipation by releasing new products sporadically.

This strategy may be different for you because you are not Lady Gaga. But you can do something like use your fan page to build the hype around a new product then, once anticipation is built, release the product and direct users to your store page at the optimal time.

Of course, you are not Starbucks and you don’t have Lady Gaga’s wardrobe but the lessons still hold true. Build on these examples and you will find yourself moving up in the F-Commerce world.

Transform your eCommerce Marketing Strategy with Social Media

Transform your eCommerce Marketing Strategy with Social Media

Connect with HubSpot:

HubSpot on Twitter HubSpot on Facebook HubSpot on LinkedIn HubSpot on Google Buzz 


Go to Source

Using Social Media to Get Ahead of Search Demand

Posted by iPullRank

Before I even start saying anything about keyword research I want to take my hat off to Richard Baxter because the tools and methodologies he shared at MozCon make me feel silly for even thinking about bringing something to the Keyword Research table. Now with that said, I have a few ideas about using data sources outside of those that the Search Engines provide to get a sense of what needs people are looking to fulfill right now. Consider this the first in a series.
Correlation Between Social Media & Search Volume
The biggest problem with the Search Engine-provided keyword research tools is the lag time in data. The web is inherently a real-time channel and in order to capitalize upon that you need to be able to leverage any advantage you can in order to get ahead of the search demand. Although Google Trends will give you data when there are huge breakouts on keywords around current events there is a three-day delay with Google Insights and AdWords only gives you monthly numbers!
However there is often a very strong correlation between the number of people talking about a given subject or keyword in Social Media and the amount of search volume for that topic. Compare the trend of tweets posted containing the keyword “Michael Jackson” with search volume for the last 90 days.

Michael Jackson Trendistic Graph
"Michael Jackson" Tweets


Michael Jackson Google Insights Graph
"Michael Jackson" Search Volume

The graphs are pretty close to identical with a huge spike on August 29th which is Michael Jackson’s (and my) birthday. The problem is that given the limitations of tools like Google Trends and Google Insights you may not be able to find this out until September 1st for many keywords and beyond that you may not be able to find complementary long tail terms with search volume.
The insight here is that subjects people are tweeting about are ultimately keywords that people are searching for. The added benefit of using social listening for keyword research that you can also get a good sense of the searcher’s intent to better fulfill their needs.
Due to this correlation social Listening allows you to uncover what topics and keywords will have search demand and what topics are going have a spike in search demand –in real-time.
Before we get to the methodology for doing this I have to explain one basic concept –N-grams. An N-gram is a subset of a sequence of length N. In the case of search engines the N is the number of words in a search query. For example (I’m so terrible with gradients):
 Michael King SearchLove NYC 5-gram
is a 5-gram. The majority of search queries fall between 2 and 5-grams anything beyond a 5-gram is most likely a long tail keyword that doesn’t have a large enough search volume to warrant content creation.

If this is still unclear check out the Google Books Ngram viewer ; it’s a pretty cool way to get a good idea of what Ngrams are. Also you should check out John Doherty’s Google Analytics Advanced Segments post where talks about how to segment N-grams using RegEx.

Real-Time Keyword Research Methodology

Now that we’ve got the small vocabulary update out of the way let’s talk about how you can do keyword research in real-time. The following methodology was developed by my friend Ron Sansone with some small revisions from me in order to port it into code.

1.  Pull all the tweets containing your keyword from Twitter Search within the last hour. This part is pretty straightforward; you want to pull down the most recent portion of the conversation right now in order to extract patterns. Use Topsy for this. If you’re not using Topsy, pulling the last 200 tweets via Twitter is also a good sized data set to use.

2.  Identify the top 10 most repeated N-grams ignoring stop words. Here you identify the keywords with the highest (ugh) density. In other words the keywords that are tweeted the most are the ones you are considering for optimization. Be sure to keep this between 2 and 5 N-grams beyond that you most likely not dealing with a large enough search volume to make your efforts worthwhile. Also be sure to exclude stop words so you don’t end up with n-grams like “jackson the” or “has Michael.” Here’s a list of English stop words and Textalyser has an adequate tool for breaking a block of text into N-grams.

3.  Check to see if there is already search volume in the Adwords Keyword tool or Google Insights. This process is not just about identifying breakout keywords that aren’t being shown yet in Google Insights but it’s also about identifying keywords with existing search volume that are about to get boost. Therefore you’ll want to check the Search Engine tools to see if any search volume exists in order to prioritize opportunities.

4.  Pull the Klout scores of all the users tweeting them. Yeah, yeah I know Klout is a completely arbitrary calculation but you want to know that the people tweeting the keywords have some sort of influence. If you find that a given N-gram has been used many times by a bunch of spammy Twitter profiles then that N-gram is absolutely not useful. Also if you create content around the given term, you’ll know exactly who to send it to.

 Methodology Expanded

I expanded on Ron’s methodology by introducing another data source. If you were at SMX East you might have heard me express the love that low budget hustlers (such as myself) have for SocialMention. Using SocialMention allows you pull data from up to 100+ social media properties and news sources. Unlike Topsy or Twitter there is an easy CSV/Excel File export and they give you the top 1-grams being used in posts related to that topic. Be sure to exclude images, audio and video from your search results as they are not useful.
Michael Jackson Social Mention
"Michael Jackson" Social Mention
One quick note: The CSV export will only give you a list of URLs, sources, page titles and main ideas. You will still have to extract the data manually or with some of the ImportXML magic that Tom Critchlow debuted earlier this year.
So What’s the Point?
So what does all of this get me? Well today it got me "michael jackson trial," "jackson trial," "south park" and "heard today." So if I was looking to do some content around Michael Jackson I’d find out what news came to light in court, illustrate the trial and the news in a blog post using South Park characters and fire it off to all the influencers that tweeted about it. Need I say more? You can now easily figure out what type of content would make viral link bait in real-time.
So this sounds like a lot of work to get the jump on a few keywords, doesn’t it?
Well I can definitely relate and especially since I am a programmer it’s quite painful for me to do any repetitive task. Seriously am I really going to sit in Excel and remove stop words? No I’m not and neither should you. Whenever a methodology like this pops up the first thing I think is how to automate it. Ladies and gentlemen, I’d like to introduce you to the legendary GoFish real-time keyword research tool.
GoFish Screenshot
I built this from Ron’s methodology and it uses the Topsy, Repustate and SEMRush APIs. When I get some extra time I will include the SocialMention API and hopefully Google will cut the lights back on for my Adwords API as well.
I seriously doubt it will handle the load that comes with being on the front page of SEOmoz as it is only built on 10 proxies and each of these APIs has substantial rate limitations (Topsy – 33k/day, Repustate 50k/month, SEMRush-I’m still not sure) but here it is nonetheless. If anyone wants to donate some AWS instances or a bigger proxy network to me I’ll gladly make this weapons grade. Shout out to John Murch for letting me borrow some of his secret stash of proxies and shout out to Pete Sena at Digital Surgeons for making me all-purpose GUI for my tools.
Anyway all you have to is put in your keyword, press the button, wait for a time and voila you get output that looks like this:
GoFish Screenshot 2
The output is the top 10 N-grams, the combined Klout scores of the all users that tweeted the given N-gram vs the highest combined Klout score possible, all of the users in the data set that tweeted them and the search volume if available.
So that’s GoFish. Think of it as a work in progress but let me know what features will help you get more out of it.
Until Next Time…
That’s all I’ve got for this week folks. I’ll be back soon with another real-time keyword research tactic and tool. if you haven’t checked out my keyword-level demographics post yet, please do! In the meantime look for me in the chatroom for Richard Baxter’s Actionable Keyword Research for Motivated Marketers Webinar.

Do you like this post? Yes No

Go to Source

Marketing Update: Marketing Testing & PETA PORN w/ Guest Anne Holland

Episode #167 – September 23rd, 2011

This week on the Marketing Update, Karen and Mike sit down and talk with their guest Anne Holland, the publisher of Anne shares her expertise and opinion on this week’s most talked about topics, including PETA.XXX and Facebook’s ‘TimeLine’.  


PETA is planning to shock everyone with a pornographic website called PETA.XXX. PETA’s goal is to utilize shock-value/pornographic photos and videos to attract and redirect their audience’s attention to their animal rights and vegan diet message. However, this sexually suggestive endeavor may not provide the results PETA is looking for. Hopefully, PETA understands the consequences of taking such measures to gain attention. This decision may cost PETA their reputation and image.

Marketing Takeaway: Out of the box ideas and remarkable content attract attention. PETA has been successful in the past with radical strategies to gain PR attention. Mike Volpe mentions in this week’s episode that even if PETA chooses to not follow through with PETA.XXX, they have already created a buzz by just mentioning their plans on taking this step.

2. Facebook’s Timeline

Facebook’s new profile design, ‘Timeline’, is a type of “digital scrapbook” that enables you to go back in time and look at all of your past posts. You can navigate through your profile’s history and find specific posts from your past by clicking on any year in the ‘Timeline’. With ‘Timeline’, you are also capable of adjusting the privacy settings of past posts and are able to delete any videos, pictures, and comments that you regretted posting. This allows Facebook users to have more control over their ‘digital identity’ and how they represent themselves and display their life history on Facebook.

What Facebook needs to worry about, is negative feedback from users unwilling to accept these changes and reject the new design. However, there will definitely be users willing to accept and utilize the new ‘Timeline’ features.

Here is a recent interview with Vadim Lavrusik discussing the details and capabilities of ‘Timeline’.

Marketing Takeaway: In the past, getting “likes” on Facebook would help you spread your message to people. However, now your Facebook strategy might need to be aimed at engaging and keeping people on Facebook. The new changes to Facebook, including ‘Timeline’, are focusing on keeping users logged in and using Facebook for longer periods of time during their sessions. 


Connect with HubSpot:

HubSpot on Twitter HubSpot on Facebook HubSpot on LinkedIn HubSpot on Google Buzz 


Go to Source

Conquer Link Directory Best Practices for SEO

Posted by Cyrus Shepard

Good news to all you link builders out there. SEOmoz just updated the PRO SEO Web Directory List. The long overdue update includes over 400 directories (up from 180) separated into three categories – Web, Local and Social.

Wait, aren’t link directories dead?

The practice of obtaining links from online directories has changed dramatically over the past 10 years. The stereotypes of the past are both true and dangerous. Spammy, low quality directories flood the lower cesspools of the Internet. An unbridled strategy of obtaining links from these non-discriminate directories can actually hurt your SEO.

But times have changed, and strategies have become more evolved.

Rethinking Directories

We can move beyond thinking about directories as nothing more than a paid links. For webmasters who use this approach, the links they obtain may not be worth the effort. An intelligent directory strategy provides depth to your SEO campaigns and offers tangible benefits including:

  • A more diverse link profile
  • Qualified referral traffic
  • Citations for different vertical ranking algorithms
  • Trust/Authority Signals
  • … and, of course, the link.

If you use different directories for different purposes, you can achieve this and more.

A. Web Directories

Design Flavor

If links were easy for everyone, they would be less valuable for all of us. In general, the lower the bar of entry into any directory, the less inclusion into that directory is valued by the search engines.

Some editorial directories raise the bar by charging a high cost of inclusion (Yahoo, Best of the Web) and being somewhat selective about whom they include. Three hundred dollars is a lot to pay for “inclusion review” but the truth is that for an established business, these links are like bread and butter.

Others directories focus on a particular area, and thus are harder to get into. Examples of these “niche” directories include sites such as:

When pursuing directory links, keep the following tips in mind.

  1. Not every directory link is right for your site. Be selective and don’t go for every link out there.
  2. Pace yourself. A good hint I got from Ian Lurie’s Fat Free Guide is the 2:1 rule.: for every two directory links you build, make sure to build one genuinely natural link. This helps to keep your link profile looking “natural.”
  3. Research. Understand where your link will be placed before you go after it. Check Google’s cache of the page to make sure they are indexing the page. Large directories are often plagued by bad SEO, and not every link carries the weight it should.
  4. Seek relevancy. Ask yourself if this link has the potential to send qualified traffic to your site. Even if it’s a small amount of traffic, it’s probably worth the effort.

I’d often rather have a hard-to-get link from a smaller niche sites that an easy-to-get link from a larger well-known directory.

B. Social Directories


We know that the rise of Social SEO means sharing your content on sites like Facebook and Twitter can have a positive impact on your site’s traffic. But far too many people limit sharing to the big three (Twitter, Facebook and Google+) without considering other social sharing sites. These plethoras of specialized social sites offer several benefits.

1. Member Profiles – Here’s a random profile from Mister Wong, a social bookmarking site. (thank you rgonzalo!) Mr. Rgonzalo appears to be an authority on the site. Any content he shares will carry weight with the Mister Wong audience.

2. Content Publishing – Instead of a single website listing, social sites allow you to promote individual pieces of content. Using the above example, whenever Rgonzolo shares a URL, that content is likely linked to and noticed by search engines.

3. Increased participation increases visibility. You probably can’t participate in every social site out there. For web marketers, focusing on a few sites where you can devote your time, like Quora or CrunchBase, may be a good strategy. Those who become trusted authorities within their community are often rewarded with increased visibility of the content they share.

C. Local Citation Directories

Judy's Book

Using local directories requires a shift in thinking for many webmasters, because it’s not always about the link, but about the citation. As David Mihm points out, the search engine’s local ranking algorithms work much differently than the search algorithms.

For local SEO, search engines trust verifiable information from local portals such as Superpages and Judy’s Book. In most cases, if you are a verified business, gaining a citation from these sources is worth the effort and time.

For more on local SEO directories, I highly recommend reading Mike Blumenthal and Andrew Shotland (and David Mihm, of course.)

Directories = Diversity in Your Link Profile

Just as you shouldn’t rely solely on directory links, you shouldn’t ignore them either. The goal is a diverse and blended link profile. Many webmasters have abandoned directory links due to the bad reputation they have gained over the past years. In truth, the variety and quality of the directories available today offer unique opportunities to expand your SEO reach and diversify your link profile in future-proof ways.

Web Directory

I encourage you to check out the new PRO Directory list, it’s an awesome resource. That said, any SEO can take advantage of the tactics in this post. Even with a curated list, using Web directories takes time and research.

There are no shortcuts in link building, but the effort is worth it.

Do you like this post? Yes No

Go to Source

September 2011
« Aug   Oct »

SEO products: