Archive for the ‘web traffic’ Category

How to Customize Your Social Updates for Facebook and Twitter

twitter and facebook status best practicesintermediate

Like every aspect of inbound marketing, social media is constantly evolving. Once upon a time, when marketing on Twitter and Facebook was in its infancy, auto-uploading the same posts to both platforms was considered acceptable and efficient. For the savvy inbound marketer, those days are over. While efficiency is important, it should never be at the expense of quality content and relevant social media posts. Twitter and Facebook speak a different language — what your audience looks for on one is not the same as what they look for on the other.

Read the rest of this entry »

What Community Builders Can Learn From Research

Posted by thogenhaven

Two weeks ago, Tom Critchlow suggested that we work to close the gap between inbound marketing and content marketing communities. It's time to build bridges again, this time between inbound marketing and research. In this post, you'll find research on participation patterns, how to spot high-value users, seeding content in a new community, how to bring new life to old content, and a little bit of gamification.

Some research is already being shared with the inbound community. Bill Slawski from SEO By The Sea does a great job reading and condensing patents from the search industry. But there is so much more research waiting to be tapped.

I am currently in a PhD program and therefore attend academic conferences. They are different to MozCon, SearchLove, SMX, Blueglass and the other conferences we all usually go to. And different means different perspectives. Last week at CSCW, 160 researchers from private companies and universities presented a paper. Topics include social media analysis, collaboration, gamification, incentives, recommender algorithms and online communities. For better or worse, I did not attend 160 presentations. So this will be a very limited summary, focusing on online communities.

Why Should You Care?

Universities and private companies like IBM, Microsoft and Google do some legit research. Being familiar with this research is a competitive advantage and will help generate new ideas.

In this post I focus primarily on community building. At SearchLove last year, Rand had a slide stating a 34% growth in 4 months, primarily from Q+A, YouMoz, the blog and user profiles. Add to this that community members are some of the best link builders you'll ever find. Getting community right is a huge win.


 

Who Participates In Online Communities?

Previous research offers two perspectives on participation patterns in online communities:

  1. Some people contribute, and others do not. It is an inherent, personal trait like hair color.
  2. Lurking is a development stage toward being an active member. All people potentially contribute, after the learning/socialization phase: users lurk for a while before participating.

Michael Muller from IBM presented fascinating research on a study on 8,711 online communities covering diverse topics with 224,232 unique users. The insight of the research shows a completely different pattern than the conventional wisdom above: 84 % of those users who participate in one or more community, lurk in others. However, the majority of members' lifetime contributions are in the beginning on their membership. Thus, many users start off contributing like mad, then stop. This means retention is key.

(Graph is printed in Muller, 2012. See references in the bottom of this post).

Design implications: Do whatever you can to grasp new members. There are many ways to do this: Make sure they get encouraging feedback to their initial comments/contributions. Assign them a mentor. Send them nice emails. Reach out to them on social media.

Spotting Talent

Despite the overall participation trend identified by Michael Muller, some people are more likely to contribute more to new communities than others. In fact, only few people end up participating in the first place. Google+ VP Bradley Horowitz once wrote about 90-9-1 principle, describing how 1% of community members are creators, 9% are synthesizers, and the remaining 90% are users/lurkers who do not directly add anything to the community.

Rosta Farzan and colleague from Carnegie Mellon University and University of Minnesota developed an algorithm to identify potential high-contributing members. The algorithm uses the following metrics to spot a potential high value member.

  • Quality
  • Motivation (quantity, frequency, and commitment)
  • Ability (knowledge, trustworthiness, and politeness and clarits)

Those identified as potential high-contributing members participated 10 times more actively than those not classified.

Design implications: sometimes the gold is right in front of us, but without our knowing. Identifying high potential members early on can help us reach out and retain these creators.

Starting A New Community

In inbound marketing, one often hears the advice: go build a community. Yes, we'd all love to have flourishing communities, right? But how to get critical mass? One solution often used is seeding a site with (third party) content. This is supposed to show that the community is lively and thereby encourage users to contribute. Jacob Solomon and Rick Wash from Michigan State University tried this form of bootstrapping when starting a new wiki.

The results show that users contribute more when they are given a blank page, than they do when they see a seeded page. This makes sense, as there is more work to do on a blank page. However, contributions made on a blank page tend to be unstructured. If the users see a page with some content (e.g. headers, text chunks, objective content, opinionated content etc.), they tend to contribute content similar to the seeded content.

Design implication: If you want users to create a special kind of focused content (e.g. replies of a certain length or with a special focus), seeding can be good. The bad news: seeding content is not a shortcut to start a community as it might actually reduce contributions. Two weeks ago, Rand and Dharmesh launched Inbound. When the site was launched, it was already seeded with many good articles. According to this paper, this seeding reduced contributions, but made them more focused on the kind of articles Rand and Dharmesh want. Sounds plausible.

New Life To Old Content

This one might require a bit engineering power. But it is really neat. Aditya Pal and colleagues from University of Minnesota created an algorithm to detect expired content on a Q&A site. The algorithm uses metrics such as

  • TF/IDF
  • Reference to a specific time (e.g. date, month)
  • Fixed vs relative time reference (ago, after, before, today, tomorrow)
  • Reference a date in past
  • Tense of the question

Design implications: Such algorithms are not only useful on Q&A sites. On enterprise websites, it can be used to flag content that ought to be updated, removed, rel=canonicalized or 301 redirected to new content. This creates better and fresher content on websites, as well as help avoiding old and irrelevant pages rank in Google. It can also help scale some of Cyrus Shepard's advices on fresh content, and help you rank for QDF keywords.

(This illustration is made by Dawn Shepard for Cyrus' post mentioned above)

Gamification Over?

Gamification has been a hot topic in the last couple of years. For many websites, the question is no longer if gamification systems should be implemented, but if it should be kept. Jennifer Thom and collaborators from IBM studied the removal of gamification points from IBM's internal social network. The researchers found that removing the points system made users contribute significantly less than before.

Design implications: You might (also) be tired of hearing about gamification. But it kinda works… So you might want to take a look at these gamification slides from Richard Baxter:

Curious for more?

The ACM Library is very good. In fact, so good that Matt Cutts blogs about it. To access the articles, you might have to go to a library or a university. But many researchers are happy to share their research, and link to it directly to their own work from their personal websites (The authors have the rights to share their own articles for free). So a little Googling can usually provide the article.

References

Michael Muller (2012): Lurking as Personal Trait or Situational Disposition? Lurking and Contributing in Enterprise Social Media. Proceeding to CSCW 2012

Aditya Pal, James Margatan, Joseph Konstan (2012): Question Temporality: Identification and Uses. Proceeding to CSCW 2012

Jacob Solomon, Rick Wash (2012); Bootstrapping wikis: Developing critical mass in a fledgling community by seeding content. Proceeding to CSCW 2012

Rosta Farzan, Robert Kraut, Aditya Pal, Joseph Konstan (2012): Socializing volunteers in an online community: A field experiment. Proceeding to CSCW 2012

Jennifer Thom, David Millen, Joan DiMicco (2012): Removing Gamification from an Enterprise SNS. Proceeding to CSCW 2012

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Go to Source

8 Great Apps to Make Inbound Marketing Easier [HubSpot Software]

swiss army knifeIf you’re using HubSpot’s Inbound Marketing Software, you’re probably happy to have all your core inbound marketing tools — Lead Nurturing, Lead Intelligence, Email Marketing, Blogging, Social Media, SEO, Marketing Analytics, and more — in one place.

But what about all the other stuff? All the tools that aren’t absolutely core to inbound marketing but that you use enough to care about.

Turns out many of these tools are now available in HubSpot’s growing app marketplace. Here are eight of our favorites:

8 Awesome Inbound Marketing Apps

1. Pay-Per-Click Analysis (Developer: Website Publicity) — Google AdWords can tell you a campaign’s click-through rate. And with an additional level of setup, it can tell you the conversion rate on the corresponding landing page. But you still won’t know if those conversions are turning into customers. This app gives you that data, allowing you to focus on the keywords that generate customers. Install PPC Analysis for HubSpot

2. Lead Grader (Developer: Lynton Web Solutions) — If you get any kind of lead volume, you need to make sure your sales team is focusing on the leads that are most likely to close. This app makes it easy to do that. In under five minutes, anybody can install the app and set up customer rules to grade their leads. Once the rules are set up, the app starts grading leads, and your sales team can start focusing on the ones that are most likely to close. Install Lead Grader 

3. Content Marketplace (Developer: Zerys) — Many inbound marketers have trouble keeping up with the pace of the content creation needed to be successful. The Zerys content marketplace helps solve this problem. It gives marketers access to a skilled pool of freelance writers who can help to create blog articles, ebooks, and whitepapers, all within HubSpot. Install Zerys

4. iReach Blog Distribution (Developer: PR Newswire) The iReach Blog Distribution App integrates with HubSpot’s Blog API by pulling blog posts from within your HubSpot portal and pushing that content through PR Newswire’s press release distribution network to thousands of sites. No more cutting and pasting your blog posts all over the place; the iReach app automates the work of transforming your blog posts into press releases. Install the iReach Bog Distribution

5. Marketing Contests (Developer: SnapApp) — Most marketers know that quizzes, polls, surveys, and sweepstakes are more likely than your average piece of content to be forwarded and shared with friends and family than static web pages. The SnapApp app in HubSpot allows you to create and measure all of the above from within HubSpot. No more trying to stitch together contests and surveys with a series of different apps. Install the Marketing Contests App

6. Ecommerce – Shopify (Developer: Lynton Web Solutions) – Using a Shopify Store for your ecommerce? This app will help you get more out of it by connecting the blogging, SEO, email marketing, social media, and analytics you do in HubSpot with your shopping engine. Install the Shopify Ecommerce App

7. Facebook Landing Pages (Developer: Convert Social) — Have a good following on Facebook, and trying to convert it into lead growth? This app will help, allowing you to set up landing pages and calls-to-action in Facebook. Leads collected from the landing pages will populate right into your HubSpot account! Install the Facebook Landing Pages App 

8. Vocalyze Blog Voice Widget (Developer: Vocalyze) — As an inbound marketer, you want your content to be as easy to consume as possible. This app takes your blog to the next step — making it simple for your readers to listen to your blog content created on HubSpot. Check out the app on the sidebar of this very blog for an example. Install the Vocalyze Blog Voice Widget

9. [Bonus] APIs – Is there something you’re trying to do that’s not available in the marketplace? No problem! It’s easy to build your own apps inside of HubSpot. Learn how

essential-guide-dark-cta

Connect with HubSpot:

HubSpot on Twitter HubSpot on Facebook HubSpot on LinkedIn HubSpot on Google Buzz 

 


Go to Source

SEO Monitoring Tools and Tips

Posted by willcritchlow

In the real world, things go wrong. While we might all wish that everything we did was "fix once, stay fixed", that's rarely the case.

Things that were previously "not a problem"(TM) can become "a problem"(TM) rapidly for a variety of reasons:

  • someone changes something unrelated / without realising it would impact you or just screws up (e.g. deploying a staging version of robots.txt or an old version of a server config)
  • the world changes around you (there was a Google update named after a black and white animal a while back)
  • the technical gremlins gang up on you (server downtime, DDoS etc.)

In all of these cases, you'd rather know about the issue sooner rather than later because in most of them your ability to minimise the resulting issues declines rapidly as time passes (and in the remaining cases, you still want to know before your boss / client).

While many of us have come round to the idea that we should be making recommendations in these areas, we are too often still creating spectacularly non-actionable advice like:

  • make sure you have great uptime
  • make sure your site is quick

Today, I want to give you three pieces of directly actionable advice that you can start doing for your own site and your key clients immediately that will help you spot problems early, avoid knock-on indexing issues and quickly get alerted to bad deploys that could hurt your search performance.

#1 Traffic drops

Google analytics intelligence alerts

Google Analytics has a feature that spots significant changes in traffic or traffic profile. It can also alert you. The first of these features is called "intelligence" and the second "intelligence alerts".

Rather than rehash old advice, I'll simply link to the two best posts I've read on the subject:

This is the simplest of all the recommendations to implement and is also the most holistic in the sense that it can alert you to traffic drops of all kinds. The downside of course is that you're measuring symptoms not causes so you (a) have to wait for causes to create symptoms rather than being alerted to the problem and (b) get an alert about the symptom rather than the cause and have to start detective work before paging the person who can fix it.

#2 Uptime monitoring

It doesn't take a rocket surgeon to realise that SEO is dependent on your website. And not only on how you optimise your site, but also on it being available.

While for larger clients, it shouldn't be your job to alert someone if their website goes down, it does no harm to know and for smaller clients there is every chance you'd be adding significant value by keeping an eye on these things.

I have both good and bad reasons for knowing a lot about server monitoring:

  • the good: we made a small investment in Server Density in May last year (and scored our only link from Techcrunch in the process)
  • the bad: we've been more enthusiastic users of our portfolio company's services than we might have hoped – some annoying server issues have resulted in more downtime for distilled.net than I care to think about. To add insult to injury, we managed to get ourselves hit with a DDoS attack last week (see speed chart below)

There are three main elements you might want to monitor:

  1. Pure availability (including response code)
  2. Server load and performance
  3. Response speed / page load time

Website availability

There are two services I recommend here:

  • Pingdom's free service monitors the availability and response time of your site
  • Server Density's paid service provides more granular alerting and graphing as well as tying it together with your server performance monitoring

Here's what the Server Density dashboard looks like:

Server Density dashboard

And here is the response time graph from pingdom:

Pingdom website speed report

You can see the spike in response time during the DDoS attack and the lower average response time over the last few days after we implemented cloudflare

Incidentally, you may not have noticed (it had passed me by until Mike gave me the heads-up the other day) that Google rolled out site speed to all analytics accounts without the previously required change to the GA snippet so you can get some of this data from your GA account now – here's the technical breakdown from some of Distilled's pages:

Site speed in GA

#3 Robot exclusion protocols, status codes

This was the most ambitious of my ideas for SEO monitoring. It came out of a real client issue. A major client was rolling out a new website and managed to deploy an old / staging version of robots.txt on a Saturday morning (continuous integration FTW). It was essentially luck that the SEO running the project was all over it, spotted it quickly, called the key contact and got it rolled back before it did any lasting harm. We had a debrief the following week where we discussed how we could get alerted to this kind of thing automatically.

I went to David Mytton, the founder of Server Density and asked him if he could build some features in for you lot to alert when this kind of thing happens – if we accidentally noindex our live site or block it in robots.txt. He came up with this ingenious solution that uses functionality already present in their core platform:

Monitoring for any change to robots.txt

First create a service to monitor robots.txt – here's ours:

Monitor robots.txt with server density

Then create an alert to tell you if the MD5 hash of the contents of robots.txt changes (see a definition of MD5 here):

robots md5 alert

If you copy and paste the contents of your robots.txt into an MD5 generator you get a string of gobbledegook (ours is "15403cbc6e028c0ec46a5dd9fffb9196"). What this alert is doing is monitoring for any change to our robots.txt so if we deploy a new version I will get an alert by email and push notification to my phone. Wouldn't it be nice to get alerted in this way if a client or dev team pushed an update to robots.txt without telling you?

Spotting the inclusion of no-index meta tags

In much the same way, you can create alerts for specific strings of text found on specific pages – I've chosen to get an alert if the string "noindex" is found in the HTML of the Distilled homepage. If we ever deployed a staging version or flipped a setting in a wordpress plugin, I'd get a push notification:

Server Density homepage noindex monitoring

Doing this kind of monitoring is essentially free to me because we are already using Server Density to monitor the health of our servers so it's no extra effort to monitor checksums and the presence / absence of specific strings.

#4 Bonus – why stop there?

Check out all the stuff that etsy monitor and have alerts for. If you have a team that can build the platform / infrastructure, then there are almost unlimited things you could monitor for and alert about. Here are some ideas to get you started:

  • status codes – 404 vs 301 vs 302 vs 500 etc.
  • changes in conversion rates / cart abandonment
  • bot behaviour – crawling patterns etc – given how disproportionately interested I was in the simple "pages crawled" visualisation available in cloudflare (see below – who'd have guessed we get crawled more by Yandex than Google?), I feel there is a lot more that could be done here:

Cloudflare crawl stats


PS – today is the last day for early bird discounts on our Linklove conferences in London and Boston at the end of March / beginning of April. (There's also a sign-up form on that page if you want to make sure you always hear about upcoming conferences and offers). I hope to see many of you there.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Go to Source

How To Get The Equivalent Of $100K in PPC Ads For Free

Posted by scanlin

This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of SEOmoz, Inc.

We launched our site in July 2010. By the end of 2011 we ranked on page one organic results for 108 relevant phrases. During 2011 we went from four phrases in the top three results to 44 phrases in the top three. Here are the SEO tactics we used to get the equivalent of $100K in PPC ads in 2011 for free.

Starting in early 2009, we took 18 months to build a subscription-based information service for investors. Half way through that process we started thinking about marketing and joined SEOmoz to learn about SEO. (First and foremost, thanks to the SEOMoz team and community for educating us on how to do SEO, as we were total novices!) Based on what we learned we made changes to our site architecture, URL naming conventions, image naming conventions, and content strategy before we launched.

Because we are a self-funded startup we knew we wouldn't have a big (or any, really) PPC budget. In our sector (financial services) many of the phrases we wanted are $10/click because we are bidding against well funded competitors (online brokers mostly). Given our conversion rates and lifetime customer value we can't make money by buying visitors at $10/click. We had to rely on organic traffic and SEO.

SEO Results

We made solid progress with our SEO in 2011. We are analytical types and like to graph the number of phrases we have in Top 3 and Page 1 organic results each week.

For Page 1 results we went from 14 phrases at the beginning of 2011 to 108 phrases at the end of 2011:

For Top 3 organic rankings in 2011 we went from four at the start of the year to 44 at the end of the year:

The impact of these ranking improvements was significant. We quadrupled our Google referred organic traffic during the year. At the start of the year we were getting 2000 visitors per month from Google organic visits. By the end of 2011 we were getting 8000 visitors per month from Google organic visits:

For us, this increase in organic search traffic helped us grow our business nicely during 2011.

Over $100,000 Of PPC Ads Equivalent

We wanted to know how much that organic traffic was worth to us in terms of equivalent PPC ad spend. So we went to the Google Keyword Tool and looked up the Exact Match estimated CPC for each phrase where we ranked. Then we multiplied that number by the actual visits we received for that Exact Match phrase.

For example, we rank for "call option" which has an estimated CPC (for Exact Match) of $13.66. We got 286 clicks from that phrase in 2011, which would have cost us 286 x $13.66 or $3907 if we had purchased those clicks via PPC. Do that same exercise for all of the phrases that sent us organic traffic during 2011 and you get a number in excess of $100,000. Those are visits we got for free because of our SEO. (Did I mention how much we appreciate our training from SEOmoz yet?)

Cool. So How Did You Get Those Rankings?

Ah, yes. The secret sauce. Because we are grateful to the community here, we are going to share our tactics. None of this is rocket science or breaking new ground. But rather than vague assurances, we can say for certain these tactics worked for us.

On-page optimization. We created an Excel file and mapped our site so we knew which phrase was mapped to which URL. We limited ourselves to one phrase per URL (okay, maybe two phrases if one was the plural of the other). Then we used the Report Card feature of the On Page tools here until we got an 'A' grade for every phrase/URL pair. We did this for about 200 phrases we care about. Yes, it took a while (a little bit of time each day spread over six months).

Internal linking. If a blog article on one concept mentions a concept we have another blog article for then we make sure the first points to the second with appropriate anchor text. We also interlink our Tutorial with our Blog. We actually repeat this process about once every 90 days, so to make sure that older content is referring to newer content (and vice versa) as we add more content pages.

New content. We add at least one page of unique content per week to the site (300-500 words written by us and relevant to our audience). We have a list of phrases we'd like to rank for that we don't currently rank for and tend to create content around one of those phrases each week.

Link building. We build deep links to every page. For some pages, optimized for long tail phrases, it only takes 1 or 2 links with appropriate anchor text to get a decent ranking. But for most of our phrases it requires many more links than that. We wrote a ton of guest blog articles and article marketing articles (non-spun, non-spammy) and posted them on themed (investment related) blogs and sites. An example is this guest post on a PR5 site.

BLU. Blogger Link Up is a free email list where people post requests for articles every day (there are a few of these kinds of sites). If you write something they will give you a link back. Before spending time creating new content for someone else we always check their traffic stats and look at their site. If their site is spammy looking then forget it. But many of them are quality, well-curated sites that will provide a decent link in exchange for quality content.

HARO. If you aren't using HARO you should be. It stands for Help A Reporter Out. You sign up (free) and then get a daily email from journalists looking for sources on articles. If you are relevant to the article they are working on and offer them some expert answers or content they may cite you in their article (and give you a link back). Major publications use HARO and we have successfully gotten links on sites like American Express's OpenForum (PR6 site) through this process. It's not the same as having an expensive PR firm, but it will give you at least some access to the same kind of publications a PR firm would.

Press releases. Never underestimate the links you will get if you issue a press release. We use PRWeb but there are others. Make sure the release is SEO optimized (put in a few links to deep pages on your site). Seems like no matter what you issue at PRWeb there are dozens of sites that will republish your release, creating dozens of new links. Yes, you have to pay for the releases. Do it a couple times a year, minimum.

Forum participation. This does not mean posting spam in forums. This means find where your audience hangs out and provide meaningful participation. After you've established yourself as credible (posted a certain number of non-spam postings) then most forums will let you have a do-follow link in your signature line for each post. Yes, it takes time to read and participate in the forums. You will not only get some link love (for the bots) but eventually but you will also get human visitors who just like what you're saying in the forums and come check you out.

YouTube videos. We weren't sure about this one until we did it, but it's totally worth it. Create a channel on YouTube (which will get you one do-follow link from a PR9 site) and post some videos. We saw a noticeable increase in rankings once we did this. We think that PR9 link really helped.

Facebook, Twitter, LinkedIn, Google+: Set up profiles and every time you write a blog entry post it to these outlets.

You Had Better Like To Write

The bottom line is we spend a ton of time writing. Writing for our own site, writing guest blogs and articles for other sites, writing to answer HARO requests, answering questions in forums, etc. We probably spend half our time on new content creation and writing in general. Yes, you can outsource the writing but (1) it costs money, and (2) much of what you get back won't be of high enough quality to use (at least, within our financial niche that has been our experience). Better to write it yourself.

We've definitely come to realize that SEO is not a sprint; it's a marathon. Even though we made good progress in 2011 we have another hundred phrases we want to rank for in 2012. That's over eight per month. Time to get back to writing!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Go to Source

How Customer-Centric Analytics Will Change the Future of Marketing

arrows pointing to manWe have no qualms about beating the following concept into the ground, which is why you’ve likely heard us say it before: Analytics are critically important to inbound marketing success. Measuring and analyzing the performance of every inbound marketing channel you use to drive traffic, generate leads, and convert those leads into customers—then making adjustments to your marketing strategy and tactics based on the insights you glean from them—is what separates good inbound marketing from truly remarkable inbound marketing.

So if you’re leaning toward the side of remarkable inbound marketing, you likely have some type of marketing analytics tool in place to track and measure how your marketing programs are performing. And that’s all well and good, but there’s a deficiency in many of these analytics platforms.

So exactly what is missing from most analytics tools these days? A canonical identity.

Putting the Person at the Heart of Analytics 

The biggest thing missing from many present-day analytics solutions is the customer. While it’s great to have aggregate data—like overall number of page views, leads, etc.—it’s also important to remember that an individual view or lead represents an actual person. When you take this person-centric approach, you can go back in time and look at every interaction that an individual person took.

hubspot analytics resized 600

The Role of Cohort Analytics

It’s easy to see why person-centric analytics are a huge advantage, especially for companies whose marketing and sales teams are very closely tied together. However, to make truly useful strategic decisions, what businesses really need are cohort analytics.

This is not to be confused with aggregate data or basic segmentation. Cohort analytics let you focus on a group of people who shared a particular experience at a specific point in time. In other words, you can then compare your visitors who saw Campaign A in January to those who saw it in February, all while ignoring those who saw Campaign B or C.

Even better, with person-level analytics, you can identify customer personas to help you find out what marketing tactics work well for each persona. For instance, you’ll be able to see that people like Robbie respond better to email campaigns, while people like Joe convert better through social media.

The Future of Analytics Is Integrated

The two concepts above are patterns that other analytics products are likely to follow very soon. Kissmetrics has already started to adopt the canonical identity stuff, and Google is making headway on cohorts. However, an analytics product, on its own, isn’t going to be enough to give you all the answers you want. For example neither Kissmetrics nor Google can give you good conversion data on the entire history of an A/B tested landing page, which will have variations starting and stopping at different times. As that gets more complex, it’ll become nearly impossible for those analytics products to keep track of your cohorts without being deeply integrated with your CMS.

Things get even more complicated if you want to integrate your email analytics, browser data, error logging data, usage tracking, etc. Once you start going down that road, be ready with an army of engineers and a fat checkbook.

In order to create a truly powerful analytics system, all of a marketer’s analytics need to live in one place, and they need to talk to each other. The HubSpot software currently offers this capability. Marketers should be hopeful that other solutions will also follow suit.

Having an all-in-one solution is valuable for marketers not just because it’s “convenient” to have everything in one place, but also because the richest insights come from the intersection between different channels. Once marketers have these insights at their fingertips, they’ll be able to make their marketing even more tailored to suit the individual needs and behaviors of their prospects and leads.

analytics-ebook

Connect with HubSpot:

HubSpot on Twitter HubSpot on Facebook HubSpot on LinkedIn HubSpot on Google Buzz 

 


Go to Source

February 2012
M T W T F S S
« Jan    
 12345
6789101112
13141516171819
20212223242526
272829  
Links
SEO products: