Archive for November, 2011

YouTube Launches New and Improved YouTube Analytics

describe the imageIt looks like the marketer’s cry for more data has reached YouTube’s ears, and not too long after Facebook beefed up its analytics with new features for Facebook Insights.

While YouTube had previously offered an Insights tool that let you view data for videos on your channel, the Google-owned online video giant announced on its blog this morning that, over the course of today, it will be replacing your YouTube Insights with YouTube Analytics. Here are the four latest features YouTube Analytics will tack on to the almost-defunked Insights.

New Dashboard Overview

When you visit youtube.com/analytics, you’ll be greeted with a page that provides all your important data in one screenshot. You’ll still be able to see a summary of your views, videos demographics, and popularity by region (though it looks much prettier!), but now your dashboard will also display overall channel performance, engagement, and how people find and view your videos. You can click into each report to see more detailed information, which brings us to the next awesome YouTube Analytics update.

describe the image

More Detailed Reports

YouTube Analytics will offer more detailed statistics than Insights that are intended to give a more precise understanding of your audience and how well your content performs. Right from the overview page, you can access several reports from the left navigation. You also now have a data filter that lets you filter by content, geography, and custom date ranges.

youtube data filterCharts can display with data points in daily, weekly, or monthly increments, and it also lets you compare two metrics on one chart. You can even toggle between the Line Chart view and Map view on most reports. If your report has map functionality, you can hover over countries to see more detailed data for that area.

Once you run a report, download it…and show it to your boss to show what an analytical marketing superstar you are!

Audience Retention

Formerly called Hot Spots, this report will show you how far viewers watch through your video. This will be relevant for older videos with 600 views or more, or newer videos with 300 views or more, as YouTube has set these as the minimum number needed for statistical significance.

youtube analytics audience retention

YouTube measures your ability to retain your audience based on both absolute and relative retention. Absolute audience retention shows the views of every moment of the video as a percentage of the number of views at the beginning of the video. So if someone rewinds and re-watches your video, you’ll see your graph go up; if someone starts to fast forward, your graph will go down. Relative audience retention shows your video’s ability to retain viewers during playback relative to all YouTube videos of similar length. The higher the graph at any moment, the more viewers kept watching your video over other videos at that same moment in playback.

What all this fancy math allows you to do is see where in your video people get bored. YouTube has even made it possible to embed your video in the report so you can align the peaks and valleys in the report with the moments in the video people are most engaged, and least engaged, so you can adjust your content to capitalize on what people seem to enjoy.

Estimated Earnings

Some marketers are trying to monetize their YouTube presence. If that’s you, you’ll enjoy the Estimated Earnings tab of YouTube Analytics. Total Estimated Earnings tells you the net revenue from Google-sold advertising, AFV Earnings provides the estimated earnings from auction sold advertising via AdSense for Video, and YouTube Earnings shows the estimated earnings from Doubleclick advertising and any other YouTube-sold sources.

Why Marketers Should Measure Video Performance

If you didn’t spend much time in Insights, take these positive changes as an opportunity to reengage with your video marketing analytics. While video content is highly coveted by consumers, it is far more costly to produce than web content, in terms of both time and money. What goals do you have with your video marketing? Whether it’s driving engagement, ad revenue, subscribers, or a whole mix of metrics, having access to daily insights and reporting into your wins and losses will give you the data you need to make educated decisions about your video marketing investment.

The changes to YouTube account holders will roll out throughout the day, so if you haven’t noticed a change yet, hold steady. YouTube Analytics was launched to help marketers extend their reach and, straight out of YouTube’s mouth, “earn more money.” I’m sure we all hope these changes deliver on that!

Have you seen your Insights account replaced by YouTube Analytics? What do you think of the new features?

Image credit: Rego – d4u.hu

monitor-10-minutes-ebook

Connect with HubSpot:

HubSpot on Twitter HubSpot on Facebook HubSpot on LinkedIn HubSpot on Google Buzz 

 


Go to Source

Just How Smart Are Search Robots?

Posted by iPullRank

Matt Cutts announced at Pubcon that Googlebot is “getting smarter.” He also announced that Googlebot can crawl AJAX to retrieve Facebook comments coincidentally only hours after I unveiled Joshua Giardino's research that suggested Googlebot is actually a headless browser based off the Chromium codebase at SearchLove New York. I'm going to challenge Matt Cutts' statements, Googlebot hasn't just recently gotten smarter, it actually hasn’t been a text-based crawler for some time now; nor has BingBot or Slurp for that matter. There is evidence that Search Robots are headless web browsers and the Search Engines have had this capability since 2004.

Disclaimer: I do not work for any Search Engine. These ideas are speculative based on patent research done by Joshua Giardino, myself, some direction from Bill Slawski and what can be observed on Search Engine Results Pages.

What is a Headless Browser?

A headless browser is simply a full-featured web browser with no visual interface. Similar to the TSR (Terminate Stay Resident) programs that live on your system tray in Windows they run without you seeing anything on your screen but other programs may interact with them. With a headless browser you can interface with it via a command-line or scripting language and therefore load a webpage and programmatically examine the same output a user would see in Firefox, Chrome or (gasp) Internet Explorer. Vanessa Fox alluded that Google may be using these to crawl AJAX in January of 2010.

However Search Engines would have us believe that their crawlers are still similar to Unix’s Lynx browser and can only see and understand text and its associated markup. Basically they have trained us to believe that Googlebot, Slurp and Bingbot are a lot like Pacman in that you point it in a direction and it gobbles up everything it can without being able to see where it’s going or what it’s looking at. Think of the dashes that Pacman eats as webpages. Every once in a while it hits a wall and is forced in another direction. Think of SEOs as the power pills. Think of ghosts as technical SEO issues that might trip up Pacman and cause him to not complete the level that is your page. When an SEO gets involved with a site it helps a search engine spider eat the ghost; when they don’t Pacman dies and starts another life on another site. 

Pac-Man as a Crawler

That’s what they have been selling us for years the only problem is it’s simply not true anymore and hasn’t been for some time. To be fair though Google normally only lies by omission so it’s our fault for taking so long to figure it out.

I encourage you to read Josh’s paper in full but some highlights that indicate this are:

  • A patent filed in 2004 entitled “Document Segmentation Based on Visual Gaps” discusses methods Google uses to render pages visually and traversing the Document Object Model (DOM) to better understand the content and structure of a page. A key excerpt from that patent says “Other techniques for generating appropriate weights may also be used, such as based on examination of the behavior or source code of Web browser software or using a labeled corpus of hand-segmented web pages to automatically set weights through a machine learning process.”
     
  • The wily Mr. Cutts suggested at Pubcon that GoogleBot will soon be taking into account what is happening above the fold as an indication user experience quality as though it were a new feature. That’s curious because according to the “Ranking Documents Based on User Behavior and/or Feature Data” patent from June 17, 2004 they have been able to do this for the past seven years. A key excerpt from that patent describes “Examples of features associated with a link might include the font size of the anchor text associated with the link; the position of the link (measured, for example, in a HTML list, in running text, above or below the first screenful viewed on an 800.times.600 browser display, side (top, bottom, left, right) of document, in a footer, in a sidebar, etc.); if the link is in a list, the position of the link in the list; font color and/or attributes of the link (e.g., italics, gray, same color as background, etc.);” This is evidence that Google has visually considered the fold for some time. I also would say that this is live right now as there are instant previews that show a cut-off at the point which Google is considering the fold.
     
  • It is no secret that Google has been executing JavaScript to a degree for some time now but “Searching Through Content Which is Accessible Through Web-based Forms” shows an indication that Google is using a headless browser to perform the transformations necessary to dynamically input forms. “Many web sites often use JavaScript to modify the method invocation string before form submission. This is done to prevent each crawling of their web forms. These web forms cannot be automatically invoked easily. In various embodiments, to get around this impediment, a JavaScript emulation engine is used. In one implementation, a simple browser client is invoked, which in turn invokes a JavaScript engine.” Hmmm…interesting.

Google also owns a considerable amount of IBM patents as of June and August of 2011 and with that comes a lot of their awesome research into remote systems, parallel computing and headless machines for example the “Simultaneous network configuration of multiple headless machines” patent. Though Google has clearly done extensive research of their own in these areas.

Not to be left out there’s a Microsoft patent entitled “High Performance Script Behavior Detection Through Browser Shimming” where there is not much room for interpretation; in so many words it says Bingbot is a browser. "A method for analyzing one or more scripts contained within a document to determine if the scripts perform one or more predefined functions, the method comprising the steps of: identifying, from the one or more scripts, one or more scripts relevant to the one or more predefined functions; interpreting the one or more relevant scripts; intercepting an external function call from the one or more relevant scripts while the one or more relevant scripts are being interpreted, the external function call directed to a document object model of the document; providing a generic response, independent of the document object model, to the external function call; requesting a browser to construct the document object model if the generic response did not enable further operation of the relevant scripts; and providing a specific response, obtained with reference to the constructed document object model, to the external function call if the browser was requested to construct the document object model."(emphasis mine) Curious, indeed. 

Furthermore, Yahoo filed a patent on Feb 22, 2005 entitled  "Techniques for crawling dynamic web content" which says "The software system architecture in which embodiments of the invention are implemented may vary. FIG 1 is one example of an architecture in which plug-in modules are integrated with a conventional web crawler and a browser engine which, in one implementation, functions like a conventional web browser without a user interface (also referred to as a "headless browser")." Ladies and gentlemen I believe they call that a "smoking gun." The patent then goes on to discuss automatic and custom form filling and methods for handling JavaScript.

Super Crawling Pac-Man

Search Engine crawlers are indeed like Pacman but not the floating mouth without a face that my parents jerked across the screen of arcades and bars in the mid-80’s. Googlebot and Bingbot are actually more like the ray-traced Pacman with eyes, nose and appendages that we’ve continued to ignore on console systems since the 90’s. This Pacman can punch, kick, jump and navigate the web with lightning speed in 4 dimensions (the 4th is time – see the freshness update). That is to say Search Engine crawlers can render the page as we see them in our own web browsers and have achieved such a high level of programmatic understanding that allows them to emulate a user.

Have you ever read the EULA for Chrome? Yeah me neither, but as with most Google products they ask you to opt-in to a program in which your usage data is sent back to Google. I would surmise that this usage data is not just used to inform the ranking algorithm (slightly) but that it is also used as a means to train Googlebot’s machine learning algorithms in order to teach it to input certain fields in forms. For example Google can use user form inputs to figure out what type of data goes into which field and then programmatically fill forms with generic data of that type. If 500 users put in an age in a form field named “age” it has a valid data set that tells it to input an age. Therefore Pacman no longer runs into doors and walls, he has keys and can scale the face of buildings. 

Evidence

  • Instant Previews – This is why you’re seeing annotated screenshots in Instant Previews of the SERPs. The instant previews are in fact an impressive feat in that they not only take a screenshot of a page but they also visually highlight and extract text pertinent to your search query. This simply cannot be accomplished with a text-based crawler. 
    Moz Annotated Screenshot
     
  • Flash Screenshots – You may have also noticed in Google Webmaster Tools screenshots of Flash sites. Wait I thought Google couldn’t see Flash?
     
  • AJAX POST Requests ConfirmedMatt Cutts also confirmed that GoogleBot can in fact handle AJAX POST requests coincidentally a matter of hours after the “Googlebot Is Chrome” article was tweeted by Rand, it made its way to the front of HackerNews and brought my site down. By definition AJAX is content loaded by JavaScript when an action takes place after a page is loaded. Therefore it cannot be crawled with a text-based crawler because a text-based crawler does not execute JavaScript it only pulls down existing code as it is rendered at the initial load.
     
  • Google Crawling Flash – Mat Clayton also showed me some server logs where GoogleBot has been accessing URLs that are only accessible via embedded in Flash modules on Mixcloud.com:

     66.249.71.130 "13/Nov/2011:11:55:41 +0000" "GET /config/?w=300&h=300&js=1&embed_type=widget_standard&feed=http%3A//www.mixcloud.com/chrisreadsubstance/bbe-mixtape-competition-2010.json&tk=TlVMTA HTTP/1.1" 200 695 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"

    66.249.71.116
     "13/Nov/2011:11:51:14 +0000" "GET /config/?w=300&h=300&js=1&feed=http%3A//www.mixcloud.com/ZiMoN/electro-house-mix-16.json&embed_type=widget_standard&tk=TlVMTA HTTP/1.1" 200 694 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html

    Granted this is not new, but another post from 2008 explains that Google "explores Flash files in the same way that a person would, by clicking buttons, entering input, and so on." Oh, you mean like a person would with a browser?
     

  • Site Speed – Although Google could potentially get site load times from toolbars and usage data from Chrome it’s far more dependable for them to get it by crawling the web themselves. Without actually executing all the code on a page it’s not realistic that the calculation of page load time would be accurate.

 

So far this might sound like Googlebot is only a few steps from SkyNet and due to years of SEOs and Google telling us their search crawler is text-based it might sound like science fiction to you. I assure you that it’s not and that a lot of the things I’m talking about can be easily accomplished by programmers far short of the elite engineering team at Google.

Meet PhantomJS

PhantomJS is a headless Webkit browser that can be controlled via a JavaScript API. With a little bit of script automation a browser can easily be turned into a web crawler. Ironically the logo is a ghost similar to the ones in Pacman and the concept is quite simple really; PhantomJS is used to load a webpage as a user sees it in Firefox, Chrome or Safari, extract features and follow the links. PhantomJS has infinite applications for scraping and otherwise analyzing sites and I encourage the SEO community to embrace it as we move forward.

Josh has used PhantomJS to prepare some proof of concepts that I shared at SearchLove.

I had mentioned before when I released GoFish that I’d had trouble scraping the breakout terms from Google Insights using a text-based crawler due to the fact that it’s rendered using AJAX. Richard Baxter suggested that it was easily scrapable using an XPath string which leads me to believe that the ImportXML crawling architecture in Google Docs is based on a headless browser as well.

In any event here Josh pulls the breakout terms from the page using PhantomJS:

Creating screenshots with a text-based crawler is impossible but with a headless webkit browser it’s a piece of cake. Here’s an example that Josh has prepared to show screenshots being created programmatically using PhantomJS.

Chromium is Google’s open source fork of the Webkit browser and I seriously doubt that Google’s motives for building a browser were altruistic. The aforementioned research would suggest that GoogleBot is a multi-threaded headless browser based on that same code.

Why Don't They Tell Us?

How to Make Super Googlebot

Well actually they do but they say the "instant preview crawler" is a completely separate entity. Think of the Instant Crawler as Ms. Pacman.

poster on Webmaster Central complained that they were seeing "Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.14 (KHTML, like Gecko) Chrome/9.0.597 Safari/534.14" rather than "Mozilla/5.0 (en-us) AppleWebKit/525.13 (KHTML, like Gecko; Google Web Preview) Version/3.1 Safari/525.13" as the Google Web Preview user agent in their logs.

John Mu reveals "We use the Chrome-type user-agent for the Instant Previews testing tool, so that we're able to compare what a browser (using that user-agent) would see with what we see through Googlebot accesses for the cached preview image."

While the headless browser and Googlebot as we know it may be separate in semantic explanation I believe that they always crawl in parallel and inform indexation and ultimately rankings. In other words it's like a 2-player simultaneous version of Pacman with a 3D Ms. Pacman and a regular Pacman playing the same levels at the same time. After all it wouldn't make sense for the crawlers to crawl the whole web twice independently.

So why aren't they more transparent about these capabilities as they pertain to rankings? Two Words: Search Quality. As long as Search Engines can hide behind the deficiencies of a text-based crawler they can continue to use it as a scapegoat for their inability to serve up the best results. They can continue to move towards things such as the speculated AuthorRank and lean on SEOs to literally optimize their Search Engines. They can continue to say vague things like “don’t chase the algorithm”, “improve your user experience” and “we’re weighing things above the fold” that force SEOs to scramble and make Google’s job easier.

Google’s primary product (and only product if you’re talking to Eric Schmidt in court) is Search and if it is publicly revealed that their capabilities are far beyond what they advertise they would then be responsible for a higher level of search quality if not indexation of impossible rich media like Flash.

In short they don’t tell us because with great power comes great responsibility.

What Does That Mean For Us?

A lot of people have asked me as Josh and I've led up to unveiling this research “what is the actionable insight?” and “how does it change what I do as far as SEO?” There are really three things as far as I’m concerned:

  1. You're not Hiding Anything with Javascript – Any content you thought you were hiding with post-load JavaScript — stop it. Bait and switching is now 100% ineffective. Pacman sees all.
     
  2. User Experience is Incredibly Important – Google can literally see your site now! As Matt Cutts said they are looking at what's above the fold and therefore they can consider how many ads are rendered on the page in determining rankings. Google can leverage usage data in concert with the design of the site as a proxy to determine out how useful a site is to people. That's both exciting and terrifying but it also means every SEO needs to pick up a copy of "Don't Make Me Think" if they haven't already.
     
  3. SEO Tools Must Get Smarter – Most SEO tools are built on text-based scrapers and while many are quite sophisticated (SEOmoz clearly is leading the pack right now) they are still very much the 80’s Pacman. If we are to understand what Google is truly considering when ranking pages we must include more aspects in our own analyses.
  • When discussing things such as Page Authority and the likelihood of spam we should be visually examining pages programmatically rather than just limiting ourselves to the metrics like keyword density and the link graph. In other words we need a UX Quality Score that is influenced by visual analysis and potential spammy transformations.
     
  • We should be comparing how much the rendered page differs from what would otherwise be expected of the code. We could call this a Delta Score.
     
  • When measuring the distribution of link equity from a page the dynamic transformations must also be taken into account as Search Engines are able to understand how many links are truly on a page. This could also be included within the Delta Score.
     
  • On another note Natural Language Processing should also be included in our analyses as it is presumably a large part of what makes Google’s algo tick. This is not so much for scoring but for identifying the key concepts that a machine will associate with a given piece of content and truly understanding what a link is worth in context of what you are trying to rank for. In other words we need contextual analysis of the link graph.

There are two things that I will agree with Matt Cutts on. The only constant is change and we must stop chasing the algorithm. However we must also realize that Google will continue to feed us misinformation about their capabilities or dangle enough to make us jump to conclusions and hold on to them. Therefore we must also hold them accountable for their technology. Simply put if they can definitively prove they are not doing any of this stuff – then at this point they should be; after all these are some of the most talented engineers in the universe.

Google continues to make Search Marketing more challenging and revoke the data that allows us to build better user experiences but the simple fact is that our relationship is symbiotic. Search Engines need SEOs and Webmasters to make the web faster, easier for them to understand and we need Search Engines to react to and reward quality content by making it visible. The issue is that Google holds all the cards and I’m happy to have done my part to pull one.

Your move Matt.

Do you like this post? Yes No


Go to Source

4 Simple Steps to an Optimized Twitter Presence

Personal Branding on TwitterMany of you probably joined Twitter to market a business. While marketing may be your primary focus, the Twitter also emphasizes an element of personal branding you shouldn’t neglect. Have you optimized both your business and personal Twitter presence to enable people to learn more about you and your business?

If you haven’t, keep reading. David Meerman Scott shares a set of elements you should optimize within your Twitter profile. Let’s look at the four major components of a Twitter profile for better optimization:

1. Your Twitter Background 

The first component you can optimize is your Twitter background. As a Twitter user, you have the opportunity to upload a custom image or pick one of Twitter’s suggested templates. Don’t use the default. David’s Twitter background, for instance, is an image of an antique typewriter. “It’s like my personality,” says David.

A customized Twitter background is great for conveying something about you or your brand’s personality. What is more, it makes you more unique, helping you stand out from the crowd of other Twitter users. Not sure how to go about creating a custom Twitter background? Check out our video tutorial.

2. Your Photograph

The second element of a Twitter profile that you will need to optimize is your avatar. Again, don’t use the default “egg” image. That won’t help you differentiate you or your business from the rest.

Many people use photos that don’t help Twitter users recognize their identity. There is either too much going on in the photo, or it has been taken from too far away. These types of images might be great at conveying your personality, but they aren’t necessarily optimized for branding. Instead, for personal profiles, you should consider using a headshot that clearly shows your face so you can be easily recognized in the Twittersphere. For business accounts, use an image that portrays your company logo or brand.

3. Your Twitter Bio

Your bio is the third thing you should optimize on your Twitter profile. It’s easy to just put a laundry list of stuff in there to define you or your brand, says David. But why not come up with a full sentence that describes you or your business? Also, make sure you include a link to your website or blog, where visitors can go to learn more about who you are and what you do. 

4. Your Tweets

Don’t forget to also optimize each of your tweets. Always share valuable content and use action-oriented language. As we have discussed previously, verbs are the part of speech that generates the most shares on Twitter. Post regularly — even over the weekend. We have found that Saturdays and Sundays perform well in terms of engaging people through tweets.

Make sure your updates also include links to landing pages, a technique that will enable you to generate leads from Twitter. In this way, your social media efforts will directly impact lead generation.

What are some practices that you have leveraged to optimize your Twitter presence, both for personal and business use?

intro-to-twitter-ebook

Connect with HubSpot:

HubSpot on Twitter HubSpot on Facebook HubSpot on LinkedIn HubSpot on Google Buzz 

 


Go to Source

Building The Implicit Social Graph

Posted by Justin Briggs

Google Plus is Google's latest attempt at building an explicit social graph that they control, but Google has been building out an implicit social graph for quite some time. This graph is still relatively naive compared to the maturity of the link graph, but search engines continue to develop this graph. Since it is already directly influencing rankings, and its value will increase, it’s important to understand how this type of social graph is being built. In this post, I’ll look at some of the methods for building the social graph, as well as looking at explicit vs. implicit social graphs.

You can be certain Google is building an implicit social graph:
 
“we studied the implicit social graph, a social network that is constructed by the interactions between users and their groups. We proposed an interaction-based metric for computing the relative importance of the contacts and groups in a user's egocentric network, that takes into account the recency, frequency, and direction of interactions” – Google
 

Building the Implicit Graph

This graph can be built by looking at Google’s link graph.
 
social graph via link graph
 
By looking at links between profiles, and reinforcing relationships based off content analysis (username, bio, etc), search engines can confirm ownership, or at least believe with a high degree of certainty that all of these web properties are in fact owned by one person.
 
The implicit graph can grow from one seed explicit relationship.
 
building the implicit social relationship
 
In this example, accounts A and B have defined an explicit relationship via reciprocal following on Twitter. The degree of this relationship can be gauged on interactions, but more on that in a moment. However, A and B have not continued this relationship across all networks, such as Facebook (or maybe the relationship is not crawlable).
 
Let’s say for example I’m user B and user A is Hello Kitty. Hello Kitty shares a link publicly on Facebook, and then later I perform the following search.
 
hello kitty in search results
 
The explicit relationship on one social network can be used to evaluate URLs based off the behavior on a different social network where I have not explicitly defined it. This brings up all sorts of questions about privacy, and Google will tread lightly here, as you don’t want Google displaying known relationships that you haven’t made public. However, displaying and knowing are independent. They might know your relationships, even if they never expose them to you.
 
In the Google paper “Suggesting (More) Friends Using the Implicit Social Graph” they clearly make a distinction here:
 
“we draw a sharp distinction between each user's egocentric network and the global or sociocentric network that is formed by combining the networks of all users. […] By showing users suggestions based only on their local data, we are able to protect user privacy and avoid exposing connections between the user's contacts that might not otherwise have been known to him”
 

Interaction Rank

Relationships can be further analyzed by computing Interaction Rank, which measures the degree of relationship between two users.
 
Interaction Rank: A metric computed by looking at the number of exchanges between users, weighting each interaction as a function of recency. The interaction weight decays exponentially over time. It also looks at the relative importance of ongoing interactions.
 
Note: In the paper, Interaction Rank is defined in terms of building an implicit social network on top of email interactions, which is a data set Google has a lot of access to, but could be applied to non-email social graphs.
 
Google may use three criteria to measure edge weights. In graph theory, edges are the connections between nodes (the blue lines in the image above).
 
1. Frequency: Users / groups that interact frequently are more important to those with infrequent interactions.
2. Recency: The change in interactions over time. Recent interactions should carry more weight than interactions in the past.
3. Direction: Interactions a user initiates are more significant than those a user does not initiate.
 
Criteria like direction, for example, can help determine spam relationships. Spam accounts send out more interactions than they receive.
 
One obvious short-coming of the model is that Interaction Rank is higher for active social media users than less active users. However, since Interaction Rank is used to sort relationships relative to one egocentric view of the graph, and not across a global graph, it can function as metric to sort the relative importance of relationships in regards to the central node/user.
 

They’ve Been Doing This for a Long Time

Google is getting a lot more attention regarding social recently, but Google has been doing this for quite some time. Google launched the Social Graph API back in February of 2008, which is an API that taps into one form of an explicit graph based off XFN and FOAF. This tool has been tracking reciprocal Twitter relationships, and many other things, for years.
 
Rand's twitter relationships
 
Some of the social network building they’ve done can be seen via the social graph API.
 
Rand's social networks via social graph API
 
Rand gave examples back in July of this type of deep dive crawl.
 
Social circle in Google
 
They crawl from this seed set of explicit opt-ins to build out a wider set of related connections.
 
Implicit Social Circle in Google
 
In the example above, Google is crawling multiple hops away from a seed node to build out an implicit social graph. In the example above, a relationship between Rand and Andrew can be defined, and this relationship analysis can be carried over to networks where that relationship isn’t explicitly defined. The Interaction Rank between Rand and Andrew on Twitter can set the degree that Google pulls signals from these implicit connections.
 

And Here Comes Google Circles

This all changes with Google Plus. One of the limitations of building an implicit social graph is that you don’t have the data to test against to confirm the predictions and relationships that graph discovers. It still has to depend on the data made public, but is limited by relationships that are held private (aka Facebook). Google Plus, among other things, creates a massive set of explicit social graph data, which can be used for machine learning and accuracy checking.
 
It’s easy to imagine that Google will use the implicit social graph to predict relationships with relative degrees of certainty about the nature and importance of that relationship. Now Google Plus data can be pushed into the algo, in the same way human reviews could be pushed into Panda. And not only that, but they’re bucketed into contextual based relationships using Circles. The implications of this are huge.
 
However, an explicit social network will not replace the implicit network.
 
From the same Google paper:
 
“One survey of mobile phone users in Europe showed that only 16% of users have created custom contact groups on their mobile phones. In our user studies, users explain that group-creation is time consuming and tedious. Additionally, groups change dynamically, with new individuals being added to multi-party communication threads and others being removed. Static, custom-created groups can quickly become stale, and lose their utility”
 
They go on to say:
 
“Our algorithm is inspired by the observation that, although users are reluctant to expend the effort to create explicit contact groups, they nonetheless implicitly cluster their contacts into groups via their interactions with them”
 
This clearly shows at least some of the shortcomings of the explicit social graph.
 
Pros and Cons of Implicit and Explicit Social Graphs
 
Even with publicly available, and privately available, explicit social data, there is still a strong incentive to build out the implicit graph. The explicit graph can be used to make improvements upon this graph. The implicit graph is one area where Google has a significant advantage over Facebook.
 
It’s no secret that the social graph appears to be the next evolution with increasing uses of social factors, social elements in search, and mechanisms that will lead into AgentRank/AuthorRank, which will tie directly into the implicit social graph.
 
p.s. Some great additional reading on this topic: Are You Trusted by Google? via SEO By the Sea's Bill Slawski.

Do you like this post? Yes No


Go to Source

7 Reasons You’re Not Generating Leads From Social Media

twitter bird chirpingThis is a guest post written by Pam Sahota. Pam is a marketing communications/social media manager and freelance blogger who loves Boston, photography, charity events, sushi, wine, and the Red Sox.

Social media is a great inbound marketing tool that allows businesses and marketing teams to interact with prospects, cater to customers, promote their content, and yes, generate leads. When a business uses social media right, prospective customers have the opportunity to access great content and information via a platform they already populate and actually want to gather said content and information.

Additionally, when prospects do “bite,” many of them are willing to provide their contact information, click to obtain more valuable content, and then come back for more, illustrating the concept of effective use of social media for lead generation beautifully. And a good chunk of B2B marketers are on top of this: According to BtoB Magazine, 48.9% of B2B marketers who use social media say use it for lead generation, making lead generation one of the top applications for the use of social media. Unfortunately for some brands, they don’t always realize there are true tactics in order to use social media effectively for lead gen, and they approach their social media presence blindly.

To make sure you’re business is appropriately using social media to boost its lead gen efforts, check out the following list to ensure you’re not making any of these rookie mistakes.

7 Ineffective Ways to Generate Leads From Social Media

1. Not being where your target customers are. It’s not important to maintain a presence on just any social media network in order to engage with potential and current consumers; you have to be where they actually are. If you are posting content and updates blindly to Twitter, but members of your target market aren’t present there, what’s the point? The first step in effective use of social media for lead generation is to research and determine which social media sites your target audience is active on a regular basis. That way when you do share content and information, you can know you’re working to build awareness for your blog, product, service, and other types of content you offer on a regular basis. Awareness is a key preliminary stepping stone for lead generation, since prospects likely go through a period of learning more about your business and deciding whether or not they should research your company further.

2. Not providing valuable content. If you’re just pushing out content about your product and why it’s so “awesome,” more than likely, people will not want to share or engage with it. If someone is following your brand on Facebook, it’s probably to see what valuable content and offers you can offer them. Rather than product-focused content, focus on content rich with tips and tricks which can help to relieve your target customers’ pain points. When you target the content you’re offering to the different marketing personas you have defined for your business, then your prospects will be much more likely to engage with your brand and therefore, more likely to complete a lead-capture form for a piece of your content. In short, providing targeted, useful content will help you generate more qualified leads who may genuinely be interested in what you have to offer.

3. Not using calls-to-action or sharing targeted links to landing pages. I could have sworn the most effective use of social media for lead gen is the ability to share links to your content, blog, and cool offers? Don’t just say you have a great blog or that your fans should check out your awesome new ebook, link to it, and use an enticing call-to-action to do so. You’d be surprised how commonly businesses neglect to do this. Furthermore, share targeted links. Don’t talk about how your followers should register for your upcoming webinar and link to your website’s homepage that has no mention of the webinar. Instead, link to a targeted landing page where visitors to register. Even better, target specific content to different platforms. Create special offers for Twitter followers that are different from your offers to your Facebook fans.

4. Not leveraging social media real estate. When people visit your pages on social sites, they probably want to learn more about your brand and its offerings. On Facebook, you can provide as many links as you would like in the Info section. On Twitter, you can use the short bio to share a link that is integral to your service. Same with LinkedIn, Google+, and other platforms you may be a part of. Use that real estate wisely; it’s there for a reason. While social media is a great platform to help you generate leads, you still want to create a connection between the educational content your prospects are downloading and the recognition that your business does more than give out free stuff. Make sure you’re effectively leveraging the real estate of your social media accounts to create that brand and product awareness, too.

5. Not integrating email and social media. Email marketing and social media are great friends, not enemies. They work together well, help each other out to promote content, and share one another’s information on a regular and consistent basis. You should promote your presence on social media sites through buttons on each email as well as share links to email opt-in forms for social followers to sign up to be a part of your email database. When you combine the power of your lead generation tools, you’ll create a more strategic effort and a better chance of reaching and nurturing your potential customers through multiple fronts, allowing them to choose how they engage with you, consume information, and decide if they would like to move further down the sales cycle.

6. Not displaying highly visible social share and follow buttons. So you have your website and blog and all this great content, but can people easily share it? Be sure to place easy-to-see and -use share and follow buttons on all your content in order to increase its reach. The more your fans share your content with their networks, the more potential new leads will see it!

7. Not analyzing the effectiveness of your social media efforts. You may be on the latest and great social platforms, sharing awesome content, listening and engaging with your potential audience, and collecting that valuable lead information, but how do you determine if it’s working as well as you want it to? You should be regularly analyzing how much traffic and leads you’re generating from each social platform you’re participating in as well as how valuable it is. This will allow your team to evaluate its efforts and make adjustments if needed. For instance, you may want to spend more time engaging the community that tends to convert into more qualified leads. Or perhaps you’ve discovered that Facebook fans prefer different types of offers than Twitter followers. Use this data to perfect your future lead gen efforts in social media to do more of what works and less of what doesn’t.

How effectively are you using social media for lead generation?

Image Credit: Widjaya Ivan

intro-to-twitter-ebook

Connect with HubSpot:

HubSpot on Twitter HubSpot on Facebook HubSpot on LinkedIn HubSpot on Google Buzz 

 


Go to Source

Using Emails to Build Links – Whiteboard Friday

Posted by Kenny Martin

It's not so common to think about email marketing as a potential link building opportunity, but it's actually a wonderful tactic that you can use. Leveraging those finely crafted email lists with an SEO strategy can be highly advantageous.

In this weeks Whiteboard Friday, Rand shows off some useful and creative tips on how to utilize an email marketing strategy that will help you build links.

Video Transcription

Howdy, SEOmoz fans. Welcome to another edition of Whiteboard Friday. This is actually Thanksgiving Friday in the United States, so I hope you had a wonderful turkey day with your families. And for those of you in other countries, turkey is not the most delicious of birds, but we enjoy it. It's good. We make a little cranberry sauce. We've got a little yams and some sweet potatoes. It's great. It's lots of fun. And, of course, there's football, which is my favorite part.

All right. So in this edition of Whiteboard Friday, since it is indeed the giving of thanks, we are talking about getting thanks from your users from whom you're getting great email content by getting links from them.

Email marketing, email list building is actually a phenomenal way to get links. It's not something that many people in the SEO world think about. We've got a bunch of different strategies.

The first thing I want to talk about is building an email list itself. There are tons and tons of tips on this out there on the Web, and I don't want to pretend that I'm an expert. But what I will say is that having a subscription to an email list on a website, particularly if you are a content-based site or an ecommerce-based site, is absolutely huge. Even for those of you doing B2B direct marketing or doing affiliate sales of some kind, email list building is a wonderful way to capture consumers and potential customers, bring them into your ecosystem. Those email addresses are incredibly valuable if you can build up a good relationship over time.

A few recommendations. You've often got something on the side of your website that lets people subscribe with email. If you blow it up, it looks something like this. You've got your name, you've got your email, you have a subscribe button, and that's great. What I would really recommend is to ask for very, very little in these boxes. If you have a subscription that pops over, please ask for as little as possible. But do me one favor – ask for the name.

The reason the name is so important is because in email marketing and list building, as email marketers know, getting the open rate up is critical. Getting people to click on that email, open it up and click through. Having their name means that you can do much more with personalization of those emails. Not having their name means it's very frustrating. It's hard to write that first intro sentence or paragraph, whatever, if you don't have their name, and it's often hard to get them to click through as well. I'm sure all of you get email spam like this that says, "Hi, blogger from so- and-so." "Hi, dear Rand@SEOmoz." I'm sort of like, "Yeah, you have no idea who I am and you couldn't care less. You're just trying to get me to take some action." But if it says, "Hey there, Rand" or, "Rand, we've got something you might like," that is much more likely to get an open. It can be customized, etc.

Make sure that you have something of great value that you are delivering over email, and then make sure that you're not just promising it, but you are actually delivering on that promise.

Indicate the frequency that you are going to have. So in here, I might say something like "once per week." So you will get a weekly email, or you'll get a daily email, or you'll get a monthly email. Don't be coy about how often you are going to send it. Try not to be too out-of-cycle with those emails. It's really that kind of thing. I get a weekly email and then suddenly I get two in two days, and I think, "It's time to unsubscribe from this list." Try not to do that.

Also, watch and manage your inactives very, very carefully. If you see people who consistently have never clicked, never opened, never taken any action, you might actually want to remove them from your email list. I know that sounds crazy, because you're thinking, "Wait, but I want a bigger email list. I want to grow it over time." I know. I want that too. But the problem is that email management services, MailChimp for example, or Bronto, which is what we use here at SEOmoz, they monitor very closely that usage rate. A lot of those people who aren't taking any action but haven't unsubscribed are reporting you for spam. They're clicking that spam button in Gmail. They're clicking the report spam button in Hotmail. They're clicking the spam button in Outlook. That's a huge problem, because the percentage of people who report you for spam is a metric that those providers use to determine deliverability rates. They might actually kick you off their service. You are going to have worse deliverability problems over the long run. So try and weed those people out. Anyone who you think might not be engaged or active or interested anymore, you've changed the focus of your business or of your email list, get them out of that funnel so you are not clouding up and murky-ing the waters. Spam is a big, big problem in email deliverability, and you don't want to end up in that group.

And finally, last but not least of the tips here, A/B and multivariate test how this piece performs. You want to be trying different things. A different headline, a different way of capturing it, different form placement, using the overlay, using something where they only see when they scroll to the bottom, whatever it is, so that you can get the maximum percent of people who are visiting your pages taking that email action, particularly if email is a big way that you drive your business.

Now, you've done all these things. You've built an extraordinary list. I'm very proud of you. Your marketing team loves you. Now you're thinking, "I want to leverage this for some SEO. I want some links. Give me some links, baby!" I'm going to give you some links.

There are some link building tactics that you can use that are going to drive value back to your site, maybe some of them direct links, good anchor text pointing to the right pages, some of them brand links that are just pointing to your home page, some of them random distribution, and some of them, of course, are going to be social shares, which might not be counted as links, might have some impact on rankings. We're not really sure. Probably as a second order effect at worst.

I'll talk about the first one here. Share embeddable content. You're very aware of the power of things like badges and infographics or tools, what have you, stuff that can sit on someone else's site and point back to you. Share that stuff over your email list. If you have a great badge for people and you want to say, "Hey, you've contributed a design to our site," "You've been a member here for a year now," "You've filled out your profile completely," "You've bought three things from us," "Here's a way to say that you like our brand. Here's something to encourage you." And the people who are passionate about your brand and about your community they're going to embed those things on their sites. Wonderful. Just great for your SEO. And, of course, you get to control the anchor text and where that points. Another great thing.

This sounds a little complicated, but it is totally brilliant. You've got this big list, and the list looks something like, here's email@domain.com. This domain.com is an absolutely incredible piece of information that so many people under-appreciate. Domain.com is often Gmail. It's often Hotmail. Scrub those. You don't care about those. What you care about are all the rest of those domains. Those are all websites where people from those sites, particularly if you're in the B2B field or serving B2B type customers, where these people own those sites, are marketers on those sites, are involved with those sites somehow, and you can reach out to them by filtering the domain names you care about, using something like the Linkscape API or the Majestic SEO API if you want to get fancy, and seeing, "Hey, do these sites already link to me?" and then ordering by, "Oh, you know what? I'm going to take these domain names and in Excel I'm going to order by domain authority, and I'm going to grab the ones that are highest domain authority that aren't linking to me, where I think I've got a chance of outreach." And by the way, I can do that outreach directly because I know somebody there. Somebody there has subscribed to my email list, so they care about my business. That makes that outreach, those business development possibilities much, much more accessible. For those of you who are looking for where should I do email outreach, where is an easy target, it's this. Come on, you can't get any easier. It's wonderful.

Number three. Encourage your users through email, particularly if you have something like a profile that they are creating on the site or a user page, encourage them to fill those out completely. The reason that I love filling them out completely is because when people invest effort in them, they will often link to them. If you provide any value back on those profile pages to the people who are creating stuff on there, whether it's, "This is your design portfolio," "Here's your Amazon Wish List," "Here are the things that you've customized on the site so far," whatever it is that you've done, you want those users to fill out the profiles because they will have a strong potential to link to their profiles, and once they do, you get SEO value from that "rising tide lifting all ships" phenomenon.

The next one; consider sending some individual emails to the users who get activity and engagement on your site. The simplest form of this is blog comments. Someone subscribed to your email list, they accept email privileges when they register with your site, someone replied to their comment. Someone mentioned them somewhere. They received an action on their page. 100 people visited their profile page. 100 people checked out a product they customized. 100 people looked at their wish list. Imagine if you were ThinkGeek and you get, "Hey, someone looked at your ThinkGeek wish list today." Just a little friendly notification. This is a way to bring people back to their site and for them to think, "Oh, yeah. I wish more people did this. I wish more people engaged with me on the site. I'm going to re-engage." Again, not necessarily leading to direct links, but in some cases it will.

Just three more good ideas for you here. When you capture the email address, if possible in this box here, if you don't ask for location necessarily, you might later in a profile setup or completion step, you might get it through a credit card, but you can also get it through their IP address. When you capture an email address, capture that IP and Geotag it so that you know where those users are. The reason is when I go and visit Kansas City, I can say, "Hey, SEOmozers who are in Kansas City," shoot them an email, let's tell them I'm going to do a MozCation there. We're going to do an event there. I'm going to be speaking at an event there. It's a fantastic way to bring people from your community who already care about you back into the fold. Events are just a great way to earn natural links back to yourself, because you build relationships, people see you, and they just naturally link to you. You are engaging with them, you are contributing. Even if they can't make it to the event, sometimes you are going to get a link by them sharing it and saying, "Hey, by the way, so- and-so is coming to this." They'll put it on their blog. They'll link to it on a forum. They'll put it on their About page, whatever it is. They tweet about it. Great. Just a terrific way to interact and engage.

The second thing. This might be my favorite one on here. When you do this, when you go and you filter and you grab the domain name of all the people that you care about, and you've got that ordered, then go find those people's Twitter accounts, their blogs, their websites, and go engage with them socially. I promise you, if you are naturally, positively engaging on Twitter, in blog comments, on their Facebook Brand page, on their Google+ page, whatever it is, they are going to figure out who you are and remember you. They've already signed up for your email list, so they have a positive association with you and like what you do. You are going to earn a link sooner or later. It might be a month, might be six months, might be a year, but you are going to get that link through engagement. This is a wonderful way to just build your presence in raw inbound marketing, never mind just pure SEO.

Finally, Aaron Wheeler from the Moz team had this great idea to insert in your RSS feed, especially if you're running an RSS feed that's not powered by advertising but is content-driven, insert ads. You know how you've got an RSS feed and it looks something like this. Here's a piece of content, and then sometimes there will be a little block of advertising if it's a paid RSS feed and they want to sponsor it. Instead of advertising for a third party, encourage your own advertising. Encourage sharing of content. If you have a special piece of content in the RSS feed, you might say, "Hey, we'd really appreciate your help spreading this out over the Web. We're doing a subscription drive with a nonprofit. We're having an event somewhere. We are promoting a new service that we have. We have this new infographic that collects a bunch of data that we think a lot of people will love. Help us share it." Don't do it on every post or it will be ignored. Do it on every 7 posts, every 20 posts. Then you'll get attention and intrigue, because people who are subscribing to RSS via their email and getting those posts in email will see that little ad block and go, "Oh, maybe I should share that. That seems useful and interesting."

So you can see the wonderful power of collecting email addresses, building a great list, obviously for email marketing, but also getting some value back for SEO through the links that you can drive.

I hope you've enjoyed this edition of Whiteboard Friday. I hope you had a great Thanksgiving. We will see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Do you like this post? Yes No


Go to Source

November 2011
M T W T F S S
« Oct   Dec »
 123456
78910111213
14151617181920
21222324252627
282930  
Links

SEO products: