Articles on Marketing

I recently finished reading Social Media Marketing by Liana Evans. It’s a great book. Comprehensive with a high-level focus for marketers and managers new to social media. I’ve recommended it to people new to the field who just don’t know where to start. At the same time this book has enough technical depth and unique perspective to keep Web marketing and social media professionals engaged.

I recently found some time to read three books that I’ve had on my list for a while.

  1. Socialnomics: How Social Media Transforms the Way We Live and Do Business
  2. Rework
  3. Marketing in the Age of Google: Your Online Strategy IS Your Business Strategy

If I could recommend only one of the three it would be Marketing in the Age of Google. A lot of people have really enjoyed Socialnomics, but for me it seemed better suited for social media newbies. I found Rework an entertaining quick read, but more practical for someone launching, or working in, a small tech startup. I am a big fan of the work the authors do at 37signals – products and blog.

Marketing in the Age of Google offers an interesting perspective  on a variety of Web marketing topics including analytics, advertising and search engine optimization. Author Vanessa Fox (formerly with Google) does a great job explaining complex topics thoroughly but succinctly.

Twitter FirehoseDrinking from the Twitter Firehose

The Twitter Firehose is a phrase that refers to the roughly 50 million daily tweets on everything from Wall Street reform to Justin Bieber’s favorite cereal. There are hundreds of tools on the market designed to monitor and measure Twitter data by allowing users to easily search and track Twitter trends. One of the big problems with most of these products is that they don’t have access to the full Twitter Firehose. Typically they might only have access to 10%-20% of all Twitter posts on any given day. There’s a couple reasons for this.

  • The cost and technical challenge of handling so much data
  • The cost – or privilege – to access the Firehose

Who’s Tapping it?

Giants like Microsoft, Yahoo, and Google have reportedly paid hundreds of millions of dollars for access to the full Twitter data stream. Surprisingly Twitter has decided to grant access to the full Firehose to a number of startups including; Twazzup, Collecta, CrowdEye, Scoopler, Kosmix, Chainn Search and Ellerdale.  No one is quite sure, at least I’m not, how much it’s costing these startups (if anything), and why they were chosen when so many others have been denied. I should also mention that there are also a small number of mid-size companies that fall somewhere between the giants and startups that now have access to the full Firehose – including Jive and Converseon.

A large number of social media monitoring companies – whose goal is to collect and consolidate all relevant social media data for its clients – are left out of the party. Which means even great products like Radian6 and Scout Labs have a significant hole in their data inventory – full Twitter Firehose access. Both companies expect to have full access at some point, but when isn’t yet clear.

What Does the Future Hold?

Some experts expect Twitter to eventually be more generous in granted access to the Firehose. However, this would be opposed to the direction they’ve taken recently in switching to their own URL shortening service (rather than using bit.ly), unveiling their own iPad application, and requiring third-party Twitter applications to access user accounts through the OAuth authentication standard. The latter move caused problems for a number of high profile tools like TweetDeck and a number of widely used WordPress Twitter plugins.

As use of social media has exploded over the last few years, companies and government agencies have struggled to keep pace with the constant flow of information and opinions.  Not only are organizations struggling with how to monitor social information, they’re also not sure how to analyze what they are monitoring. As a result we’ve seen a new market called social intelligence take shape.

Social Intelligence

Social intelligence is the process of monitoring, collecting, and analyzing social data to inform business decisions. I.e. “Let’s see what people are saying, see what meaning, insights, and patterns we discover, and act accordingly.” Social intelligence attempts to make sense of the endless stream of tweets, comments, posts, and other social data. This market generally involves three components: social media monitoring, social media analysis, and social media strategy.

I should note that the term social intelligence is not new – see original definition on Wikipedia – but it is relatively new in context we’re using it.  Much the way social networking is an old term given new meaning through the context of the Web and social media. It seems that Zach Hofer-Shall at Forrester is one lead voices in this area. Nielsen and McKinsey are also using the term to describe their new joint venture – NMIncite.

Social Media Monitoring

Listening platforms

Listening platforms refer to software created to allow organizations to monitor social media information. The term “social media monitoring software” could be used interchangeably. There are dozens of companies in this space now. Some are full-featured solutions to monitor all social media (or anything else on the Web) while others focus on specific platforms such as twitter. Obviously the latter are much less useful for organizations that need to see the big picture.

Forrester recently put out a great paper analyzing some of the best enterprise-level solutions. The only drawback to the report is that it’s limited to products targeting the “enterprise” market, i.e. companies over $1 billion in revenue. Unfortunately this approach leaves out great products like Scout Labs. Forrester needed some criteria to narrow down the 100+ potential vendors in the market, and that probably was the most logical selection criteria to use.

Social Media Analysis

Social media analysis comes in a variety of flavors. Listening platforms usually provide dashboards with aggregate trends and statistics. At the most basic level social media analysis involves analyzing these types of aggregate usage statistics. Many listening platforms promise to deliver more qualitative research in areas like user sentiment and tone, by using natural language processing (NLP) . However, anyone with experience using these platforms understands that limitations of NLP for social analysis.

An area that listening platforms tend to do well with is identifying key influencers – at least from a quantitative standpoint. Key influencers – whether they be bloggers, or twitter or Facebook users – are usually identified by the number of times they use targeted keywords, and by the size of their following. We then need to do some qualitative analysis to determine who are the true influencers.

We can take social media analysis to the another level by conducting a content analysis. Content analysis (AKA media analysis) typically involves manual data coding. This is a very labor intensive process but the flexibility, detail, and accuracy cannot be replicated by any software. Our team conducted a content analysis of social media data for a CDC project that I currently manage. The end result of a detailed content analysis will include statistics and usage patterns that aren’t possible with automated software – e.g. accuracy of posts and sentiment towards specific topics.

Social Media Strategy

Social media strategy should be based on organizational and project goals, but also as a result of what is learned through social intelligence. By conducting environmental scans and implementing ongoing social media monitoring organizations can learn a lot about how they should be leveraging new media to meet their target audience’s needs. Additionally, information gleaned through social intelligence will often drive larger marketing and business decisions.

I recently attended a presentation by a marketing executive form the Red Cross. A great quote she had was “social media is our canary in the coal mine for reputation management.“  I think this quote speaks well to the power of social intelligence.

Social media monitoring companies

Social media monitoring and engagement software

Anyone who works in Web consulting or communications has likely seen some variation of a Web 2.0/New Media presentation that includes the ubiquitous slide showing a montage of 2.0 companies.  I thought that I would do my part and create a graphic showing the next logical company collection… social media monitoring software companies. I created this slide for a recent presentation I gave at the National Conference on Health Communication, Marketing, and Media in Atlanta.

This is a market that is blazing hot right now as companies and public sector agencies struggle to understand how they should monitor and analyze social media. Which is great progress from where we were just two years ago. Now that most organizations have embraced the power of social media, the next step is making sure we are understanding and reaching our target audiences using all the right channels.  Social media monitoring and engagement is extremely powerful for doing just that.

Future of Online Advertising
Twitter, LinkedIn, and Zynga execs discuss the future of online advertising. Gotta love a CEO that can get away with sporting a beat up t-shirt.

AOL, Display Advertising Still Dominant
AOL Chairman and CEO Tim Armstrong says display advertising will still dominant.

Sapient’s Freddie Laker on the future of social media
This is a repurposing of his earlier article, but it’s also nice to see it in video format.

Lithium’s Paul Greenberg on the state of Social CRM
This is an area that many in the industry still aren’t aware of.

Text Link Ads

I’ve been using the text link advertising platform  for several years now with good success.  However, I initially (and foolishly) submitted a request to include this blog in my inventory just after I launched the site. Naturally, traffic was low at that time so it was not accepted into their inventory. So I let the traffic build and the Alexa and Compete.com rankings catch up, and then attempted to resubmit the site two weeks ago. Unfortunately Text Link Ads is setup to reject any sites that were previously submitted and not approved. The system provides a message saying that someone will manually review the site for reconsideration. It appears this process isn’t actually occurring, at least not fast enough.

LinkWorth

Based on my recent experience I decided to search for an alternative text link advertising provider.  I hadn’t previously heard of LinkWorth, but the site looked nice and the product was described intuitively. Additionally they offer a variety of advertising options including standard text links, inline links, sponsored content, and banner advertising. I always like to see a one-stop-shop approach. The sign up and site submission process was straightforward – so far so good.

About Text Link Advertising

In case you’re new to the topic of text link ads, essentially they’re just that…  Company A decides they want to pay for an ad on Website B. Why a text link and not a banner ad? 1) Some studies have shown that text links have higher click through rates.  2) Many Internet users have learned to ignore banner ads.  3) Text link ads can help pass on credibility and search engine ranking juice (i.e. Google Page Rank).  Google PR as it relates to text link ads has been a topic of much debate in the past, and Google previously excluded Text Link Ads from its ranking algorithm.

Next Steps

I’m anxious to give LinkWorth a test run and see how it performs. The big question mark is whether they will have a steady stream of advertisers to match with their partners (e.g. FatPurple). One of the things I like about Text Link Ads is that there seems to be a large enough variety of advertisers to meet most niche website content. I don’t plan on giving up Text Link Ads entirely (not yet anyway), but I’m hoping that LinkWorth will soon be the vendor of choice for this blog.

I’m often asked to evaluate or analyze websites that I have no relation with. Perhaps it’s a site developed by a competitor or a site we’re looking to get some quick insight on before speaking with a potential client.  While there’s no way to get very accurate traffic statistics without having access to the site’s server logs or some other analytics, there are some nice tools to give us a glimpse of the traffic levels and general visitor demographics. Here’s a quick list of some of the tools I use.

Google Page Rank

Most people reading this site are probably very familiar with Google Page Rank (PR). It’s arguably the most important ranking for a website. Google ranks all website on a scale of 1-10 (10 being the most traffic). What determines a site’s Page Rank. A long complex algorithm that we’re not smart enough to understand, but essentially it’s based on the number and quality of sites that link to the site in question. Most sites on the Web don’t even receive enough traffic to rank at level 1. To put this in perspective here are some Google PR’s for some huge sites:

  • Wikipedia & Digg: 8
  • Twitter: 9
  • CNN and Facebook: 10

There’s a number of free tools for checking PR on the Web. I just use the PR tool display built into Google Toolbar, so on any site I instantly see the associated PR. It’s a quick way to assess the size of a website and company. For example let’s say I’m looking for an online card sorting tool. If I visit company A with a page rank of 2, and company B with a page rank of 6, I’m probably going to automatically assign a lot more credibility to site B.

Alexa Ranking

Alexa ranks sites from 1 to a lot, with 1 (Google.com) being the most visited site. I give a lot less weight to the Alexa ranking – in comparison to Google PR – but it too has its purpose. Alexa provides rankings for websites that don’t register for Google PR. I just checked out a site that has a ranking of 14,856,256 (only 14,856,255 spots to go guys…), which obviously doesn’t fall on Google’s PR radar. Another nice thing about Alexa is that its rankings start to show up a lot faster than Google PR. As I type FatPurple is listing around 374,883 after only about a month in existence. But we’re still showing as a big fat 0 on Google PR.  Google simply takes a lot longer to rank a site. Similar to the way I have Google PR displayed in my browser, I also have Alexa ranking automatically displayed for each site I visit.

Alexa’s top 5 websites:

  1. Google
  2. Facebook
  3. Youtube
  4. Yahoo
  5. Live.com

www.Alexa.com

Compete

compete.com comparison chart

Wicked Smart Websites

Yes you need to signup for paid plans to get the best features, but Compete.com still offers some nice freebie tools. Sign up for their free account plan for some additional options. Compete lets you create some nice comparison graphics and analyze a variety of statistics. This is a much more robust tool than Alexa, and probably why they get away with charging for it.

Check out this nice comparison chart for these three college websites. Coincidentally these are three schools I overlooked in my college selection process, but I’m sure they would have loved to have me…

www.compete.com

Quantcast

quantcast-fatpurple

FatPurple's 1st month Quantcast Snapshot

Quantcast provides traffic statistics as well as basic demographic trends. We can debate the accuracy of the demographic statistics, but Quantcast is somewhat unique (and brave) in their attempt to map website demographics. If a site is “quantified” i.e. registered with Quantcast the numbers should be much more accurate.

www.quantcast.com

And Some Others

  • Yahoo Site Explorer – You can easily find the number of backlinks (inlinks) to a website. For some reason I find that this works better than Google’s backlinks search. Yahoo tends to turn up additional accurate results that Google misses.
  • Technorati blog rankings

I also recently wrote about the discrepancies and inconsistencies between these tools. This may also be of interest – Your Website Numbers are Lying to You.

What other tools am I missing? Please share if you have others you like.

The upcoming release of the Radian6 Engagement Console promises to improve the way companies monitor and engage customers, or their client’s customers, through social media.  I have looked at a variety of social media monitoring systems over the past few months – including entry level tools like Trendrr or SocialMention, as well as mid to enterprise level applications like Vocus and Nielsen’s Buzz Metrics. There are dozens more I’m anxious to take a look at – including products from Converseon and Autonomy. Take me to Your Leader has a nice list of dozens of free tools.  However, I’ve been less than impressed with most of the products I have used.

I’m currently leading a project for a large Federal Government agency to evaluate the effectiveness of social media for health communications. So these tools are of particular interest to me now.  This new Radian6 release has me excited. I’ve yet to see a product that systematically streamlines the engagement process like this. Does Autonomy offer this type of functionality? Does anybody else?

I’m also curious about what platforms Radian6 will monitor.  One capability that most systems lack is the ability to monitor and engage through non social media channels that have certain Web 2.0 components. For example, a news story on ABCNews.com may generate dozens or hundreds of user comments. These data are just as important as anything being said on Facebook or Twitter.But many products seem to ignore these more traditional Web channels.

I’ve setup a Radian6 demo, and can post a follow up if anyone is interested.

YouTube Preview Image

Who can you trust on the Web these days? Certainly not your Web anlaytics or tracking softeware. At least not without a grain of salt and the ability to selectively ignore data.

Check out these confusing and contradictory numbers from a few of the sites I maintain. The site vary in rank depending on the source or criteria used. The discrepancies in numbers between Google Analytics (and other JavaScript dependent options) and WebTrends (and other log file dependent software) have been well documented for years. But when you mix in these other rankings it further muddies the water.

It’s very important when talking Web stats with a client that your comparing apples to apples (or Google vs. Google), rather than apples to melons (Google vs. server stat programs). If they’re matching previous server stats to a redesigned site’s Google Analytics numbers – you’ve got some serious explaining to do.

It’s surprising how inaccurate Alexa’s rankings seem to be. Especially given that services like Text Link Ads use those numbers as a primary factor in ad link valuation. For example GDoss.com was ranked higher than e-Lacrosse.com – despite the fact that it get’s only a fraction of the traffic.

Rankings and traffic for the month of February 2010

* I haven’t figured out why yet, but when I redesigned this site about 5 months ago the Google PR dropped from 3 to 0 for the home page. Some interior pages still rank a 3. Very strange.

Why Don’t Google’s Numbers Match Up?

There’s a variety of reasons why Google Analytics (GA) numbers won’t match – a generally be much lower than – your server-based statistics. The most likely causes:

  • WebTrends and other server-based software typically uses server log files (that is and GA uses tagging JavaScript code in the HTML that is reported on Google’s servers. This inherently creates differences in the way numbers are reported.
  • Search engine crawlers cannot execute JavaScript (so I hear) and therefore do not show up in GA numbers. The amount of search engine spider traffic can be huge if you have a large site. Especially if it’s crawled frequently. Check out the big spikes in the page views for one of my sites in the graphic below. Search spiders seem to be the only logical explanation there.
  • iframes and AJAX applications may result in Google under reporting.
  • Views of document files (PDF, Word, etc.) may be counted as page views in log file based software, but they will not be counted in GA page views.
  • Users with JavaScript turned off with not be counted in GA. Although those people usually account for less than 5% of all users.
  • Some web managers forget to add GA’s code to every page of the site.
  • Some claim that users will often leave a page before the page (and GA code at the bottom) has fully loaded.
  • Google doesn’t like your site (or you).

Traffic spikes from search engines on a site with ~16,000 pages.

Side note – 1&1’s (the host for most of these sites) online Web statistics are painfully slow! As much business as they get you would think they would have a better stats platform. Another example of their low cost = low quality strategy.