About Lee Jacobson

Lee is a video game and entertainment executive and is currently the CEO of Apmetrix Inc., a global provider of next generation multi channel analytics for video game, mobile, digital media and virtual reality companies. His experience spans over 25 years with some of the most well know companies in gaming including Virgin Entertainment, Midway Games and Atari.

Tech Isn’t Enough: Analytics 2.0 and User Adoption

By: Lee Jacobson, CEO – Apmetrix

Business owners and management are often mesmerized by the potential of new software and then flummoxed when the change initiative utterly fails. Despite a robust literature on the pitfalls and best practices of change management, there often appears to be a psychological disconnect when it comes to IT. The irony being that IT initiatives frequently require the most stringent application of change management principles to succeed. Big data and analytics 2.0 serves as a case in point.

There is no question that big data can transform an organization’s bottom line from adequate to excellent, simply by informing business activities with relevant data. If you see a sudden spike in customer attrition following every phone outreach campaign, the obvious thing is to drop phone campaigns in favor of something less intrusive. You lose fewer customers, which is always more profitable than new customer acquisition. You also don’t waste precious time and money on a counterproductive use of a call center.

Business owners and management hear examples like this and salivate, rightly, at the cost-cutting potential. They find a best-in-class or customizable analytics program and, for reasons that often remain opaque, shove it onto their entire staff with zero warning. Organizational culture, like regular culture, is obstinate. It doesn’t change on a dime and never changes without appropriate preparation. Introducing new software is always problematic, but analytics software represents a sea change from the old culture into a data-driven culture. That sea change can terrify employees and generate such entrenched resistance that no amount of coaxing will ever get them to use the software.

The data-driven culture is terrifying for employees for a lot of reasons. The primary one, however, is that data-driven is frequently read as a code-word for imminent downsizing. Employees assume that the analytics will reveal some weakness in their performance or the performance of their department. Alternately, they fear the entire purpose of the analytics program is to find excuses to let people go.

Change management best practices can help your business assuage some of these fears. One of the ways to avoid employee terror is to make the case directly to them about why you want to start using the program. In most cases, the idea is to bolster business, not cull employees. For an analytics program, specifically, remind them that the majority of analytics deal with external data sources, not the internal metrics that are much more likely to result in downsizing. Finally, make it clear that the analytics program is intended to streamline the usually agonizing process of data entry and analysis. The less time people need to spend on it, the happier most employees will be.

While the value of a given technology may seem apparent in the office of a business owner or management team member, the value is often less clear to rank-and-file employees. Before investing resources on an analytics 2.0 application, no matter how intuitive it might be, deploy change management best practices. It will limit employee panic and help to ensure maximum user adoption.

Silos, KPIs and Confusion

Apmetrix-Logo-with-new-subtext-300x61

Silos, KPIs and Confusion – April 24th, 2015

By: Lee Jacobson, CEO – Apmetrix

The strength of analytics 2.0 is its ability to inform actions that benefit your business based on the behaviors of your customers, social sentiment about your brand, and facilitating outreach to potential customers. Yet, big data analytics don’t always live up to this promise. Results get ignored and opportunities are missed. This dismissal of analytics often stems from confusion about what the results mean. The confusion, however, is more likely a symptom. You can probably trace the causes of the confusion trace back to silos inside the business and unclear key performance indicators.

Silos are hard to avoid. Departments compete for budget allocations, which mean the departments are best served by cloaking their work in secrecy and discipline-specific jargon. They protect their data and share it only under direct orders or duress. This tendency toward reinforcing the silo is a problem in and of itself, because it means cross-departmental data-sharing is often hindered by dissimilar methods of organizing and labeling the data. If your marketing team labels everything with esoteric marketing jargon, while your sales team uses its own language, the analyst from IT is going to be mystified by how those numbers align in a statistical analysis.

Even if your analytics person is able to overcome the problem of silos, or you’ve broken down those walls in your own business by force of will, it doesn’t mean you’re going to get actionable answers from your analytics. KPIs, like any other number, get assigned different meanings depending on whether they apply to a department or a business as a whole. If your marketing team assigns KPIs, those numbers probably won’t help HR, Sales or the management team. The KPIs marketing cares about, such as social sentiment about your current product, aren’t going to tell HR anything about the success of its online recruiting efforts.

While each department needs to be able to assign and access the data relevant to its own KPIs, you also need analytics that relate to the business as a whole. Just as important, you need to understand how the overall analytics relate to the department-specific analytics. This can take some time and serious discussion with your individual departments or teams to establish a common language for assigning and interpreting the data. The common language provides a shorthand for the analytics person and for you, while also giving you a bridge for taking action based on the numbers you see.

This is also important in terms of the analytics software you choose. All too often, analytics software defaults to displaying results set up during installation by the manufacturer. Everyone, in every department, sees the same things, whether those results are relevant or not. A customizable dashboard provides everyone the opportunity to get at the results they care about, while keeping the information centralized. You can look at the big picture numbers or focus in on departmental numbers, while your departments can focus in on the KPIs that matter to them without the distractions of irrelevant results.

Harnessing Outliers and Microtrends

Apmetrix-Logo-with-new-subtext-300x61

Harnessing Outliers and Microtrends – March 12th, 2015

By: Lee Jacobson, CEO – Apmetrix

Every first year statistics student learns the term “outlier.” In short, outliers are pieces of data that fall beyond the norm for a data set. The traditional wisdom is to cull those numbers ruthlessly from your analysis, since they can badly throw off your results. When it comes to big data, though, discarding outliers isn’t always the best idea. After all, if you’re running analytics 2.0 and your data set comprises 2 million pieces of data from a social network, 500 or 1000 outliers doesn’t necessarily mean faulty data. It can mean you’re seeing a microtrend you can capitalize on.

According to Mark J. Penn, who coined the term, microtrends are small groups comprised of very passionate people with a counterintuitive mindset. It’s important to understand that by small, he’s talking about a small percentage of a total population. In the US, that can mean a few million. In a country like India, that can mean 10 million. Most businesses get by on numbers a lot lower than that, so ignoring microtrends is something you do at your own hazard.

One easy way to spot a microtrend that applies to you is to look for groups of users that have taken your product and applied to a situation you either hadn’t considered or hadn’t intended it for. Those self-identified users are probably the tip of an iceberg of underserved customers. They have a need and, somehow, they figured out that your product meets it. This is an opportunity.

Rather than ignore that atypical use, you can foster the idea. You can create a forum dedicated to it and, in essence, create a home for the microtrend to burgeon. You can build a marketing campaign around it that caters to the atypical use. You can even build a dedicated version of the product that serves that microtrend even more effectively than the original product. If a microtrend consists of nothing more than 500,000 people, you only need to capture a modest percentage of them to generate a nice return on investment.

The trick is not to ignore every outlier. You need an analytics program that will alert you when a certain threshold of outliers is reached. Maybe it’s 500 outliers or 5000, but you need a place in your dashboard that tracks and displays that information. Once you have the information, it becomes a fairly straightforward exercise in snagging a market segment.

Asking the Right Questions

Asking the Right Questions – January 23rd, 2015

By: Lee Jacobson, CEO – Apmetrix

Like it or not, big data is here to stay. The integration of computer and the internet into everything from phones and gaming systems to environmental controls and security in the home virtually guarantee the expansion of data sets. Yet, for all the purported power of big data, it can’t tell you anything until you ask a question. It can’t tell you anything that helps your business unless you ask it the right questions.

The core of any big data analysis is drawing correlations, sometimes gross and sometimes subtle. In some cases, the question you want answered is easy. You made a major change to the search algorithm on your retail website with the hope that it would generate more sales by providing more salient results.

You take all the data on sales and analyze the trends for two months before the change and a month after. If the data suggests a sales uptick in the weeks following the algorithm change, you’ve got a solid correlation between the new algorithm and customer behavior.

Of course, correlation isn’t proof. In fact, the simple answer you got may be a totally spurious correlation. If you roll out your algorithm change the week of Black Friday or anytime in the weeks leading up to Christmas, the correlation may be patently false. All retailer or e-tailer sales are likely to experience an uptick during that period. Incidentally, your sales uptick around Black Friday probably also correlates to higher natural gas sales.

The example above is, of course, overly simplistic. Any business worth its salt will run analytics not just for current and recent sales, but on corresponding sales in the same period over previous years. What it does show, however, is how simple it can be to get a false impression by asking the wrong question from big data.

Getting to the right questions is the human element in the process. A good analytics 2.0 application can streamline data processing, corral results into a single dashboard, and convert files for you, but it can’t tell you what to ask. It all comes down to what matters to your business.

You need to figure out what your key performance indicators are and craft specific questions to draw correlations out of the mountain of data. How much churn are we experiencing? Do increases in churn correlate tightly with specific changes we’ve made recently after discounting external factors? How many new customers come to us through social media? Does the amount made from those customers in total offset the total investment in social media initiatives? What are our cart abandonment percentages and how are they trending?

Big data is reactive by nature. It responds to questions. The more specific and on-point the questions, especially ones that tie directly to your key performance indicators, the better and more specific the correlation will be. Those are analytics you can use to actively improve your business.

Resource Drain: The Dark Underbelly of Big Data

Resource Drain: The Dark Underbelly of Big Data – December 31st, 2014

By: Lee Jacobson, CEO – Apmetrix

For all of its unprecedented potential, big data isn’t just the goose that lays the golden egg. It can also be the anchor that drags your resources under. This applies equally to your human resources and network resources. Part of the problem, for many companies is the lack of comprehension about what the IT department actually does for the business.

Ask people about IT and you’ll get vague answers about fixing that workstation. Savvier people might talk about hardware upgrades and making sure you can get online. For too many people, though, IT is a euphemism for white smoke and black magic contained in a computer-sized box. IT professionals are simply wizards who wave their hands, write in cryptic languages and use their occult powers to make the technology work.

In reality, those IT professionals are probably handling a substantial workload ranging from web development and server-side scripting to network security and hardware maintenance. They are already working full-time jobs. Big data analysis often taxes the human resources of IT. Some companies bring in a database manager to handle manual and ever-changing analytics queries. Other companies just add it to the list of black magic the IT department must conjure on a daily or weekly basis.

Then there is the network drain. Running queries on thousands or millions of pieces of data consumes huge amounts of processing power on your network, which makes everything else sluggish or non-responsive. Even well-known applications specifically designed to sift, sort and report analytics – such as Hadoop – can be resource killers. While providing additional hardware and creating partitions can mediate some of these problems, they’re short-term solutions to a long-term problem that will only grow.

This problem proves especially problematic for app developers and game companies, where massive amounts of data are generated daily and largely inside the apps and games. A real, scalable solution to managing that data is ultimately a survival necessity both for improving apps and game and spotting problems.

A real solution to the problem doesn’t aim to “quick fix” things, but to streamline the process for both the human and network resources. The solution is a single application designed to minimize the number of hours your IT people need to spend on manual data configuration, but one that also limits the network overhead. The application should also be non-IT-expert friendly and provide a single dashboard that allows rank-and-file employees to execute the searches they need.

When picking a big data solution, the question you need to ask is: Does this feel like white smoke and black magic or does this look like something anyone in my company can use? If it looks like something anyone can use and will place limited demands on your network, that is a long-term solution to the long-term, big data issue.

Analytics 2.0 – A Holistic Approach

Analytics 2.0 – A Holistic Approach – December 9th, 2014

By: Lee Jacobson, CEO – Apmetrix

The Parable of the Blind Men and the Elephant: What It Teaches Us about Holism and Analytics 2.0

As I’ve said before, much of analytics is still trapped in a single-stream model that generalizes findings from a single data source to a business’s entire customer base or market segment. That approach is a little like the parable of the blind men and the elephant. Each blind man puts a hand on a different part of the elephant and draws an erroneous conclusion based on incomplete information.

It is only by putting all the pieces together that you can draw an accurate conclusion. At the risk of sounding a little New Age, analytics 2.0 is a much more holistic approach to gleaning broad-based insight into your entire customer base and rather than insight into a single piece of your customer base.

This holistic approach is particularly valuable when a business wants to launch a large-scale marketing initiative. Unlike targeted marketing, which consciously and intentionally limits itself to very specific channels and market sub-segments, large-scale marketing must address the entire customer base. More importantly, it must appeal to most of the customer base. If you’ve been running analytics 2.0 for a while, or at least collecting the data, you have what you need to stack the deck in your favor.

You can run a cross comparison on engagement across your entire customer base for previous campaigns across all channels; determine the social sentiment toward those campaigns; and, use that information to formulate a messaging strategy that maximizes the odds of a successful new campaign.

For example, say your company sells customer relationship management software using a SaaS model. In most cases, the same CRM software can be used in B2B and B2C settings and you may sell your service to business- and consumer-facing companies. Your B2B and B2C customers’ priorities for the software, however, may be very different.

For B2B companies, a robust feature set may trump other concerns, because customer volume is generally low. For the B2C companies, scalability may matter more than bells and whistles because of the sheer number of potential customers out there. How do you use your data and analytics 2.0 to create marketing messages that reach companies with such, apparently, incompatible concerns?

You look for the crossover. What elements of previous marketing messages did both groups respond to positively? How positive was the response? Was there something about software that both groups praised on social media over and over again, even if it wasn’t a key element in your previous marketing?

That kind of analysis, across multiple communication channels, is where analytics 2.0 excels. By asking the right questions of your data and observing a holistic view, rather than a segmented or single-stream view, you can reach your entire customer base in a way that matters to them.

Are Your Marketing Campaigns Based on Faith?

Are Your Marketing Campaigns Based on Faith? – November 21st, 2014

By: Lee Jacobson, CEO – Apmetrix

Marketing campaigns are often exercises in faith. Your marketing team or an external marketing firm develops a new campaign idea and pitches it to you. Maybe the campaign has some research behind it, such as a focus group, but you’re really taking the idea on faith. They’re the experts. It’s their job. Except, marketing campaigns fail as often as they deliver. Analytics 2.0 can take faith out of the equation.

The nature of analytics 2.0 is to track engagement across multiple channels and deliver you real-time information. Are people responding to your marketing tweets? Has the campaign made you a trending topic among your ideal customer base? Is the social sentiment towards the campaign positive or negative, and just how intensely positive or negative.

Just as importantly, if you’ve been running analytics 2.0 for a while, you’ve got a much sounder frame of reference with which to judge a new marketing campaign idea. Since it’s the same business, probably offering similar products, there will be core similarities between marketing initiatives. If your customers or ideal customer segment responded poorly to the last campaign that took a similar approach, you know better than to go down that road again.

That wealth of customer information isn’t just a sanity check for new marketing initiatives. It can also serve as a key source for generating new ideas. Say, for example, your ideal market segment has been persistently cool toward your formal marketing efforts, but have a high level of engagement with user-generated content. That positive response to user-generated content provides a clarion call for future marketing efforts.

Rather than beat a dead horse with more slickly-produced marketing collateral, you can take a page from Andrew Davis’ book, “Brandscaping.” You can look for someone who is already creating user-generated content that speaks to your ideal market segment. They already have the inroad and trust built with that audience. Then, you cut a deal with the person.

Maybe you agree to underwrite a portion of their production costs in exchange for a brief introductory commercial. If your product dovetails with their content in some way, you can offer to supply it for free in exchange for your logo appearing onscreen. The goal isn’t necessarily to make your product the central focus, but to get in front of receptive eyes in the right market segment.

The crux of the issue is that analytics 2.0 arms you with the right information to sanity check new marketing campaign ideas and to generate more targeted marketing initiatives. Taking marketing ideas on faith is no longer a requirement, but an outmoded approach to business in a data-driven world.

Real-time Analytics Creates More User Engagement

Real-time Analytics Creates More User Engagement – October 27th, 2014

By: Lee Jacobson, CEO – Apmetrix

Nothing can substitute for genuine user engagement with your product or service. No one knows this better than GoPro, a camera company that specializes in rugged, handheld and mounted cameras. GoPro customers are some of the most engaged customers out there. They post on Facebook, Twitter and, best of all for the GoPro, they load videos on YouTube. Those videos, along with solicited videos from athletes, serve as the backbone of GoPro’s entire marketing strategy.

What separates GoPro from so many other companies is that it actively encourages its users to post videos or photos and then the company actively shares that user-generated content. You better believe that GoPro is paying a lot of attention to its analytics regarding where its customers are engaging with its products and when. A quick look at their Facebook page shows that shared content averages thousands of likes and hundreds of comments. User-generated videos shot with GoPro cameras often get millions of views on YouTube.

GoPro’s products are uniquely designed for exactly this kind of engagement, but the lesson holds true for every business. You can find a devotee of almost every product posting videos on YouTube, pictures on Pinterest and tweeting about it. The trick is becoming aware of that user-generated content and responding to it quickly. Enter real-time analytics.

Real-time, multichannel analytics reaches out into the vast digital universe of the internet and pulls back information about when and where your product is getting talked about. A good real-time analytics system will be set up to autorespond to much of this content, which not only engages the person posting the user-generated content, but everyone who follows that person as well. The autoresponse feature lets you make the connection when the topic is still hot and fresh in people’s minds.

The next step is to distribute that user-generated content. Even if the sharing doesn’t happen until the next morning, you’re still going to get a lot of traction from it. The person who created the content – who you already engaged – will frequently advertise that you shared content with their followers, which encourages those people to visit your social media profile or website. Your followers get new content to view and respond to with negligible investment from your marketing budget. What makes this work, though, is that initial contact in real-time. Without that initial contact, which you wouldn’t make without the analytics autoresponse, the odds of engagement diminish rapidly.

Actionable Analytics

Actionable Analytics – October 7th, 2014

By: Lee Jacobson, CEO – Apmetrix

The lean startup guru, Eric Ries, frequently discusses the problem of vanity and actionable metrics. According to Ries, a vanity metric is a feel-good metric. Essentially, it serves no purpose but to sound impressive and, often, to mask the fact of some abysmal failure. An actionable metric, on the other hand, delivers information that enables you to make valuable business decisions. Of course, metrics and analytics aren’t the same thing. Metrics focus on tangibles and internal numbers, where analytics embrace intangibles and external numbers. Ries’ point, however, remains valid.

An example of vanity analytics would be calculating the raw number of tweets that reference your business or products. That number can look impressive, especially if there has been a real uptick in the pure number of mentions in the last week or month. Yet, that number doesn’t really tell you anything. It’s a blunt instrument better suited to ego massages than decisions.

An actionable analytic – one that serves to help you make a decision – would be a breakdown of the social sentiment of all those tweets. For example, the total number of mentions might have spiked because of some serious problem with your product. Maybe there was a flaw in the production of your universal remote and it won’t communicate with LG products, which led to a nasty backlash on social media. Knowing that the response is negative and, perhaps even more importantly, why it’s negative lets you take action.

You can use that information to issue an apology and a recall on the device. You can create a form for customers to fill out to get a replacement when the problem is fixed. You can offer refunds. You can manage the problem because you understand there is a problem.

As with metrics, what constitutes an actionable analytic varies from company to company. For an online magazine, the total number of comments and posts on social media may actually serve as an actionable analytic. Media sites need to be concerned with total viewership as much as the social sentiment. If people aren’t sharing and bringing in more eyes, it means the current slant on your content or titles isn’t working and needs to be adjusted. For a business that offers products, social sentiment is a big deal.

Before you jump into crunching numbers, you need to decide what analytics actually matter to your business. There is zero benefit in generating numbers on things that don’t help you make decisions. Once you do decide which analytics matter, you’re best served by finding a software application that automates and streamlines the process of creating reports on those numbers.

Do You Work for Your Analytics, or Do They Work for You?

Do You Work for Your Analytics, or Do They Work for You? – September 26th, 2014

By: Lee Jacobson, CEO – Apmetrix

Stop me if you’ve heard this one. Your numbers guy goes into his office to crunch the analytics numbers…and is never heard from again. For far too many businesses this is not a joke, but a cruel reality. Data sources seem to be increasing exponentially, while the time we all have to devote to them remains painfully fixed. That constant increase in data sources has led many businesses into an odd inversion of priorities. The numbers no longer work for the business, but the business increasingly works for the numbers.

The business response to this problem, all too often, is to simply stop collecting new data from new sources, regardless of whether the business sets up a new social media account or starts offering products through a new e-tailer. From a time management perspective, this makes abundant sense. You already have more data than you can manage. Bringing in more will only exacerbate an existing problem. From a business case perspective, this is disastrous.

Every new data source gives you a new opportunity to understand customers, old or new, in new ways. After all, how customers respond to you on the text-based Twitter is going to be radically different than the way they respond to you by way of the photo-based Tumblr. Ignoring that data deprives you of the opportunity to convert prospects into customers. The question, then, is how do you make your analytics work for you, rather than the other way around?

One of the biggest time sucks in analytics is the lack of a standardized method of data reporting. Excel files are common, but so are CSV files. If you’re an Excel fan, and most businesses are, that means converting every single CSV file before you even start doing the work of data correlation and collation. Only then can you begin the real work of translating all that raw data into workable information. It’s no wonder businesses give up and say, “Enough is enough! No new data sources!”

Automating that conversion process is a big step in making your data work for you. The less time you spend on that part, the more time you spend on analysis. If you can get real-time analytics in the process, a once unmanageable task suddenly becomes a business boon. This isn’t just wishful thinking, but a present reality. This is exactly what Apmetrix focuses on providing you. Our scalable software is built to automatically convert CSV into Excel, provide real-time data access and even offer customized reporting in a single dashboard.