For video game fans of the early nineties, and Internet meme lovers ever since, the phrase “all your bases are belong to us” is legendary. It first appeared in a game called Zero Wing and stands as a testament to mistranslation. For many businesses, big data and analytics is a classic case of “all your bases are belong to us.” You know the data is out there. You also know you should be able to do something with it, because just about every business magazine and marketing expert alive tells you so. Yet, when you get your hands on the data, it leaves you scratching your head.
Volume is a big part of the problem. With the ever-expanding set of opportunities to gather ever more data, it’s painfully easy to lose the financially valuable trees through the nigh infinite data forest. Culling useful knowledge from that vast expanse of information requires a combination of sufficient processor resources, a viable analytics 2.0 program and a sense of what benefits your business. Without that combination of factors, you run a real risk of mistranslating your data.
Now, with any luck, you’ve got a good sense of what matters to your business and your staff knows what information they need from big data. Any good IT person can get you set up with the right technological infrastructure to process all that data, or you can sign on with a cloud-based analytics service that processes it for you. Where things get tricky is the software.
Just because an analytics 2.0 program can show you the data you’re interested in, doesn’t mean that it’s going to show you that data by default. Even with vendor assisted customization, analytics programs often lack an intuitive interface that lets you easily move from a big-picture view to drilling down into a highly selective data set. The less intuitive the program, the less “all your data are belong to you.” The data becomes the software’s hostage and you’re left in the unenviable role of hostage negotiator with your business success on the line.
What’s worse is that un-intuitive programs are likely to lead you to misread the end results of any given analytics function. Maybe you were looking for overall social sentiment about your new product and all you’re getting is social sentiment from Instagram. If you don’t catch that error immediately, it can be disastrous. Regardless of social sentiment on Instagram, you wind up with an incomplete and undoubtedly skewed view of social sentiment regarding your new product. Since you can’t avoid sanely big data and analytics are the inevitable end result of big data, you need to stack the deck in your favor where you can: the software. When you go to make that investment, be sure the software is intuitive to use not only for you, but for your staff as well.
The image of pirates on the open seas wielding cutlasses and swilling rum has certainly got a boost in recent years courtesy of Disney. Yet, the modern pirate sails a very different ocean: the digital ocean of the Internet. These pirates don’t steal gold, but your intellectual property. As most app developers know, the turnaround time from app release to hacked versions going up on pirate boards runs about 72 hours. If you’re like most app developers, you can almost hear the money being deposited in a pirate’s bank account.
Almost as bad as the inevitable financial loss is the faulty data that you get from those hacked apps. A wealth of data from users that never bothered to become paying customers provides limited utility value. In the end, all it tells you is what appeals most to people that don’t pay. Working from that data just means that your next app, or the improved release of the current app, will be even more appealing to non-paying base. This isn’t what anyone would call an ideal situation.
With limited recourse, developers tend to accept these as the harsh realities of app development and hope that enough people will pay to make it financially feasible to continue. A better situation is one where your analytics suite tells you what data comes from legitimately purchased apps and what comes from pirates. In that situation, you can kill two birds with a single stone.
The identification and removal of fraudulent data means that your analytics are reflecting the actual users you care about, namely the ones that pay. While the features or elements these users prefer might be identical to those preferred by people using hacked apps, you won’t know for sure until you filter the data. If there is a preference variance between paying and non-paying users, you can slant your next release to favor your paying user base. That doesn’t mean the new version won’t get hacked, because it almost certainly will, but it incentivizes paying users to recommend the app to like-minded others.
Of course, that still leaves the financial loss of all the non-paying users. If your analytics suite can identify non-paying users, it provides you a golden opportunity to help recoup your losses. You don’t need to just lie down and take it! You can feed those non-paying customers ads. You get compensated by the advertiser and get to annoy non-paying users. Win-win.
Are you taking proximity beacons seriously? If not, you should be, because Apple is taking them seriously and so is Google, via the Android OS. Proximity beacons, or what they can do, provide businesses the chance to interact with customers not just on the local level, but the hyperlocal level. In other words, you can communicate with customers inside the building, department or even aisle.
The profitability in achieving that kind of personalization is, frankly, mind-boggling. A substantial portion of the smartphone-owning public, a constantly increasing number, is willing to tell you where they are in exchange for more relevant marketing or deals. Although not proximity sensor driven, this is the same conceptual model Groupon employs by providing city specific deals in its emails.
Take a grocery store, for example. Few retail environments are so plagued by the problem of choice paralysis. There are countless brands competing or trying to compete for consumer attention and dollars. Without some compelling reason to do otherwise — such as conscious cost-cutting or a reason to believe another product will perform better — most consumers opt for their normal brands. This decision or non-decision lets them avoid the choice paralysis of trying to figure out, on the spot, which of the dozen or so choices might best serve their needs.
Enter proximity beacons and hyperlocal marketing. If you want to move a particular product or set of products, proximity beacons can help provide your customers with a reason to choose a specific product. Instead of waiting for customers to “discover” a product on the shelf, every customer with the appropriate software on their phone can receive a coupon for a product on entering the store. If you want to help move secondary products, messages or coupons can be delivered in the aisles. The previous tendency to grab the familiar is now offset by an opportunity for novelty at a reduced cost; two things that are likely to drive sales.
The trick, of course, is to keep track of the information. It’s not enough to simply send the messages. How many people are receiving the messages? How many people are using them? This is where a robust analytics 2.0 package, such as Apmetrix, becomes invaluable. Not only can it import the data and give you a real-time view of what’s happening in the store, but it can also help you track social sentiment about the deals on social media.
By: Lee Jacobson, CEO – Apmetrix
Context can help organize the sometimes chaotic world of data into coherence. Take the following statement as a case in point: One man punches another man in the face. In the world of analytics, this would simply be reported as an instance of pure data. On the other hand, this statement as written leaves us confused. Why did the man punch the other man? Should we be horrified? Disappointed? Thrilled? If this action happened between two patrons in a bar, it can mean one thing. If it happened in a mixed martial arts match, it would mean something very different. Under the old model of analytics, we would translate this scenario by counting how many times someone gets punched in the face. In other words, we would record the number of occurrences for that specific action. Unfortunately, without context this approach only multiplies the amount of data you don’t know what do with.
To carry this metaphor just a little bit further, the function of analytics 2.0 is to provide you the context you need to understand whether to cheer or help pull a couple of guys apart. Let’s look at a different example. Say for instance that your metrics show an unusually high rate of cart abandonment in the last day. This high rate of cart abandonment also corresponds with a big uptick in chatter about your business on social media. If you aren’t looking too closely at the social sentiment (i.e. the context) of the chatter, you might read this as a win. There is social engagement about my brand!
When you finally set aside the volume of chatter to investigate the emotional tenor, you discover that your potential customers are actually vehemently complaining. Maybe you’ve got a glitch in your cart programming and customers are being overcharged on shipping? Perhaps it won’t let anyone using Firefox complete their transactions? In situations like this, context would alert you to the problem, make your data coherent, and better position you to take action.
Of course, if you’ve had twenty-four hours of lag time between the problem occurring and starting to resolve it, the damage is already done. The Internet world moves quickly and the loss of customer trust is nearly as fast when things go wrong. For your analytics to mean anything, they need to happen in real-time and cue you to social sentiment. Apmetrix builds and expands its analytics solutions with these exact concerns in mind. Whether it’s mobile, digital media, gaming, television or any entertainment solution we offer, Apmetrix has a scalable solution for your business.