Big Data: A Game of One Many Times Over

Posted on by Beth Negus Viveiros

All by itself, big data doesn’t matter. What matters is the application of big data to accomplish business objectives.

Technologists point to very large numbers—the unprecedented amount of data accessible to companies today and the availability of low-cost processing to interpret it. Pundits point out that much of the data is noise and that the demand for data scientists so exceeds supply that companies need to figure out how to create data scientists out of the people they’ve got.  And yet executives know that the strategic value harnessing of big data can deliver competitive advantages.
 
How do you rectify the quandary of the amount of data, juggling a budget, yet still delivering the personalized approach needed to attract a diverse set of individuals with specific demand? Many companies have been thinking of it in the wrong way. Some may say that it is a “quantity vs. quality” issue, but all data potentially is quality data because it pertains to someone that could be a potential customer. My $20 for a shirt is just as beneficial as my neighbor’s—but the way we get there will almost certainly be different.
 
Big data strategies have evolved past game of mass numbers. It’s a game of many “ones.” Effective strategies require engaging individuals in the way that they want to be engaged. But, many companies look at how to get the largest data set collectively, when instead they should thinking of how to discern the noise from the signals that drive more content and brand attention. However, at this massive scale, machines are required to act because there aren’t enough resources—whether budget and/or data scientists—armed with advanced analytics to continuously iterate based on constantly changing data.
 
With slim margins for error and with competitors probably looking at similar data sets, the most successful big data strategies have systems that are at-the-ready to interpret intent and respond with relevant content in real time—not just when resources are available. Let’s face it, no matter how much data you have and how many smart people you have to respond—you’re probably too late—ask Ken Jennings and Brad Rutter after facing IBM’s Watson on “Jeopardy.” You need the intuitive characteristics of artificial intelligence like natural language processing, automatically adjusting feedback loops, an almost infinite “if this, then that” approaches—that’s useful big data.
                                                                                                       
Want an example? Consider traffic, a headache for most Americans twice a day. Waze is a popular mobile app not because it collects a ton of historical traffic data. People use Waze because it crunches real-time data and helps them find the fastest route from point A to B. Many Waze users even help keep the data current by giving real time reports of accidents, traffic jams or road hazards.
 
On the business side, consider a company like like TellApart, which uses a big data approach for ad retargeting, or our firm Bloomreach, which uses a big data application to apply consumer intent to deliver timely content to online shoppers via SEO.
 
Huge ROI can be attained by making things relevant to a consumer – and yes – it still takes huge amounts of data, but not in blocks. Instead, think of it as sifting through sand on a beach with a tool that adjusts to the individual shapes and sizes of the grains, not shoveling it up with a backhoe.
 
Joelle Kaufman is head of marketing for BloomReach.

More

Related Posts

Chief Marketer Videos

by Chief Marketer Staff

In our latest Marketers on Fire LinkedIn Live, Anywhere Real Estate CMO Esther-Mireya Tejeda discusses consumer targeting strategies, the evolution of the CMO role and advice for aspiring C-suite marketers.

	
        

Call for entries now open

Pro
Awards 2023

Click here to view the 2023 Winners
	
        

2023 LIST ANNOUNCED

CM 200

 

Click here to view the 2023 winners!