-->

How Big Data Can Be A Big Problem - Karthik Ramasamy

Data is considered a business’s most valuable asset, so it’s understandable that organizations often strive to collect as much of it as they can. The more data you have, the more opportunity for insight, right? That mentality has spawned the era of big data — which is the attempt to analyze and extract patterns, trends or associations from data sets that are too large to handle through traditional computation. Yet in my experience working for a company specializing in driving data in real time, I've found that for many organizations, fixation on endlessly collecting more and more data in the belief that it will magically generate value can be a fool’s errand. Simply accumulating every bit of data available to the business isn’t always the key to better results. It can actually lead to serious obstacles when trying to derive value from what has been collected.

The more data businesses have, the more difficult and time-intensive it is to process and analyze it. This added complexity can delay crucial decisions and actions, delays that will ultimately hurt the business. It also puts the most predictive and important data — recent data — further out of reach as it’s stored for review at a later date.

Here’s a simple thought experiment: Am I better able to make a decision or take advantage of an opportunity if I know what’s happening right now in the present moment, or what was happening at some arbitrary point weeks, months or years ago? In most instances, it’s that immediacy that offers the most valuable insight. And anything that hinders fast decisions based on “right now” data hampers success. For many organizations, this requires a complete 180-degree turn in thinking. Instead of focusing on accumulating more data and hoping to derive value later, businesses should focus on getting immediate value out of key information as it streams into the organization. Acting and reacting to this fast data can lead to real progress toward specific business objectives but requires teams to approach their data in new ways.

Determine what data needs to be processed immediately.

To uncover data that deserves immediate attention, analysts need to go beyond just looking at historical trends and start asking: What can I do if I know current conditions? Often this means processing data before it is stored, or foregoing storage altogether. For example, consider the weather. Big data trends might dictate that spring retail fashions typically generate their best returns when they enter inventory as early as March and make way for fall items in the middle of the summer. Yet by knowing what the weather is at the present moment where an online shopper is located, a savvy retailer can act immediately to promote hot weather items (fans, A/C units, shorts and T-shirts, etc.) at the front of a shopper’s search when a sudden heatwave happens later in the summer.

A shift in thinking about the value of “now” will also help data teams better balance complexity and speed. Very sophisticated data models can take time to run, slowing down even the fastest data. Is that delay warranted, or is speed the critical factor? The right answer will depend on the specifics of the business, but if your team is not asking the question you could be missing out on driving growth.

Build analytics that know how to incorporate the most recent data.

Businesses can start by reviewing existing analytical models within their organization. The key is acknowledging that not every business interaction requires an immediate response. Some of the best candidates will be those where an understanding of current conditions can improve the accuracy and predictability of a subsequent action. For example, industrial IoT sensors may indicate a deviation or impending failure that requires immediate attention, and knowing that condition X is likely to lead to failure Y in the short term is a candidate for fast action, not later analysis.

The goal should be to identify where the payoff will be the greatest. There are many other systems related to customers, logistics, operations, supply chain and more that teams may want to optimize. Fast data can make each drastically more valuable when it leads to dynamic decisions based on current information rather than on static, predetermined actions based on historical trends.

Leverage the right technology to enable fast data processing and analytics.

Big data recognized that scale meant previous data processing approaches were no longer appropriate, but the pursuit of fast data likely requires new systems and approaches to handle that data velocity. The right technology underpinnings are required to transform, process, analyze and distribute data in motion. A key consideration might be to include the adoption of a modern, cloud-native approach, for two reasons. First, in today’s world much fast data is cloud data, and is best handled in its native environment for speed and flexibility. Secondly, the move to fast data often entails rapid iteration, massive fluctuations in scale (both up and down), dynamic integration of multiple services and reconfiguration on the fly — all of which must be done without disruption to the overall business. Those are areas where cloud-native approaches excel, while legacy hardware-intensive solutions were simply not designed with this dynamic use case in mind.

In many instances, the siren call of big data has led companies on a never-ending pursuit of more and more data as a goal in and of itself. Yet, by instead focusing on their most recent and relevant data, businesses can simplify their overall data infrastructure while making analytics more accurate and predictive. The potential benefits are myriad, from new revenue opportunities to more efficient operations to more relevant (and thus better) customer interaction. Changing how teams approach data is difficult, but leadership and analysts will be relieved to learn that in the case of fast data — less is more.

(Forbes)

No comments:

Post a Comment