(Forbes) - I’m not a fan of the word “governance” in the context of big data and
the IT policies that “govern” how we use it. The term governance comes
across as boring and maybe even a bit oppressive, not unlike how many
people think of actual governments – especially the dystopian regimes
where citizens are stifled by excessive laws and overzealous enforcement
of all those rules.
Some data-driven organizations indeed cling to heavy-handed IT
policies that harken the regimented Big Brother–style oversight George
Orwell wrote about in his famous novel, 1984. I recently re-read that
book and the concept of “Newspeak” jumped out at me. This is the
strictly enforced, limited
vocabulary that kept citizens in the novel
from thinking new thoughts by literally curbing the creation of new or
creative thoughts by removing the words that enable them. It brought to
mind some enterprise environments where analytics are locked down by
stern limits on the kinds of questions users are able to ask of their
data.
On the flip side, having no rules at all in your organization can
lead to a state of anarchy and mayhem that is just as bad as having too
many rules. Most people would agree that analytics requires a certain
amount of quality assurance and reliability validation, along with
sensible privacy policies, security protocols and other measures that
all fall under the rubric of governance.
That’s why we need to start looking at governance as a kind of
necessary evil, and pulling off the right balance between access and
control can be tricky. By far, the most burdensome governance hurdles
many users face involve data access and security; and I’ve personally
found that these hurdles are most challenging when organizations fail to
distinguish discovery processes and environments from operational
processes and environments.
Data security policies, if poorly designed, can have a far bigger
impact on the ability to discover new insights than anything else.
After all, if data is not accessible, it can’t be analyzed. So a core
group of analytics professionals tied to discovering and exploring
innovative new processes must be highly trusted within an organization,
with broad license to mix and match data. At the same time, these
creative data artists (a term, by the way, that I prefer to “data scientist”)
need to understand corporate rules and other limits on what’s
ultimately allowed in the production environment. That way, they can
take these limits into account as they work in the discovery mode and
can be thinking from the start about how their approach might need to
change in an operational context.
Good governance goes even further, however. Regardless of how you
massage corporate policies within your organization to give wiggle room
for your discovery team, you must always make sure to follow all
applicable laws. Strict legal limits exist, for example, on medical and
credit card data, and the nuance between discovery and production mode
is going to be lost on hard-nosed, outside investigators. Beyond the
legal strictures, you also have to consider the “ick factor” of
prevailing norms and expectations among your customer base. No matter
how legal your analytics may be, it’s still a losing strategy if you
manage to creep out your customer base through what might be perceived
as intrusive or indiscreet handling of sensitive information.
For all these reasons, I think we need to make peace with the term
“governance” and realize that it’s an undue burden only when an
organization makes it so. The best approach is to pursue a unified
analytics environment – one well suited for the analytics revolution
– that recognizes the unique dynamics of both the discovery and
production modes and makes the transition between the two settings as
seamless and trouble-free as possible. If the discovery and deployment
environments are integrated and consistent, you’ll save a lot of
headaches and realize value faster with whatever exciting new analytics
processes that you plan on developing and deploying.
No comments:
Post a Comment