Research

CAO Perspectives: Ideal Analytics Organization

By Doug Hague, Nov 13, 2019

Available to Research & Advisory Network Clients Only

To set the stage, the analytics organizational structure I’m presenting below pertains to an analytics organization between 60 and 120 people; this is the size that seems to be a sweet spot for an effective and efficient team (large enough to have specialized skill sets, but small enough to effectively demonstrate the benefits of the team). Moreover, I’m presenting such an organizational design in consideration of an analytics effort at an established, traditional corporation, not a digital native. Digital natives will break down differently with more need for data science and data management. With 60 to 120 people, I prefer a centralized organization with P&L Analytics/Ad Hoc Analysis dotted-lined to their business partners.

Read More »

The Analytics of Things

By Bill Franks, Oct 16, 2019

Available to Research & Advisory Network Clients Only

The Internet of Things (IoT) has exploded in recent years due to the dropping costs of sensors, network bandwidth, and data storage. What wasn’t economical a decade ago is now compellingly cheap. As a result, sensors are turning up in more and more places and are generating more and more data. The problem is that as always, generating a bunch of data doesn’t by itself provide any value. What provides value are the analytics using that data. In the case of IoT we will call these analytics the Analytics of Things (AoT). This research brief will dive into a number of important considerations when analyzing sensor data. There will be discussions of success stories, governance, security, technology, and the analytical methods that can be applied to IoT data.

Read More »

Everything You Wanted To Know About Containers But Were Afraid To Ask

By Jesse Anderson, Oct 01, 2019

Available to Research & Advisory Network Clients Only

Containers are repositories that hold everything required to run a microservice, software process, application, or data analytics program. Each container includes everything required to run the program: executables, binary code, libraries, and configuration files. This could include Python code or Java code, plus dependencies such as Python modules, JAR files (for Java), interpreters, security software, and the secure sockets layer (SSL).

A container should not access anything outside itself; it is self-sufficient and functions outside of the network. Using a software version or piece of code that doesn’t reside within the container causes leakage. The problem with leakage is that anything located on the host operating system—that is, outside the container—is subject to change.

Read More »

A Framework For Analytical Approaches

By Elliot Bendoly, Sep 20, 2019

Available to Research & Advisory Network Clients Only

Any effective journey in analytics involves multiple touch points. Multiple stages of understanding. Multiple vantage points through which data is considered, and extracted intelligence scrutinized. Each step we take is designed to help fill in the blanks. Part of this effort involves outlining the structure of the problems we face, while the rest involves identifying strong solutions to those problems.

It’s a bit like putting together a complex puzzle … of an intricate maze … with multiple possible exit points. On top of that, not all the pieces are immediately available, or never will be, or will be blurry at best. And in contrast to regular puzzles, there aren’t really any obvious borders.

Read More »

The Ethics of Analytics

By Bill Franks, Sep 12, 2019

Available to Research & Advisory Network Clients Only

The ethics of analytics are receiving more and more attention today. Historically, the only aspect of ethics that received any substantive attention was the privacy of sensitive personal data. The broader aspects of ethics didn’t truly come to the forefront until late 2017 and early 2018.

What’s driving the sudden focus on ethics are the new, evolving artificial intelligence (AI) capabilities as well as the embedding and operationalizing of analytics as discussed in The Analytics Revolution. These two trends involve analytics making a huge number of automated decisions for us. Therefore, people want to understand what the algorithms are doing, how they’re doing it, and how we can know they are sufficiently ethical.

Read More »

Seven Steps to Implement DataOps

By Christopher Bergh, Sep 06, 2019

Available to Research & Advisory Network Clients Only

The speed and flexibility achieved by Agile and DevOps, and the quality control attained by statistical process control (SPC), can be applied to data analytics. Leading edge proponents of this approach are calling it DataOps. DataOps, simply stated, is Agile development and DevOps with statistical process control, for data analytics. DataOps applies Agile methods, DevOps, and manufacturing quality principles, methodologies and tools, to the data-analytics pipeline. The result is a rapid-response, flexible and robust data-analytics capability, which is able to keep up with the creativity of internal stakeholders and users.

Read More »

Mapping an Information Economy

By Doug Mirsky, Aug 16, 2019

Available to Research & Advisory Network Clients Only

Information Economies in Organizations

The data warehouse revolution began in 1991 when Bill Inmon published Building the Data Warehouse. Inmon observed, early in that book, that every organization has a naturally occurring information economy, and that most naturally occurring information economies were inefficient, duplicative and prone to produce suboptimal decisions.

This observation of Inmon’s has not gotten anywhere near the credit, or attention, it deserves. A decade’s worth of collective practice in advanced analytics should tell us that everything we know about real-world economies applies to our information economies. There is demand for information by people and functions in an organization, and there is a supply of (some of) that information. There is (some amount) of technical and procedural infrastructure – some kind of market — to bring demand and supply together in an organized way. That “market” infrastructure is often partial, fragile and in some cases ineffective. There are competitive alternatives (like cloud service providers and SaaS vendors), over- and under-regulation (various data governance models), excessive demand-side taxation (cost allocation strategies), failure to invest in infrastructure, and all other elements of economies.

When organizations are planning strategy-driven large-scale advanced analytics programs, they should begin their planning by characterizing their as-is information economy.

Read More »

Creating A Data Engineering Culture: What it is, why it’s important, and how, and how not, to build

By Jesse Anderson, Jul 31, 2019

Available to Research & Advisory Network Clients Only

Why do some analytics projects succeed while so many fail? According to Gartner analyst Nick Heudecker, as many as 85% of big data projects fail. However, the ROI from the other 15% that succeed is incredibly promising. With such a clearly high barrier to competency in executing big data strategies, there remains significant opportunity for first-mover advantage for enterprises that can crack the code to improving their outcomes.

So, what can organizations do to increase their chances of big data success? Part of the answer lies in creating a data engineering culture. This is the necessary foundation underpinning a big data analytics proficiency and enables companies to outperform the competition.

Read More »

Mastering the Art & Science of Storytelling

By Brent Dykes, Jul 26, 2019

Available to Research & Advisory Network Clients Only

Analytics experts love data. But just presenting raw data or even insights derived from data isn’t good enough. To create business value from data requires that analytics professionals develop skills at data storytelling. This entails telling persuasive stories, tailored to a specific audience, that combine data, narrative, and visuals effectively.

Why Storytelling?

Human beings love stories. In fact, author Philip Pullman has written, “After nourishment, shelter, and companionship, stories are the thing we need the most in the world.” And scriptwriting expert Robert McKee has said, “Storytelling is the most powerful way to put ideas into the world today.”

Read More »

Graph Analytics Use Cases

By Daniel Graham, Jul 10, 2019

Available to Research & Advisory Network Clients Only

Introduction In 1996, two computer science students — Larry and Sergei — were enthralled by the emerging internet. But finding anything on the undeveloped web was horribly difficult. Then came the “Aha!” discovery that academic web page citations (URLs) are a proxy for popularity. If many websites “like” the same web page, that page value is probably higher to researchers. So Larry and Sergei designed an algorithm called PageRank. It measured “link juice” — the strength between web pages. Google emerged from PageRank, web URLs and an advertising business model. This article explores the incredible value of “link juice.” Graph analysis turns the relational…

Read More »