Research

The vast majority of people building analytics and data science processes have every intention of being good and ethical. As a result, most potentially unethical and evil processes arise in situations where that wasn’t the intention. The problem is typically that proper focus and governance is not in place to keep analytics and data science processes on the side of good. On top of that, what is good and what is evil isn’t nearly as clear cut as we’d wish it to be.

Read More »

Mapping an Information Economy

By Doug Mirsky, Aug 16, 2019

Available to Research & Advisory Network Clients Only

Information Economies in Organizations

The data warehouse revolution began in 1991 when Bill Inmon published Building the Data Warehouse. Inmon observed, early in that book, that every organization has a naturally occurring information economy, and that most naturally occurring information economies were inefficient, duplicative and prone to produce suboptimal decisions.

This observation of Inmon’s has not gotten anywhere near the credit, or attention, it deserves. A decade’s worth of collective practice in advanced analytics should tell us that everything we know about real-world economies applies to our information economies. There is demand for information by people and functions in an organization, and there is a supply of (some of) that information. There is (some amount) of technical and procedural infrastructure – some kind of market — to bring demand and supply together in an organized way. That “market” infrastructure is often partial, fragile and in some cases ineffective. There are competitive alternatives (like cloud service providers and SaaS vendors), over- and under-regulation (various data governance models), excessive demand-side taxation (cost allocation strategies), failure to invest in infrastructure, and all other elements of economies.

When organizations are planning strategy-driven large-scale advanced analytics programs, they should begin their planning by characterizing their as-is information economy.

Read More »

We’ve had technical people focused on the ingestion and management of data for decades. But, only recently has data engineering become a critical, widespread role. Why is that? This post will outline a somewhat contrarian view as to why data engineering has become a critical function and how we might expect the role to evolve over time.

Read More »

Graph Analytics Use Cases

By Daniel Graham, Jul 10, 2019

Available to Research & Advisory Network Clients Only

Introduction In 1996, two computer science students — Larry and Sergei — were enthralled by the emerging internet. But finding anything on the undeveloped web was horribly difficult. Then came the “Aha!” discovery that academic web page citations (URLs) are a proxy for popularity. If many websites “like” the same web page, that page value is probably higher to researchers. So Larry and Sergei designed an algorithm called PageRank. It measured “link juice” — the strength between web pages. Google emerged from PageRank, web URLs and an advertising business model. This article explores the incredible value of “link juice.” Graph analysis turns the relational…

Read More »

Modernizing Analytics for Law Enforcement

By Steve Shirley, Captain Steve Serrao, Robert Morison, May 29, 2019

Technologically, law enforcement is an exciting field these days. Vast new sources of electronic data and advanced analytical methods offer opportunities not only to resolve individual investigations in record time, but also to discover patterns of activity to exploit in crime prevention. To seize these opportunities, many agencies are modernizing their information and analytics platforms. To explore the pragmatic challenges and potential benefits of modernization, IIA spoke with Steve Shirley, Head of Customer Advisory for the SAS Justice and Public Safety Team, and Captain Steve Serrao, Senior Customer Advisor for the SAS Justice and Public Safety Team.

Read More »

GE’s Path to Emerging Analytics Technologies

By Mano Mannoochahr, May 01, 2019

Available to Research & Advisory Network Clients Only

GE aspires to be an algorithmic business, but recognizes this transition will not occur overnight. It will occur in stages as the company develops new capabilities and implements multiple emerging technologies. This transition requires building solid foundational systems and encouraging broad experimentation and innovation using new analytics technologies.

Beyond getting experience with next-generation technologies, transitioning to an algorithmic business requires cultivating an enterprise-wide data culture and changing how people work throughout the company, particularly on the front line.

Read More »

Artificial Intelligence – A Primer On Several Common Approaches

By Bill Franks, Apr 24, 2019

Available to Research & Advisory Network Clients Only

There is a lot of well-deserved hype for artificial intelligence algorithms and for deep learning in specific. Self-driving vehicles are already being tested and rolled out into our communities. So, the future is here. The way the cars are enabled is partly through using convolutional neural networks to do object detection. There are certainly many other algorithms that are part of the self-driving process, but a lot of the key algorithms that enabled us to get to where we are today are the convolutional neural networks that are explained in this research brief.

Read More »

Portland 2019 Analytics Symposium Video: Michael Hoffman

By Michael Hoffman, Apr 17, 2019

Available to Research & Advisory Network Clients Only

Mixed Reality and Analytics

Mixed reality (XR) technology is providing quantifiable business value through multiple features and benefits, which include shared 3D context, spatial mapping, data visualization, and much more. Companies are deploying XR across multiple uses cases—and many of these use cases require the use of analytics to analyze and gain insights from massive amounts of information.

Read More »

Portland 2019 Analytics Symposium Video: Melanie Mitchell

By Melanie Mitchell, Apr 17, 2019

Available to Research & Advisory Network Clients Only

AI Hits The Barrier of Meaning

Hype about AI is not new. In 1965 experts predicted that by 1985, AI would do anything humans could. Today, many are optimistic about AI while others want to put the brakes on. So how close are we to human-level AI?

Today, the most common form of AI is deep neural networks, which can do impressive things like object detection and tracking. Classification errors have gone down and are now only 3%, while detection has improved. Combining vision with language enables systems to identify a picture and generate a caption, often with impressive results. Speech recognition and translation have improved as has the ability of machines to answer questions. Machines have shown improved reading comprehension and the ability to play video games. This progress is why some feel AI is closer to human-level intelligence.

Read More »

Portland 2019 Analytics Symposium Video: Mark Madsen

By Mark Madsen, Apr 17, 2019

Available to Research & Advisory Network Clients Only

The Black Box: Interpretability, Reproducibility, and Responsibility

Historically, a model produced a result that was interpreted by a person who made a decision. In recent years, as the amount of data and number of decisions have grown, agency has been taken from humans and given to machines, which make decisions in a black box. Black boxes raise issues around explainability (or interpretability)—being able to explain how a decision was made—and reproducibility —being able to use the same data and model to make an identical decision.

The reality is that being able to explain complex decisions is extremely difficult, and may not be necessary. And, being able to reproduce decisions is also very challenging, as data, tools, software, models, and environments change. Any single change can have a ripple effect that changes everything. The real issues are trust, reliability, and repeatability, particularly in high-stakes decisions. Building trust starts with IT policies, governance, and infrastructure, to enable preserving history and allow for understanding and reproducing decisions. This is the key to gaining trust and scaling analytics.

Read More »