Electric Insights
APPROACH · WORKFLOW · TRANSPARENT METHODS

The Electric Insights
approach.

A practical workflow for moving from reported outcomes to interpretable explanation, scenario testing, and interactive public inspection.

What standard reporting often leaves out

Many fields report a topline result without showing what helps explain it, how large the relevant shifts are in outcome-level terms, or what changes under stated alternatives.

Topline without depth

A percentage, rate, or score can be useful, but it often arrives without a disciplined public account of what helps explain it.

Technical results, weak translation

Even when a model exists, the results are often left in specialist units rather than translated into plain-language changes in the outcome.

No way to test alternatives

Readers are often left with commentary instead of tools that let them ask, compare, and inspect “what if?” scenarios for themselves.

How the workflow works

The domain can change. The workflow stays broadly the same.

1

Define the outcome

Start with the result that matters: approval, vote choice, shot success, brand choice, conversion, or another measurable outcome.

2

Measure plausible drivers

Gather the measured factors that may help explain the outcome and structure the data so those factors can be compared and modeled clearly.

3

Build an explanatory model

Fit a defensible model of the observed outcome so the analysis is not limited to isolated descriptive slices or commentary alone.

4

Translate results into outcome-level terms

Report the implications in plain units such as percentages, percentage-point shifts, probabilities, or rates rather than relying on technical coefficients alone.

5

Let users test scenarios

Turn the model into an interactive tool so users can explore how the outcome changes under alternative conditions and inspect the assumptions more directly.

Why pair models with interactive tools

A model becomes more useful when people can test it. Interactive tools make assumptions visible, allow scenario comparison, and let users see how much the outcome moves under alternative conditions.

That does not replace careful explanation. It strengthens it by turning interpretation into something more inspectable, more reproducible, and more open to challenge.

What “show the work” means here

The goal is not only to display an outcome, but to make its explanatory basis easier to examine. In practice that can include model choices, variable definitions, source-data context, and tools that let users rerun or stress-test parts of the analysis.

The result is a more visible explanatory layer rather than a headline number followed only by unsupported commentary.

One approach, multiple domains

The same broad workflow can be applied across public opinion, sports analytics, market research, and other decision settings.

Public Opinion

Explain approval, vote choice, and other headline survey outcomes.

Sports Analytics

Explain outcome rates under changing play or shot conditions.

Market Research

Explain brand choice, evaluation, and simulated shifts in demand.

Strategic Decisions

Explain measurable outcomes and test plausible alternatives.

See the approach in live examples

The current site demonstrates the workflow in two domains with live pages and tools.

Public Opinion

JFK Approval

A historical public-opinion case study centered on what drove Kennedy’s approval rating in November 1963 and what might have changed it.

Sports Analytics

NBA 3-Point Shots

A sports-analytics case study centered on what drove the league’s three-point make rate in the 2014–15 season and what might have changed it.

Start with a live case study

Explore the current examples to see how explanatory modeling and interactive simulation look in practice.