Original source: SiliconANGLE theCUBE
This video from SiliconANGLE theCUBE covered a lot of ground. 4 segments stood out as worth your time. Everything below links directly to the timestamp in the original video.
Most organisations treat data as a cost centre rather than a compounding asset. The casino case shows why that framing leaves measurable money on the table.
Data's Economic Value Lies in the Decisions It Supports, Not the Volume It Represents
Working with a major casino, Schmarzo found that 48.2 percent of players'-card sign-ups visited only once. Moving that figure two percentage points — to 46 percent — carried a $40 million impact, which in turn provided a concrete basis for deciding how much to spend acquiring third-party consumer data from sources such as Acxiom. What this exposes is that data has no legible price tag in the abstract; its value emerges only when anchored to a specific decision inside a specific business initiative. The structural issue compounds from there: data gathered to resolve one decision does not depreciate the way capital or labour does. It replicates across subsequent decisions in a network-effect dynamic — one plus one equalling three, then nine — a property that, Schmarzo argues, underpins the entire logic of digital business.
"I tie the value of the data back to the decisions you're trying to make, which ties back to your business initiative."
The Biggest Barrier to Data-Driven Culture Is Hierarchy, Not Technology
Schmarzo argues that the primary obstacle blocking organisations from leveraging data is not a shortage of tools or talent but a cultural habit he calls 'HIPPO' — the highest-paid person's opinion overriding evidence-based deliberation. When senior figures dismiss junior ideas in open forums, creative thinking collapses and data initiatives stall. Schmarzo teaches this framework at the University of San Francisco and applies it in facilitation workshops where rank is explicitly suspended. The real question is not whether firms possess sufficient data, but whether their internal power structures allow that data to surface decisions from the people closest to customers. Small and medium-sized businesses, he observes, show the strongest big-data results precisely because they cannot afford layers of opinion-driven management to intercede.
"If you want to kill creative thinking, have some senior person go out there and say, 'That's a stupid idea' — then no one's going to volunteer anything."
Schmarzo's 'Anti-Jabberwocky' Framework Strips Big Data Back to Business Decisions
Rather than opening with technology choices — whether a company needs Apache Spark, for instance — Schmarzo's approach begins with a nine-to-twelve-month business objective that already has executive urgency behind it, then maps the three to five business functions affected, and finally identifies the specific decisions those stakeholders need to make. The book Moneyball, he notes, describes the logic better than most data-science curricula: the discipline is fundamentally about finding variables that predict performance, not about accumulating infrastructure. What this exposes is that the industry's tendency to lead with jargon — what he names the Jabberwocky strategy — actively prevents purchase decisions, because customers buy on understanding, not confusion. Anchoring the conversation in outcomes, he argues, unlocks unconventional data sources such as building permits or property listings as legitimate analytical inputs.
"There's a belief that if I can confuse my customer enough they'll actually buy from me — when in reality customers don't buy in confusion, they buy when they have understanding."
Data Science Shortage Overstated When Treated as a Team Discipline, Schmarzo Argues
Rather than chasing what Schmarzo calls the 'unicorn data scientist' — a single individual expected to master modelling, engineering, domain knowledge, and communication simultaneously — he advocates assembling a team in which each role does what it does distinctly well: data engineers manage pipelines, subject-matter experts supply the business questions, and data scientists focus on identifying which variables carry genuine predictive power. The structural advantage of a data lake over a traditional data warehouse in this model is speed of iteration: a hypothesis about whether, say, property-price data predicts student classroom performance can be tested in hours rather than the two to three months a warehouse schema build would require. The real question is not whether data science talent is scarce, but rather whether organisations are structuring the work in ways that make any single person's absence a bottleneck.
"The data lake allows me to have that very rapid environment so that the business users aren't throwing an idea out and then going away for two or three months while you build a data warehouse to answer that question."
Summarised from SiliconANGLE theCUBE · 19:25. All credit belongs to the original creators. Streamed.News summarises publicly available video content.