Big Learnings on ‘Small Data’ Modeling: The Perils and Payoffs of Doing More with Less
Susan S. McDonald, Ph.D.
While many of us are awash in Big Data, some business environments are challenged by the quantum scale of data scarcity because they have relatively few customers and a shortage of research respondents. That means they also have a shortage of judgments available on which to model market outcomes and predict product demand. This dilemma is common in B2B markets or scenarios in which there is the desire for granular models for different customer groups on a regional level. Alternatively, businesses that have access to larger samples may want to leverage early-stage qualitative samples to provide a more disciplined understanding of customer tradeoffs.
NAXION has been doing experimental work to develop models based on small samples, mapping the latitudes and limitations of sample scarcity, and identifying procedures that can shore up models constrained limited sample size and a limited base of judgments. Key factors that influence the viability of small data modeling include the number of attributes and levels, the type of attributes, the kind of model you choose, and the availability of supplemental data to shore up utilities. For more information on what we’ve done and how that learning may be relevant for your market, see excerpts from a recent talk or contact us for more information.