The Problem: How to Provide More Granular, Store-level Foot Traffic Insights
The right data makes all the difference. Unfortunately, the data sources that Olvin first started working with, when building out the company’s Almanac platform, were too “noisy” and inaccurate to be able to answer a key customer question: How many people are actually stepping foot into my brick and mortar store? “With our previous data provider, the polygons didn’t always align to where stores really were,” explained Matthew Taaffe, Olvin’s VP of Product. “It took us sometimes a month or two to import and clean the data before we could even use it.” It became clear that better (and cleaner) data was needed to build a stronger, more accurate attribution model that could deliver the granular, store-level insights their customers wanted.
The Problem-Solver: Olvin
Olvin is a retail analytics platform, fueled by its flagship Almanac product, with an objective to “level the playing field against e-commerce and create tools that enable brick and mortar retailers and product owners to get ahead of the curve with predictive analytics.” The company is focused on tapping into the power of artificial intelligence (AI) to predict consumer behavior and demand, so that brick and mortar businesses can make decisions with confidence around labor optimization, assortment planning, and site selection. But at the end of the day, the company’s focus is all about helping marketers, planners, and merchandisers delight their customers once again.
The Challenge: Going From Aggregated Local Area Foot Traffic Insights to Store-level Insights
When Olvin was searching for new data providers, they did a simple “visits validation” test:
- Find a store location on Google Maps
- Verify whether the data actually shows the store at the same location
With the company’s original data provider, there was all too often a disconnect. Sometimes the data placed stores in incorrect locations—like stores being shown in the middle of a street—or, in the worst of cases, showed them overlapping each other. This made it virtually impossible to provide customers with store-level foot traffic insights.
“Because of the inaccuracy of the data we were working with, we couldn’t provide insights with precision,” underscored Taaffe. “We had no choice but to aggregate data to the local area, which made it impossible for us to provide foot traffic insights with any level of granularity.” But granularity is exactly what Olvin’s customers wanted. Making informed decisions around retail site selection or day-to-day store operations requires knowing not only how many customers (may) step foot into a store but also where they’re coming from when they do.
The Solution: SafeGraph Places and Geometry Data
The three primary criteria that Olvin cared about most when choosing a new data provider were:
- Accurate polygons
- Easy to buy and download
- Reasonable cost
SafeGraph ticked all of those boxes—and more. For starters, SafeGraph’s Geometry dataset passed the “visits validation” test and made it possible for Olvin to build an attribution model that could accurately assess foot traffic at the store-level. “Not to mention, SafeGraph data adheres to industry standards, like NAICS codes,” confirmed Taaffe. “This makes it a lot easier for us to work with and join to other data sources without having to do a big cleanup effort.”
The Result: Getting a Competitive Edge with Future Customers
Whether it’s for trade area analysis, retail site selection, store visit attribution, store performance, or even demographic insights, getting customers to believe in Olvin’s offering wouldn’t be possible if the company couldn’t provide store-level insights with precision and granularity. “Plus, it allowed us to simplify the user experience and make it a whole lot easier for our customers to take action on the insights we provide,” reiterated Walters.
“The ability to offer our customers store-level foot traffic insights has helped us unlock new conversations,” chimed in Taaffe. “While our competitive edge has always been based on our predictive modeling and forecasting methodologies, fueled by several data sources, it’s now so much easier for our customers to perceive the value of what we offer them.”
In other words, all it took was for Olvin to work with the right data sources to now be able to provide “predictive foot traffic insights” versus simply a summary of foot traffic.
Before SafeGraph, we couldn’t rely on the quality of our POI data sources, which meant we couldn’t achieve our ultimate end goal of being able to build an accurate attribution model based on visits to individual stores.