The focus is on signal density and how to achieve it with high-quality data.
For advertisers and marketers, creating the right data mix of location signals is a crucial component of every successful campaign and it’s critical to understand how many devices — and how many signals and of what kind — it takes to drive them. In this installment of Verve 101, the focus is on signal density and how to achieve it with high-quality data.
Signal Density: Basic Benchmarks
The idea of signal density is that advertisers and marketers are seeking as much data about a device as possible — every hour of the day and every day of the month, and the next month, and so on.
When we talk about signal density for targeting reach, advertisers need access to hundreds of millions of devices. For movement insights, they require location data from tens of millions of users supplied by apps with trusted software development kits (SDKs).
Note, though: only data of the highest quality will create major boosts when it comes to deriving actionable value from analysis. In the next section, the lens turns to how, once advertisers and marketers achieve a benchmark of density, they then assure the data quality necessary for successful campaigns.
Identifying Foreground and Background Data
To start, a signal-dense approach requires both foreground and background data.
- Foreground data is data collected by an app while it is being used by the consumer. Ad impression data is one example of foreground data. It tends to be sparse, creating challenges around signal density.
- Background data is device data collected when an app is not in active use. Background data also includes gaps, but since it is not reliant on the active use of apps on the device, it can provide a more complete view of a device’s journey through space and time.
The underlying point is that while not all data is created equal a healthy mix of different data types creates a complementary scenario for marketers and their advertising partners. In terms of reaching the highest bar, stakeholders should seek an assortment of resources, including cell towers, GPS, Wi-Fi, IP addresses, and beacons. It’s important to build signal density by representing a spectrum of data types even while acknowledging different degrees of precision and accuracy.
- Cell-tower data is imprecise, for example. IP-address based locations are mostly inaccurate — deviating to the degree of showing a device in a different US state from its actual position.
- Wi-Fi service set identifier (SSID) and beacon data don’t always cover at scale and could in fact be inaccurate if the data partner source does not know where the beacon is located. This often happens when they don’t have direct relationship with the store placing the beacons; such sources also default to probabilistic calculations. As always, direct partnerships are to advertisers’ and marketers’ advantage in the marketplace.
- Although the precision and accuracy of GPS signals are affected by the environment, GPS location is still the most commonly available, and has the largest geographic coverage.
And so, if advertisers focus on signal-dense background GPS data, they can further leverage the advantages of signal density by making certain that their data comes from trusted SDKs and/or server-to-server sources that allow for the filtering and accessing of the most precise and accurate information.
Scale, of course, is always a consideration as well, but in the quest to cap a healthy signal density with more expansive data sets, marketers must take extra care. Exchange-based location events, in particular, typically represent app and/or device-in-use ad calls (i.e., foreground data) and they often lack accuracy. Exchange data can be adequate for activating intended or lookalike audiences, but they will always need additional evaluation and cleaning to assure authenticity and value.
Boosting Signal Density: Cautions and the Need for a Scoring System
In the end, for a brand, all data types and signals should add up to a kind of score. Using device information such as timestamps, location of consecutive signals, and speed (if available), several models should be applied to remove bad and fraudulent devices. These models include non-human request frequency and patterns, the relationship of device locations and IP addresses, and the regularity of signal distribution throughout the day. Also, device data should be run against a lifetime model to flag instances featuring a very low probability to show up again — i.e., fictitious devices.
Advertisers can hedge their outcomes on the side of signal density and data quality by partnering with a digital platform capable of scoring incoming location data in real time, assessing the quantity, mix and density of the data while reducing — or eliminating — fraudulent signals. By establishing benchmarks of data quality, the results will be greater insight and, ultimately, meaningful experiences on the screens of the mobile-only consumer.
Technology goliaths such as Facebook and Twitter have been held to a radically different standard in the marketplace, and their impunity has damaged innovation and the media.
All too often, the challenges around ad delivery transparency and auditing the supply chain are assumed to be out of reach. One concept that isn’t on the table, however, but it should be, is a trustless peer-to-peer ad network.
The time for in-app video is undoubtedly now, but the question remains: what steps can publishers, advertisers, and marketers take to stay on the path of accelerated growth?