February 10, 2026 Jacob Fisher, Founder of Prato Studios 8 min read

Your Running Data Deserves Better: Why Privacy-First Architecture Matters

In July 2025, Swedish newspaper Dagens Nyheter published an investigation that should make every runner uncomfortable. Seven bodyguards assigned to protect Prime Minister Ulf Kristersson had been uploading their workouts to Strava—with public profiles. Reporters tracked over 1,400 activities and mapped the prime minister’s private residence, his travel routes, overnight trips abroad, and an unannounced meeting with the leaders of Norway and Finland. Bodyguard routes even revealed how to navigate the grounds of Drottningholm Palace, the Swedish royal family’s permanent residence.

This wasn’t a hack. It wasn’t a data breach. It was just people going for runs and letting an app do what it’s designed to do: collect and share their location data.

You’re probably not protecting a head of state. But your daily runs still reveal where you live, when you leave the house, when you’re traveling, what time you’re alone on a trail, and how your health is trending. The question isn’t whether that data exists—it’s whether you actually know who has it and what they’re doing with it.

What Your Running App Knows About You

The data that fitness apps collect goes well beyond your split times. GPS coordinates. Heart rate trends. Sleep patterns. Weight. Routes you run repeatedly. Time-of-day patterns that reveal your schedule. Some apps go further still.

A Surfshark analysis of 16 popular fitness apps found that they collect an average of 12 different data types—more than a third of the 35 categories Apple defines. Fitbit leads the pack at 24 data types, with Strava close behind at 21. According to that same research, Strava collects no data that is actually essential for the app to function—meaning every data point it gathers goes beyond core functionality.

Four of the analyzed apps collect what Apple classifies as “sensitive data”—a category that includes race or ethnic background, sexual orientation, fertility data, and biometric information. Data that has nothing to do with your 10K pace.

And collection is only half the problem. Over 90% of the analyzed fitness apps use collected data beyond what’s needed for app functionality—for tracking, advertising, analytics, or marketing. Strava shares data with third parties for tracking purposes. Nike Training Club collects sensitive data and uses it for tracking.

Data Types Collected by Fitness Apps

Fitbit24 of 35 data types
Strava21 of 35 data types
Industry Average12 of 35 data types
HannahTraining data only

Source: Surfshark analysis of 16 popular fitness apps (Jan 2026). Apple defines 35 total data type categories.

Location data is particularly dangerous because it’s nearly impossible to truly anonymize. A widely cited MIT study found that just four location data points are enough to identify 95% of individuals in a supposedly anonymized dataset. Your “anonymous” run logs might not be as private as you’ve been told.

The Business Model Problem

There’s a structural reason fitness apps collect so much data: many of them are built on a business model that profits from it. Location data can be packaged and sold to advertisers, insurance companies, and data brokers. As the Journal of High Technology Law put it, geodata is notoriously easy to re-identify even when anonymized, and users face an uphill battle protecting themselves until stricter standards are enforced.

The pattern is familiar across tech: if the app is free, someone else is paying for access to your data. Free fitness apps subsidize their costs by monetizing what you generate—your routes, your patterns, your health metrics. That’s not a conspiracy theory. It’s the business model.

Subscription apps flip this equation. When you pay for the service, the product is the service—not you. That doesn’t automatically mean every paid app respects your privacy, but it removes the structural incentive to harvest and sell your data.

The Strava incidents—from the 2018 military base exposures to the 2025 Swedish bodyguard revelations—aren’t isolated failures. They’re symptoms of an architecture that prioritizes data collection and social sharing over user protection. When your default setting is “public” and your business model rewards more data, incidents like these aren’t bugs. They’re features.

What Privacy-First Architecture Actually Looks Like

Privacy-first isn’t a marketing label. It’s a set of engineering decisions that determine what data gets collected, where it lives, and who can access it. Here’s what it means in practice.

Data minimization means collecting only what’s needed to deliver the service. If an AI coach needs to know your weekly mileage trend to give good advice, it doesn’t also need your GPS coordinates, your full name, or your social graph. Hannah collects training data—workout summaries, fitness metrics, race goals—and nothing else. No GPS coordinates. No route maps. No full name (first name only). No social connections. No device identifiers beyond basic app functionality.

On-device storage means your data stays on your phone by default. Hannah stores workout history, conversation history, and user preferences locally using encrypted on-device storage. Your training data doesn’t live on our servers waiting to be monetized or breached.

User-controlled sharing means you decide how much context the AI coach can access. Hannah offers three privacy tiers:

Hannah’s Privacy Tiers

Maximum

24 months of detailed training history

Best coaching experience. Most data shared with AI.

BalancedDefault

12 months detail + summarized history

Recommended. Balances coaching quality with data minimization.

Minimal

30 days of data only

Maximum privacy. Still effective for straightforward training questions.

You can change your level at any time, see exactly what data Hannah has access to, export your conversation history, or hit the “Forget Everything” button to delete all AI conversations permanently.

What gets sent to the AI provider matters just as much as what gets collected. When Hannah sends your training context to Anthropic’s Claude AI for coaching, she sends aggregated training metrics—weekly mileage totals, pace trends, training load ratios—not raw data streams. She sends your first name and age range (like “30s”), not your full identity. She never sends GPS data, location information beyond general region (for weather context), payment details, or device IDs.

The AI provider’s commitments are the final layer. Anthropic, the company behind Claude, does not train its AI models on API customer data. Your coaching conversations aren’t feeding a machine learning model. Data is retained for 30 days for abuse monitoring only, then automatically deleted. No third-party sharing. No advertising. SOC 2 Type II certified. GDPR and CCPA compliant.

What Hannah Sends vs. What Others Upload

Data TypeHannahTypical App
GPS coordinatesNeverFull tracks
Full nameFirst onlyFull name
Training metricsAggregatedRaw streams
Social graphNeverConnections
PhotosNeverUploaded
Third-party sharingNone21+ data types
AI model trainingNeverVaries

Why User Control Changes Everything

Here’s what’s counterintuitive about privacy: giving users more control over their data often results in them sharing more, not less.

When people understand exactly what’s being collected and feel genuinely in control of it, trust increases. And trust leads to engagement. The alternative—burying data practices in 40-page privacy policies that nobody reads—breeds the kind of vague unease that makes users disengage or abandon apps entirely.

Hannah’s transparency features are designed around this principle. The “What AI Coach Knows About Me” view shows you exactly which data categories Hannah can access at your current privacy level. The token usage dashboard lets you see how much AI processing your conversations use. And the ability to export or delete everything at any time means you’re never locked in.

Privacy shouldn’t mean worse coaching. Even at the Minimal privacy level—just 30 days of training data—Hannah can answer training questions, provide workout recommendations, and offer pacing guidance. You give up some of the deeper pattern recognition that comes with months of historical data, but you get competent coaching with maximum data protection. At the Balanced or Maximum levels, Hannah can spot trends across weeks and months—cardiac drift, training load spikes, pacing patterns—because she has the context to do so.

The point isn’t that everyone should minimize their data sharing. The point is that you should get to choose, with full understanding of what each choice means for both your privacy and your coaching experience.

The Standard We Should Expect

The running app industry has normalized a level of data collection that would be unacceptable in most other contexts. Imagine if your physical therapist recorded your GPS location during every appointment, shared your health data with advertisers, and collected your racial background. You’d find a new PT.

But that’s roughly what many fitness apps do—and we accept it because we’re used to it, and because the alternatives haven’t existed.

We built Hannah as proof that privacy and great coaching aren’t mutually exclusive. That an AI running coach can know enough about your training to give genuinely useful advice without knowing where you live, who your friends are, or what ads to show you. That runners deserve the same data protection standards we’d expect from any other service that handles personal health information.

We’re also building Altitude, a free race planning and training calendar tool. Sign up for early access—your data stays yours there, too.

Sources

  1. Euronews — “Swedish PM’s movements leaked by bodyguards uploading workouts on Strava.” July 2025.
    euronews.com
  2. Surfshark — “Fitness Apps Privacy: How Much Data Do They Collect?” January 2026.
    surfshark.com
  3. TechRadar — “Fitbit, Strava, and Nike Training Club are the most data-hungry fitness apps.” January 2026.
    techradar.com
  4. Suffolk University Journal of High Technology Law — “Running Into Danger: How Strava and Fitness Tracking Apps Put Your Privacy at Risk.” September 2025.
    Suffolk JHTL
  5. Anthropic — “Privacy Policy.”
    anthropic.com/legal/privacy
  6. The Local (Sweden) — “Säpo bodyguards’ Strava runs ‘reveal locations of Swedish PM and royals.’” July 2025.
    thelocal.se
  7. ASIS International / Security Management — “Check Your Privacy Settings: Bodyguards’ Fitness Apps Used to Track Swedish Leaders.” July 2025.
    asisonline.org

Related reading: AI Coach vs. Human Coach: The Real Cost-Benefit Analysis · Preventing Overtraining Before It Happens: How AI Spots What You Miss