Skip to content
SUNDAY, APRIL 5, 2026
Consumer Tech3 min read

Granola notes privacy scare for AI notetakers

By Riley Hart

Smartwatch displaying health and fitness data

Image / Photo by Luke Chesser on Unsplash

Granola’s AI-assisted note-taking looks great on paper—until you realize your meeting bullets aren’t as private as they should be. The Verge reports that Granola’s notes are “private by default” in name, but anyone with a link can view them, and those notes can be used for internal AI training unless you opt out. In other words, your quiet reflections from a tense brainstorm could end up helping train a model you never signed up for.

Granola bills itself as an AI notepad that sits in your calendar-dense life, listening to meeting audio and turning it into bulleted takeaways. The catch, readers will want to know, is how those notes are treated once they exist. The Verge story highlights a design quirk that could surprise even the most privacy-conscious user: a default setting that makes notes sharable via a link, even if you think you’ve kept them tucked away in a private space. That discrepancy between “private by default” and “readable by anyone with the link” is where the real risk lives. The app’s pitch—the power of AI to summarize what you heard— lands hard when you realize a single shared link could expose spreadsheets, client names, or strategy comments that you’d ordinarily keep off a public channel.

From a consumer standpoint, the core tension is clear: convenience versus control. The more Granola promises to streamline your workflow with AI, the more you rely on it to store and analyze sensitive conversations. The Verge notes that Granola also uses these notes to train its AI, unless you actively opt out. That “opt out” option is a lifeline for privacy—but it’s not automatically engaged. In practice, that means users who want to protect their notes must actively dive into settings, locate a presumably buried opt-out toggle, and understand what “training” means for an app that’s supposed to read the room for you.

Two practical insights jump out for shoppers evaluating Granola or similar tools. First, default sharing settings matter as much as the feature set. A tool that promises private notes but exposes them via a shareable link undermines trust and complicates data governance in any team environment. If you’re in a regulated industry or simply discuss sensitive topics, you’ll want to lock down access at the source. Second, opt-out language around AI training is only useful if users can easily find and understand it. The idea that “your notes may be used to train the AI unless you opt out” sounds straightforward until you realize opt-out steps aren’t always obvious, explicit, or front-and-center during onboarding.

Looking ahead, Granola faces a typical privacy crossroads seen across AI-powered apps: will defaults shift toward stricter privacy, and will companies make opt-out clarity a feature rather than an afterthought? Regulators and consumer advocates are watching because the incentive structure around data curation and model training is increasingly scrutinized. For now, users should treat note content as potentially more shareable than they expect, and consider disabling link-based sharing and confirming AI-training preferences before storing anything truly sensitive.

What to watch next: Granola’s response to this PSA, any changes to default privacy behavior, and whether other AI-notepad players face similar questions about link sharing and training data. If you’re weighing this tool today, test both content sensitivity and the ease of tightening privacy—before your next big idea ends up in the wrong hands.

Sources

  • PSA: Anyone with a link can view your Granola notes by default

  • Newsletter

    The Robotics Briefing

    Weekly intelligence on automation, regulation, and investment trends - crafted for operators, researchers, and policy leaders.

    No spam. Unsubscribe anytime. Read our privacy policy for details.