In the Privacy Soapbox, we give privacy professionals, guest writers, and opinionated industry members the stage to share their unique points of view, stories, and insights about data privacy. Authors contribute to these articles in their personal capacity. The views expressed are their own and do not necessarily represent the views of Didomi.
Do you have something to share and want to take over the privacy soapbox? Get in touch at blog@didomi.io
Joining Aura, an all-in-one tool for protection from identity theft, scams, and other online threats, changed my entire perspective on how important it is to build an understanding of privacy and being in control of my data.
I work on the growth and marketing there, helping connect people with tools that make them safer online. That means I spend a lot of time learning how Aura’s products work and how real users interact with them.
In 2025, 70% of users trust brands with clear consent practices, but being in control isn't just about clicking “accept” or "deny": it’s about understanding what happens next.
My first steps into the privacy world
If you had asked me ten years ago what data privacy meant, I would’ve answered something vague about turning off location services or keeping my Instagram account private. Like many people, I assumed privacy was about avoiding the creepy feeling that my phone was “listening” to me. As for cybersecurity? That was the IT team’s problem.
That all changed when I joined Aura. I came into the privacy space with a background in digital communications. I’d worked on messaging, product branding, and community building, but not from the perspective of user protection. My introduction to Aura was the beginning of a steep learning curve. I thought cookie banners were a “one-and-done” compliance fix for companies, a quick way to check the privacy box and move on. I believed most companies were ethical with personal data. I assumed consent mechanisms were strong safeguards in themselves, without realizing how much depends on how they’re implemented and respected.
It didn’t take long to realize how naïve I’d been.
Working in privacy forced me to reexamine what I thought I knew. I began to understand that behind every tool we use and every form we fill out, there’s a trade happening. Sometimes it’s fair. Sometimes it’s not. But the responsibility to make that call doesn’t just sit with regulators or platforms - it sits with us, too.
The moment it got real: When privacy became personal
There was one day in particular that changed everything.
I started getting phishing emails that used my name, job title, and the exact language we’d recently posted on our blog. They weren’t your average “Nigerian prince” scams - they were polished, credible, and clearly targeted. My personal inbox, LinkedIn, and even work address were hit within 24 hours.
At first, I assumed it was random spam. Then I realized the emails were tailored to me. Someone (or something) was mining the same public data I’d helped publish. That’s when it clicked. Working in privacy had made me a target.
It turns out that cybersecurity threats don't just aim at big companies. They look for accessible individuals with a sufficient digital presence to be useful. A name, email, and job function are enough for bad actors to create convincing spear-phishing campaigns. And once that information’s out there - on marketing lists, public repositories, breached databases - it’s very hard to claw it back.
In my case, it took weeks of back-and-forth with platforms to secure my accounts, remove my information from data broker sites, and tighten up my digital footprint. It wasn’t easy - but it was doable. And now I know exactly what to do if it ever happens again.
Hackers don’t just sell your email address. They stitch together your digital identity from multiple leaks, then sell entire “profiles” on the dark web. These profiles can be used to apply for credit cards, impersonate employees, or run scams that seem legitimate because they’re backed by real data. Recovering from it takes time, effort, and in some cases, legal help.
It was personal now. And it fundamentally changed the way I operate online.
The privacy habits I’ve changed in my own life
Once you’ve seen how the sausage is made, you stop eating it the same way. I didn’t become a doomsday prepper overnight, but I did start making deliberate choices to protect myself and my family.
What I do differently now
I use a password manager for everything now. Not just to avoid reuse, but to make sure my credentials are strong, unique, and not stored in browsers that are more vulnerable. Two-factor authentication (2FA) is non-negotiable - especially for accounts tied to finances, email, or work.
I also regularly check for data breaches using tools like Aura’s digital footprint checker. It’s unsettling how often you’ll find your information floating around in compromised databases. But knowing is better than not knowing. Once you’re aware, you can take action: reset passwords, freeze credit, or even opt out of data brokers.
Another big shift? I stopped downloading sketchy apps or signing up for “free” tools that don’t clearly state what they collect. I now review app permissions at least once a month, and I’m always surprised by how much access some apps have by default. That includes apps that run in the background and are capable of remote monitoring and management, especially those with administrative privileges that can silently access system data.
Finally, I’ve become more loyal to brands that treat privacy seriously. It’s not just about compliance anymore. When I see a site that’s transparent about what data it collects, offers real choices, and explains things in plain language, I trust it more. That means I’m more likely to engage, buy, or recommend it. Privacy, it turns out, isn’t just good ethics - it’s good business.
If you’re curious what a user-first approach to privacy looks like, check out how Aura gives users control over their data. Whether it’s real-time alerts, credit monitoring, or anti-tracking tech, their tools are built with transparency in mind.
How I talk to friends and family about privacy
My new habits didn’t stay private for long.
At first, friends and family rolled their eyes. “Here comes the privacy lecture,” they'd joke when I brought up 2FA or warned them about a phishing scam. However, over time, they began to listen. Why? Because the scams got smarter.
One day, my dad got a text saying his banking account was locked. The message included the name of his actual bank and a link that looked real. He almost clicked it. These scams often mimic legitimate emails to new clients. They use brand logos, tone, and urgency tricks that sales teams also rely on, which makes them even harder to detect.
That’s why it’s crucial to verify the authenticity of any unexpected messages before taking action. Simple steps like checking the sender’s email and using a secure Google Workspace signature for official communications can help distinguish real messages from phishing attempts.
That night, I showed him how phishing links work and how to verify URLs using a domain blacklist check tool. He now forwards me scam messages weekly and has started teaching his own friends what to watch for.
I’ve helped cousins freeze their credit, walked friends through setting up password managers, and even helped remove hackers from phones.
I also write posts and guides to share with others. One of my favorites: a short blog breaking down the common signs of a phishing attempt, and how Aura helps identify them early using AI-powered alerts. It’s not about scaring people. It’s about giving them the tools to feel in control.
Privacy shouldn’t be a mystery. The more we normalize talking about it, the safer we all become.
I’m not the only one having these conversations - more people are creating webinars and recording videos to help clients or colleagues understand the risks.
What most people still get wrong about data privacy
Even now, with privacy in the headlines almost weekly, many people still misunderstand what it actually means. Here are the most common misconceptions I hear - and what I wish more people understood.
Misconception 1: Privacy = not posting online
Wrong. You can stay offline and still be tracked. Everything from loyalty programs to IP addresses contributes to your digital footprint. The key is managing what gets shared, with whom, and for what purpose.
Misconception 2: It doesn’t matter what you click on a cookie banner
Not true. Cookie banners are meant to inform and give you a real choice. But let’s be honest - most people just click “Accept all” to get it over with. That’s where consent fatigue kicks in: we’re so used to seeing banners everywhere that we stop engaging with them meaningfully.
A proper Consent Management Platform (like Didomi’s) helps ensure your preferences are logged, respected, and actually followed. But not every company uses one - or uses them well. So yes, what you click does matter.
Misconception 3: If I have nothing to hide, I don’t need to worry
You wouldn’t say that about your house keys or bank PIN, would you? Privacy isn’t about secrecy - it’s about control. Having nothing to hide doesn’t mean you should give away everything.
Misconception 4: Big companies like Google or Facebook must be secure
They invest in security, yes - but they’re also high-value targets. Even the best defenses can be breached. And more often, data leaks happen through third-party partners you’ve never heard of.
Misconception 5: I read the privacy policy, so I know what’s happening with my data
Unless you’re a privacy lawyer, probably not. Most policies are vague, jargon-filled, or leave out crucial details. What matters more is how transparent a company is about your choices - and whether they honor them.
Misconception 6: My phone is private, no one’s tracking me
Unless you’ve disabled location tracking, limited ad personalization, and turned off third-party cookies (among other things), your phone is probably sharing more than you think.
Misconception 7: Using Incognito mode keeps me anonymous
Incognito hides your history from your browser locally, not from websites, your ISP, or trackers. It’s useful, but not the privacy shield many assume.
Misconception 8: Data privacy is just a tech issue
It’s not. It’s a human issue. It affects how you vote, get healthcare, apply for jobs, and navigate the world. Tech may be the medium, but the impact is deeply personal.
If you’re unsure where to start, my colleagues at Aura have worked on an excellent beginner-friendly resource center on privacy tools and best practices. It’s not just for cybersecurity experts - it’s for real people.
Misconception 9: Business communication tools are always private and secure
Many people assume their messages on unified communication platforms, like team chat apps or virtual meetings, are secure by default. But unless end-to-end encryption is in place and data policies are transparent, your conversations could be stored, scanned, or even shared with third parties. It’s worth understanding the privacy settings of whatever unified communication software your company uses.
What I hope everyone starts doing today
If there’s one thing I’ve learned, it’s this: privacy isn’t a switch you flip. It’s a mindset.
Here are three things I hope more people start doing, right now:
1. Treat your data like money
You wouldn’t give away your credit card number to a stranger. Think the same way about your email, phone number, or location. They all have value, especially when combined together.
2. Don’t overshare
Before you fill out that form or download that app, ask: Do they really need this information? Be intentional with what you give.
3. Use simple tools
You don’t need to be a tech genius. Password managers, 2FA apps, and services like Aura make it easy to build strong habits without getting overwhelmed.
Working in privacy changed how I live, how I talk, and how I help others. And while the threats may evolve, one thing hasn’t changed: the more control you have, the safer you are.
If you're ready to take your own first steps, Aura and Didomi are great places to start: Aura helps individuals protect their personal data and identity, while Didomi builds tools that help companies respect those choices at scale.
Why this work matters more than ever
Since joining Aura, I’ve watched scams, fraud, and data exploitation grow in both volume and sophistication. Phishing emails look like real invoices. Deepfake videos can mimic a CEO’s voice. Stolen data is bought and sold in seconds.
At the same time, users are waking up. People want more control over their data. They want transparency. They want to know that when they click a banner, their decision means something.
This is where the work becomes meaningful. We’re not just creating policies or frameworks. We’re helping reshape how digital trust is built.
That’s why having the right privacy tools and practices in place matters. They empower users to take action, stay informed, and hold companies accountable. The good news? Many of these safeguards are already accessible - you just have to know where to look.
More and more, I see organizations investing in real privacy practices beyond the bare minimum. They’re not just ticking boxes. They’re building trust at every touchpoint of the customer journey.
That’s a future I want to be part of.