How do you model privacy, security, anonymity, etc?

The way I see privacy, security and anonymity is in a sort of Venn-diagram - you could have privacy, security and no anonymity; anonymity and security but no privacy (somehow); and other combinations.
But, I was thinking today about how much that misses out. It doesn’t really address who you’re hiding information, hiding your identity or securing yourself from. And, it allows things like 100% privacy and no security, which might not even be possible. How do you view the relationship between these three things? Feel free to add anything else that seems relevant.

1 Like

I think it’s better to define what each term means and acknowledge that what they do overlaps.

Still working on these definitions, but here goes.

Security is about preventing unauthorized access from outside of the tool or service. This deals with the security of the tool itself and how it’s built as well as the practices of the user and the organization providing the tool.

Privacy is about minimizing the total number of people who have access to the data, whether they are unauthorized (outside) or authorized (inside) of your tools and services. Security should keep most people away from your data, but the tool or you yourself can take steps to occlude more data even from folks who are trusted within the tool

Anonymity is about separating your use of a tool or service from your personal identity regardless of whether the information is secure or private. In my mind, this is literally as simple as having something under an alias, but it also covers other breadcrumbs that can be followed to link the alias with you directly.

For example, I can use Proton Mail for my email. That will bring me a certain level of security because of how it’s built and commitments that Proton as a company has. Because I picked Proton Mail instead of Gmail, I get an encrypted inbox with no snooping from Proton, and that adds privacy to my use. Lastly, I can make the email account under a random alias like ZuckStan and now I have anonymity because there’s nothing tying that account to me personally (though more steps can be taken to ensure this).

Hopefully this example shows that there’s not a neat way that each of these things fit together. You can gauge how well something does on each of these axes individually on some ways.

However, one maxim that I learned from Casey Parker of Firewalls Don’t Stop Dragons is this: security enables privacy. And now that I think of it, both of those enable anonymity if you want to make sure it doesn’t come back to bite you in the butt. So maybe this Techlore shirt has it right and we can just think of it as a pyramid, lol.


I think of it as you cannot have true privacy without security. However, too much security can reduce privacy. So you need at least some basic security, so FOSS preferred, but not necessarily required assuming there have been audits by trusted 3rd parties.

That just covers the essentials. It’s possible to achieve maximum feasible privacy with close-source, but that requires more trust than FOSS.

Anonymity has a higher bar to get past IMO, though it also requires more security along with FOSS + trusted 3rd party audit required.

I think in the end, the highest reasonable privacy can only be achieved with FOSS + trusted audit. It’s not too hard to find an example of FOSS where the dev simply doesn’t know enough for proper crytographically assured security.


The pyramid probably makes more sense, now that I think about it. The way I saw it in the first post allows for some weird cases.

How does this happen?

1 Like

First thing that comes to mind are financial services in the US with the KYC laws. A lot of the measures they take are ostensibly to increase security, though not necessarily security for the end user.

1 Like