Dear Privacy Tool Builders, I Just Need To Do My Work

This is the second part of a series where we share the insights from our community outreach earlier this year. We spoke with digital trainers, journalists, and people who work in environments where privacy tools and enhanced security posture are required for personal safety. In this post we explore the challenges of using and integrating privacy and security tools into everyday life.

Behaviour, Habits and Workflow

A big challenge is behaviour change and commitment to using privacy tools and workflows. Another challenge is that even if a person learns about workflows and tools, making use of them is dependent on their environment.

When a person returns to their own lifestyle, organization or workplace after training, tools like communication and secure calling may not be able to be integrated. Other people need to use these options. So it’s up to them to get their colleagues to use the same tool so it makes sense. Otherwise they have all this knowledge and can’t use it.

Integrating privacy options and lessons from training depends on having other people making use of them as well. Not everyone may use a secure messenger or encrypt their email or make use of anonymizing habits or software to protect identity.

I just need to do my work

In some cases something might stop working or it does not provide a user experience that is expected or preferred. They let it go, and do not find a replacement and go back to usual behaviour which may create a risk factor.

If something is not supported on mobile, it is a big challenge especially considering that most of the people I train are more mobile users than desktop users.

Here we see another challenge, which is where an option might be a very good fit in terms of its usability and familiarity, but it cannot be integrated into a persons lifestyle because for their own use case a mobile version is required.

Inexperience with Threat Modelling and Risk Assessment

In my experience the biggest problem is a lack of proper assessment of one’s own risks and the consequences.

Even people who are under constant pressure gradually relax and get used to it, which leads to trivial mistakes, such as “did not make a backup” or “did not use hard drive encryption”. In other words, low security motivation. Even people who know what needs to be done for security — just don’t do it.

User Friendliness

A common problem is that in trying to onboard new users, many times tool teams begin with the complicated and technical descriptions or options for how their privacy option works.

Unless you are talking to other developers or a technical audience, it is important to consider that your end user is going to start with just trying to install the software or make use of the application. What happens if it ends up revealing that what the team has created is not usable for the public?

This is an opportunity to really consider the importance of proper real world usability pattern matching of what tool builders and developers consider usable and what end users find usable.

User friendliness is the heart of what I’m doing as a trainer.

Tools are great. But if you’ve never used encryption before, it’s daunting at first.

If a tool takes a longer time to install people do not favour them, and they will not keep using them if they work more slowly than what they are familiar with.

It is important to consider stress tests as part of overall assessments and how to identify and bridge motivational gaps with understanding and options that better connect people to their own risk management capacity. As we have observed in the past when testing the I2P Java desktop software, people do not want to spend, or do not have the time to spend downloading and configuring something. It is exhausting to have to configure things, especially if it is a new or unfamiliar environment. Motivation to continue to use the software can and does fail when it stops working for some reason or when the UX does not provide an experience that is similar or even pleasurable in comparison to something that is not considered secure or private.

We should always consider and explore where and why these trade-offs happen. In terms of open source teams resources are more focused on development roadmaps and keeping the lights on by maintainers. What we need are better community feedback loops where new users, designers and digital trainers can work more closely with developers to help support more adoptable outcomes. Maybe we think of it as opening open source and reconsider how product development pipelines work with consideration to free software and its sustainability. Could this improve not just retention of people making use of privacy tools, but also improve our overall relationship to privacy, security and to the technology that we rely on everyday?

The next part of our outreach series will focus on trust modelling, where we learn more about perception, and building relationships.