So, you wanna build an app?
Part III: Safety First
Minimizing safety risks for victims of abuse who use your app is a daunting but crucial process. Remember that survivors may be in crisis, in danger, or have someone monitoring their device when they’re using your app. This post discusses how you can address and minimize some of these safety risks.
Your App Could Be a Safety Risk
Victims of abuse are most at risk when they attempt to leave their abusive partner or try to limit the abuser’s control. Simply having a safety app on their device could indicate that the victim is seeking information or help, and the abuser could escalate his/her control and abuse. While you can’t remove that risk entirely, it’s important to consider ways you can address and minimize those risks.
Inform the User
The first step is to inform the user of possible dangers and risks they might face if they download your app. Some survivors may be aware that their devices are being monitored and know to be careful about what they download, but others may have never thought that about risk before, and may not have considered that the abuser may see the app and discover that they are seeking help.
This reminder should take place before they download the app. It should be noted in the app store description, and in other places that describe the app. For example, the Tech Safety App provides notices about potential monitoring by abusive partners and suggests that users only access the app from a safer device. These notices are available on the app’s informational website, in the app description in both the Apple App & Google Play stores, and as part of the onboarding process after someone downloads the app. These reminders both inform potential users of the related risks when downloading the app, and encourages them to wait until they are on a safer device.
Other Safety Strategies That May or May Not Work
Quick Escape – Most websites for survivors of abuse have a “Quick Escape” or “Exit” button so that they can leave the site quickly if they’re worried that someone is monitoring their internet use. However, this can be a challenge for apps, since having an exit button can take up valuable screen space. It’s also unnecessary because it’s often very easy to quickly close an app. Since building an “Exit” button throughout an app isn’t practical, the best way to inform users of possible monitoring is to inform them before they download the app.
Disguised Apps – Some apps have been designed to look like something else, such as a news app or a calculator, but are actually apps to help domestic violence or sexual assault survivors. While it might be helpful for the icon to be disguised so that it doesn’t raise the suspicions of an abusive partner, there can also be significant challenges with this strategy. The Apple App Store doesn’t allow these types of apps, or they require an explanation of what the app actually is in the app description, which may defeat the purpose of it being disguised. App users also won’t be able to find the app unless they know exactly what it’s called and what the icon looks like. If the icon changes as a part of the update process and the survivor doesn’t notice, this may make the app hard to find, or may lead to accidental deletions. Survivors may also forget the fake name if they download the app and don’t use it regularly, making it difficult to find in a time of crisis. Moreover, if someone happens to open the app on the phone, they’ll know that it isn’t whatever the app is pretending to be.
In some cases, app developers may actually build the disguised app and hide domestic violence/sexual assault content within the app. While this might minimise the risk of someone opening the app and immediately seeing the domestic violence/sexual assault content, it might be harder for users to access hidden content easily and quickly.
Passwords – Some apps will use a password to protect the app (or parts of the app) so that only someone with the password can access it. This strategy does work to a certain extent, particularly if there’s private or sensitive information the survivor wants to keep protected in case someone goes through the device. Just keep in mind that a password protected app might raise the suspicions of the abusive person if he or she is used to having full control over the device. This strategy might be best for someone whose abuser generally doesn’t have access to the device, but who wants additional privacy protection for the information she/he is accessing or storing. Having this as a security option rather than a default setting can be helpful for survivors, because it lets them individualise the app based on their unique circumstances.
Be Aware of Unintentional Access to App Content
There are many ways that app content can be accessed without the knowledge of the survivor, simply by the way the device may be connected to other technologies. For example, some devices are set up to automatically connect to smart TVs, speakers, or cars via Bluetooth. If your app contains multimedia, build the app so that files don’t automatically start playing when the device connects to a speaker or other technology. Also consider naming multimedia files in a way that doesn’t reveal anything if someone happens to see the file name on a media player.
Safety and Privacy When Collecting Sensitive Information
Some safety apps encourage users to store personal information either on the app itself or to the cloud via the app. This might include contact information, a journal logging the abuse, and photographic/video/audio evidence of abuse. It’s critical that users of these apps are notified of the related safety risks involved in storing information this way. If the information is stored on the device, users should be warned that anyone with access to the device might be able to see the content.
Additionally, if your app collects and stores any private information connected to its users, you should have a privacy and security policy that clearly explains what information the app is collecting, why it is being collected, and who has access to it. If your app is using a third-party service to store the information, or if it shares the information with another company, it’s vital to let users know how to find that third-party’s privacy and security policies.
In cases where personal information is being stored on the user’s own cloud-based service, such as Dropbox, they should be notified of the related privacy and security risks. Many users don’t know how easily cloud-based services can be accessed. If the abusive person knows the victim’s password or has access to a device the account syncs with, all of the information stored could be easily accessed, manipulated, or deleted. If your app encourages users to use their personal cloud storage service, provide them with information about how they can increase their privacy and security when using these services.
This series was written by our sister project, the U.S. Safety Net Project at the National Network to End Domestic Violence. This series is based on lessons learned when developing the NNEDV Tech Safety App, and in reviewing dozens of apps created for victims of domestic violence, sexual assault, and stalking.