Skip to content
mass alert
Elizabeth VenafroFeb 2, 2021 9:45:00 AM7 min read

When Saving Lives Infringes on Personal Privacy

The use of emergency messaging systems has risen in recent years with such common use cases as severe weather mass alerts and COVID-19 updates, along with preventative measures leveraging new artificial intelligence (AI) technologies. These innovations are extremely valuable but, like every other industry catapulted forward by technology, a familiar concern arises -- privacy.


AI and data science have the ability to infer a lot of information based on people's actions. Regulations, such as CCPA and GDPR, require organizations to disclose how the data collected will be used. Even so, while these technologies are designed to help people, they can still raise privacy concerns. The question is where does public safety cross the line of becoming a privacy issue?


Here are four ways organizations can balance the fine line between protecting their people and protecting personal data:

1. Implement an "Opt-In" Emergency Notification System for Mass Alerts


Whenever anyone offers up some sort of personal information, there is a sense of empowerment that comes with knowing they chose to opt-in on their own terms. By asking users to self-register for an emergency notification system, organizations put the user management responsibilities in their hands, which also results in less time and IT resources for set-up and maintenance over the lifespan of the mass alert system. 


A potential hurdle to an opt-in system is that it generally results in lower subscriber levels, at least initially. With this in mind, organizations may need to spend time upfront to promote the emergency mass notification system to their community. With that said, as time passes and the mass notification software's benefits become more apparent, this concern becomes less of a factor.  Click here to read more about Opt-In vs. Opt-Out.

2. Make Location Sharing Voluntary


Similar to implementing an opt-in emergency communication system, the sharing of location should be voluntary. Many people are understandably hesitant to share their location data, as it conjures up images of “Big Brother” surveillance, but they may be more willing to do so when they understand how useful it can be in a dangerous situation.


Emergency notification apps that ask for private information, such as location, can lead to user concerns and discourage installs. In fact, estimates suggest that only 7% actually download the safety apps and only 10% of those users enable location services. You can read more on this in our blog post, The Dirty Little Secret of Emergency Notification Apps.


Omnilert enables you to overcome these obstacles and achieve 100% installation by making the emergency alert app experience available in both Native and Instant App formats. Omnilert’s Instant App provides the emergency app experience to all subscribers without requiring users to download anything. Users are taken to their personalized app experience through a link appended to messages sent.


Additionally, Omnilert's two-way mass engagement platform simplifies and increases subscriber location capture. Omnilert Engage provides a method to receive location information from a high percentage of your user base, while still respecting personal privacy. It is a practical, privacy-conscious approach to capturing subscriber location, with no app download required.


By providing their location, users are able to receive real-time messages that are most relevant to them, which can make a huge difference in a crisis situation, such as an active shooter

3. Be Transparent About How Data Is Used For Mass Alerts


While there is a certain segment of people who avoid sharing any personal information at all costs, there is also a significant amount who are fine with it as long as they know how their data is being used. People are appreciative of organizations that are upfront and honest about their data privacy policies, and that is no different when it comes to emergency notification software.


If users know and understand how providing some basic data can help in an emergency situation, they may be more willing to share it to help make the notification system more effective.

4. Shift From Facial Recognition To Weapon Detection 


The issue of bias has become a growing concern with both artificial intelligence and machine learning. For example, there could be concerns that video surveillance cameras that use AI to detect potential active shooters might unfairly flag individuals based on ethnicity or skin color. According to a recent article in Fast Company, "shifting the focus from detecting human faces to detecting the instruments of harm they carry with them might be a way to satisfy security needs while preserving civil liberties." 


An emergency notification system that disregards facial recognition and solely focuses on whether a firearm is being drawn makes for a completely unbiased system. In other words, it is not about recognizing and storing images of people’s faces, it is about verifying if a gun is detected so the proper safety protocols can be initiated, as quickly as possible.


Omnilert Gun Detect, the emergency mass notification industry's first visual gun detector, leverages AI to reliably and rapidly recognize firearms and immediately trigger multi-channel mass alerts and automated pre-defined workflows. Designed for privacy and performance, Gun Detect is a software solution typically deployed on-premise and integrated with existing IP-based video surveillance and camera systems. 


THAT is how organizations can save lives while also keeping personal data safe.


The hard reality is that, in today’s society, mass notification systems and artificial intelligence are absolutely necessary in order to keep communities safe. These systems are more powerful than ever, but with these advancements comes a level of concern over privacy. Organizations that follow the above best practices will get the most from their emergency messaging systems and will have a better chance of keeping their communities safe.


Click here to read our Opt-In vs. Opt-Out Whitepaper.