Recently, the ICT4D Conference and NetHope hosted a webinar on security lessons learned and good practice in data protection featuring Joel Urbanowicz, Director for Information Security and ICT Operations, Catholic Relief Services; Stuart Campo, Team Lead – Data Policy, Centre for Humanitarian Data, United Nations Office for the Coordination of Humanitarian Affairs (OCHA); Jay L. Manns, Data Protection Officer, Director – Information Security, Mercy Ships; and, Keil Hubert, Head of Security Training and Awareness, The Options Clearing Corporation (OCC). These presenters graciously lent us their expertise for an hour to discuss the need to be increasingly cognizant of information security and data protection. With the growing use of digital tools in international development and humanitarian aid, data is being created at a pace the likes of which we have never seen. Not to discriminate against the various types of data that exist, but data from our sector is particularly sensitive. For instance, a health program that might collect information from HIV patients using digital data collection, or a program aimed at helping refugees during a protracted conflict; if this data fell into the wrong hands, violence could erupt, lives could be lost, and the organization facilitating those programsâ€™ reputation could be irreparably damaged. Resources were shared, questions were answered, and a thoughtful discussion was had. In the following paragraphs I will attempt to summarize what each presenter discussed, followed by brief coverage of the Q&A segment, and provide answers to any questions that were not addressed.
First up, was Joel Urbanowicz of CRS, who discussed the almost catastrophic and publicly reported Red Rose data breach of 2017. Some may remember that year, in the week of Thanksgiving in the US, CRS had a near breach event that was written about in both Devex and IRIN News that involved the potential leak of program participant information. Rather than get into the nitty gritty of the event, Joel instead states that we must learn a couple of things about preventing incidents like this. First, he says is how important preparation before the actual event occurs. Surrounding preparation, there are two things Joel mentions â€“ how to deal with these events and how you make them less likely to occur. We all must keep in mind that these events happen quite frequently and often go unreported, for obvious reasons, so it is very likely that your organization will have to deal with one. If you donâ€™t have a pre-determined plan in place (and a plan B), Joel says, then youâ€™re going to have a bad time scrambling, losing time while things get away from you. The second lesson is that crisp and clear communication is incredibly important â€“ both internally and externally â€“ with your partners, your donors, host governments, and anybody else involved. This dovetails quite nicely with Joelâ€™s third point, and that is the fact that transparency is absolutely the best policy. To drive this point home, he describes how in the private sector there are attempts to obfuscate or be less than totally honest with what happened. Donâ€™t try to hide what happened, as the truth will eventually come out, and you will be much worse for it. Joel closes out his segment with a couple of things you can do that will lessen the impact of a breach, which are privacy impact assessments and building a responsible security approach.
Next, we have Jay L. Manns, Data Protection Officer, Director â€“ Information Security for Mercy Ships who opens with agreeing with Joel on his point about privacy impact assessments; however, he wants to highlight a slightly more technical perspective. But first, he reminds us all just how expensive a breach event can get. This is an economic war for malicious actors on the internet with over two trillion dollars a year at stake. So, Jay quotes a Verizon Data Breach assessment which said that more than 4/5 publicly known breaches were preventable had they performed security patches and been current on their actions. Moral of the story? Patch your software, patch your servers, patch your desktops, patch your mobile devices! Speaking of mobile devices, which can be tremendously useful for health professions and as a tool for international development; however, lost or stolen devices can be a devastating to an organization that doesnâ€™t enforce strict use of passwords, autolocking, or even encryption on their devices â€“ as encryption is ultimately a get out of jail free card for people whoâ€™ve lost their phone. This is a big deal considering if your device is encrypted, it is no longer considered a breach that requires reporting. Iâ€™m sure almost everyone reading this has lost a cell phone beforeâ€¦ I know I have! Not to mention, Jay says, that under nearly all worldwide privacy laws today state that a lost or stolen phone that isnâ€™t password protected constitutes a breach. Under GDPR, you have 72 hours by law to report this. Jayâ€™s next point relates with Joelâ€™s about having a plan B, practicing that plan, and not just within your direct team; but, also marketing, finance, legal, etc. as this can get out of control very quickly. For instance, if your donors cannot trust that youâ€™re taking the proper safety precautions to protect their data then your reputation will suffer tragically. To close out his portion of the webinar, a point that Jay makes that is worth remembering which is that after a severe breach, a remediator may come in and say â€˜well if you had done a privacy impact assessment and considered this then this might not have occurred,â€™ which may feel like they are punishing the victim but it is a common practice.
Thirdly, we have Stuart Campo, Team Lead â€“ Data Policy in the Centre for Humanitarian Data for UNHCR, who offers practical guidance for data incident management. First, Stuart describes a Guidance Note on Data Incident Management that seeks to establish a common understanding for humanitarian organizations that. The need for this arises from the fact, as Stuart points out, that the only way weâ€™re going to start proactively managing incidents like these is with a shared language and clear approach to data incident management. Without this â€“ humanitarian organizations risk exacerbating existing vulnerabilities, as well as creating new ones, which can lead to adverse effects for affected people and aid workers. Things like the guidance note need to be not only invested in by individual organizations, but by the sector. Stuart then discusses the importance of having conversations like that taking place in the webinar; as, this type of knowledge sharing helps others avoid the same mistakes. This helps us to understand some of the underlying vulnerabilities within the humanitarian sector. Stuart explains that vulnerabilities are much easier to manage than threats, as you canâ€™t manage threats other than being aware of them and having mitigating measures in place to deal with them. Overall, Stuart concludes with the notion that the best place to intervene is by identifying and addressing existing vulnerabilities and understanding what practices, tools and policy instruments are leading to good and safe data practices. Knowing these things will help us find gaps that could lead to actual risk and harm, but without knowing them, how could we hope to prevent them?
Last (but certainly not least), we have Keil Hubert, Head of Security Training and Awareness â€“ The Options Clearing Corporation (OCC) who offers practical advice on training staff throughout oneâ€™s organization on cybersecurity. Keil beings by discrediting the notion that human beings are the weakest link in the defense against cybersecurity threats by explaining that yes â€“ they may be the weakest link â€“ but that they also can be one of the strongest assets because of a humanâ€™s ability to think rationally, recognize patterns, and genuinely know when something feels odd. Subsequently, he offers three points of advice that are relevant for private businesses and humanitarian organizations alike. First, we must deal with people who think security isnâ€™t their job or that â€˜my organization has a whole information security department â€“ let them worry about it!â€™ However, it is everyoneâ€™s responsibility, and that is something that individuals and organizations need to understand and apply to their day to day work. Second, is what Keil describes as the â€˜immediate manager issue,â€™ which is a line supervisor who makes sending a bit of sensitive information so important that they are willing to sacrifice security for expediency. The employee is stuck between a rock and a hard place because even if they know the need to encrypt and send their data securely, the line manager is standing right there and can potentially chastise the individual for not following their orders while the data security department might be in an another country somewhere and isnâ€™t there to enforce their policies. Third, is the mingling of personal and professional information or communications. We all know how this goes â€“ sending private and business emails, working from a work and a personal laptop, etc. It creates a blurred threat model, as we think of all information as effected by the same threats; however, this is not the case â€“ and they should be considered and treated differently. Keil closes out his segment by offering us a bit of advice, and that is all the equipment and best policies in the world are not going to stop an issue from occurring that is caused by a single person; therefore, organizations need to reach people where they are using analogies and metaphors that make it real and personal for them. Data canâ€™t protect itself, which is why educating people is at the heart of data protection.
After the wonderful presentations by each of the four presenters, we get to the question and answer portion of the webinar. While it isnâ€™t possible to describe each of the questions that the audience asks, the following were some of the more robust inquiries. First is a question for Joel regarding how one can balance the need for transparency and ensuring that the data subjects have confidence in your organization. Joel responds by claiming that, in his experience, it is best to report incidents where you may have done something irresponsibly and has a direct impact on your program participants or users â€“ which is much higher for humanitarian organizations like CRS as we have a duty to do no harm. Not to mention, if you try to hide something that happened and it eventually comes out, it will be far worse for your organizationâ€™s reputation than if you had just been upfront about it from the get-go. Another excellent question, this time for Keil, is about the best way to enact organizational level change. First, he explains, is that you need to understand what the organizational culture actually is; then, you need to start finding ways to modify it by translating your intent into policies, processes and controls into terms that the people can actually understand and fit into their view of how the organization works. Next, where there is conflict between the policies youâ€™d like to enact and the way the people actually doing the work see things, you need to compromise in a way where you can achieve the desired objective without necessarily forcing people to unlearn or break their concept of what they do or who they are. Jay is then asked about whether Mercy Ships have employed the use of â€˜data championsâ€™ throughout the organization. Jay explains that they have privacy champions placed in each of their business units around their agency, as Mercy Ships works all around the world. He explains that you need to educate at that level in order to enact change from the front lines. Jay feels that this ground up approach would be the best way to alter the data culture of an organization. Stuart then offers a response to a question on how organizations can cope better or collaborate around data protection, particularly regarding any open source state data. Stuart claims that it may be best to address this issue flexibly at the program level first then develop a cluster system or working group. This way they can develop their own â€˜light touchâ€™ data incident management structure. This way individual programs wouldnâ€™t necessarily be totally responsible for these concerns on their own. This ties in nicely to the other responses with the fact that the best way to change the organizational culture around data protection is to work from the ground up, spreading data champions through the agency and throughout every process. Integrating it into daily activities for all staff!
To close, it is worth noting that there were a number of questions that came in through the webinar chat â€“ and you can see most of those listed out and answered by our presenters here.
Paul S. Wiedmaier
ICT4D Knowledge Management
Catholic Relief Services