Goals
- Understanding if what we are doing is making impact/change
- Ask if staff do the practices
- What is working well already?
- What isn't working?
- What progress you have made together?
- What should we change?
Elements
- "River of Life" exercise
- Capture lessons learned
- Stress test
- Ask if staff feel more secure, risk awareness
- The stuff you can see: If there is a concrete agreement (remove x from your computer), you can walk around and check to see who does it by that day. Post it notes with passwords disappear from computer. Creation of PGP keys, uploaded to key server.
- Amount of internal communication that is exchanged over encrypted means.
- Security is discussed more often in the organization.
- Increase of emails related to security (and you would know that the content relates to security b/c it is in the subject line, or questions are being sent to a specific email, or there is an email list dedicated to that topic, etc)
- Have a measurement for the adoption of each tool?
- Are the tools working?
- With indicators, you could measure the extent to which you have achieved it.
- You could visualize the metrics (listed above) to show the impact that has been achieved so far (how many people are using PGP, 50% of staff are using libreoffice, etc).
- Use the threats that the org have prioritized to protect against, and identify how much safer they are against that threat now.
- Matrix:
- Prioritized areas from risk assessment:
- Threat 1 - highest priority
- Threat 2 - next highest priority
- You could cluster specific tasks, behaviors, tools per objective, and then measure the adoption of each thing. For example:
- Objective: staff is safer during protest
- mobile phone keepass - 67% of staff have implemented this
- red button implementation - 80% of staff have implemented this
- position reporting protocol - 45% of staff have implemented this
- Do you feel safer now?
- Feedback from real time situations - did the tools/systems/behaviors make an impact during that situation?
Can we "test" staff's understand and behavior by trying to attack them via spear phishing, etc? Is this ethical? Some trainers have a principle of "ethical hacking", which includes:
- Consent from participants
- time-bound agreement (we will only try to hack your device in this time period)
- Agree to not use, share or store that information that is collected
Considerations:
- This could really impact trust between the trainer and the staff. How can the staff trust that the trainer is truly ethical and responsible with the information they collect?
- Avoid alienating people by shaming them, or telling leadership who failed the test
Incentives could both motivate and include measurement opportunities:
- Competitions to implement specific tasks, tools, behaviors in a specific amount of time. once they have done it, they send a message to the team, which will motivate others.
- Awards
¶ Approach: Using Outcome Mapping to Evaluate Progress and Results in Org Sec
This is an attempt to apply Outcome Mapping (OM) into evaluating Org Sec work. Originally, OM is has been used to evaluate development projects, specifically during the ICT4D years. But there might be concepts in the OM framework that might be useful for org sec mentors / trainers / supporters in looking at incremental progress, actors within the organisation, and how they all contribute to an overall goal of org sec. OM can also be used as a methodology to clarify macro-level goals with an organisation and build space for buy-in and trust. OM requires a participatory process. Before using OM, it is assumed that the mentor and organisation has already
- undergone an initial risk assessment
- has prioritised risks and mitigation strategies
- learned about the organisational culture and practice
- mapped information and flows
Stage 1: Intentional Design In this stage, usually in a workshop setting, the organisation agrees on the big (macro-level) goals of their organisational security, and to drill down into each big goal (i.e., secure communications) into achievable Progress Markers. Progress Markers, simply put, are the incremental steps towards achieving bigger outcome challenges. In this stage, you are attempting to answer the four key questions:
- Why? The vision of securing this organisation.
- How? The prioritised mitigation strategies and how they connect to the why.
- Who? Boundary partners – or the staff and the teams within the organisation and their roles within the org sec process
- What? Macro-level goals (Outcome Challenges) and incremental ways to achieve them (Progress Markers)
One way to think about Progress Markers would be dividing them up into: expect to see (short-term markers), like to see (medium term markers), and love to see (long-term markers). So, in the example of the outcome challenge, "secure staff email communications" (which contribute to the bigger goal of secure communications), progress markers might include:
- Expect to see:
- All staff set up Rise-up or TutaNota email accounts
- All staff using their Rise-up or Tutanota email accounts
- Like to see:
- Staff using their Rise-up or Tutanota email accounts over their Gmail / Yahoo accounts
- Staff using PGP
- Love to see
- Staff able to train partners in the use of PGP
(These are simplified examples for an imaginary organisation with minimal tech capacity. This can change, depending on the tech capacity within the organisation and other factors.)
Stage 2: Outcome and Performance Monitoring In this stage, the progress markers are turned into measurable output, and also monitoring for issues in performance and failure. This is where metrics come into play. Metrics also have to be time-bound as well in order to help monitor progress. Using the example above of "email security" progress markers, then possible metrics can be:
- number of staff who have migrated from Gmail / Yahoo to Rise-up or Tutanota within six months
- number of staff trained in PGP compared with number of staff who use PGP for their emails
- an increase in the number of PGP emails sent by the org's staff within six months
- number of partners who are trained by staff members in the use of PGP
- number of staff members who a training others in the use of PGP
Stage 3: Evaluation Planning In this stage, the team figures out the logistics of monitoring and evaluation:
- how do we document progress and failure?
- what do we use to measure the output (interviews, surveys, etc)?
- when do we pause to assess if the org sec intervention is happening?
- how often do we pause to assess progress?
- when to we stop evaluating and monitoring?
It is important to have a plan in place so that the M&E doesn't take over the implementation of the work.
Key resources:
Examples of the evaluation questionnaires that we ask participants to fill out 4-6 weeks after the training: