Observation Management Procedure
Purpose
Observations, also commonly referred to as findings, exceptions or deficiencies, refer to any Tier 3 information system risks that are identified as an output of compliance operations or other mechanisms by team members, such as self-identification of a system specific risk.
Scope
Risks identified at the information system or business process levels. The volume and granularity of these Tier 3 risks make it inappropriate to track via the Bramble Risk Register. This observation management process will guide team-members on how to track Tier 3 risks.
Roles and Responsibilities
| Role | Responsibility |
|---|---|
| Security Compliance | Responsible for executing Security control tests to determine test of design and test of operating effectiveness of Security and IT general controls. |
| Internal Audit | Responsible for executing Internal Audit control tests to determine test of design and test of operating effectiveness of all internal controls as required by audit plan. |
| Risk and Field Security | Responsible for executing Third Party Risk Management (TPRM) risk and security assessments to determine risk associated with third party applications and servicees. |
| Observation Identifier | Responsible for being the observation DRI through the observation lifecycle including designing remediation plans in order to meet legal and regulatory requirements. |
| Remediation Owner | Validates observation, confirms assignee, stop date (due date), finalizes remediation plan and conducts remediation activity based on defined remediation SLA's |
| Observation Program DRI | Responsible for regular reviews of program health and stakeholder report delivery |
| Managers to Executive Leadership | Responsible for escalation as necessary and resource allocation for remediation activity |
Observation Phases Overview
{{< mermaid >}}
graph TD;
A[Identified] --> B[Assigned];
B --> C[Remediation in progress];
B --> D[Ignored or Invalid];
C --> F[Resolved];
{{< /mermaid >}}
Procedure
The following phases walk through the observation lifecycle.
Identifying Observations
Observations can be identified in the following ways:
- Internal audit activities
- Security control testing
- Third Party Risk Management (TPRM) activities
- Customer Assurance activities
- External audits
- Third party application scanning (BitSight)
- Ad-hoc issues
Observation Identifier Responsibilities Based on Observation Type:
| Identified Observation | Responsible Party |
|---|---|
| Internal audit activities | Security Compliance |
| Security control testing | Security Compliance |
| Third Party Risk Management (TPRM) activities | Security Compliance |
| Customer assurance activities | Security Compliance |
| External Audits | Security Compliance |
| Third party application scanning (BitSight) | Security Compliance |
| Ad-hoc issues | Security Compliance |
Recording Observations
The Observation Identifier is responsible as the Observation DRI through the observation lifecycle. The Observation Identifier completes all necessary observation information and remediation recommendations for the remediation owner. They are responsible for managing the observation through the observation lifecycle. This includes creating the observation, validating the observation with the Remediation Owner, tracking all remediation progress and updating the associated ZenGRC issue with current information and status updates. Each observation has both a GitLab Issue (for Remediation Owners) and a mirrored ZenGRC Issue (for Observation Identifiers). Each observation will be assigned a risk rating, which should drive the priority of remediation.
If multiple observation issues relate to the same root cause or are blocked by the same component of work, these issues will be connected together into an Epic in order to more clearly see how multiple observations issues are connected.
To ensure transparency accross the organization, Security Compliance documents observations in the Assurance project
Observation Risk Ratings
Tier 3 information system risk ratings are based off the STORM risk rating methodology.
Risk Rating = Likelihood x Impact
Determining the likelihood
At Bramble, observations will be rated based on the likelihood the observation has of recurring and/or the frequency that the control has seen observations. The criteria used to assess this likelihood can be found in the Likelihood Table below. Note that there are two different definitions for each likelihood rating level:
- Control Observation: This criteria is utilized to rate observations identified as an output of control testing (e.g. where control testing performed internally by Internal Audit and/or Security Compliance has failed). The assumption of the Likelihood Table is to consider observations individually rather than in aggregate (i.e if 2 similar observations occur against a single test of a sample of 25, the failure rate is 8% and would be scored a 3. The control does not need to be tested multiple times in the current year or prior 9 months with an observation each time to meet the requirement for a score of 3).
- Information System Risk (Tier 3): This criteria is utilized to score the likelihood of an information system being exploited (e.g. insufficient encyrption mechanisms for the storage of data within
[System Name]result in the unintentional exposure/leakage of this information to the public)
Likelihood Table
Determining the observation impact
In addition to applying a qualitative scoring factor for likelihood, all observations need to be evaluated for the impact they would have to Bramble at the organization level and/or the compliance impact (if applicable). The criteria and qualitative scores for assessing the impact of an observation can be found in the Impact Scoring Table below. The highest rating in any field is the final impact score of the observation so as to approach observations in a more conservative manner (i.e if all fields are rated at a value of 2 except Remediation Effort which is scored a 3, the final impact score would be a 3).
Important Note: Team members who are leveraging the impact scoring criteria below may judgementally select the impact factors most relevant to them. Internal Audit and Security Compliance utilize all columns when scoring observations identified as part of controls testing because there may be specific impacts to external compliance audit requirements as a result of these findings. Any information system risk identified outside of control testing may utilize the columns that are most relevant.
Impact Scoring Table
Determining the Observation Risk Rating
In order to arrive at a final observation risk rating, the likelihood and impact scores of an observation are multiplied together. The final score determined will determine whether or not the observation is a LOW, MODERATE, or HIGH risk observation using the Observation Risk Rating Table
Observation Risk Rating Table
Additional Considerations Specific to Control Observations
The procedures outlined in the preceding sections below are used specifically by Internal Audit and Security Compliance. Team members utilizing the observation management program for rating information system risks outside of control testing activities will not need to engage in the procedures below.
Determining the Individual Control Health & Effectiveness Rating (CHER)
The importance of risk rating each control observations comes into play when making a final determination on how to establish a control's Control Health & Effectiveness Rating (CHER). CHER ratings on a sliding scale outside of the typical effective/ineffective rating used for compliance, allow for clearer communication and prioritization with broader audiences outside of compliance functions and allows non-compliance stakeholders the ability to view how observations impact the control environment.
CHER provides a qualitative value of a control's effectiveness that is used as an input for various processes within the Risk Management Program. When needing to report to management, these quantitative values are translated to qualitative terms: Fully Effective, Substantially Effective, Partially Effective, Largely Ineffective, Ineffective. Refer to the CHER Quantitative vs. Qualitative Terms and Definitions Table below for a mapping of CHER to it's definition and the related qualitative term and definition. Use the rating determined by completing the observation risk rating with likelihood and impact scores and applying that risk rating into the table below (i.e if a control has 1 low risk observation per the Observation Risk Rating table, the CHER for that control would be a 2 (Substantially Effective)).
CHER Quantitative vs. Qualitative Terms and Definitions (For individual controls)
System Health Rating - Quantitative vs. Qualitative Terms and Definitions
CHER is assigned on a control by control basis but in instances where we want to report on system health, the ratio of high risk observations to the number of applicable controls that were assessed against the system is determined. That ratio is used to determine the system health rating from the following table:
Refer to the System Effectiveness Rating Table below for a mapping of averaged CHERs to the qualitative term and definition that can be used to report on system health/effectiveness. Note that when using this table the final average of CHER values should be rounded up to the nearest quantitative value to determine the CHER for the system (i.e if average of all CHER's equals 2.3, the final CHER for the system would be rounded up to a 3)
System Health Rating Table
Control Family Effectiveness Rating - Quantitative vs. Qualitative Terms and Definitions
CHER is assigned on a control by control basis but in instances where we want to report on control family effectiveness, the CHER for each of the individual underlying controls in a control family can be averaged to provide a more holistic view. Refer to the Control Family Effectiveness Rating Table below for a mapping of averaged CHERs to the qualitative term and definition that can be used to report on control family health/effectiveness. Note that when using this table the final average of CHER values should be rounded up to the nearest quantitative value to determine the CHER for the control family (i.e if average of all CHER's equals 2.3, the final CHER for the control family would be rounded up to a 3).
Control Family Effectiveness Rating Table
CHER Override
To account for edge case scenarios or other extenuating circumstances that may not be modeled appropriately using the Bramble Observation Management methodology as outlined, the final CHER can be downgraded (i.e move from 2 to 3) at the discretion of the Security Compliance Director if it is determined that the observation's risk rating and therefore CHER does not appropirately reflect the current control or control environment health. The rating cannot be upgraded (i.e move from 4 to 3) to ensure a conservative approach to securing the organization and managing risk.
Remediation
Once all remediation activities have been completed, the Remediation Owner is responsible for tagging the Observation Identifier in the observation issue. If no individual Observation Identifier is assigned, tag the Security Compliance Team. The Observation Identifier will then validate the remediation activity for completeness, re-test the observation as appropriate and close the observation issue.
It is the responsibility of the Observation Identifier to track the milestones, work progress and validation of the remediation activity.
The remediation workflow by observation stage can be found here (access is available only to internal Bramble team members)
Non Remediation Owner Actions To Support Observation Closure
In cases where Internal Stakeholders (not the Remediation Owner) provide remediation documentation to support closure of the observation. Please tag the Observation Identifier in the observation issue. This will trigger the validation of the remediation activity for completeness, re-test as appropriate and closure by the Observation Identifier.
Remediation SLA
Observation remediation SLA's are determined by the risk rating of the individual observation. The following table shows the SLA for each risk rating:
| Risk Rating | Remeditation SLA | Remediation Goal |
|---|---|---|
| High | 6 months, or as otherwise defined by the agreed upon remediation plan | 4 weeks |
| Moderate | 6-12 months, or as otherwise defined by the agreed upon remediation plan | 6 weeks |
| Low | > 12 months, or as otherwise defined by the agreed upon remediation plan | 8 weeks |
A more detailed SLA and Remediation Goal process can be found here (access is available only to internal Bramble team members)
Opportunities for Improvement (OFI)
Throughout the course of testing or general monitoring of the Bramble ecosystem, Opportunities for Improvement (OFI) may be identified and documented so that the overall control environment and Bramble's processes can be improved.
To capture an OFI, create an issue in the Assurance project and add the RiskRating::OFI label.
OFI's do not have defined remediation SLA's as they are process improvements or suggestions only. The Remediation Goal to communicate the OFI to the appropriate stakeholder is 10 weeks.
What is the difference between an OFI and an Observation?
- Observations are tied to specific testing attributes and/or reflect areas where a third party compliance professional would be of the opinion that a relevant risk wouldn't be or hasn't been, mitigated.
- OFIs are not tied to specific testing attributes and are general areas of improvement that may streamline compliance or business activities.
- Observations will always impact control effectiveness ratings
- OFIs will never impact control effectiveness ratings
Exceptions
Exceptions will be created for observations that breach a mutually agreed upon remediation date, breach in SLA or if the Remediation Owner confirms the observation will not be remediated.
Exceptions to this procedure will be tracked as per the Information Security Policy Exception Management Process.
References
- Parent Policy: Information Security Policy
- BCF Contol Lifecycle
Contact
If you have any questions or feedback about the security compliance observation management process please contact the Bramble security compliance team.