In this article:
SecurityScorecard continually makes improvements and critical changes that affect how your Scorecard is scored, including scoring recalibrations every three months. Use this article to prepare for recalibrations, so that you can maintain the highest score and best security posture possible.
Tip: In addition to recalibrations, we also make other changes, such adding or removing domains to your Digital Footprint or improving our attribution process. See all of our scoring-related updates.
What is a recalibration and why is it important?
We recalibrate the baselines that we use to calculate our scores. These recalibrations reflect changes to organizations, the cybersecurity landscape, and the internet.
Note: As of August, 2023, we recalibrate our Scores every three months.
By keeping the calibration up to date, we prevent scores from becoming less accurate over time as the number of findings on the internet moves away from the historical baseline.
Additionally, scoring updates allow us to add new issue types, retire old ones, and re-weigh the severity of issues to make scores more predictive of negative outcomes, such as breaches and malware attacks. We cannot add new issue types without understanding the baseline to score against, which we get from recalibration.
Factor scores are based on issue severity levels, or weights, which we also recalibrate. Each issue contributes some expectation value to the factor score. If we deprecate an issue type, we need to recalibrate the factor without it.
How we recalibrate scoring
We base the score on the average number of findings for an organization against their Digital Footprint size. We recalibrate to make sure that the average number of findings of issues and factors is up to date.
We update the scoring calibration using two months worth of data. We believe this minimizes the expected score impact and allows for frequent improvements to the platform.
In our current Scoring methodology, a Scorecard’s overall score is calculated as the weighted average of the factor scores.
Note: In Scoring 3.0, a methodology that we are previewing and will fully apply in 2024, a Scorecard's overall score will directly reflect the severity levels and score impact of issue types. Learn more about Scoring 3.0.
As we add or remove issues and information with recalibration, certain factor scores may change while others stay the same. Changes in an organization's factor-level score cause to a change in their overall score.
Learn more about how we calculate scores.
How Scorecards and Portfolios are affected by recalibration
When there are no changes to issues and severity levels, we expect the score changes to be less significant than when we add in new issues or re-weight the severity of issues.
Since scoring updates affect all Scorecards in the platform, you will see score changes within their Portfolios. These changes may also be reflected in reports, analytics, and other platform features.
Take steps to prepare for a recalibration
Use the scoring recalibration reports in the SecurityScorecard platform to help you anticipate, and prepare for, an update.
Step 1: Watch for update announcements, and download recalibration reports
If a scheduled scoring update will affect your Scorecard, you will see alerts in the platform ahead of time, so that you can prepare for them and possibly prevent any projected negative impact:
- A banner appears when you log into your SecurityScorecard account. Click Download reports in the banner.
Note: The displayed score impact reflects how the recalibration change impacts your score on the day it was computed. Since we run scores daily, the impact may change with new data.
- On your Scorecard, a wrench icon that represents a scoring update recalibration appears under your score. Mouse over the icon to see more information...
....Click Download reports....
Select a report to download the .csv file, and then open it in a spreadsheet application.
Tip: For reference and to track changes, you can see past scoring updates on your Scorecard's History age and issue-level event log.
Step 3: Review the reports
During the window preceding the recalibration, we generate these three reports on a daily basis to show the difference in scores and findings between the current calibration and the new one we will apply on the announced date.
You may find it helpful to view the reports in the following order:
- Start with the Score Difference report for a broad understanding of how your factor scores and overall Scorecard score will change as of the recalibration. Note which factor scores are particularly impactful.
- Then look at the Score Impact Changes report to see how relevant issue types within each factor will impact scores.
- Finally, look at the New Findings report to get details about specific issue findings.
The reports are in .csv format, and you can open them in a spreadsheet application.
Score Differences report
This report shows how your overall and factor scores will change as of the recalibration date because of new findings relevant to the recalibration.
See the following table for column explanations:
Column heading | Explanation |
date |
The date that relevant findings were associated with the Scorecard database, to be applied to the Scorecard as of the recalculation date Note: This is not the date of the report itself, the date of the scan, or the date that the finding was observed. |
parent_domain | The Scorecard's registered domain, where these changes will occur |
factor_name | The name of the score factor |
current_factor_score |
The factor score prior to the recalibration |
new_factor_score | The factor score as of the recalibration |
factor_diff | The numeric difference in the issue type's factor score impact before and after recalibration |
current_total_score |
The overall Scorecard score prior to the recalibration |
new_total_score | The overall Scorecard score as of the recalibration |
total_diff | The numeric difference between the overall Scorecard scores before and after recalibration |
Score Impact Changes report
This report shows the how issue findings affected by the recalibration will impact factor scores and the overall Scorecard score as of the recalibration date.
See the following table for column explanations:
Column heading | Explanation |
date |
The date that relevant findings were associated with the Scorecard database, to be applied to the Scorecard as of the recalculation date Note: This is not the date of the report itself, the date of the scan, or the date that the finding was observed. |
parent_domain | The Scorecard's registered domain, where these changes will occur |
factor_name | The name of the score factor |
measurement_name | The issue type that will have an impact on the score change as of recalibration |
current_factor_impact |
The issue type's factor score impact prior to the recalibration |
new_factor_impact | The issue type's factor score impact as of to the recalibration |
factor_impact_diff | The numeric difference in the issue type's factor score impact before and after recalibration |
current_total_impact |
The issue type's overall Scorecard score impact prior to the recalibration |
new_total_impact | The issue type's overall Scorecard score impact as of the recalibration |
total_impact_diff | The numeric difference in the issue type's overall Scorecard score impact before and after recalibration |
New Findings report
This report shows you new issue findings that are projected to appear on your Scorecard, and those that will disappear, as of the recalibration date.
Note: Be aware of the following when viewing this report:
- The findings that appear in the report may change at any time preceding the recalibration date. For example, if you view the report on two different days, some findings that appear the first day you see the report may no longer appear the next day you see it.
- This also means that any of the new findings you see in the report may not appear on your Scorecard on the day of the recalibration.
- A recalibration may affect existing issue types as well as introduce new issue types. This means that you may see findings for existing issue types in the report.
See the following table for column explanations:
Column heading | Explanation |
date |
The date a finding was associated with the Scorecard database, to be applied to the Scorecard as of the recalculation date Note: This is not the date of the report itself, the date of the scan, or the date that the finding was observed. |
parent_domain | The Scorecard's registered domain, where the finding occurred |
measurement_type | The issue type that the finding belongs to |
change | Whether a new finding was added to the Scorecard (measurement_arriving), or removed because of decay (measurement_departing) |
measurement_id | The unique identifier for the specific issue finding on the given IP address and port number |
first_seen |
The first date we observed the finding |
last_seen | The most recent date we observed the finding |
dst_ip | The IP address on which the finding occurred |
dst_port | The port number on the IP where the finding occurred |
observation_tls | Dynamically appears when the report includes findings for the issue type SSL/TLS Service Supports Weak Protocol; the data we gathered that supports the finding |
protocol | The protocol running on the port |
target | Dynamically appears when the report includes findings for the issue type SSL/TLS Service Supports Weak Protocol; the URL or IP address where we observed the finding |
vulnerability_description | The description of the vulnerability as it appears in the National Vulnerability Database |
vulnerability_id | The common vulnerability enumeration (CVE) identifier as it appears in the National Vulnerability Database |
vulnerability_publish_date | The date that the vulnerability was published in the National Vulnerability Database |
FAQ
Why does it take two to four days, or even more, for my score to reflect a change?
Our batch processing method typically takes two to four days: one day to collect the data and one day to process it. The process can last longer when unexpected variables occur. When collecting data we sometimes detect errors in the data; we then reprocess the data to ensure accuracy. Other reasons for delayed updates to your score may be errors in our scoring pipeline.
How can I stay ahead of possible scoring changes to keep my score high or stable? Specifically, how can I get to high-priority fixes quickly, given that any change takes two to four days to affect my score?
To give you as much time as possible to correct and remediate problems – and thereby keep your scores high – our platform alerts you several weeks before any scoring changes take place. In addition, we highlight any recent changes that may affect your score in our scoring update release notes. There we explain what changes are significant, and how you can resolve these issues in the platform prior to the scoring update deadline.
How do you show me what changed, so I can start resolving issue findings quicker?
We make your score more actionable with our issue-level event log, which shows new and removed findings each day. This insight helps you implement your remediation process faster.
Get help
If you have any questions or comments, contact your Customer Success manager, or submit a Support request.