Accessibility Conformance Report Methodology
This method describes an iterative approach for performing multiple specialized assessments and producing an Accessibility Conformance Report (ACR).
This methodology is designed to produce an accessibility conformance report that provides a way to record multiple assessment iterations and is compatible with multiple standards, such as WCAG 2.1 Level AA, EN 301 549, or any subset of requirements.
The method consists of the following phases:
- Analyzing and exploring the subject involves identifying inapplicable success criteria, determining the necessary expertise domain for producing the accessibility conformance report (ACR), and, if applicable, identifying different parts of the test subject.
- Producing assessments involves evaluating a test subject based on a set of Accessibility Testing Rules (ACT).
- Maintaining a living accessibility conformance report (ACR), until no further testing is required to establish conformance or until the test subject undergoes a major backward incompatible change that invalidates some or all related assessments included in the ACR.
- Issuing the ACR, indicating when the report is final and represents the state of the test subject. When issued, the ACR shall not reference any additional assessments or be edited except for archiving.
- Archiving the ACR, indicating that the report does not represent the current accessibility state of the test subject and suggesting that a new ACR report should be produced with a set of new assessments.
Type of documents
There are two major types of documents: assessments, which are considered informal, and accessibility conformance reports, which are considered formal. This means that while assessments may report various states about success criteria testing (fail, pass, not applicable), these results may be audited differently for accuracy, false positives, or false negatives when producing the accessibility conformance report.
Assessments
Informal report
Outcome:
- Pass
- Fail
- Inapplicable
Evaluate test cases
Leverage a set of Accessibility conformance testing (ACT) rules to establish the result.
Accessibility conformance report (ACR)
Formal report when there is an issued date
Outcome:
- Satisfied
- Not satisfied
- Further testing is needed
Evaluate test requirements
Leverage a set of assessments to establish the conformance.
Theoretical complete example
Assessments
- All SC WCAG 2.1 level AA
- Pre-assessment of WCAG 2.1 level AA
- SC 1.1.1 Non-text content
- Situation A, short text alternative - SC 1.1.1 Non-text content
- Situation B, long text alternative - SC 1.1.1 Non-text content
Accessibility conformance report (ACR)
- Conformance report to WCAG 2.1 Level AA
Practical working example
Assessments
- Specific to "Add to calendar" plugin
Accessibility conformance report (ACR)
- Specific to "Add to calendar" plugin
Accessibility assessment
Instructions:
- Define the evaluation scope.
- Choose a conformance accessibility testing ruleset published in the WET-BOEW ACT rules repository.
- Complete the following information in the assessment report template:
- Subject:
earl:subject
- Your name and your organization:
earl:assertedBy
- Date:
dct:date
- Standard targeted:
acr:ConformanceStandard
(Official URL of the standard) - Standard option targeted:
acr:conformanceOption
(Instances can be found in act:standard/profiles/)
- Subject:
- (Optional) Identify the following when applicable:
- Subject parts:
earl:subject/dct:hasPart/
- Expertise fields involvment:
acr:involvesExpertise/
- Subject parts:
- Set the selected ruleset JSON template as the baseline in
earl:result
. - For each
earl:result
, test, assess, and note the result according to the associated accessibility conformance testing ruleearl:test
. - Finalize the assessment by adding an assessment summary in
dct:description
.
Accessibility conformance report production
- Report initialization
- Auditing and referencing an assessment
- Issuing and formalizing the ACR report
- Archiving
Report initialization
- Define the scope of the accessibility evaluation.
- Choose a conformance criteria ruleset from the WET-BOEW ACT rules repository.
- Fill out the assessment report template with the following information:
- Subject:
earl:subject
. - Your name and your organization:
earl:assertedBy
. - Date created:
dct:created
. - Conformance to standard:
acr:standard
. - Conformance option:
acr:conformanceOption
.
- Subject:
- Set the selected ruleset JSON template as a baseline in
acr:conformity
Auditing and referencing an assessment
- Edit the living ACR to include the new assessment that is in scope of the tested subject.
- Add the referenced assessment in
acr:assessment
:- The key is the URL where the assessment in JSON can be retrieved.
- Add the assessment title in
dct:title
. - Add a link to the HTML equivalent report in
dct:references
. - Define the main language used for that assessment (2 letter code)
dct:language
. - Set the date of when the assessment was added into this ACR
dct:date
.
- Audit the assessment
acr:audits
:- Your name:
earl:assertedBy
. - Define the testing mode:
earl:mode
. - Add a reference to the assessment:
acr:assessment
. - Audit each evaluated test case and record your finding if applicable in
acr:auditNote
:- Define the accuracy (False negative, False positive, Accurate) in
acr:accuracy
. - Set the date of this note in
dct:date
. - Set the title of this note in
dct:title
. - Associate the applicable test case in
earl:test
. - Describe the concern expressed by this note in
dct:description
. - Optionally, describe the error message in
earl:info
. - Optionally, define a severity level in
acr:severity
. - Optionally, define a relevancy subject in
acr:relevancy
. - Optionally, include an attachment encoded in data URL in
acr:asset
.
- Define the accuracy (False negative, False positive, Accurate) in
- If applicable, create a work item in
wf:task
and for each work item:- Set the title in
dct:title
. - Describe the work goal and additional description to provide enough context in
wf:goalDescription
. - Add a reference to the discussion page in
dct:references
. - Set the work item creation date in
dct:created
. - Define the test requirement this work item is about in
acr:requirement
. - Define the specific test case this work item is about in
earl:test
. - Optionally, define a severity level in
acr:severity
. - Optionally, define a relevancy subject in
acr:relevancy
.
- Set the title in
- Update the conformity of the criteria affected by the assessment and for each updated conformance requirement:
- Update the conformance state (Satisfied, Not satisfied, Further test needed) in
acr:conformance
. - Add a reference to the applicable assessment in
acr:assessment
. - Add a description about its latest conformance state or the current state if applicable in
dct:description
. - Update the modified date in
dct:modified
. - Define the cumulative testing mode for this conformance requirement conformity in
earl:mode
.
- Update the conformance state (Satisfied, Not satisfied, Further test needed) in
- Your name:
- If possible, add your verifiable credential.
Issuing and formalizing the ACR report
- Edit the living ACR for which we want to set the conformance status for the tested subject.
- Validate the integrity of the test subject for all related assessments and audits. If this is not satisfied, the ACR must be archived.
- Revise audits, work items and all conformance requirements to ensure they are accurate.
- Validate each conformance requirement. None of them should be marked as "Further test needed"
- Set your name and your organization's name as the main assertor name for the conformance report using
earl:assertedBy
. - Define the conformance report's conformance as "satisfied" only if all the conformance requirements are also "satisfied." Otherwise it is "not satisfied."
- Set the current date as the issue date using
dct:issued
. - If possible, add your verifiable credentials.
Archiving
- When we don't know its replacement:
- Edit the ACR report to archived.
- Set the expiry date of when the ACR was officially acknowledged to be outdated in
schema:expires
. - If possible, add your verifiable credentials.
- When it is being replaced by a new report:
- Edit the ACR report to archived.
- If not defined, set the expiry date of when the ACR was superseded in
schema:expires
. - Set the URL of the replacement ACR report in
dct:isReplacedBy
. - If possible, add your verifiable credentials.
-
When the ACR report is reissued:
Note: A reissued accessibility conformance report must have its own identities, which must be different from the issued accessibility conformance report. In practice, this means creating a new JSON file for the reissued ACR report.
- Issue the reissued ACR as an independent report by including a reference to its previous version in
dct:isVersionOf
. - Edit the original ACR report to archive it.
- If the expiry date of the original ACR report is not defined, set it as the date when the ACR was reissued in
schema:expires
. - Set the URL of the reissued ACR report in
dct:hasVersion
. - If possible, add your verifiable credentials.
- Issue the reissued ACR as an independent report by including a reference to its previous version in
Resources
- Bilingual labels used as replacements for technical references
- GitHub repository for rulesets and rules
- Additional and complementary vocabulary terms
Archived instructions of the first experimental version of the ACR reporting in JSON-LD
We are actively working on producing a JSON-LD format which is machine readable and ease the process of publishing the report on the web by leveraging additional vocabulary terms to support the accessibility conformance reporting. Our intent is to update this tool to allow the creation of JSON-LD ACR (Accessibility Conformance Report). You can contribute to our discussion on Github via the issue #9271 or during any of our regular meetings.
Step 1 of 3 - Instructions and sample for publishing/creating the ACR
- English instructions for publishing/creating the ACR
- French instructions for publishing/creating the ACR
- Experimental JSON-LD report sample
- English version of the report sample
- French version of the report sample
Complete theoretical example
- Experimental JSON-LD report sample
- English version of the report sample
- French version of the report sample
Intended reporting process:
- The assessor creates the report, adds the assessment's general information and logs their assessment for each Success Criterion result.
- A reviewer re-opens the assessment, identifies themselves in the review section, reviews the original assessment and puts any remark along with the review. Once the review is completed, the reviewer will create as many work items as needed. This step can be repeated as often as needed, but when there is more than 3 reviews, you should consider producing a new ACR and start over this process.
- Closing - A contributor submits a change to make the assessment no longer valid or being superseded by another ACR.
- Date modified: