Accessibility Conformance Report Methodology

This method describes an iterative approach for performing multiple specialized assessments and producing an Accessibility Conformance Report (ACR).

This methodology is designed to produce an accessibility conformance report that provides a way to record multiple assessment iterations and is compatible with multiple standards, such as WCAG 2.1 Level AA, EN 301 549, or any subset of requirements.

The method consists of the following phases:

  1. Analyzing and exploring the subject involves identifying inapplicable success criteria, determining the necessary expertise domain for producing the accessibility conformance report (ACR), and, if applicable, identifying different parts of the test subject.
  2. Producing assessments involves evaluating a test subject based on a set of Accessibility Testing Rules (ACT).
  3. Maintaining a living accessibility conformance report (ACR), until no further testing is required to establish conformance or until the test subject undergoes a major backward incompatible change that invalidates some or all related assessments included in the ACR.
  4. Issuing the ACR, indicating when the report is final and represents the state of the test subject. When issued, the ACR shall not reference any additional assessments or be edited except for archiving.
  5. Archiving the ACR, indicating that the report does not represent the current accessibility state of the test subject and suggesting that a new ACR report should be produced with a set of new assessments.

Type of documents

There are two major types of documents: assessments, which are considered informal, and accessibility conformance reports, which are considered formal. This means that while assessments may report various states about success criteria testing (fail, pass, not applicable), these results may be audited differently for accuracy, false positives, or false negatives when producing the accessibility conformance report.


Informal report


  • Pass
  • Fail
  • Inapplicable

Evaluate test cases

Leverage a set of Accessibility conformance testing (ACT) rules to establish the result.

Accessibility conformance report (ACR)

Formal report when there is an issued date


  • Satisfied
  • Not satisfied
  • Further testing is needed

Evaluate test requirements

Leverage a set of assessments to establish the conformance.

Theoretical complete example

Accessibility conformance report (ACR)

Practical working example

Accessibility conformance report (ACR)

Accessibility assessment


  1. Define the evaluation scope.
  2. Choose a conformance accessibility testing ruleset published in the WET-BOEW ACT rules repository.
  3. Complete the following information in the assessment report template:
    • Subject: earl:subject
    • Your name and your organization: earl:assertedBy
    • Date: dct:date
    • Standard targeted: acr:ConformanceStandard (Official URL of the standard)
    • Standard option targeted: acr:conformanceOption (Instances can be found in act:standard/profiles/)
  4. (Optional) Identify the following when applicable:
    • Subject parts: earl:subject/dct:hasPart/
    • Expertise fields involvment: acr:involvesExpertise/
  5. Set the selected ruleset JSON template as the baseline in earl:result.
  6. For each earl:result, test, assess, and note the result according to the associated accessibility conformance testing rule earl:test.
  7. Finalize the assessment by adding an assessment summary in dct:description.

Accessibility conformance report production

  1. Report initialization
  2. Auditing and referencing an assessment
  3. Issuing and formalizing the ACR report
  4. Archiving

Report initialization

  1. Define the scope of the accessibility evaluation.
  2. Choose a conformance criteria ruleset from the WET-BOEW ACT rules repository.
  3. Fill out the assessment report template with the following information:
    • Subject: earl:subject.
    • Your name and your organization: earl:assertedBy.
    • Date created: dct:created.
    • Conformance to standard: acr:standard.
    • Conformance option: acr:conformanceOption.
  4. Set the selected ruleset JSON template as a baseline in acr:conformity

Auditing and referencing an assessment

  1. Edit the living ACR to include the new assessment that is in scope of the tested subject.
  2. Add the referenced assessment in acr:assessment:
    • The key is the URL where the assessment in JSON can be retrieved.
    • Add the assessment title in dct:title.
    • Add a link to the HTML equivalent report in dct:references.
    • Define the main language used for that assessment (2 letter code) dct:language.
    • Set the date of when the assessment was added into this ACR dct:date.
  3. Audit the assessment acr:audits:
    • Your name: earl:assertedBy.
    • Define the testing mode: earl:mode.
    • Add a reference to the assessment: acr:assessment.
    • Audit each evaluated test case and record your finding if applicable in acr:auditNote:
      • Define the accuracy (False negative, False positive, Accurate) in acr:accuracy.
      • Set the date of this note in dct:date.
      • Set the title of this note in dct:title.
      • Associate the applicable test case in earl:test.
      • Describe the concern expressed by this note in dct:description.
      • Optionally, describe the error message in earl:info.
      • Optionally, define a severity level in acr:severity.
      • Optionally, define a relevancy subject in acr:relevancy.
      • Optionally, include an attachment encoded in data URL in acr:asset.
    • If applicable, create a work item in wf:task and for each work item:
      • Set the title in dct:title.
      • Describe the work goal and additional description to provide enough context in wf:goalDescription.
      • Add a reference to the discussion page in dct:references.
      • Set the work item creation date in dct:created.
      • Define the test requirement this work item is about in acr:requirement.
      • Define the specific test case this work item is about in earl:test.
      • Optionally, define a severity level in acr:severity.
      • Optionally, define a relevancy subject in acr:relevancy.
    • Update the conformity of the criteria affected by the assessment and for each updated conformance requirement:
      • Update the conformance state (Satisfied, Not satisfied, Further test needed) in acr:conformance.
      • Add a reference to the applicable assessment in acr:assessment.
      • Add a description about its latest conformance state or the current state if applicable in dct:description.
      • Update the modified date in dct:modified.
      • Define the cumulative testing mode for this conformance requirement conformity in earl:mode.
  4. If possible, add your verifiable credential.

Issuing and formalizing the ACR report

  1. Edit the living ACR for which we want to set the conformance status for the tested subject.
  2. Validate the integrity of the test subject for all related assessments and audits. If this is not satisfied, the ACR must be archived.
  3. Revise audits, work items and all conformance requirements to ensure they are accurate.
  4. Validate each conformance requirement. None of them should be marked as "Further test needed"
  5. Set your name and your organization's name as the main assertor name for the conformance report using earl:assertedBy.
  6. Define the conformance report's conformance as "satisfied" only if all the conformance requirements are also "satisfied." Otherwise it is "not satisfied."
  7. Set the current date as the issue date using dct:issued.
  8. If possible, add your verifiable credentials.



Archived instructions of the first experimental version of the ACR reporting in JSON-LD

We are actively working on producing a JSON-LD format which is machine readable and ease the process of publishing the report on the web by leveraging additional vocabulary terms to support the accessibility conformance reporting. Our intent is to update this tool to allow the creation of JSON-LD ACR (Accessibility Conformance Report). You can contribute to our discussion on Github via the issue #9271 or during any of our regular meetings.

Step 1 of 3 - Instructions and sample for publishing/creating the ACR

Complete theoretical example

Intended reporting process:

  1. The assessor creates the report, adds the assessment's general information and logs their assessment for each Success Criterion result.
  2. A reviewer re-opens the assessment, identifies themselves in the review section, reviews the original assessment and puts any remark along with the review. Once the review is completed, the reviewer will create as many work items as needed. This step can be repeated as often as needed, but when there is more than 3 reviews, you should consider producing a new ACR and start over this process.
  3. Closing - A contributor submits a change to make the assessment no longer valid or being superseded by another ACR.
Date modified: