13. Konflux Test Stream - API contracts

Date: 2023-01-30

Status

Deprecated by ADR 30. Tekton Results Naming Convention.

Relates to ADR 14. Let Pipelines Proceed

Context

The Konflux project being developed aims to serve Red Hat teams but also partners and customers. This requires a level of adaptability to avoid recreating custom flows and Tasks for each stakeholder.

In this respect Tasks developed by Konflux test stream should allow swapping external systems to accommodate different environments. This swap should not induce the complete recreation of pipelines.

This and the idea of providing a homogeneous experience, which is easier to comprehend and navigate complex systems, leads to the definition of API contracts. These contracts need to be understood as guidance that may evolve with time and experience while keeping the aim of building a flexible homogeneous system.

Tasks from the test stream may exchange information through

  1. Parameters passed at the time of the trigger of a PipelineRun (that is user, git repository, etc)
  2. Output of upstream elements of the Pipeline which may be needed for performing the Task (that is build artifacts, container image, etc)
  3. Providing whether the Task was successful or not
  4. Output of the Task that may be needed for processing subsequent Tasks of the pipeline
  5. Output of the Task that need to be stored for auditing and troubleshooting (that is summarized results of a validation or a scan, etc)
  6. Communication with external systems that are needed for performing the Task (that is container image repository, image scanner, etc)
  7. Connection details and credentials that may be required for the above

Related to PLNSRVCE-41 investigations in regards to Tekton results. The investigations are documented here.

Decision

Related to PLNSRVCE-41 investigations in regards to Tekton results. The investigations are documented here. The output of each Tekton task will be provided in two forms: Tekton Task Results and Full Test Output JSON.

Tekton Task Results

The output of each Tekton task will be provided in a minimized Tekton result in JSON format listing all test failures. The name of the result will be HACBS_TEST_OUTPUT.

To display count of found vulnerabilities and make it easy to understand and evaluate the state of scanned image, the additional output of Tekton task clair-scan will be provided in a minimized Tekton result in JSON format listing. The name of the result will be CLAIR_SCAN_RESULT.

The maximum size of a Task’s Results is limited by the container termination message feature of Kubernetes.

App Studio builds are structured as a shared Persistent Volume per Konflux Workspace. This allows teams to share builds, implement caching and other shared volumes. A single persistent volume is mapped to each default build pipeline. Builds are passed a directory specific to their builds.

Tekton Result Format for HACBS_TEST_OUTPUT

The Test output of the Tekton result HACBS_TEST_OUTPUT will be a JSON object that includes the context about the test along with the list of check names for all failed checks

The output will provide the following information about the overall test result:

Example contents of the test result output file (HACBS_TEST_OUTPUT) for a failed run:

{
    "result": "FAILURE",
    "namespace": "image_labels",
    "timestamp": "1649148140",
    "successes": 12,
    "note": "Task fbc-related-image-check failed: Command skopeo inspect could not inspect images. For details, check Tekton task log.",
    "failures": 2,
    "warnings": 0
}

Example for a successful run:

{
    "result": "SUCCESS",
    "timestamp": "1649843611",
    "namespace": "required_checks",
    "successes": 16,
    "note": "Task fbc-related-image-check succeeded: For details, check Tekton task result HACBS_TEST_OUTPUT.",
    "failures": 0,
    "warnings": 0
}

Example for a skipped run:

{
    "result": "SKIPPED",
    "note": "We found 0 supported files",
    "timestamp": "1649842004",
    "successes": 0,
    "note": "Task sast-snyk-check skipped: Snyk code test found zero supported files.",
    "failures": 0,
    "warnings": 0
}

Example for a run with an error:

{
    "result": "ERROR",
    "timestamp": "1649842004",
    "successes": 0,
    "note": "Task fbc-validation failed: $(workspaces.source.path)/hacbs/inspect-image/image_inspect.json did not generate correctly. For details, check Tekton task result HACBS_TEST_OUTPUT in task inspect-image.",
    "failures": 0,
    "warnings": 0
}

Tekton Result Schema Validation

The test output of the Tekton result HACBS_TEST_OUTPUT will be validated using the jsonschema validator package. The schema is configured as follows:

{
  "$schema": "http://json-schema.org/draft/2020-12/schema#",
  "type": "object",
  "properties": {
    "result": {
      "type": "string",
      "enum": ["SUCCESS", "FAILURE", "WARNING", "SKIPPED", "ERROR"]
    },
    "namespace": {
      "type": "string"
    },
    "timestamp": {
      "type": "string",
      "pattern": "^[0-9]{10}$"
    },
    "successes": {
      "type": "integer",
      "minimum": 0
    },
    "note": {
      "type": "string",
    },
    "failures": {
      "type": "integer",
      "minimum": 0
    },
    "warnings": {
      "type": "integer",
      "minimum": 0
    }
  },
  "required": ["result", "timestamp", "successes", "failures", "warnings"]
}

Tekton Result Format for CLAIR_SCAN_RESULT

The Test output of the Tekton result CLAIR_SCAN_RESULT will be a JSON object that includes the following information about the found vulnerabilities scanned by Clair. Refer to Red Hat Vulnerability documentation for more context on the vulnerability severity ratings.

{
  "vulnerabilities": {
    "critical": 1,
    "high": 0,
    "medium": 1,
    "low":0
  }
}

Detailed Conftest Output JSON

Test output JSON file with detailed test information is saved in the Tekton Pipeline Workspace. The name of the file will be in snake case and will be in the form of test_name_output.json

The Test Tekton tasks will use a standardized manner of displaying testing information. This information will be saved in the form of a JSON file. The contents of this file will relay results of validation by the Open Policy Agent policies as executed by the Conftest tool and saved in its JSON output format.

Each test will have a standardized short name in snake case, e.g. release_label_required, architecture_label_required, architecture_label_deprecated etc.

The output will provide the following information about the overall test result:

The failures list of objects will provide the following information about each check:

Example detailed JSON test output for a Tekton task that tests the container image labels:

[
    {
        "filename": "image-inspect.json",
        "namespace": "image_labels",
        "successes": 19,
        "failures": [
            {
                "msg": "The 'architecture' label is required",
                "metadata": {
                    "details": {
                        "description": "Architecture the software in the image should target.",
                        "name": "architecture_label_required",
                        "url": "https://source.redhat.com/groups/public/container-build-system/container_build_system_wiki/guide_to_layered_image_build_service_osbs#jive_content_id_Labels"
                    }
                }
            }
          ]
    }
]
[
    {
        "filename": "image_inspect.json",
        "namespace": "fbc_checks",
        "successes": 1
    }
]

Information injection

Whenever possible, resources like ConfigMaps or Secrets will be used to inject configuration into Tasks. This is preferred to templates and patches as it fits well with Kubernetes declarative and GitOps approaches.

ConfigMaps and Secrets will be mounted into the Task pod to inject file-based information like certificates. Environment variables may be injected from ConfigMaps.

Variables directly configured in the Pod definition are discouraged.

Side note: Secrets and ConfigMaps should be made immutable

Information format

Clearly define the format of input parameters and results.

Atomic information may be passed as a simple parameter. Non binary encoded complex information should be exchanged through JSON format between Tasks.

Image references

Since tags can be moved from one image to another, they should not be relied on as a reference. In order to guarantee that any scanning is performed on an image built as part of a PipelineRun, the immutable image digest reference will be used instead.

Consequences

As a result of the decision here to summarize results in a HACBS_TEST_OUTPUT result and store the larger test output as a file named test_name_output.json, we should find that:

Additional Recommendations

Convention over Configuration

https://en.wikipedia.org/wiki/Convention_over_configuration

Tasks should assume default locations for locating information, external systems or output storage. This aims to reduce the amount of information that needs to be configured for running a pipeline.

See: API Contract - Create a new Build

Naming convention

It is advantageous to have naming conventions for parameters, files, and locations. Considering the multiple options: flatcase, camelCase, PascalCase, dash-case, snake_case, UPPER_CASE, TRAIN-CASE and the possibility to use domain-scoped names here are the proposed naming conventions for Tasks

DRY

Information, which relates to an environment like connection details and credentials should be configured once at the environment level and not passed as parameters for every PipelineRun. On the other hand, information, which is specific to a run like a container image digest, may be passed as a parameter.

Appendix

References

Originally drafted in a google document