Skip to content

HDF Schema Reference

The Heimdall Data Format (HDF) schema defines the structure for security assessment results. The schema is a JSON Schema document maintained in the heimdall2 repository.

View Schema on GitHub | HDF Examples | Download Schema

Interactive Schema

Expand and collapse sections to explore the full HDF JSON Schema.

Loading schema...

Root Object

All four fields are required.

FieldTypeRequiredDescription
platformobjectYesInformation about the target system
versionstringYesVersion of the tool that generated the findings
profilesarrayYesArray of security baselines and their results
statisticsobjectYesSummary statistics (inner fields are all optional)

Platform

FieldTypeRequiredDescription
namestringYesPlatform name (e.g. "ubuntu", "windows")
releasestringYesPlatform version (e.g. "22.04", "10.0.19041")
target_idstringNoAdditional identifier (hostname, IP, etc.)

Profile

FieldTypeRequiredDescription
namestringYesUnique profile identifier
sha256stringYesProfile checksum for integrity verification
supportsarrayYesPlatform targets this profile supports
attributesarrayYesInput parameters used during the run
groupsarrayYesLogical groupings of controls
controlsarrayYesThe security requirements and test results
titlestringNoHuman-readable profile title
versionstringNoProfile version
summarystringNoProfile description
maintainerstringNoProfile maintainer
copyrightstringNoCopyright holder
copyright_emailstringNoContact email
licensestringNoLicense identifier (e.g. "Apache-2.0")
statusstringNoLoad status ("loaded", "failed", "skipped")
status_messagestringNoExplanation when status is not "loaded"
dependsarrayNoProfile dependencies
parent_profilestringNoParent profile name for overlays

Control

FieldTypeRequiredDescription
idstringYesUnique control identifier (e.g. "V-75443", "C-1.1.1.1")
impactnumberYesSeverity: 0.01.0 (0.7 = high, 0.5 = medium, 0.3 = low, 0.0 = informational)
tagsobjectYesMetadata tags — typically includes nist (NIST SP 800-53 controls) and cci (DoD identifiers)
refsarrayYesExternal references (URLs, documents)
source_locationobjectYesFile location of the control source (inner fields optional)
resultsarrayYesTest outcomes
titlestringNoHuman-readable control title
descstringNoControl description
descriptionsarrayNoStructured descriptions (check text, fix text) as {label, data} pairs
codestringNoSource code of the control
waiver_dataobjectNoWaiver information if control was waived
attestation_dataobjectNoManual attestation record

Result

FieldTypeRequiredDescription
code_descstringYesHuman-readable description of what was tested
start_timestringYesISO 8601 timestamp when the test ran
statusstringNo"passed", "failed", "skipped", or "error"
run_timenumberNoExecution duration in seconds
messagestringNoFailure explanation or additional detail
skip_messagestringNoReason the test was skipped
exceptionstringNoException type if status is "error"
resourcestringNoInSpec resource type used
resource_idstringNoResource identifier
backtracearrayNoStack trace if an error occurred

Statistics

The statistics object is required at the root level, but all inner fields are optional:

FieldTypeRequiredDescription
durationnumberNoTotal runtime in seconds
controlsobjectNoBreakdown with passed, failed, skipped sub-objects, each containing a total count

Attestation Data

When a control includes manual attestation (for requirements that cannot be automated), all attestation fields are required:

FieldTypeRequiredDescription
control_idstringYesThe control being attested
explanationstringYesJustification for the attestation
frequencystringYesHow often the attestation is reviewed
statusstringYes"passed" or "failed"
updatedstringYesWhen the attestation was last updated
updated_bystringYesWho performed the attestation

Validating HDF Files

You can validate any HDF file against the schema using a JSON Schema validator:

bash
# Using ajv-cli (Node.js)
npx ajv-cli validate \
  -s https://raw.githubusercontent.com/mitre/heimdall2/master/libs/inspecjs/schemas/exec-json.json \
  -d my-results.json

# Using SAF CLI (validates during conversion)
saf convert nikto2hdf -i scan.json -o results.json