Skip to content

Usage

Installation

npm install --save @eyevinn/autovmaf

Generate VMAF measurements

To generate VMAF measurements, you will need to define a job which can be created with the createJob()-function.

create-job.js
const { createJob } = require('@eyevinn/autovmaf');

const vmafScores = await createJob({
name: "MyVMAFmeasurements",
pipeline: "pipeline.yml",
encodingProfile: "profile.json",
reference: "reference.mp4",
models: ["HD", "PhoneHD"],         // optional
resolutions: [{                    // optional
    width: 1280,
    height: 720,
    range: {                       // optional
        min: 500000,
        max: 600000
    }
}],
bitrates: [                        // optional
    500000,
    600000,
    800000
],
method: "bruteForce"               // optional
});

When creating a job, you can specify the following parameters:

Name
Name of the job. Used to reference this particular job, for the output directory and so on.
Pipeline

Path to a YAML-file that defines the pipeline. Currently only AWS is supported.

Example: pipeline.yml
aws:
  inputBucket: input-bucket-name
  outputBucket: ouput-bucket-name
  mediaConvertRole: arn:aws:iam::role
  mediaConvertEndpoint: https://endpoint.amazonaws.com
  ecsSubnet: subnet-1234
  ecsSecurityGroup: sg-1234
  ecsContainerName: easyvmaf-s3
  ecsCluster: easyvmaf-ecs-cluster
  ecsTaskDefinition: easyvmaf-s3:1
Encoding Profile

Path to a JSON-file that defines how the reference should be encoded. When using AWS, this is a MediaConvert configuration.

Example: mediaconvert-encoding-profile.json
{
  "Inputs": [
    {
      "TimecodeSource": "ZEROBASED",
      "VideoSelector": {},
      "FileInput": "$INPUT"
    }
  ],
  "OutputGroups": [
    {
      "Name": "File Group",
      "OutputGroupSettings": {
        "Type": "FILE_GROUP_SETTINGS",
        "FileGroupSettings": {
          "Destination": "$OUTPUT"
        }
      },
      "Outputs": [
        {
          "VideoDescription": {
            "CodecSettings": {
              "Codec": "H_264",
              "H264Settings": {
                "RateControlMode": "CBR",
                "Bitrate": "$BITRATE",
                "CodecProfile": "HIGH"
              }
            },
            "Width": "$WIDTH",
            "Height": "$HEIGHT"
          },
          "ContainerSettings": {
            "Container": "MP4",
            "Mp4Settings": {}
          }
        }
      ]
    }
  ],
  "TimecodeConfig": {
    "Source": "ZEROBASED"
  }
}
Reference
Path to the reference video to analyze. Normally a local path, but when using AWS, this can also be an S3-URI.
Models (optional)
A list of VMAF-models to use in evaluation. This can be HD, MobileHD and UHD. HD by default.
Resolutions (optional)
A list of resolutions to test. By default it will test all resolutions in the example ABR-ladder provided by Apple in the HLS Authoring Spec.
Range (optional)
A min and max bitrate for testing a specific resolution. Adding a range will filter out bitrates that are outside of the given range. It is disabled by default.
Bitrates (optional)
A list of bitrates to test. By default a list of bitrates between 150 kbit/s to 9000 kbit/s.
Method (optional)
The method to use when analyzing the videos. Either bruteForce or walkTheHull. By default bruteForce. NOTE: walkTheHull is not implemented at the moment.

Read VMAF-scores

Using getVmaf(), you can read VMAF-scores from a JSON-file or a directory of JSON-files. This works on both local paths as well as S3 URIs with a s3://-prefix.

Example:

get-results.js
const vmafFiles = await getVmaf("s3://bucket-name/path/to/vmaf/");

vmafFiles.forEach((file) => {
  console.log(file.filename + ": " + file.vmaf);
});