Automating Transcoding using AWS service (Elastic Transcoder , Lambda, S3 notifications)

This blog also also covers:

Sample Lambda function

Integrating S3 event notification with Lambda function.

Creating Elastic Transcoder job  using Lamdba Function..

Elastic Transcoder is one of the very interesting  AWS service which is extremely easy to use via console. However, when it comes to transcoding automation for the media files uploaded on S3, it turns out to be slightly complex task. I tried a simple solution using combination of more that one AWS service and it worked perfectly for me.

In this blog I will be explaining steps to automate transcoding. I have used lambda function to create a transcoder job. A S3 event notification generated on every object creation on a particular bucket will be invoking lambda function.

In case you are not familiar with AWS lambda and S3 notifications, I would suggest to go through AWS documentation to get a basic understanding before proceeding.

Below are the steps to be followed:

1> Identify source and destination directories/buckets. In my example, I have used two different buckets. One where input media is received and another where transcoded files are moved.

2> Configure a pipeline under Elastic Transcoder Service. Important fields are Input and output bucket. You can select Elatic transcoder default role under IAM Roles.

3> Create a Lambda function to create a Elastic transcoder job for the pipeline configured under step2.

Below is the sample code which I used, parameters to be changed as per your configuration are marked as “vik-change”


var AWS = require(‘aws - sdk’);
var s3 = new AWS.S3({
    apiVersion: ‘2006 - 03 - 01’
});

var eltr = new AWS.ElasticTranscoder({
    apiVersion: ‘2012 - 09 - 25’,
    region: ‘Your - Region’
});

var pipelineId = ‘Your - pipeline ID’;
var webPreset = ‘Your - webPreset’;
exports.handler = function(event, context) {
    var bucket = event.Records[0].s3.bucket.name;
    var key = event.Records[0].s3.object.key;
    s3.getObject({
            Bucket: bucket,
            Key: key
        },
        function(err, data) {
            console.log(‘err::: ’+err);
            console.log(‘data::: ’+data);
            if (err) {
                console.log(‘error getting object‘ + key + ‘from bucket‘ + bucket +
                    ‘.Make sure they exist and your bucket is in the same region as this
                    function.’);
                context.done(‘ERROR’, ‘error getting file’ + err);
            } else {
                console.log(‘Reached B’);
                /* Below section can be used if you want to put any check based on metadata

                if (data.Metadata.Content-Type == ‘video/x-msvideo’) {
                console.log(‘Reached C’ );
                console.log(‘Found new video: ‘ + key + ‘, sending to ET’);
                sendVideoToET(key);
                } else {
                console.log(‘Reached D’ );
                console.log(‘Upload ‘ + key + ‘was not video’);
                console.log(JSON.stringify(data.Metadata));
                }
                */
                sendVideoToET(key);
            }
        }
    );
};
function sendVideoToET(key) {
    console.log(‘Sending‘ + key + ‘to ET’);
    var params = {
        PipelineId: pipelineId,
        OutputKeyPrefix: ‘Your - Prefix’,
        Input: {
            Key: key,
            FrameRate: ‘auto’,
            Resolution: ‘auto’,
            AspectRatio: ‘auto’,
            Interlaced: ‘auto’,
            Container: ‘auto’
        },

        Output: {
            Key: ‘Output file’ s key name’,
            ThumbnailPattern: ‘Output file’ s Thumbnail name’,
            PresetId: webPreset,
            Rotate: ‘auto’
        }
    };

    eltr.createJob(params, function(err, data) {

        if (err) {
            console.log(‘Failed to send new video‘ + key + ‘to ET’);
            console.log(err);
            console.log(err.stack)
        } else {
            console.log(‘Error’);
            console.log(data);
        }
        //context.done(null,”);
    });
}

Under lambda Role, select default lambda_exec_role.

Don’t forget to provide Transcoder resource access to lambda_exec_role via IAM module, else even after correct configuration, lambda function will not be able to create job because of insufficient access.

4> Under S3 source bucket(where input media files will be received), go to events section on right hand side. Create an event notification, for desired events. Under Send to option, select Lambda function. On selection of Lamdba Function, you will be asked to provide two more inputs i.e. function ARN and invocation role.

Provide the ARN of the lambda function configured in Step 3. For invocation role, lambda_invoke_role should be selected.

Save the event notification. And now you are ready to test automated transcoding of media files using AWS.

Enjoy transcoding..

8 thoughts on “Automating Transcoding using AWS service (Elastic Transcoder , Lambda, S3 notifications)

  1. Mike Dunn says:

    I like this, but how could you handle an input key like this:

    Video+Dec+18%2C+11+54+04+AM.mov

    I would like to upload videos directly from my iPhone to an S3 bucket via Transmit. Elastic Transcoder chokes on it because it doesn’t match the actual filename (“The specified object does not exist in the specified S3 bucket”):

    Video Dec 18, 11 54 04 AM.mov

    Thanks,
    Mike

    • vikrantsundriyal says:

      Thanks Mike or liking the blog… i have extended this snippet for a live project which generates not just a transcoded file but a playlist with multiple presets. To answer your question, yes what you wrote is a practical problem to handle, in my case I converted the URL with a single out of the box function “str = decodeURIComponent(str);” and it did the trick. I got original name and used same for transcoding. Am sure this will solve your problem.

      • Mike Dunn says:

        Thanks. I had been trying decodeURIComponent without much luck. Glad to hear I wasn’t way off base. For some reason, here’s the only way I’ve gotten it to work:

        var key = event.Records[0].s3.object.key;
        var encodedKey = key.toString(‘utf-8’);
        var decodedKey = decodeURIComponent(encodedKey).replace(/+/g, ” “);
        var params = {
        Input: {
        Key: decodedKey

  2. Martin says:

    Hi, we are creating an online music school with thousand of videos that we need to convert, so such automation is definitely key to us. I will be happy to pay you if you were interested in helping us create such automation for our S3/HLS streaming transcoding job. Let me know by email if we can discuss this any further. Thanks !!

  3. Hare says:

    could you help me. i meet a issue.User: arn:aws:sts::680222901241:assumed-role/lambda_s3_basic_execution/videotranscode is not authorized to perform: elastictranscoder:CreateJob on resource: arn:aws:elastictranscoder:us-west-2:680222901241:pipeline/1490845019041-knwuo8.
    how can i fixed it .

    • vikrantsundriyal says:

      You need to provide two kind of permissions to your lambda function
      1) to get objects from S3
      2) to create jobs for transcoder pipeline.

      For generic actions, I add below permissions to a custom policy associated with lambda role. If you wish you can modify policy to make it bucket specific or transcoder function specific.

      Hope this helps….

      {
      “Version”: “2012-10-17”,
      “Statement”: [
      {
      “Effect”: “Allow”,
      “Action”: [
      “logs:*”
      ],
      “Resource”: “arn:aws:logs:*:*:*”
      },
      {
      “Effect”: “Allow”,
      “Action”: [
      “s3:GetObject”,
      “s3:PutObject”,
      “s3:DeleteObject”
      ],
      “Resource”: [
      “arn:aws:s3:::*”
      ]
      },
      {
      “Effect”: “Allow”,
      “Action”: “elastictranscoder:*”,
      “Resource”: “*”
      }
      ]
      }

Comments are closed.