Lambda docs (#264)

* Add aws.md and s3 example

Signed-off-by: Seif Lotfy <seif.lotfy@gmail.com>

* minor fix
This commit is contained in:
Seif Lotfy سيف لطفي
2016-11-11 19:40:22 +01:00
committed by C Cirello
parent 0cc946d937
commit ba65220127
7 changed files with 211 additions and 3 deletions

114
docs/lambda/aws.md Normal file
View File

@@ -0,0 +1,114 @@
Interacting with AWS Services
=============================
The node.js and Python stacks include SDKs to interact with other AWS services.
For Java you will need to include any such SDK in the JAR file.
## Credentials
Running Lambda functions outside of AWS means that we cannot automatically get
access to other AWS resources based on Lambda subsuming the execution role
specified with the function. Instead, when using the AWS APIs inside your
Lambda function (for example, to access S3 buckets), you will need to pass
these credentials explicitly.
### Using environment variables for the credentials
The easiest way to do this is to pass the `AWS_ACCESS_KEY_ID` and
`AWS_SECRET_ACCESS_KEY` environment while creating or importing the lambda function from aws.
This can be done as follows:
```sh
export aws_access_key_id=<access-key>
export aws_secret_access_key=<secret_key>
./fnctl lambda create-function <user>/s3 nodejs example.run examples/s3/example.js examples/s3/example-payload.json --config aws_access_key_id --config aws_secret_access_key
```
or
```sh
./fnctl lambda create-function <user>/s3 nodejs example.run ../../lambda/examples/s3/example.js ../../lambda/examples/s3/example-payload.json --config aws_access_key_id=<access-key> --config aws_secret_access_key=<secret_key>
```
The various AWS SDKs will automatically pick these up.
## Example: Reading and writing to S3 Bucket
This example demonstrates modifying S3 buckets and using the included
ImageMagick tools in a node.js function. Our function will fetch an image
stored in a key specified by the event, resize it to a width of 1024px and save
it to another key.
The code for this example is located [here](../../examples/s3/example.js).
The event will look like:
```js
{
"bucket": "iron-lambda-demo-images",
"srcKey": "waterfall.jpg",
"dstKey": "waterfall-1024.jpg"
}
```
The setup, imports and SDK initialization.
```js
var im = require('imagemagick');
var fs = require('fs');
var AWS = require('aws-sdk');
exports.run = function(event, context) {
var bucketName = event['bucket']
var srcImageKey = event['srcKey']
var dstImageKey = event['dstKey']
var s3 = new AWS.S3();
}
```
First we retrieve the source and write it to a local file so ImageMagick can
work with it.
```js
s3.getObject({
Bucket: bucketName,
Key: srcImageKey
}, function (err, data) {
if (err) throw err;
var fileSrc = '/tmp/image-src.dat';
var fileDst = '/tmp/image-dst.dat'
fs.writeFileSync(fileSrc, data.Body)
});
```
The actual resizing involves using the identify function to get the current
size (we only resize if the image is wider than 1024px), then doing the actual
conversion to `fileDst`. Finally we upload to S3.
```js
im.identify(fileSrc, function(err, features) {
resizeIfRequired(err, features, fileSrc, fileDst, function(err, resized) {
if (err) throw err;
if (resized) {
s3.putObject({
Bucket:bucketName,
Key: dstImageKey,
Body: fs.createReadStream(fileDst),
ContentType: 'image/jpeg',
ACL: 'public-read',
}, function (err, data) {
if (err) throw err;
context.done()
});
} else {
context.done();
}
});
});
```

View File

@@ -36,8 +36,7 @@ Assuming you have a lambda with the following arn `arn:aws:lambda:us-west-2:1231
fnctl lambda aws-import arn:aws:lambda:us-west-2:123141564251:function:my-function us-east-1 user/my-function
```
will import the function code from the region `us-east-1` to a directory called `./my-function`. It will
then create a docker image called `my-function`.
will import the function code from the region `us-east-1` to a directory called `./user/my-function`. Inside the directory you will find the `function.yml`, `Dockerfile`, and all the files needed for running the function.
Using Lambda with Docker Hub and IronFunctions requires that the Docker image be
named `<Docker Hub username>/<image name>`. This is used to uniquely identify
@@ -47,4 +46,10 @@ name>` as the image name with `aws-import` to create a correctly named image.
If you only want to download the code, pass the `--download-only` flag. The
`--profile` flag is available similar to the `aws` tool to help
you tweak the settings on a command level. Finally, you can import a different version of your lambda function than the latest one
by passing `--version <version>.`
by passing `--version <version>.`
You can then publish the imported lambda as follows:
```
./fnctl publish -d ./user/my-function
````
Now the function can be reached via ```http://$HOSTNAME/r/user/my-function```