Function Structure
nagaraju juluru
Hadoop | Hive | Sqoop | PySpark| Spark Streaming | Kafka |AWS(S3,EMR,Ec2,Athena,Glue,Dynamo DB and Redshift) | Databricks | HBASE | Cassandra |Snowflake| Airflow|
Lifecycle
A simple serverless function should be composed of a simple linear flow.? There should only be a single dependency in each touchpoint.? If you find yourself with multiple dependencies in a single touchpoint, then you might want to consider breaking up the function into multiple functions.
Touchpoints
Touchpoints are the ways that functions will interact with the world.? Touchpoints can come bundled with your function, or can be an external entity that your function will need access to.? The rule of thumb will be a function will own the infrastructure that it outputs to be default.? This will allow you to easily chain a bunch of functions together.? The first function in the chain would own the input infrastructure as well as the output infrastructure.
Below are some examples of the touchpoints that are available.? It is not an exhaustive list.?
领英推荐
Examples
S3 Output
Below is an example of what you might get in your IAM policy if you declare an output of type s3.? You specify a bucket name in your yaml file.? Your IAM policy will grant access to the bucket, the objects, and any relevant KMS keys.? Some KMS keys are automatically detected for you.? Other KMS keys must be specified in your faas-lambda.yaml.? In this example, you will see the KMS actions for the shared CMKs.? This access is required in order to read/write data from Vanguard buckets.? When creating a custom policy, developers will often forget about granting KMS access.? It is fairly standard in the non-regulated industries to skip encryption in buckets.? The policy that FaaS will give you is also "pre-approved".? This means you don't need to go through the IAM Pull Request process to have this added to your Lambda role.
faas-lambda.yaml snippet
Frequently Asked Questions
How do I get multiple triggers?
From a business process perspective, having multiple triggers makes sense. From a low level technical perspective, it would be better to have a single trigger that integrates with multiple sources.? Specifically, if you have the need for multiple triggers, you should consider using a single SQS queue as the trigger, and have multiple sources write into the queue.
How do I write to multiple buckets?
Each touchpoint slot is for one "pattern".? This often ties back to a single AWS service.? Many of our integrations have a "multi" flavor that allows you to specify multiple ARNs.? That means a single output of type s3multi allows you to wrote to multiple S3 buckets.
How do I get multiple outputs?
FaaS has 3 touchpoints that support generic write access: Output, Notification and State.? If your Lambda needs to write to four or more places, then you might not be architecting your Lambda to be as small as it should be.? If you really need the additional access, you can use Access Blocks to augment your Lambda's access:?
I'm using a library that wants to write to multiple S3 services.? How do I make this work?
The touchpoints don't have to align one-to-one to an AWS service.? They can represent higher level patterns.? If there is some standard library that wants to write to DynamoDB, S3 and SQS all in one action, then it makes sense to have a single output that grants access to all 3.? If you find yourself in that scenario, submit a feature request, and we can partner to see if we can add it.