A GitHub Action that packages your AWS Lambda code into a zip file and uploads it to an S3 bucket. Designed for Lambda deployment workflows with intelligent file filtering, automatic key generation, and robust error handling.
- Automatic Packaging: Creates deployment-ready zip files from your source code
- Smart File Filtering: Excludes common unnecessary files (node_modules, tests, .env) with customizable patterns
- S3 Upload with Retry: Uploads to S3 with exponential backoff retry logic for transient errors
- GitHub Context Integration: Automatically generates S3 keys using repository name and commit SHA
- Size Validation: Validates package size doesn't exceed AWS Lambda's 250MB limit
- Comprehensive Logging: Detailed progress logging and actionable error messages
- Streaming Architecture: Minimizes memory usage for large packages
This action requires AWS credentials to be configured before use. The recommended approach is using OpenID Connect (OIDC) with the official AWS credentials action:
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v5
with:
role-to-assume: arn:aws:iam::123456789012:role/MyGitHubActionsRole
role-session-name: github-actions-session
aws-region: us-east-1Required IAM Permissions: The IAM role must have permissions to upload objects to your target S3 bucket:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:PutObjectAcl"
],
"Resource": "arn:aws:s3:::your-bucket-name/*"
}
]
}For more information on setting up OIDC with AWS, see:
| Input | Description | Required | Default |
|---|---|---|---|
s3-bucket |
S3 bucket name where the zip file will be uploaded. Must be a valid DNS-compliant bucket name. | Yes | - |
source-dir |
Source directory containing files to package. | No | . (current directory) |
s3-key |
S3 object key (path in the bucket). If not provided, automatically generated as {repo-name}/{commit-sha}.zip |
No | Auto-generated |
exclude |
Comma or newline-separated glob patterns for files to exclude from the package (in addition to default exclusions). | No | - |
The following patterns are always excluded from the package:
.git/- Git repository datanode_modules/- Node.js dependencies (install these in Lambda layer or during deployment)**/*.test.*,**/*.spec.*- Test files**/__tests__/,**/tests/,**/__mocks__/- Test directories.env*,*.local- Environment and local configuration files*.md- Documentation files (README, CHANGELOG, etc.).github/- GitHub workflows and configurationscoverage/,.nyc_output/- Test coverage data*.log- Log files.DS_Store- macOS system files
| Output | Description | Example |
|---|---|---|
s3-bucket |
S3 bucket name where the file was uploaded | my-lambda-deployments |
s3-key |
S3 object key (path) of the uploaded file | my-repo/abc123def456.zip |
s3-uri |
Full S3 URI of the uploaded file | s3://my-lambda-deployments/my-repo/abc123def456.zip |
zip-size |
Size of the uploaded zip file in bytes | 15728640 |
The simplest usage with only the required input. The action will package the current directory and generate an S3 key automatically:
name: Deploy Lambda Function
on:
push:
branches: [main]
permissions:
id-token: write
contents: read
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v5
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v5
with:
role-to-assume: arn:aws:iam::123456789012:role/MyGitHubActionsRole
role-session-name: github-actions-session
aws-region: us-east-1
- name: Upload Lambda package to S3
uses: taxintt/lambda-zip-file-upload-action@v1
with:
s3-bucket: my-lambda-deploymentsExample with all available options configured:
- name: Upload Lambda package to S3
id: upload
uses: taxintt/lambda-zip-file-upload-action@v1
with:
source-dir: ./src
s3-bucket: my-lambda-deployments
s3-key: my-function/production/v1.2.3.zip
exclude: |
*.log
*.tmp
config/local.*
- name: Use upload outputs
run: |
echo "Uploaded to: ${{ steps.upload.outputs.s3-uri }}"
echo "Package size: ${{ steps.upload.outputs.zip-size }} bytes"Exclude additional files beyond the defaults:
- name: Upload Lambda package to S3
uses: taxintt/lambda-zip-file-upload-action@v1
with:
s3-bucket: my-lambda-deployments
source-dir: ./lambda
exclude: |
*.key
secrets/*
config/development.*Complete workflow that uploads to S3 and updates a Lambda function:
name: Deploy Lambda
on:
push:
branches: [main]
permissions:
id-token: write
contents: read
env:
AWS_REGION: us-east-1
FUNCTION_NAME: my-lambda-function
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v5
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v5
with:
role-to-assume: arn:aws:iam::123456789012:role/MyGitHubActionsRole
role-session-name: lambda-deploy
aws-region: ${{ env.AWS_REGION }}
- name: Upload package to S3
id: upload
uses: taxintt/lambda-zip-file-upload-action@v1
with:
s3-bucket: my-lambda-deployments
source-dir: ./src
- name: Update Lambda function code
run: |
aws lambda update-function-code \
--function-name ${{ env.FUNCTION_NAME }} \
--s3-bucket ${{ steps.upload.outputs.s3-bucket }} \
--s3-key ${{ steps.upload.outputs.s3-key }}Error: AWS authentication/authorization failed: AccessDenied
Solutions:
- Ensure
aws-actions/configure-aws-credentialsruns before this action - Verify the IAM role has
s3:PutObjectpermission for the target bucket - Check that the bucket name is correct and accessible
- Verify the OIDC trust relationship is properly configured
For detailed setup instructions, see AWS IAM documentation.
Error: Package size exceeds AWS Lambda limit (250MB)
Solutions:
- Exclude unnecessary files using the
excludeinput:exclude: | docs/* examples/* *.pdf
- Move large dependencies to a Lambda Layer
- Review included files and remove development-only dependencies
- Consider using Lambda container images for packages > 250MB
Error: No files found to package after applying exclusion filters
Solutions:
- Verify the
source-dirpath is correct - Check that your exclusion patterns aren't too broad
- Ensure files were checked out (use
actions/checkoutfirst) - Review the action logs to see which files were excluded
Error: NoSuchBucket: The specified bucket does not exist
Solutions:
- Verify the bucket name in the
s3-bucketinput - Ensure the bucket exists in the correct AWS region
- Check that your AWS credentials have access to the bucket
- S3 bucket names must be globally unique and DNS-compliant (lowercase, no underscores)
The action automatically retries on transient errors (network issues, throttling) with exponential backoff. If you see warnings about retries, this is normal behavior and the upload should eventually succeed.
Error: Failed to upload to S3 after 3 attempts
Solutions:
- Check your network connectivity
- Verify AWS service status at AWS Service Health Dashboard
- Consider reducing package size to minimize transfer time
- Check if your AWS account has any rate limiting in effect
# Install dependencies
npm install
# Run tests
npm test
# Run tests in watch mode
npm run test:watch
# Build the action
npm run build
# Package for distribution
npm run packagenpm run test:integrationThis project is licensed under the MIT License - see the LICENSE file for details.
Contributions are welcome! Please feel free to submit a Pull Request.
If you encounter any issues or have questions:
- Check the Troubleshooting section above
- Review existing issues
- Create a new issue with detailed information about your problem