@clowdhaus/serverless-python-requirements
v6.0.2
Published
Serverless Python Requirements Plugin
Downloads
5
Maintainers
Readme
A Serverless plugin to automatically bundle dependencies from requirements.txt
and make them available in your PYTHONPATH
.
:warning: This is a fork of UnitedIncome/serverless-python-requirements :warning:
The upstream project is currently undergoing some changes and while waiting for those to happen, I have decided to fork the project and carry on with some important changes. Hopefully the upstream project starts picking up steam again soon and this fork can be parked.
Some of the notable changes from upstream include:
- Removed Windows/WSL support
- Python2.7 and Python3.6 support has been dropped due to end-of-life
- Docker images used have been switched from lambci to the official AWS SAM build images. Lambci at time of writing did not support Python3.9 and was falling behind on updates
- Added support for Python3.9 runtime
- CI/CD improvements to ensure all tests run on both Linux and MacOs runners the same
Requirements
- Serverless >= v2.x
- Python >= 3.7
Install
sls plugin install -n @clowdhaus/serverless-python-requirements
This will automatically add the plugin to your project's package.json
and the plugins section of its
serverless.yml
. That's all that's needed for basic use! The plugin will now bundle your python
dependencies specified in your requirements.txt
or Pipfile
when you run sls deploy
.
For a more in depth introduction on how to use this plugin, check out this post on the Serverless Blog
If you're on a mac, check out these notes about using python installed by brew.
Cross compiling
Compiling non-pure-Python modules or fetching their many linux wheels is
supported on non-linux OSs via the use of Docker and the
AWS SAM build image.
To enable docker usage, add the following to your serverless.yml
:
custom:
pythonRequirements:
dockerizePip: true
The dockerizePip option supports a special case in addition to booleans of 'non-linux'
which makes
it dockerize only on non-linux environments.
To utilize your own Docker container instead of the default, add the following to your serverless.yml
:
custom:
pythonRequirements:
dockerImage: <image name>:tag
This must be the full image name and tag to use, including the runtime specific tag if applicable.
Alternatively, you can define your Docker image in your own Dockerfile and add the following to your serverless.yml
:
custom:
pythonRequirements:
dockerFile: ./path/to/Dockerfile
With Dockerfile
the path to the Dockerfile that must be in the current folder (or a subfolder).
Please note the dockerImage
and the dockerFile
are mutually exclusive.
To install requirements from private git repositories, add the following to your serverless.yml
:
custom:
pythonRequirements:
dockerizePip: true
dockerSsh: true
The dockerSsh
option will mount your $HOME/.ssh/id_rsa
and $HOME/.ssh/known_hosts
as a
volume in the docker container. If your SSH key is password protected, you can use ssh-agent
because $SSH_AUTH_SOCK
is also mounted & the env var set.
It is important that the host of your private repositories has already been added in your
$HOME/.ssh/known_hosts
file, as the install process will fail otherwise due to host authenticity
failure.
You can also pass environment variables to docker by specifying them in dockerEnv
option:
custom:
pythonRequirements:
dockerEnv:
- https_proxy
:sparkles::cake::sparkles: Pipenv support
If you include a Pipfile
and have pipenv
installed instead of a requirements.txt
this will use
pipenv lock -r
to generate them. It is fully compatible with all options such as zip
and
dockerizePip
. If you don't want this plugin to generate it for you, set the following option:
custom:
pythonRequirements:
usePipenv: false
:sparkles::pencil::sparkles: Poetry support
If you include a pyproject.toml
and have poetry
installed instead of a requirements.txt
this will use
poetry export --without-hashes -f requirements.txt -o requirements.txt --with-credentials
to generate them. It is fully compatible with all options such as zip
and
dockerizePip
. If you don't want this plugin to generate it for you, set the following option:
custom:
pythonRequirements:
usePoetry: false
Poetry with git dependencies
Poetry by default generates the exported requirements.txt file with -e
and that breaks pip with -t
parameter
(used to install all requirements in a specific folder). In order to fix that we remove all -e
from the generated file but,
for that to work you need to add the git dependencies in a specific way.
Instead of:
[tool.poetry.dependencies]
bottle = {git = "[email protected]/bottlepy/bottle.git", tag = "0.12.19"}
Use:
[tool.poetry.dependencies]
bottle = {git = "https://[email protected]/bottlepy/bottle.git", tag = "0.12.19"}
Or, if you have an SSH key configured:
[tool.poetry.dependencies]
bottle = {git = "ssh://[email protected]/bottlepy/bottle.git", tag = "0.12.19"}
Dealing with Lambda's size limitations
To help deal with potentially large dependencies (for example: numpy
, scipy
and scikit-learn
) there is support for compressing the libraries. This does
require a minor change to your code to decompress them. To enable this add the
following to your serverless.yml
:
custom:
pythonRequirements:
zip: true
and add this to your handler module before any code that imports your deps:
try:
import unzip_requirements
except ImportError:
pass
Slim Package
Works on non 'win32' environments: Docker, WSL are included
To remove the tests, information and caches from the installed packages,
enable the slim
option. This will: strip
the .so
files, remove __pycache__
and dist-info
directories as well as .pyc
and .pyo
files.
custom:
pythonRequirements:
slim: true
Custom Removal Patterns
To specify additional directories to remove from the installed packages,
define a list of patterns in the serverless config using the slimPatterns
option and glob syntax. These patterns will be added to the default ones (**/*.py[c|o]
, **/__pycache__*
, **/*.dist-info*
).
Note, the glob syntax matches against whole paths, so to match a file in any
directory, start your pattern with **/
.
custom:
pythonRequirements:
slim: true
slimPatterns:
- '**/*.egg-info*'
To overwrite the default patterns set the option slimPatternsAppendDefaults
to false
(true
by default).
custom:
pythonRequirements:
slim: true
slimPatternsAppendDefaults: false
slimPatterns:
- '**/*.egg-info*'
This will remove all folders within the installed requirements that match
the names in slimPatterns
Option not to strip binaries
In some cases, stripping binaries leads to problems like "ELF load command address/offset not properly aligned", even when done in the Docker environment. You can still slim down the package without *.so
files with
custom:
pythonRequirements:
slim: true
strip: false
Lambda Layer
Another method for dealing with large dependencies is to put them into a
Lambda Layer.
Simply add the layer
option to the configuration.
custom:
pythonRequirements:
layer: true
The requirements will be zipped up and a layer will be created automatically. Now just add the reference to the functions that will use the layer.
functions:
hello:
handler: handler.hello
layers:
- Ref: PythonRequirementsLambdaLayer
If the layer requires additional or custom configuration, add them onto the layer
option.
custom:
pythonRequirements:
layer:
name: ${self:provider.stage}-layerName
description: Python requirements lambda layer
compatibleRuntimes:
- python3.7
licenseInfo: GPLv3
allowedAccounts:
- '*'
Omitting Packages
You can omit a package from deployment with the noDeploy
option. Note that
dependencies of omitted packages must explicitly be omitted too.
This example makes it instead omit pytest:
custom:
pythonRequirements:
noDeploy:
- pytest
Extra Config Options
Caching
You can enable two kinds of caching with this plugin which are currently both ENABLED by default.
First, a download cache that will cache downloads that pip needs to compile the packages.
And second, a what we call "static caching" which caches output of pip after compiling everything for your requirements file.
Since generally requirements.txt
files rarely change, you will often see large amounts of speed improvements when enabling the static cache feature.
These caches will be shared between all your projects if no custom cacheLocation
is specified (see below).
Please note: This has replaced the previously recommended usage of "--cache-dir" in the pipCmdExtraArgs
custom:
pythonRequirements:
useDownloadCache: true
useStaticCache: true
Other caching options
There are two additional options related to caching.
You can specify where in your system that this plugin caches with the cacheLocation
option.
By default it will figure out automatically where based on your username and your OS to store the cache via the appdirectory module.
Additionally, you can specify how many max static caches to store with staticCacheMaxVersions
, as a simple attempt to limit disk space usage for caching.
This is DISABLED (set to 0) by default.
Example:
custom:
pythonRequirements:
useStaticCache: true
useDownloadCache: true
cacheLocation: '/home/user/.my_cache_goes_here'
staticCacheMaxVersions: 10
Extra pip arguments
You can specify extra arguments supported by pip to be passed to pip like this:
custom:
pythonRequirements:
pipCmdExtraArgs:
- --compile
Extra Docker arguments
You can specify extra arguments to be passed to docker build during the build step, and docker run during the dockerized pip install step:
custom:
pythonRequirements:
dockerizePip: true
dockerBuildCmdExtraArgs: ['--build-arg', 'MY_GREAT_ARG=123']
dockerRunCmdExtraArgs: ['-v', '${env:PWD}:/my-app']
Customize requirements file name
Some pip
workflows involve using requirements files not named
requirements.txt
.
To support these, this plugin has the following option:
custom:
pythonRequirements:
fileName: requirements-prod.txt
Per-function requirements
If you have different python functions, with different sets of requirements, you can avoid including all the unecessary dependencies of your functions by using the following structure:
├── serverless.yml
├── function1
│ ├── requirements.txt
│ └── index.py
└── function2
├── requirements.txt
└── index.py
With the content of your serverless.yml
containing:
package:
individually: true
functions:
func1:
handler: index.handler
module: function1
func2:
handler: index.handler
module: function2
The result is 2 zip archives, with only the requirements for function1 in the first one, and only the requirements for function2 in the second one.
Quick notes on the config file:
- The
module
field must be used to tell the plugin where to find therequirements.txt
file for each function. - The
handler
field must not be prefixed by the folder name (already known throughmodule
) as the root of the zip artifact is already the path to your function.
Customize Python executable
Sometimes your Python executable isn't available on your $PATH
as python3.7
, for example, when using pyenv.
To support this, this plugin has the following option:
custom:
pythonRequirements:
pythonBin: /opt/python3.7/bin/python
Vendor library directory
For certain libraries, default packaging produces too large an installation,
even when zipping. In those cases it may be necessary to tailor make a version
of the module. In that case you can store them in a directory and use the
vendor
option, and the plugin will copy them along with all the other
dependencies to install:
custom:
pythonRequirements:
vendor: ./vendored-libraries
functions:
hello:
handler: hello.handler
vendor: ./hello-vendor # The option is also available at the function level
Manual invocations
The .requirements
and requirements.zip
(if using zip support) files are left
behind to speed things up on subsequent deploys. To clean them up, run
sls requirements clean
. You can also create them (and unzip_requirements
if
using zip support) manually with sls requirements install
.
Invalidate requirements caches on package
If you are using your own Python library, you have to cleanup
.requirements
on any update. You can use the following option to cleanup
.requirements
everytime you package.
custom:
pythonRequirements:
invalidateCaches: true
:apple::beer::snake: Mac Brew installed Python notes
Brew wilfully breaks the --target
option with no seeming intention to fix it
which causes issues since this uses that option. There are a few easy workarounds for this:
- Install Python from python.org and specify it with the
pythonBin
option.
OR
- Create a virtualenv and activate it while using serverless.
OR
- Install Docker and use the
dockerizePip
option.
Also, brew seems to cause issues with pipenv, so make sure you install pipenv using pip.
Native Code Dependencies During Build
Some Python packages require extra OS dependencies to build successfully. To deal with this, replace the default image (public.ecr.aws/sam/build-python3.7
) with a Dockerfile
like:
FROM public.ecr.aws/sam/build-python3.7
# Install your dependencies
RUN yum -y install mysql-devel
Then update your serverless.yml
:
custom:
pythonRequirements:
dockerFile: Dockerfile
Native Code Dependencies During Runtime
Some Python packages require extra OS libraries (*.so
files) at runtime. You need to manually include these files in the root directory of your Serverless package. The simplest way to do this is to use the dockerExtraFiles
option.
For instance, the mysqlclient
package requires libmysqlclient.so.1020
. If you use the Dockerfile from the previous section, add an item to the dockerExtraFiles
option in your serverless.yml
:
custom:
pythonRequirements:
dockerExtraFiles:
- /usr/lib64/mysql57/libmysqlclient.so.1020
Then verify the library gets included in your package:
sls package
zipinfo .serverless/xxx.zip
If you can't see the library, you might need to adjust your package include/exclude configuration in serverless.yml
.