npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

serverless-python-common-requirements

v4.1.2

Published

Serverless Python Requirements Plugin

Downloads

23

Readme

Serverless Python Requirements

serverless CircleCI appveyor npm code style: prettier

A Serverless v1.x plugin to automatically bundle dependencies from requirements.txt and make them available in your PYTHONPATH.

Requires Serverless >= v1.12

Install

sls plugin install -n serverless-python-requirements

:apple::beer::snake: Mac Brew installed Python notes

Cross compiling!

Compiling non-pure-Python modules or fetching their manylinux wheels is supported on non-linux OSs via the use of Docker and the docker-lambda image. To enable docker usage, add the following to your serverless.yml:

custom:
  pythonRequirements:
    dockerizePip: true

The dockerizePip option supports a special case in addition to booleans of 'non-linux' which makes it dockerize only on non-linux environments.

To utilize your own Docker container instead of the default, add the following to your serverless.yml:

custom:
  pythonRequirements:
    dockerImage: <image name>:tag

This must be the full image name and tag to use, including the runtime specific tag if applicable.

Alternatively, you can define your Docker image in your own Dockerfile and add the following to your serverless.yml:

custom:
  pythonRequirements:
    dockerFile: ./path/to/Dockerfile

With Dockerfile the path to the Dockerfile that must be in the current folder (or a subfolder). Please note the dockerImage and the dockerFile are mutually exclusive.

To install requirements from private git repositories, add the following to your serverless.yml:

custom:
  pythonRequirements:
    dockerizePip: true
    dockerSsh: true

The dockerSsh option will mount your $HOME/.ssh/id_rsa and $HOME/.ssh/known_hosts as a volume in the docker container. If your SSH key is password protected, you can use ssh-agent because $SSH_AUTH_SOCK is also mounted & the env var set. It is important that the host of your private repositories has already been added in your $HOME/.ssh/known_hosts file, as the install process will fail otherwise due to host authenticity failure.

:checkered_flag: Windows notes

Pipenv support :sparkles::cake::sparkles:

If you include a Pipfile and have pipenv installed instead of a requirements.txt this will use pipenv lock -r to generate them. It is fully compatible with all options such as zip and dockerizePip. If you don't want this plugin to generate it for you, set the following option:

custom:
  pythonRequirements:
    usePipenv: false

Dealing with Lambda's size limitations

To help deal with potentially large dependencies (for example: numpy, scipy and scikit-learn) there is support for compressing the libraries. This does require a minor change to your code to decompress them. To enable this add the following to your serverless.yml:

custom:
  pythonRequirements:
    zip: true

and add this to your handler module before any code that imports your deps:

try:
  import unzip_requirements
except ImportError:
  pass

Slim Package

Works on non 'win32' environments: Docker, WSL are included
To remove the tests, information and caches from the installed packages, enable the slim option. This will: strip the .so files, remove __pycache__ directories and dist-info directories.

custom:
  pythonRequirements:
    slim: true

Custom Removal Patterns

To specify additional directories to remove from the installed packages, define the patterns using regex as a slimPatterns option in serverless config:

custom:
  pythonRequirements:
    slim: true
    slimPatterns:
      - "*.egg-info*"

This will remove all folders within the installed requirements that match the names in slimPatterns

Omitting Packages

You can omit a package from deployment with the noDeploy option. Note that dependencies of omitted packages must explicitly be omitted too. By default, this will not install the AWS SDKs that are already installed on Lambda. This example makes it instead omit pytest:

custom:
  pythonRequirements:
    noDeploy:
      - pytest

Extra Config Options

extra pip arguments

You can specify extra arguments to be passed to pip like this:

custom:
  pythonRequirements:
      dockerizePip: true
      pipCmdExtraArgs:
          - --cache-dir
          - .requirements-cache

When using --cache-dir don't forget to also exclude it from the package.

package:
  exclude:
    - .requirements-cache/**

Customize requirements file name

Some pip workflows involve using requirements files not named requirements.txt. To support these, this plugin has the following option:

custom:
  pythonRequirements:
    fileName: requirements-prod.txt

Per-function requirements

If you have different python functions, with different sets of requirements, you can avoid including all the unecessary dependencies of your functions by using the following structure:

├── serverless.yml
├── function1
│      ├── requirements.txt
│      └── index.py
└── function2
       ├── requirements.txt
       └── index.py

With the content of your serverless.yml containing:

package:
  individually: true

functions:
  func1:
    handler: index.handler
    module: function1
  func2:
    handler: index.handler
    module: function2

The result is 2 zip archives, with only the requirements for function1 in the first one, and only the requirements for function2 in the second one.

Quick notes on the config file:

  • The module field must be used to tell the plugin where to find the requirements.txt file for each function.
  • The handler field must not be prefixed by the folder name (already known through module) as the root of the zip artifact is already the path to your function.

Customize Python executable

Sometimes your Python executable isn't available on your $PATH as python2.7 or python3.6 (for example, windows or using pyenv). To support this, this plugin has the following option:

custom:
  pythonRequirements:
    pythonBin: /opt/python3.6/bin/python

Vendor library directory

For certain libraries, default packaging produces too large an installation, even when zipping. In those cases it may be necessary to tailor make a version of the module. In that case you can store them in a directory and use the vendor option, and the plugin will copy them along with all the other dependencies to install:

custom:
  pythonRequirements:
    vendor: ./vendored-libraries
functions:
  hello:
    handler: hello.handler
    vendor: ./hello-vendor # The option is also available at the function level

Manual invocations

The .requirements and requirements.zip(if using zip support) files are left behind to speed things up on subsequent deploys. To clean them up, run sls requirements clean. You can also create them (and unzip_requirements if using zip support) manually with sls requirements install.

Invalidate requirements caches on package

If you are using your own Python library, you have to cleanup .requirements on any update. You can use the following option to cleanup .requirements everytime you package.

custom:
  pythonRequirements:
    invalidateCaches: true

:apple::beer::snake: Mac Brew installed Python notes

Brew wilfully breaks the --target option with no seeming intention to fix it which causes issues since this uses that option. There are a few easy workarounds for this:

OR

  • Create a virtualenv and activate it while using serverless.

OR

Also, brew seems to cause issues with pipenv, so make sure you install pipenv using pip.

:checkered_flag: Windows dockerizePip notes

For usage of dockerizePip on Windows do Step 1 only if running serverless on windows, or do both Step 1 & 2 if running serverless inside WSL.

  1. Enabling shared volume in Windows Docker Taskbar settings
  2. Installing the Docker client on Windows Subsystem for Linux (Ubuntu)

Native Code Dependencies During Build

Some Python packages require extra OS dependencies to build successfully. To deal with this, replace the default image (lambci/lambda:python3.6) with a Dockerfile like:

# AWS Lambda execution environment is based on Amazon Linux 1
FROM amazonlinux:1

# Install Python 3.6
RUN yum -y install python36 python36-pip

# Install your dependencies
RUN curl -s https://bootstrap.pypa.io/get-pip.py | python3
RUN yum -y install python3-devel mysql-devel gcc

# Set the same WORKDIR as default image
RUN mkdir /var/task
WORKDIR /var/task

Then update your serverless.yml:

custom:
  pythonRequirements:
    dockerFile: Dockerfile

Native Code Dependencies During Runtime

Some Python packages require extra OS libraries (*.so files) at runtime. You need to manually include these files in the root directory of your Serverless package. The simplest way to do this is to commit the files to your repository:

For instance, the mysqlclient package requires libmysqlclient.so.1020. If you use the Dockerfile from the previous section, you can extract this file from the builder Dockerfile:

  1. Extract the library:
docker run --rm -v "$(pwd):/var/task" sls-py-reqs-custom cp -v /usr/lib64/mysql57/libmysqlclient.so.1020 .

(If you get the error Unable to find image 'sls-py-reqs-custom:latest' locally, run sls package to build the image.) 2. Commit to your repo:

git add libmysqlclient.so.1020
git commit -m "Add libmysqlclient.so.1020"
  1. Verify the library gets included in your package:
sls package
zipinfo .serverless/xxx.zip

(If you can't see the library, you might need to adjust your package include/exclude configuration in serverless.yml.)

Contributors