npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

wagcontent

v1.0.0

Published

Django application for running content site behind wagwalking.com Maintains it's own postgres database along with a CMS and Admin site. Build to host on Heroku with assets deployed to S3

Downloads

46

Readme

WagContent

Django application for running content site behind wagwalking.com Maintains it's own postgres database along with a CMS and Admin site. Build to host on Heroku with assets deployed to S3

Navigation

Dependencies

Quickstart

Build the environment

$ cp settings.example .env
$ docker-compose up --build -d
$ docker-compose exec app python3 manage.py migrate
$ docker-compose exec app python3 manage.py createcachetable
$ docker-compose exec app python3 manage.py createsuperuser
$ docker-compose exec app python3 manage.py loadfixtures wagcontent/*/fixtures/* -v3 -i
$ nvm use 10
$ npm i
$ npm run webpack:build-dev

Navigate to http://localhost:8000/admin/ and login with your super user!

Useful commands:

Django

  • $ docker-compose exec app python3 manage.py migrate - run any outstanding database migrations
  • $ docker-compose exec app python3 manage.py createcachetable - create db-backed caches
  • $ docker-compose exec app python3 manage.py createsuperuser - create a new super user
  • $ docker-compose exec app python3 manage.py loadfixtures wagcontent/*/fixtures/* -v3 -i - seed the database with test data

Webpack

  • $ npm run webpack:watch - run webpack in local dev mode and reload when watched files change
  • $ npm run webpack:build - build webpack bundles for production
  • $ npm run webpack:build-dev - build webpack for local dev mode and exit

DB Management

Creating a PostgreSQL db dump

In order to quickly return to a stable db state from which you can run new model changes or migrations without undergoing the entire re-seeding process, you can create a local snapshot to use as a restore point:

$ cd <project_dir>
$> docker-compose exec db /bin/bash
$> pg_dump -h db -U postgres dev > /bkp/<DUMP_FILENAME>

The dump file will be available in <project_dir>/data/bkp/<DUMP_FILENAME>

Restoring database from PostgreSQL dump

After completing the quickstart tutorial:

  1. Move the file you want to restore from into <project_dir>/data/bkp/<DUMP_FILENAME>
  2. Execute:
$ cd <project_dir>
$ docker-compose run db bash
$ psql -h db -U postgres
<specify db password>
postgres=# DROP DATABASE dev;
postgres=# CREATE DATABASE dev;
postgres=# \q
$ psql -h db -U postgres -f /bkp/<DUMP_FILENAME> dev
<specify db password>

Note: you may need to stop the app and celery container if the DROP action returns an exception due to open connections. You can do this with: docker stop wagcontent_app_1 wagcontent_celery_1

Building Environment Images

There are two images that are built internally and used for the local dev and CI runtime environments. They are not rebuilt on every deployment, since the dependencies rarely change. If there are dependency updates made, please make sure to tag a build with build-images in the name. This will rebuild the images described below:

|image|description| |---|---| |wagcontent/dpl:latest|Alpine linux-based image which comes pre-installed with ruby 2.4. Used with the dpl gem, which is a multi-host deployment tool with support for heroku deployment| TODO is this now python 3.8? |gdal-python-runtime:latest|Alpine linux-based image which comes pre-installed with python2.7. It includes the GDAL geo-spatial library, which is built into the image |

Making changes to the .env, docker-compose.yml or other build files

In order for changes you have made to these build files to propagate it's necessary to refresh stale containers and restart the container that you changed.

  1. Make changes
  2. docker-compose pull to refresh the stale containers
  3. docker-compose --up --build -d <mutated-container(usually app> to rebuild the stale container

Notes:

  • Some environment variables can be taken from the pubdev environment on Heroku. For example, AWS_S3_ACCESS_KEY_ID and AWS_S3_SECRET_ACCESS_KEY should be taken from Heroku pubdev to ensure that s3 direct uploads work locally
  • If you choose to enable livereload, you will need to ensure that your default site points to localhost:
$ docker-compose run app python3 manage.py shell
$ from django.contrib.sites.models import Site
$ s = Site.objects.first()
$ s.domain = 'localhost'
$ s.save()
$ exit()

Migration to ECS Notes:

  • Environment variables now live in 2 places: the settings.$env file for public values, and the AWS Parmeter Store for secrets. Secrets are accessed via chamber. Anything needed by the CI pipeline can be fetched from the Parameter Store via 'chamber read'. E.g.: aws ssm get-parameters --region us-west-1 --names /wagcontent/django_secret_key --with-decryption --query 'Parameters[].Value' --output text
  • Python runtime environments now use the dotenv package to load the contents of the .env file to python's environment. This file contains non-secrets and gets baked into the docker image that's pushed to ECR.
  • The python runtime environment is built using the Dockerfile in docker/ci.