Part II. Dockerize SPA based on Nginx +VueJs+Python-Sanic. Python API.

In this article we build and adjust to container a simple API using asynchronous python framework Sanic.

In an Article I, we have built the “skeleton” for our future SPA. We have already mounted PostgreSQL, Redis, Nginx, and Adminer (tool for DB management) containers. In this part, we will develop container with python code for /api/* routes and cover it with tests.

TL;DR

If you have no time to read, just exec next bash commands:
cd ~
git clone git@github.com:v-kolesov/vue-sanic-spa.git
cd vue-sanic-spa
docker-compose up  -d
and run a browser with the http://localhost.

Briefly let’s talk about what we will do in this part.

  • Our back-end should accept POST request for /api/v1.0/user/auth route with data contains username and password. If the user exists in the database, it should create unique token key save user.id to Redis and give this token in JSON response.
  • If requested data is invalid back-end should return proper error(s) in JSON format.
  • Stage 1: Before we will start…

    On developing stage there are a lot of things that can make you “bored”. As an instance when you have a deal with docker you should type mass of commands that is a bit “talkative”. I have found useful to apply "make" utility for making short alias for long commands. In the root directory of the project I have placed next Makefile:
    include .env
    up:
    	docker-compose up -d
    upb:
    	docker-compose up -d --force-recreate --build
    stop:
    	docker-compose stop
    db:
    	export PGPASSWORD=${POSTGRES_PASSWORD}; docker exec -it test_db psql -U ${POSTGRES_USER} ${POSTGRES_DB}
    r:
    	docker exec -it test_redis  /usr/local/bin/redis-cli
    test:
    	docker exec -it test_api pytest 
    b:
    	docker exec -it $(c) /bin/bash
    So with that commands it is possible to save time:
    • make up — just start all containers, whereas make upb —start and rebuild them;
    • make stop — obviously will stop containers;
    • make db — the way how to connect to working “db” container and automatically connect to the database with the psql utility. We include all variables from .env file, so we can use all of them in instructions;
    • make r — quickest way to get access to redis storage with redis-cli;
    • make test — runs all unit/integrity tests directly inside the container;
    • make b — is a way quickly connect to bash console for a specific container. (an example: make b c=test_api will connect to bash console of the container with name “test_api”).
    • Stage 2: Creating minimal API with Sanic-Framework

      So create api/requirements.txt file:
      sanic
      asyncio_redis
      asyncpg
      passlib
      pytest
      marshmallow
      aiohttp
      Before we start to make the logic of API, we should create a minimal working Sanic application put it into the container and setting up route proxy for Nginx. I have got the “hello world” example here make some minor changes and save it to app/run.py:
      import os
      from sanic.response import json
      
      import application
      
      app = application.create('api')
      
      @app.route("/")
      async def test(request):
          return json({"hello": "world!"})
      
      if __name__ == "__main__":                
          debug_mode =  os.getenv('API_MODE', '') == 'dev'   
      
          app.run(
              host='0.0.0.0',
              port=8000,
              debug=debug_mode, 
              access_log=debug_mode
          )
      Also, we need a container for our API. So place the Dockerfile with
      FROM python:3.6.7-slim-stretch
      WORKDIR /app
      RUN apt-get update
      RUN apt-get -y install gcc
      COPY requirements.txt /tmp
      RUN pip install -r /tmp/requirements.txt
      VOLUME [ "/app" ]
      EXPOSE 8000
      CMD ["python", "run.py"]
      into “/api” directory.

      Clarification for Dockerfile

      In Dockerfile we have got the latest official stable image of python3.6 and added some software that would be used by some python packages. Inside the container, we have defined the “/app” directory as “workdir”. Also, we have given instruction to docker to execute command “python run.py” at the moment when the container will start.

      Stage 3: add new API container to the docker-compose.yml

      We should add some lines into the section “services” of our main docker-compose.yml file:
      api: 
          container_name: test_api
          build: 
            context: ./api
          tty: true
          restart: always
          volumes: 
            - "./api:/app"    
          networks:      
            - internal
          env_file:
            - .env
          ports:
            - "8000:8000"
      In the instructions above we:
      • line 5 makes possible for us in the future to be attached and detached (with CTRL+C) from main docker process (python run.py ) for debugging purpose;
      • line 8 “shares” content of “api” directory on the host machine with /app directory on the container;
      • line 12 makes available the environment variables from .env file to the application inside the container.

      Stage 4: Tuning nginx for API.

      Add next block of Nginx-configuration into nginx/server.conf file:
      location /api {            
              rewrite /api$     /    break;  
              rewrite /api/(.*) /$1  break;  
              proxy_redirect     off;
              proxy_set_header   Host                 $host;
              proxy_set_header   X-Real-IP            $remote_addr;
              proxy_set_header   X-Forwarded-For      $proxy_add_x_forwarded_for;
              proxy_set_header   X-Forwarded-Proto    $scheme;
              proxy_set_header Host $http_host;
              proxy_pass http://api:8000;
          }
      Now restart and rebuild containers with “make upb” command and go to the http://localhost/api in a browser. You should see: {“hello”: ”world”}.

      Stage 5: Developing API and database migration hook.

      Into api directory place application.py:
      import os
      import asyncpg
      import asyncio_redis
      from sanic import Sanic, Blueprint
      from views import user
      from migration import SCHEMA, LATEST_VERSION
      
      
      def create(*args, **kwargs):
          app = Sanic(*args, **kwargs)
          before_server_start(app)
          after_server_stop(app)
          init_blueprints(app)
          return app
      
      
      def before_server_start(app):
          @app.listener('before_server_start')
          async def run(app, loop):
              dsn='postgres://{user}:{pwd}@db/{database}'.format(
                  user=os.getenv('POSTGRES_USER'),
                  pwd=os.getenv('POSTGRES_PASSWORD'),
                  database=os.getenv('POSTGRES_DB')
              )
              app.db = await asyncpg.create_pool(dsn, loop=loop, max_size=80)
              app.redis = await asyncio_redis.Pool.create(host='redis', poolsize=10)
              await db_migrate(app, SCHEMA, LATEST_VERSION)
      
      
      def after_server_stop(app):
          @app.listener('after_server_stop')
          async def run(app, loop):
              await app.db.close()
              app.redis.close()
      
      def init_blueprints(app):
          v1_0 = Blueprint('api', url_prefix='/v1.0')
          v1_0.add_route(user.Auth.as_view(), '/user/auth')
          app.blueprint(v1_0)
      
      
      async def db_migrate(app, schema, version):
          async with app.db.acquire() as conn:
              result = await conn.fetchval(
                  """
                  select table_name from information_schema.tables
                  where table_schema='public' and table_name = $1
                  """, 'versions'
              )
              if result is None:
                  num = 0
              else:
                  num = await conn.fetchval("select id from versions limit 1")
      
              async with conn.transaction():
                  while version > num:
                      num += 1
                      for sql in schema[num]['up']:
                          await conn.execute(*sql)
      
                      await conn.execute('update versions set id=$1', num)
      
                  while version < num:
                      for sql in schema[num]['down']:
                          await conn.execute(*sql)
                      num -= 1
                      if num > 0:
                          await conn.execute('update versions set id=$1', num)
      In application.py was defined “create” function (pattern fabric). This function returns an instance of Sanic application. Pay attention for lines 20–23. In those lines we concatenate Data Source Name (DSN) string and use some container’s environment variables. As you can see, we hard-coded “db” as an alias for the host database. It should be the same name as PostgreSQL container was named in docker-compose.yml file in section services. Check that host for redis connection is also hard-coded in the same way (line 26).

      If docker containers are belong to the same network (in our case both is in “internal” network) docker automatically resolve “DSN” to proper IP address

      Pay attention to db_migrate function. Of course, there is a lot of solution for database migration purpose out there. Most popular in Python world is Alembic. But when we get a deal with microservice architecture, I think simpler solutions are better and faster. That is why I don’t have used Sqlalchemy in this case. So, we have migration.py:

      import os
      from passlib.handlers.pbkdf2 import pbkdf2_sha256
      
      LATEST_VERSION = 2
      
      SCHEMA = {
          2: {
              'up': [
                  ["""
                      CREATE TABLE users(
                       id BIGSERIAL PRIMARY KEY,
                       email VARCHAR (64) UNIQUE NOT NULL,
                       password TEXT NOT NULL,
                       created_on TIMESTAMP default (now() at time zone 'utc'),
                       confirmed BOOLEAN default false
                      );"""
                   ],
                  ["INSERT INTO users (email, password) values ($1, $2)",
                      os.getenv('API_ADMIN_EMAIL'),
                      pbkdf2_sha256.hash(os.getenv('API_ADMIN_PASSWORD'))
                   ],
                  ["CREATE INDEX idx_users_email ON users(email);"],
              ],
              'down': [
                  ["DROP INDEX idx_users_email"],
                  ["DROP TABLE users;"],
              ]
          },
          1: {
              'up': [
                  ["CREATE TABLE versions( id integer NOT NULL);"],
                  ["INSERT INTO versions VALUES ($1);", 1]
              ],
              'down': [
                  ["DROP TABLE versions;"],
              ]
          }
      }
      A key feature of the migration mechanism is the value of LATEST_VERSION. When our application will start, LATEST_VERSION value would be compared to the previous version that stored in the database table. If it greater then application runs all SQL-instructions from key “up” and, runs all SQL-instructions from key “down” otherwise.

      The logic of authorization is plain:

      import ujson
      import uuid
      from marshmallow.validate import Length
      from passlib.handlers.pbkdf2 import pbkdf2_sha256
      from sanic.views import HTTPMethodView
      from marshmallow import Schema, fields
      from sanic.response import json
      
      class UserSchema(Schema):
          id = fields.Integer(required=True)
          email = fields.Email(required=True)
          password = fields.String(required=True, validate=[Length(min=4)])
      
      
      class Auth(HTTPMethodView):
          async def get(self, request):
              return json({"hello": "world"})
      
          async def post(self, request):
              res, errs = UserSchema(exclude=['id']).load(request.json)
              if errs:
                  return json({"valid": False, "data": errs}, status=400)
      
      
              async with request.app.db.acquire() as conn:
                  _user = await conn.fetchrow('''
                  SELECT * FROM users WHERE email=$1
                  ''', res['email'])
      
              if not (
                      _user and
                      pbkdf2_sha256.verify(res['password'], _user['password'])
              ):
                  return json({
                      "valid": False,
                      "data": 'Wrong email or password'
                  }, status=401)
      
      
              data = UserSchema(exclude=['password']).dump(_user).data
      
              token = uuid.uuid4().hex
      
              await request.app.redis.set(token, ujson.dumps(data))
      
              return json({"valid": True, "data": {"access_token": token}})

      Some integration tests expose here:

      import os
      
      def test_user_auth_successfully(app):
          data = {
              'email': os.getenv('API_ADMIN_EMAIL'),
              'password': os.getenv('API_ADMIN_PASSWORD')
          }
          req, res = app.test_client.post('/v1.0/user/auth', json=data)
      
          assert res.status == 200
          assert res.json['valid']
          assert 'access_token' in res.json['data']
          assert res.json['data']['access_token'] is not None
      It is hard to place each piece of code here, so I have created a pull request for all changes that was done in this article compared to the previous one.

      To be continued . ..