In an Article I, we have built the “skeleton” for our future SPA. We have already mounted PostgreSQL, Redis, Nginx, and Adminer (tool for DB management) containers. In this part, we will develop container with python code for /api/* routes and cover it with tests.
cd ~
git clone git@github.com:v-kolesov/vue-sanic-spa.git
cd vue-sanic-spa
docker-compose up -d
and run a browser with the http://localhost.
include .env
up:
docker-compose up -d
upb:
docker-compose up -d --force-recreate --build
stop:
docker-compose stop
db:
export PGPASSWORD=${POSTGRES_PASSWORD}; docker exec -it test_db psql -U ${POSTGRES_USER} ${POSTGRES_DB}
r:
docker exec -it test_redis /usr/local/bin/redis-cli
test:
docker exec -it test_api pytest
b:
docker exec -it $(c) /bin/bash
So with that commands it is possible to save time:
sanic
asyncio_redis
asyncpg
passlib
pytest
marshmallow
aiohttp
Before we start to make the logic of API, we should create a minimal working Sanic application put it into the container and setting up route proxy for Nginx. I have got the “hello world” example here make some minor changes and save it to app/run.py:
import os
from sanic.response import json
import application
app = application.create('api')
@app.route("/")
async def test(request):
return json({"hello": "world!"})
if __name__ == "__main__":
debug_mode = os.getenv('API_MODE', '') == 'dev'
app.run(
host='0.0.0.0',
port=8000,
debug=debug_mode,
access_log=debug_mode
)
Also, we need a container for our API. So place the Dockerfile with
FROM python:3.6.7-slim-stretch
WORKDIR /app
RUN apt-get update
RUN apt-get -y install gcc
COPY requirements.txt /tmp
RUN pip install -r /tmp/requirements.txt
VOLUME [ "/app" ]
EXPOSE 8000
CMD ["python", "run.py"]
into “/api” directory.
api:
container_name: test_api
build:
context: ./api
tty: true
restart: always
volumes:
- "./api:/app"
networks:
- internal
env_file:
- .env
ports:
- "8000:8000"
In the instructions above we:
location /api {
rewrite /api$ / break;
rewrite /api/(.*) /$1 break;
proxy_redirect off;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header Host $http_host;
proxy_pass http://api:8000;
}
Now restart and rebuild containers with “make upb” command and go to the http://localhost/api in a browser. You should see: {“hello”: ”world”}.
import os
import asyncpg
import asyncio_redis
from sanic import Sanic, Blueprint
from views import user
from migration import SCHEMA, LATEST_VERSION
def create(*args, **kwargs):
app = Sanic(*args, **kwargs)
before_server_start(app)
after_server_stop(app)
init_blueprints(app)
return app
def before_server_start(app):
@app.listener('before_server_start')
async def run(app, loop):
dsn='postgres://{user}:{pwd}@db/{database}'.format(
user=os.getenv('POSTGRES_USER'),
pwd=os.getenv('POSTGRES_PASSWORD'),
database=os.getenv('POSTGRES_DB')
)
app.db = await asyncpg.create_pool(dsn, loop=loop, max_size=80)
app.redis = await asyncio_redis.Pool.create(host='redis', poolsize=10)
await db_migrate(app, SCHEMA, LATEST_VERSION)
def after_server_stop(app):
@app.listener('after_server_stop')
async def run(app, loop):
await app.db.close()
app.redis.close()
def init_blueprints(app):
v1_0 = Blueprint('api', url_prefix='/v1.0')
v1_0.add_route(user.Auth.as_view(), '/user/auth')
app.blueprint(v1_0)
async def db_migrate(app, schema, version):
async with app.db.acquire() as conn:
result = await conn.fetchval(
"""
select table_name from information_schema.tables
where table_schema='public' and table_name = $1
""", 'versions'
)
if result is None:
num = 0
else:
num = await conn.fetchval("select id from versions limit 1")
async with conn.transaction():
while version > num:
num += 1
for sql in schema[num]['up']:
await conn.execute(*sql)
await conn.execute('update versions set id=$1', num)
while version < num:
for sql in schema[num]['down']:
await conn.execute(*sql)
num -= 1
if num > 0:
await conn.execute('update versions set id=$1', num)
In application.py was defined “create” function (pattern fabric). This function returns an instance of Sanic application. Pay attention for lines 20–23. In those lines we concatenate Data Source Name (DSN) string and use some container’s environment variables. As you can see, we hard-coded “db” as an alias for the host database. It should be the same name as PostgreSQL container was named in docker-compose.yml file in section services. Check that host for redis connection is also hard-coded in the same way (line 26).
If docker containers are belong to the same network (in our case both is in “internal” network) docker automatically resolve “DSN” to proper IP address
Pay attention to db_migrate function. Of course, there is a lot of solution for database migration purpose out there. Most popular in Python world is Alembic. But when we get a deal with microservice architecture, I think simpler solutions are better and faster. That is why I don’t have used Sqlalchemy in this case. So, we have migration.py:
import os
from passlib.handlers.pbkdf2 import pbkdf2_sha256
LATEST_VERSION = 2
SCHEMA = {
2: {
'up': [
["""
CREATE TABLE users(
id BIGSERIAL PRIMARY KEY,
email VARCHAR (64) UNIQUE NOT NULL,
password TEXT NOT NULL,
created_on TIMESTAMP default (now() at time zone 'utc'),
confirmed BOOLEAN default false
);"""
],
["INSERT INTO users (email, password) values ($1, $2)",
os.getenv('API_ADMIN_EMAIL'),
pbkdf2_sha256.hash(os.getenv('API_ADMIN_PASSWORD'))
],
["CREATE INDEX idx_users_email ON users(email);"],
],
'down': [
["DROP INDEX idx_users_email"],
["DROP TABLE users;"],
]
},
1: {
'up': [
["CREATE TABLE versions( id integer NOT NULL);"],
["INSERT INTO versions VALUES ($1);", 1]
],
'down': [
["DROP TABLE versions;"],
]
}
}
A key feature of the migration mechanism is the value of LATEST_VERSION. When our application will start, LATEST_VERSION value would be compared to the previous version that stored in the database table. If it greater then application runs all SQL-instructions from key “up” and, runs all SQL-instructions from key “down” otherwise.
The logic of authorization is plain:
import ujson
import uuid
from marshmallow.validate import Length
from passlib.handlers.pbkdf2 import pbkdf2_sha256
from sanic.views import HTTPMethodView
from marshmallow import Schema, fields
from sanic.response import json
class UserSchema(Schema):
id = fields.Integer(required=True)
email = fields.Email(required=True)
password = fields.String(required=True, validate=[Length(min=4)])
class Auth(HTTPMethodView):
async def get(self, request):
return json({"hello": "world"})
async def post(self, request):
res, errs = UserSchema(exclude=['id']).load(request.json)
if errs:
return json({"valid": False, "data": errs}, status=400)
async with request.app.db.acquire() as conn:
_user = await conn.fetchrow('''
SELECT * FROM users WHERE email=$1
''', res['email'])
if not (
_user and
pbkdf2_sha256.verify(res['password'], _user['password'])
):
return json({
"valid": False,
"data": 'Wrong email or password'
}, status=401)
data = UserSchema(exclude=['password']).dump(_user).data
token = uuid.uuid4().hex
await request.app.redis.set(token, ujson.dumps(data))
return json({"valid": True, "data": {"access_token": token}})
Some integration tests expose here:
import os
def test_user_auth_successfully(app):
data = {
'email': os.getenv('API_ADMIN_EMAIL'),
'password': os.getenv('API_ADMIN_PASSWORD')
}
req, res = app.test_client.post('/v1.0/user/auth', json=data)
assert res.status == 200
assert res.json['valid']
assert 'access_token' in res.json['data']
assert res.json['data']['access_token'] is not None
It is hard to place each piece of code here, so I have created a pull request for all changes that was done in this article compared to the previous one.
To be continued . ..