Restoring a SQL file to a dockerized Postgresql server

Fotos grátis de Namibia
If you are working with a dockerized Postgresql, perhaps, you need to restore a SQL file to a database. It is a simple task, so here are a few steps as a suggestion.

Create a container

docker run -e POSTGRES_PASSWORD=<your_passwd/> postgres

Inspect the container

docker inspect <container>

In the out produced by docker inspect look for the informations about the container’s volume. Here is a example:

    {
        "Type": "volume",
        "Name": "your_container",
        "Source": "/var/lib/docker/volumes/your_container/_data",
        "Destination": "/data",
        "Driver": "local",
        "Mode": "rw",
        "RW": true,
        "Propagation": ""
    },

Copy the SQL file

Using the source property got above, copy the SQL file to the container:

cp database_file.sql /var/lib/docker/volumes/<your_container>/_data

Restore the database

Now, in order to restore the database, you need to access the container’s shell:

docker exec -it <container> bash

Inside the container, you can use the appropriate tool to restore your database. Eg.: psql or pgrestore

psql -U <user> -W -f <path_for_your_file/banco.sql> database

That’s it! :-) Database restored.

Looking for the SQL file

If you get hard to find the SQL file, consider look for it at:

/var/lib/postgresql/data

References:

More details at: https://simkimsia.com/how-to-restore-database-dumps-for-postgres-in-docker-container/

Running several services in a single container.

Doca, Recipiente, Exportar, Carga, Frete, Envio, Porto

Docker was idealized for running a single service at once. The principal advantage of that is the isolation: you can keep several applications isolated one from the other.
But, it is possible to run several services in a single container with the supervisor. Supervisor is a client/server system to control processes in *nix-like systems.
The following example creates a single container with two services different from each other (the code is the same) controlled by supervisor.
You can download the code at: https://github.com/t3rcio/docker-supervisor

Dockerfile

#Dockerfile
FROM python:3.9

ENV PYTHONUNBUFFERED=1

RUN apt-get update && apt-get install -y supervisor
RUN mkdir -p /var/log/supervisor

WORKDIR /code
COPY . /code
COPY supervisord.conf /etc/supervisor/conf.d/supervisord.conf
RUN mkdir logs

RUN supervisord -c /etc/supervisor/conf.d/supervisord.conf &

docker-compose.yml

#docker-compose.yml
version: "3.9"

services:
   supervisor:
      build: .
      restart: always
      command: /usr/bin/supervisord
      volumes:
         - .:/code

Put the code above in a directory, and execute:

user@user-pc:~$ docker-compose -f docker-compose.yml -up

With that, you just made the container.
To verify whether the services are running, execute:

user@user-pc:~$ tail -f out1.log
(...)
user@user-pc:~$ tail -f out2.log

That’s it :-) With this technique, you can run several services in a single container; you can, for example, have a task scheduler and a task consumer.

References

https://docs.docker.com/config/containers/multi-service_container/

https://github.com/t3rcio/docker-supervisor#:~:text=Docker%20Tutorial%20%3D%3E%20Dockerfile%20%2B%20supervisord.conf

Serving static files with wsgi

Deploy Python/Django apps is not a hard task. However, sometimes some problems can show: deploy static files.

Django’s manual offers a detailed set of instructions to copy and deploy static files on production environments: using STATIC_URL and STATIC_ROOT variables on settings.py. More details here: https://docs.djangoproject.com/en/3.0/howto/static-files/

Yet, you will set the webserver to serve files from paths on settings.py. Follow the instructions here.

Deploy on shared hosting environments

However, the scenery can be more difficult when you don’t have access to web server config files. It is a reality on shared hosts. To solve this problem you can consider using the wsgi file: the config file that sets the interface between webserver and Django app.

It is true that Django’s manual affirms that Django does not serve static files. But there are several developers that use the wsgi to serve these files. We can do this in two ways:
1 – dj-static: https://pypi.org/project/dj-static/
2 – withenoise: http://whitenoise.evans.io/en/stable/

Using dj-static

To perform the installation of dj-static package use the follow commands:

pip install dj-static

Perhaps in your hosting server, you need permission to install packages. To solve this, we can use:

pip install dj-static --user

This command will install the package in your /home directory.
After the installation, it is time to config the WSGI file. Open the WSIG file and add the following lines:

from django.core.wsgi import get_wsgi_application
from dj_static import Cling
(…)
application = Cling(get_wsgi_application())

Finally, you can test your site by accessing it. If your static files doest not loaded, verify the STATIC_URL and STATIC_ROOT variables onsettings.py.

Have you ever found this problem in shared hosting? What solution you have applied to?

References

WSGI: https://wsgi.readthedocs.io/en/latest/what.html
PEP333 : https://www.python.org/dev/peps/pep-3333/
WSGI no Django: https://docs.djangoproject.com/pt-br/3.0/howto/deployment/wsgi/

Activating a virtualenv “without hands”

Virtualenv is a awesome tool to Python development.  To easier the development process there is the virtualenvwrapper:  a wrapper that allow you access the your virtualenv shell commands.

However, even with this all easies a problem rises when we deploy our code:

How to activate the virtualenv automatically?

The answer is: using the activate_this.py script.
We can found it on path:

~/.virtualenvs/<your virtualenv>/bin/activate_this.py

The script’s aim is activate the virtualenv python from “outside”. So, your environment will have access to necessary libs to run your application.

Look ma: no hands…!

To use the script put the follow code lines on file that starts your app.

If your app runs on cli (command line app) like a *nix daemon, you put the code line on your file that start your app. In case your app is a web application, then, you put the code on “*wsgi.py” – the file that is used by your webserver.

activate_this = '/path/to/env/bin/activate_this.py'
execfile(activate_this, dict(__file__=activate_this))

Note that lines above should be put on the first lines of the code.

After this, restart your service or your webserver.

Done! Your virtualenv will be start automatically ;-)

Read more:

Virtualenvwrapper – Read the docs: https://virtualenvwrapper.readthedocs.io/en/latest/

Playing with notify command – Postgresql vol. 1

What notify command does

The notify command allows sending notification to specific channels on database system. It is a very simple way to perform communication between several process that needs access a single database.

Working together with triggers the developer can assure that any process knows that specific operations was successful.

A very simple “get started”

To this first step we can use a already existing database. After access yours database shell, run the follow commands:

postgres=# LISTEN test;
LISTEN

Lets get what happened here:

The command LISTEN registers on the current session a “listener” to our notifications. We that listener of “test”. If  there is a listener called “test” on this session nothing happens.

After that, is the moment that send our notification.

postgres=# NOTIFY test;
NOTIFY
Asynchronous notification "test" received from server process with PID 1234. #The PID number will be different on your system

Cool! We just send and receive our first notification direct from database! Look that server register that notification with the follow message:

Asynchronous notification "test" received from server process with PID 1234.

This is very useful.

Example: you can receive a notification on your email when some table on your database receive a new item, or when a existing item on table was updated. To perform that we just create a trigger that run under this events. This trigger, at its time, notifies the event.

Something more complex

Lets increase the complexity of our experiments:

  1. Lets create a database with only one table;
  2. Lets create a trigger that will run after insert on table;
  3. We will register the listener;
  4. The trigger will send the notifications;

The code bellow require more acknowledged about postgres; however it will be simple and easy to follow.

1 – Creating database:

createdb -U postgres -h localhost -W playground
psql playground

p.s.: you can use any name for your database. Choose yours ;-)

2 – Creating the table. We’ll called “table_a”:

playground=# create table table_a (username varchar(512), email varchar(512));

2.1 – Creating  stored procedure to notify the events:

playground=# create function notify_trigger() returns trigger as $$
playground$# declare
playground$# begin
playground$# execute 'NOTIFY ' || TG_TABLE_NAME || '_' || TG_OP;
playground$# return new;
playground$# end;
playground$# $$ language plpgsql;
CREATE FUNCTION

One more time, we’ll get what happened.

1 – On first line we found the code:

playground=# create function notify_trigger() returns trigger as $$

This piece of code creates a function: notify_trigger().

The follow lines perform the variables declarations (if it needs) and indicates where the function’s body really begins.

2 – When it runs, the function will perform the command NOTIFY;

execute 'NOTIFY ' || TG_TABLE_NAME || '_' || TG_OP;

3 – Look the variables TG_TABLE_NAME e TG_OP – these are variables created automatically when database runs the function. The TG_TABLE_NAME stores the table’s name that receive the operation and TG_OP stores the operation’s name that shot the function. The result string will be “table_a_insert” or “table_a_update”. This will be the channel to our listener.

3 – Creating the triggers:

playground=# create trigger table_a_trigger before insert or update on table_a execute procedure notify_trigger();

4 – Lets register the listener (remember of 2.1.3 item above?) :

playground=# LISTEN table_a_insert;

Finally we make a test:

playground=# insert into table_a (username, email) values ('some_thing_cool', 'somethingcool@myemailserver.com');

The server returns:

INSERT 0 1
Asynchronous notification "table_a_insert" received from server process with PID 12345. #The PID number will be different on your system

We have a notification for each insert that run on table_a.

You can perform a test to verify if notify will be sent when a update runs on table_a. Remember that you should first declare the listener to get the update notifications.

This was a little introduction to notify on Postgres. On future post (vol. 2) we’ll see a script python to receive the database notifications.

References:

Sending emails from PostgreSQL using triggers

Sending emails is a need that often appears on web apps development; and, some time we need send emails when a event on DBMS table occurs. This can be done with triggers: sql codes that runs when a insert, update or delete events occurs on a specific table.

However a lot of things can happen with email server and, consequently lock the transaction that is running on the DBMS.

(Some considerations about what kind thing that can happen, can be read here: http://stackoverflow.com/…/how-can-i-send-email-from-postgr…)

To resolve this problems, the PostgreSQL has a great functionality: notify – the sending notifications on channels set by developer. A trigger can be used to notify a channel, and this, in turn, to send a email by script Python.

The http://www.divillo.com describe a great example about this. But nothing is better that psycopg2 driver documentation: http://initd.org/psycopg/docs/advanced.html

It is a great way to send emails on web applications (on DBMS level), and course, it is very fun to code.