Restoring a SQL file to a dockerized Postgresql server

Fotos grátis de Namibia
If you are working with a dockerized Postgresql, perhaps, you need to restore a SQL file to a database. It is a simple task, so here are a few steps as a suggestion.

Create a container

docker run -e POSTGRES_PASSWORD=<your_passwd/> postgres

Inspect the container

docker inspect <container>

In the out produced by docker inspect look for the informations about the container’s volume. Here is a example:

        "Type": "volume",
        "Name": "your_container",
        "Source": "/var/lib/docker/volumes/your_container/_data",
        "Destination": "/data",
        "Driver": "local",
        "Mode": "rw",
        "RW": true,
        "Propagation": ""

Copy the SQL file

Using the source property got above, copy the SQL file to the container:

cp database_file.sql /var/lib/docker/volumes/<your_container>/_data

Restore the database

Now, in order to restore the database, you need to access the container’s shell:

docker exec -it <container> bash

Inside the container, you can use the appropriate tool to restore your database. Eg.: psql or pgrestore

psql -U <user> -W -f <path_for_your_file/banco.sql> database

That’s it! :-) Database restored.

Looking for the SQL file

If you get hard to find the SQL file, consider look for it at:



More details at:

Running several services in a single container.

Doca, Recipiente, Exportar, Carga, Frete, Envio, Porto

Docker was idealized for running a single service at once. The principal advantage of that is the isolation: you can keep several applications isolated one from the other.
But, it is possible to run several services in a single container with the supervisor. Supervisor is a client/server system to control processes in *nix-like systems.
The following example creates a single container with two services different from each other (the code is the same) controlled by supervisor.
You can download the code at:


FROM python:3.9


RUN apt-get update && apt-get install -y supervisor
RUN mkdir -p /var/log/supervisor

COPY . /code
COPY supervisord.conf /etc/supervisor/conf.d/supervisord.conf
RUN mkdir logs

RUN supervisord -c /etc/supervisor/conf.d/supervisord.conf &


version: "3.9"

      build: .
      restart: always
      command: /usr/bin/supervisord
         - .:/code

Put the code above in a directory, and execute:

user@user-pc:~$ docker-compose -f docker-compose.yml -up

With that, you just made the container.
To verify whether the services are running, execute:

user@user-pc:~$ tail -f out1.log
user@user-pc:~$ tail -f out2.log

That’s it :-) With this technique, you can run several services in a single container; you can, for example, have a task scheduler and a task consumer.


Serving static files with wsgi

Deploy Python/Django apps is not a hard task. However, sometimes some problems can show: deploy static files.

Django’s manual offers a detailed set of instructions to copy and deploy static files on production environments: using STATIC_URL and STATIC_ROOT variables on More details here:

Yet, you will set the webserver to serve files from paths on Follow the instructions here.

Deploy on shared hosting environments

However, the scenery can be more difficult when you don’t have access to web server config files. It is a reality on shared hosts. To solve this problem you can consider using the wsgi file: the config file that sets the interface between webserver and Django app.

It is true that Django’s manual affirms that Django does not serve static files. But there are several developers that use the wsgi to serve these files. We can do this in two ways:
1 – dj-static:
2 – withenoise:

Using dj-static

To perform the installation of dj-static package use the follow commands:

pip install dj-static

Perhaps in your hosting server, you need permission to install packages. To solve this, we can use:

pip install dj-static --user

This command will install the package in your /home directory.
After the installation, it is time to config the WSGI file. Open the WSIG file and add the following lines:

from django.core.wsgi import get_wsgi_application
from dj_static import Cling
application = Cling(get_wsgi_application())

Finally, you can test your site by accessing it. If your static files doest not loaded, verify the STATIC_URL and STATIC_ROOT variables

Have you ever found this problem in shared hosting? What solution you have applied to?


PEP333 :
WSGI no Django:

Activating a virtualenv “without hands”

Virtualenv is a awesome tool to Python development.  To easier the development process there is the virtualenvwrapper:  a wrapper that allow you access the your virtualenv shell commands.

However, even with this all easies a problem rises when we deploy our code:

How to activate the virtualenv automatically?

The answer is: using the script.
We can found it on path:

~/.virtualenvs/<your virtualenv>/bin/

The script’s aim is activate the virtualenv python from “outside”. So, your environment will have access to necessary libs to run your application.

Look ma: no hands…!

To use the script put the follow code lines on file that starts your app.

If your app runs on cli (command line app) like a *nix daemon, you put the code line on your file that start your app. In case your app is a web application, then, you put the code on “*” – the file that is used by your webserver.

activate_this = '/path/to/env/bin/'
execfile(activate_this, dict(__file__=activate_this))

Note that lines above should be put on the first lines of the code.

After this, restart your service or your webserver.

Done! Your virtualenv will be start automatically ;-)

Read more:

Virtualenvwrapper – Read the docs: