nginx – Failed to load resource: the server responded with a status of 502 (Bad Gateway)

I recently added a reverse proxy to my website to make websockets connections over SSL, however, I am experiencing a 502 Bad Gateway issue. Whenever I open the website, css files, js and images are not loaded, and when opening the Google console, there are 502 bad Gateway errors, so this seems to be a problem in my reverse proxy configuration, how can I resolve this?


net::ERR_ABORTED 502 (Bad Gateway): it appears because the page can’t open js or
css files

Failed to load resource: the server responded with a status of 502
(Bad Gateway): it appears because the website doesn’t shows the
images, it is a loop in the Google Console

Nginx Configuration:

upstream websocketserver {
        server localhost:8080;

server {
    root /var/www/;
    index index.html index.php index.htm index.nginx-debian.html;


    location ~ .php$ {
                include snippets/fastcgi-php.conf;
                fastcgi_pass unix:/run/php/php7.4-fpm.sock;

    listen 443;
    ssl on;
    ssl_certificate /etc/letsencrypt/live/;
    ssl_certificate_key /etc/letsencrypt/live/;
    include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
    ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot

    access_log /var/log/wss-access-ssl.log;
    error_log /var/log/wss-error-ssl.log;
    location / {
                proxy_pass http://websocketserver;
                proxy_http_version 1.1;
                proxy_set_header Upgrade $http_upgrade;
                proxy_set_header Connection "upgrade";
                proxy_set_header Host $host;

                proxy_set_header X-Real-IP $remote_addr;
                proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
                proxy_set_header X-Forwarded-Proto https;
                proxy_read_timeout 86400; # neccessary to avoid websocket timeout disconnect
                proxy_redirect off;


server {
    if ($host = {
        return 301 https://$host$request_uri;
    } # managed by Certbot

    if ($host = {
        return 301 https://$host$request_uri;
    } # managed by Certbot

        listen 80;
        listen (::):80;

   return 404;  # managed by Certbot


What can I do to solve it? Is something wrong in my Nginx Configuration?

php – Is it bad to let a file be downloaded using its direct path on WordPress?

I have a scenario that I want to let people that visit my WordPress website download some PDFs. These PDFs go to my website as products of the Woocommerce plugin, so they are stored on the folder uploads/woocommerce_uploads.
At the moment, I don’t want people to go through all the process of buying a free product. If the user is already logged in, and has the necessary privileges, I show a custom download button, which points directly to the PDF’s path in the uploads folder. Something like this:

Does this direct link to a PDF files as a way to let people download them open any vulnerability or harm in any way my WordPress website?

jpeg – Why are thumbnails bad, when jpgs are good, with raw processed with darktable?

Your image contains an embedded color profile for “PQ Rec2020 RGB”. According to the Exif, Darktable appears to be the source of the profile. (See ExifTool output below.)

Some programs ignore embedded color profiles, which results in the washed out appearance of the thumbnails. For better results, select “sRGB (web-safe)” as the output profile for export. Be aware, there are two locations where the output profile may be set. One is set on a per image basis. The other is in the export options. Make sure they are both set correctly.

In the export settings, if you select “image settings” it should use whatever is set for each image. If you put something else, it would over-ride the image settings. Based on information you’ve provided at the Darktable github site, it appears the images were set for sRGB output, but the export was set for Rec2020, so Rec2020 over-rode the sRGB setting.

If you are meticulous about checking the per image settings, it would be fine to export to “image settings”. If you’re not sure, it’s safer to set it to “sRGB”. The only time I would use something else is for 16-bit TIFF output that I’d want to edit in another program. Then when done editing, I would still export to sRGB JPGs.

Another scenario in which it might make sense to export to a different profile is if you are absolutely certain all of your intended output devices are compatible with a different profile, and you either don’t care or are certain that everything else is properly color managed. If I were using 100% Apple devices and high-gamut monitors, I would consider using an Apple Display P3 profile, since that is what iPhone already uses by default. But it could cause problems similar to what you’ve experienced for non-Apple users (eg, images posted to social media).

If changing the output profile does not correct the issue, consider using a different raw-processing program, such as RawTherapee.

darktable sidebar
darktable export

Here is the output from ExifTool regarding the embedded profile of your image.

(ICC_Profile)   Profile CMM Type                : Little CMS
(ICC_Profile)   Profile Version                 : 2.1.0
(ICC_Profile)   Profile Class                   : Display Device Profile
(ICC_Profile)   Color Space Data                : RGB
(ICC_Profile)   Profile Connection Space        : XYZ
(ICC_Profile)   Profile Date Time               : 2020:09:23 19:05:34
(ICC_Profile)   Profile File Signature          : acsp
(ICC_Profile)   Primary Platform                : Apple Computer Inc.
(ICC_Profile)   CMM Flags                       : Embedded, Independent
(ICC_Profile)   Device Manufacturer             : 
(ICC_Profile)   Device Model                    : 
(ICC_Profile)   Device Attributes               : Reflective, Glossy, Positive, Color
(ICC_Profile)   Rendering Intent                : Perceptual
(ICC_Profile)   Connection Space Illuminant     : 0.9642 1 0.82491
(ICC_Profile)   Profile Creator                 : Little CMS
(ICC_Profile)   Profile ID                      : 0
(ICC_Profile)   Profile Description             : PQ Rec2020 RGB
(ICC_Profile)   Profile Copyright               : Public Domain
(ICC_Profile)   Media White Point               : 0.3127 0.32899 1
(ICC_Profile)   Chromatic Adaptation            : 1.04788 0.02292 -0.05022 0.02959 0.99048 -0.01707 -0.00925 0.01508 0.75168
(ICC_Profile)   Red Matrix Column               : 0.67348 0.27904 -0.00194
(ICC_Profile)   Blue Matrix Column              : 0.12505 0.04561 0.79684
(ICC_Profile)   Green Matrix Column             : 0.16566 0.67534 0.02998
(ICC_Profile)   Red Tone Reproduction Curve     : (Binary data 8204 bytes, use -b option to extract)
(ICC_Profile)   Green Tone Reproduction Curve   : (Binary data 8204 bytes, use -b option to extract)
(ICC_Profile)   Blue Tone Reproduction Curve    : (Binary data 8204 bytes, use -b option to extract)
(ICC_Profile)   Chromaticity Channels           : 3
(ICC_Profile)   Chromaticity Colorant           : Unknown (0)
(ICC_Profile)   Chromaticity Channel 1          : 0.70799 0.29201
(ICC_Profile)   Chromaticity Channel 2          : 0.17 0.797
(ICC_Profile)   Chromaticity Channel 3          : 0.131 0.04601
(ICC_Profile)   Media Black Point               : 0 0 0
(ICC_Profile)   Device Model Desc               : PQ Rec2020 RGB
(ICC_Profile)   Device Mfg Desc                 : Darktable

Charging Macbook Pro with lower wattage from display is bad for battery health?

I have a 16″ MBP and a external display with thunderbolt 3 capabilities. The display can charge devices with 85w, but the MBP needs 97ish. The question here is: is it bad for the battery to charge the computer and use it at the same time if you are consuming more energy that the charger could deliver?

jpeg – good jpg, bad thumbnails got from it

I’m using darktable to process my raw photos and save the final jpg’s. I’m on ubuntu linux.

From some of the jpg’s, gthumb gets thumbnails which are far lighter than the original jpg, and the same problem happens with nautilus and with the pillow python library.

Here is one of the photos, hoping to understand what’s the problem with it.This jpg appears to be good, but thumbnails got from it are lighter than the original jpg

Here is how the thumbnail looks like:

lighter thumbnail

code quality – Methods that receive buffer objects AND return another Object – is that bad design?

The Argument

Some say that if you write a method that receives a buffer, iT MUST RETURN VOID -> the buffer is your exit point.
Do not abuse the methods by receiving buffers AND returning a another object.

Example (BAD):
Object myMethod(String param1, String param2, Map<K,V> bufferMap)

Example (Better):
void myMethod(String param1, String param2, Map<K,V> bufferMap, Object obj)

The Counter Argument

Some say that this is fine and should not be a problem.

The Question

  • Is that a design problem ?
  • (If yes) What issuesimpacts could this kind of design have ?

Docker : Gitlab fails to run behind Traefik : 502 Bad gateway

I’m posting here, because I’m searching to self-host my personnal website (a wordpress) and sources codes of my others projects (a gitlab instance), with the help of Traefik reverse-proxy’s.

Currently, when I try to visit the differents softwares as follow :

  • (https://) : Dashboard of Traefik, it’s OK
  • (https://) : my wordpress website, it’s OK.

But, when I try to visit :

  • (https://) : the webUI of Gitlab, I’ve got the error : 502 Bad Gateway.

But after manies attempts, if I visit again the gitlab webUI, after the end of the installation or after severals minutes (avg. 15 minutes), I’ve the same error : 502 Bad gateway.

If anybody have an idea , can you help me ?

This is my configuration file (docker-compose.yml) :

    version: "3.7"
        image: "traefik:latest"
        container_name: "traefik"
        restart: always
          - webgateway
      - "80:80"
      - "443:443"
      - /var/run/docker.sock:/var/run/docker.sock:ro
      - /srv/labs/traefik/traefik.toml:/etc/traefik/traefik.toml:ro
      - /srv/labs/traefik/acme.json:/acme.json
      - /srv/labs/traefik/traefik_dynamic.toml:/etc/traefik/traefik_dynamic.toml:ro
      # http
      traefik.enable: "true"
      traefik.http.routers.traefik.rule: "Host(``)"
      traefik.http.routers.traefik.entrypoints: "web"
      traefik.http.routers.traefik.service: "api@internal"
      # https
      traefik.http.middlewares.https-redirect.redirectscheme.scheme: "https"
      traefik.http.middlewares.https-redirect.redirectscheme.permanent: "true"
      traefik.http.routers.traefik.middlewares: "https-redirect@docker"

      traefik.http.routers.traefik-https.entrypoints: "websecure"
      traefik.http.routers.traefik-https.rule: "Host(``)"
      traefik.http.routers.traefik-https.tls: "true"
      traefik.http.routers.traefik-https.tls.certresolver: "letsencrypt"

      traefik.http.routers.traefik-https.middlewares: "dashboard-auth,security@file, compression@file"
      # traefik dashboard auth
      traefik.http.routers.traefik.middlewares: "dashboard-auth"
      # traefik dashboard credentials
      traefik.http.middlewares.dashboard-auth.basicauth.users: "login:$$apr1$$XFLC8oLD$$tufQCjkmmNkXfL.cm96E90"

    container_name: mariadb
    image: mariadb:latest
      - wp
      - wp_db/:/var/lib/mysql/
    restart: always
      MYSQL_DATABASE: wordpress
      MYSQL_USER: wordpress
      MYSQL_PASSWORD: oC1rieph

      - db
    container_name: "wordpress"
    image: wordpress:latest
      - wp
      - webgateway
      - 8000:80
      - wp_statics:/var/www/html/

    restart: always
      WORDPRESS_DB_HOST: db:3306
      WORDPRESS_DB_USER: wordpress

      traefik.enable: "true" "Host(``)" "web"

      traefik.http.middlewares.https-redirect.redirectscheme.scheme: "https"
      traefik.http.middlewares.https-redirect.redirectscheme.permanent: "true" "https-redirect@docker" "websecure" "Host(``)" "true" "letsencrypt" "security@file, compression@file"

    container_name: "gitlab"
    hostname: ''
    image: 'gitlab/gitlab-ce:latest'
    restart: always
      - webgateway
      - '2200:22'

        external_url ''
        gitlab_rails('gitlab_shell_ssh_port') = 2200

      - '/srv/gitlab/config:/etc/gitlab:Z'
      - '/srv/gitlab/logs:/var/log/gitlab:Z'
      - '/srv/gitlab/data:/var/opt/gitlab:Z'
      - '/etc/localtime:/etc/localtime:ro'

      traefik.enable: "true"
      traefik.http.routers.gitlab-https.entrypoints: "websecure"
      traefik.http.routers.gitlab-https.rule: "Host(``)"
      traefik.http.routers.gitlab-https.tls: "true"
      traefik.http.routers.gitlab-https.tls.certresolver: "letsencrypt"
      traefik.http.routers.gitlab-https.middlewares: "security@file, compression@file"
      traefik.http.routers.gitlab.rule: "Host(``)"
      traefik.http.routers.gitlab.entrypoints: "websecure"
      traefik.http.routers.gitlab.tls.certresolver: "letsencrypt"
      traefik.http.routers.gitlab.middlewares: "gitlab-headers"
      traefik.http.routers.gitlab.service: "gitlab"
      traefik.http.middlewares.gitlab-headers.headers.customrequestheaders.X_FORWARDED_PROTO: "https"
      traefik.http.middlewares.gitlab-headers.headers.customrequestheaders.X_Forwarded-Ssl=: "on"
      traefik.http.middlewares.gitlab-headers.headers.customresponseheaders.X_FORWARDED_PROTO: "https"
      traefik.http.middlewares.gitlab-headers.headers.customresponseheaders.X_Forwarded-Ssl: "on" "80"

      traefik.http.routers.gitlab-registry.rule: "Host(``)"
      traefik.http.routers.gitlab-registry.entrypoints: "websecure"
      traefik.http.routers.gitlab-registry.tls.certresolver: "letsencrypt"
      traefik.http.routers.gitlab-registry.service: "gitlab-registry" "5000"
      - SYS_ADMIN

    driver: bridge

    driver: local
      o: bind
      type: none
      device: /srv/mysql

    driver: local
      o: bind
      type: none
      device: /srv/wordpress/www

This is my hardware configuration :

  • OS : Debian 10 x64
  • CPU : Intel Celeron N3450
  • Memory : 4Go DDR3
  • Storage : SDD 128Go
  • Network : avg. 300Mbps

encryption – Is storing passwords in encrypted container still bad practice?

Suppose I need to store a bunch of passwords: storing them in a txt or odt file inside an encrypted container, such as the ones produced by VeraCrypt, is bad practice? Would I put myself at risk?

If so what are my other free options? (So no password managers) Do I need to also hash my passwords before storing them in the encrypted container? Would a physical copy of the list be more secure? Or is there some other option that I didn’t consider? (Of course the option of just remembering them is not implementable)