Tagged with nginx

deluge-web behind an nginx reverse proxy

My nginx configuration to reverse proxy deluge-web needed a few subtle tweaks before it worked correctly, see below.

~/.config/deluge/web.conf

{
  "file": 1,  
  "format": 1
}{
  "port": 8112, 
  "https": false, 
  "base": "/", 
  ... 
}

/etc/nginx/sites-enabled/deluge.conf

...
  location /deluge/ {
    allow                         127.0.0.1;
    allow                         192.168.1.0/24;
    deny                          all;
    proxy_pass                    http://127.0.0.1:8112/;
    proxy_redirect                off;
    proxy_set_header              Host            $host;
    proxy_set_header              X-Real-IP       $remote_addr;
    proxy_set_header              X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header              X-Deluge-Base   "/deluge/";
    proxy_pass_header             Set-Cookie;
    proxy_pass_header             P3P;
  }
...

The trailing slashes are very important for location, proxy_pass and X-Deluge-Base. Also notice that I left base in ~/.config/deluge/web.conf blank.

With this configuration you will be able to access the Deluge web interface via http://www.example.com/deluge (no trailing slash required). Keep in mind that it will also be accessible directly on port 8112, because you can't make it listen on locahost only yet. Unless you patch it in yourself. So don't forget to add a firewall rule.

Sources

Tagged , ,

nginx User-Agent spoofing to circumvent LinkedIn's blacklisting

I have a few virtual hosts that redirect to my LinkedIn profile thanks to a rewrite rule in nginx:

server {
  server_name  example.com;
  listen       80;

  location / {
    rewrite    ^  http://be.linkedin.com/in/tristanterpelle/;
  }
}

This worked beautifully when I tested it, but today I noticed it was broken. LinkedIn returned a 999 error. After looking around a bit, it turns out that LinkedIn actively blocks HTTP requests from clients with certain User-Agent strings.

 $ curl -I --url https://be.linkedin.com/in/tristanterpelle 
HTTP/1.1 999 Request denied
$  curl -A "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.71 Safari/537.36" -I --url https://be.linkedin.com/in/tristanterpelle
HTTP/1.1 200 OK

OK, so curl's User-Agent is blacklisted, and apparently so is nginx' (wget also fails). Luckily, nginx can spoof its User-Agent thanks to the HttpHeadersMoreModule.

server {
  server_name  example.com;
  listen       80;

  location / {
    rewrite    ^  http://be.linkedin.com/in/tristanterpelle/;
    more_set_input_headers 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.71 Safari/537.36';
  }
}

On Debian, you need to install the nginx-extras package, or you will get a [emerg] unknown directive "more_set_input_headers" error.

Tagged , ,