Bill Havanki desu at deszaras.xyz
Sat May 15 15:16:01 BST 2021
- - - - - - - - - - - - - - - - - - -
I agree that simply blocking URLs with “..” won’t solve the problem. The string doesn't always show up in a segment on its own. Also, there are legitimate uses for a “..” segment when URLs are formed relative to a resource deeper down in a hierarchy.
A server should sanitize / normalize every incoming request URL to its simplest, unencoded form. Then it can more easily detect attempts to escape the server’s document root or other shenanigans.
On May 15, 2021, at 9:18 AM, Remco <me at rwv.io> wrote:
2021/05/15 13:09, Almaember:
A question to everybody reading the list, how badly would it break the
spec to simply block any request whose URLs contain ".." as a
standalone path-element?
Simply blocking ".." won't catch all problems. Of instance, dezhemini
actually blocks all request containing ".." in the URL and returns a 59
(bad request). This particular case is a problem in the Racket standard
library used to parse URLs. This library splits a path in parts (string
and symbols) with 'up (a symbol) for ".." but not when the dots are
escaped with %, it would yield "..". Dezhemini only blocked on 'up,
auch..
Also, blocking ".." will break my lang=morse site! ;-)
..///.-../---/...-/.///--././--/../-./..
Cheers,
Remco