chicken-hackers
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Chicken-hackers] CHICKEN in production


From: Florian Zumbiehl
Subject: Re: [Chicken-hackers] CHICKEN in production
Date: Tue, 14 Oct 2014 04:02:48 +0200
User-agent: Mutt/1.5.21 (2010-09-15)

Hi,

> > > I am frankly sick of tools bending over backwards to support NUL.
> > 
> > I am frankly sick of people making up their own variants of standards,
> > creating all kinds of interoperability and security problems, and even more
> > of environments that make it unnecessarily difficult to implement
> > conforming implementations.
> 
> Profiling a standard is hardly making up your own variant of it.

I guess I would disagree, but in any case this smells like a purely
semantic argument, which is pointless. If your parser does not accept all
words from the language JSON (modulo resource limits), then it's not a JSON
parser, call it a variant or not.

> The Unicode Standard does not in fact require for conformance that a
> system be able to process every character in it, and it is  in fact quite
> unusual for a system to be able to handle every character end to end.

Just because the Unicode standard doesn't, doesn't mean applications of
Unicode don't. JSON would be one example that does, as far as strings are
concerned. Requiring any format using Unicode to allow any character in any
position would obviously be silly ...

> > some creative person submits a JSON document with NULs to your frontend
> > system, which validates it, passes it to your JSON-but-without-NULs
> > parser, and voilà, you have a DoS, congrats!
> 
> Where does the DoS come in?  Your back end quite legitimately rejects
> such a bogus document, which is far better than having it accept it with
> a truncated string.  It's not, after all, a DoS to deny service to a
> malicious actor.

The DoS comes in in the part of the quote that you deleted:

> As you are guaranteed
> to receive syntactically valid JSON documents, you obviously don't need to
> worry about parsing failures.

In that case, the result is not rejected valid(!) input (it's valid JSON,
so nothing bogus about it, even if you happen to dislike NUL characters),
but a crashed system.

Rejecting the input would be legitimate system behaviour, if NUL characters
indeed are not acceptable in the specific application, but it's not sane
behaviour of a JSON parser component, for lack of being a JSON parser if
you reject words of the JSON language.

Also, I don't find arguing for one form of defect by comparing it to an
even worse defect particularly convincing. SQL injection is bad, just
dropping all possible SQL meta characters from all untrusted inputs is less
of a security risk. So far, so good. Now, the argument analogous to yours
would be: Therefore, that is better than using prepared statements or
escaping. I must say, I am not quite convinced ...

The correct way to handle data is by preserving its meaning, which is not
achieved either by truncating it at NULs, or by rejecting NULs, but by
preserving NULs. The only place where rejection is an acceptable solution
is when you are asked to convert information into a representation that
necessarily can not represent said information. String containing slash to
filename? Exception! String containing colon to hostname? Exception! String
containing NUL to C string? Exception! That's perfectly fine, you don't
really have any other sane options.

Florian



reply via email to

[Prev in Thread] Current Thread [Next in Thread]