Re: json-stream-read should validate json too Amirouche Boubekki 21 Jan 2020 10:47 UTC

Le mar. 21 janv. 2020 à 10:15, Duy Nguyen <> a écrit :
> I feel that without validation it's more like a tokenizer than an
> actual parser. Anybody who wants to use it will have to implement
> duplicated validation logic (and could get it wrong if they're not
> careful). Is it possible to adjust the interface to move validation
> back inside json-stream-read (or at least an option to include
> validation)?

Good idea. (That procedure used to be called json-tokenizer).

> Something like srfi-171 "transducers" could help make a
> container-independent json stream parser. The exact data type for
> arrays and objects are controlled by transducers (though I haven't
> really worked it out yet). Transducers seem a good fit for this to
> potentially create large data structures with minimum intermediate
> copies.

Like I said, in the other thread I am willing to use a generator. One
could add a generator->transducer. Neither I am well versed to
transducer art. It seems to me generators can do what transducers do
(with a little more work to represent intermediate results).

> Alternatively maybe we can wrap user-provided 'proc' in our own proc
> that does validation on top, something like a stripped down version of
> %json-read that does nothing but validate? For example,
> make-json-validator takes a proc and returns a new proc also performs
> validation.

I will look at it, it seems to me if one can validate inside
json-stream-read, it will be more useful.

Also, I was thinking about adding a parameters like
`json-maximum-nesting-level` that would be 501 by default.  And that
will control the reader, in case there is 501 or more nested JSON
array or object, json-stream-reader will raise a json-error?  What do
you think?

Amirouche ~