Email list hosting service & mailing list manager

json-stream-read should validate json too Duy Nguyen (21 Jan 2020 09:15 UTC)
Re: json-stream-read should validate json too Amirouche Boubekki (21 Jan 2020 10:47 UTC)
Re: json-stream-read should validate json too Duy Nguyen (21 Jan 2020 12:44 UTC)
Re: json-stream-read should validate json too Amirouche Boubekki (21 Jan 2020 13:46 UTC)
Re: json-stream-read should validate json too Duy Nguyen (23 Jan 2020 09:11 UTC)
Re: json-stream-read should validate json too Amirouche Boubekki (23 Jan 2020 19:12 UTC)
Re: json-stream-read should validate json too Amirouche Boubekki (23 Jan 2020 19:24 UTC)
Re: json-stream-read should validate json too Amirouche Boubekki (23 Jan 2020 19:16 UTC)

Re: json-stream-read should validate json too Duy Nguyen 23 Jan 2020 09:11 UTC

On Tue, Jan 21, 2020 at 8:46 PM Amirouche Boubekki
<xxxxxx@gmail.com> wrote:
> > > Also, I was thinking about adding a parameters like
> > > `json-maximum-nesting-level` that would be 501 by default.  And that
> > > will control the reader, in case there is 501 or more nested JSON
> > > array or object, json-stream-reader will raise a json-error?  What do
> > > you think?
> >
> > Do we really have any problem with nesting level though? I think the
> > streaming code itself does not, and the way 'proc' is currently
> > implement, we don't call it recursively either. This reminds me of a
> > hacker news thread [1]. Anyway, because it's quite easy to count depth
> > from user code (and if 'proc' composes well), and (I assume) we don't
> > have any limits regarding nesting level, I think it's best leave it
> > out.
> > [1] https://news.ycombinator.com/item?id=21483256
>
> Thanks for the link.
>
> I have not proof as of yet, but I think it will be faster to parse
> JSON text without streaming, but to stay safe, it must have nesting
> level limit. So, maybe there is a place for a `json-read-fast`
> procedure?

Maybe. Maybe not. For small json structures, speed does not really
matter because it won't take long either way (unless you have to
process zillions of small json structures). For large json, stream
parser rules the world. So the use case for an optimized parser seems
very small.
--
Duy