On Sat, Apr 22, 2017 at 7:53 PM, Shiro Kawai <xxxxxx@gmail.com> wrote:
Would it be too much to also require verifying with the test suite on multiple Scheme implementations, or even multiple independent implementations of SRFI itself, at least when it's relevant?   I think SRFI's spirit is to allow exploring idea spaces rather freely, so raising the bar too much could be counterproductive; but there have been cases that defects of spec is found later during independent implementations are done.

That's a good idea, but I'm not sure I can commit to doing that myself.  I would like to beg our loyal contributors to try the implementations and run the tests in as many places as possible.  I've put a request to that effect in my standard "last call" message boilerplate, just to remind everyone.
 
Well, easier said than done.  The best thing is that I go ahead implementing SRFIs during the draft period and find out any shortcomings.   The reality is that I tend to put it off until finalized so that I won't need to go back and fix every time the draft is updated.   I hope that there's a hack to remedy this dilemma without complicating SRFI process too much.

How about adopting the 2-stage model, something like RFC and STD in IETF?  Or maybe we add one more status ("mature"?) after "final" that indicates the SRFI has been implemented in multiple ways and/or adopted to number of implementations.  Whenever we talk about lifting SRFI into RnRS, for example, we can refer to its maturity level.

I like to think of the new "last call" period as the best chance for this kind of work.

The two-stage model is already implemented, but in a difference manner than RFC and STD: the second stage is that Scheme implementers incorporate the SRFI into their implementations.  But checking things during last call would be a great way to catch problems.  And I'm happy to give reviewers more time if it's needed.

On Sat, Apr 22, 2017 at 7:53 PM, Shiro Kawai <xxxxxx@gmail.com> wrote:
Would it be too much to also require verifying with the test suite on multiple Scheme implementations, or even multiple independent implementations of SRFI itself, at least when it's relevant?   I think SRFI's spirit is to allow exploring idea spaces rather freely, so raising the bar too much could be counterproductive; but there have been cases that defects of spec is found later during independent implementations are done.

Well, easier said than done.  The best thing is that I go ahead implementing SRFIs during the draft period and find out any shortcomings.   The reality is that I tend to put it off until finalized so that I won't need to go back and fix every time the draft is updated.   I hope that there's a hack to remedy this dilemma without complicating SRFI process too much.

How about adopting the 2-stage model, something like RFC and STD in IETF?  Or maybe we add one more status ("mature"?) after "final" that indicates the SRFI has been implemented in multiple ways and/or adopted to number of implementations.  Whenever we talk about lifting SRFI into RnRS, for example, we can refer to its maturity level.



On Sat, Apr 22, 2017 at 4:26 PM, John Cowan <xxxxxx@ccil.org> wrote:

On Sat, Apr 22, 2017 at 1:46 PM, Arthur A. Gleckler <xxxxxx@speechcode.com> wrote:

We should probably also do a quick "five whys" discussion to figure out how to avoid similar errors in the future.

The WP article on "five whys" says you should keep asking "Why?" until you get to a broken process.  I think in this case we need to change the SRFI acceptance process to *require* a test suite.  I submitted SRFI 142 without one; if I had been required instead of just encouraged to produce one, this would almost certainly have been caught before SRFI 142 was finalized.

I see that the SRFI FAQ says that a reference implementation "should also include automated tests. Having them will help implementors, and that will increase the likelihood that your SRFI will be incorporated in Scheme implementations. It will also help users understand how your SRFI is to be used."  I think this needs to be moved to the process document, and the "should" taken seriously: that is, it is a "must" unless a specific justification for not having tests is provided.  There was no such justification in SRFI 142, just laziness on my part.  The SRFI ought not to have been accepted without them.  (Note that even a SRFI without an implementation can have tests.)

(The .sig below was chosen at random by my .sig generator, but it is extraordinarily apropos.)

-- 
John Cowan          http://vrici.lojban.org/~cowan        xxxxxx@ccil.org
They do not preach that their God will rouse them
A little before the nuts work loose.
They do not teach that His Pity allows them
to drop their job when they damn-well choose.
                --Rudyard Kipling, "The Sons of Martha"