Would it be too much to also require verifying with the test suite on multiple Scheme implementations, or even multiple independent implementations of SRFI itself, at least when it's relevant? I think SRFI's spirit is to allow exploring idea spaces rather freely, so raising the bar too much could be counterproductive; but there have been cases that defects of spec is found later during independent implementations are done.
Well, easier said than done. The best thing is that I go ahead implementing SRFIs during the draft period and find out any shortcomings. The reality is that I tend to put it off until finalized so that I won't need to go back and fix every time the draft is updated. I hope that there's a hack to remedy this dilemma without complicating SRFI process too much.How about adopting the 2-stage model, something like RFC and STD in IETF? Or maybe we add one more status ("mature"?) after "final" that indicates the SRFI has been implemented in multiple ways and/or adopted to number of implementations. Whenever we talk about lifting SRFI into RnRS, for example, we can refer to its maturity level.
Would it be too much to also require verifying with the test suite on multiple Scheme implementations, or even multiple independent implementations of SRFI itself, at least when it's relevant? I think SRFI's spirit is to allow exploring idea spaces rather freely, so raising the bar too much could be counterproductive; but there have been cases that defects of spec is found later during independent implementations are done.Well, easier said than done. The best thing is that I go ahead implementing SRFIs during the draft period and find out any shortcomings. The reality is that I tend to put it off until finalized so that I won't need to go back and fix every time the draft is updated. I hope that there's a hack to remedy this dilemma without complicating SRFI process too much.How about adopting the 2-stage model, something like RFC and STD in IETF? Or maybe we add one more status ("mature"?) after "final" that indicates the SRFI has been implemented in multiple ways and/or adopted to number of implementations. Whenever we talk about lifting SRFI into RnRS, for example, we can refer to its maturity level.On Sat, Apr 22, 2017 at 4:26 PM, John Cowan <xxxxxx@ccil.org> wrote:On Sat, Apr 22, 2017 at 1:46 PM, Arthur A. Gleckler <xxxxxx@speechcode.com> wrote:We should probably also do a quick "five whys" discussion to figure out how to avoid similar errors in the future.
The WP article on "five whys" says you should keep asking "Why?" until you get to a broken process. I think in this case we need to change the SRFI acceptance process to *require* a test suite. I submitted SRFI 142 without one; if I had been required instead of just encouraged to produce one, this would almost certainly have been caught before SRFI 142 was finalized.I see that the SRFI FAQ says that a reference implementation "should also include automated tests. Having them will help implementors, and that will increase the likelihood that your SRFI will be incorporated in Scheme implementations. It will also help users understand how your SRFI is to be used." I think this needs to be moved to the process document, and the "should" taken seriously: that is, it is a "must" unless a specific justification for not having tests is provided. There was no such justification in SRFI 142, just laziness on my part. The SRFI ought not to have been accepted without them. (Note that even a SRFI without an implementation can have tests.)(The .sig below was chosen at random by my .sig generator, but it is extraordinarily apropos.)--John Cowan http://vrici.lojban.org/~cowan xxxxxx@ccil.orgThey do not preach that their God will rouse themA little before the nuts work loose.They do not teach that His Pity allows themto drop their job when they damn-well choose.--Rudyard Kipling, "The Sons of Martha"To unsubscribe from this list please go to http://www.simplelists.com/confirm.php?u=2oxkfhKkcg1aY2iU4ez l3oVFnLacwNDV