These are success problems!
We have demonstrated considerable ability in developing. The addition of the audio meters, for instance, have to me, demonstrated how effective the open source approach to development has been. No hierarchy of managers pronouncing, no hastily written justifications to be ignored outside the overview. And look at the speed it has come together.
Just competence…
Excellent!!
At the same time we are moving to a CI approach to release which once again is progressing very effectively, There are issues. It would appear that a unplayed zynth is going to have a jack crash at some point, and I imagine we would all agree that this is an issue we need to address, and if a freeze on new features aids that process, (not that I’ve EVER been guilty of feature suggestion, Oh No…) then I for one am willing to offer nothing more than testing to get back to the reliability I have come to expect.
But I think the idea of testing is where we do have issues. Retrofitting of testing harness is a difficult process. And quite how we DO test seems to be an area that would really help our situation.
I have two main zynth’s and I would have to say my testing procedure is pretty haphazard, partly because they both have idiosyncrasies that have just appeared and then resolutely refused to go away. I haven’t got TTY MIDI IN anymore on one machine and I’ve lost the ability to use a mouse on the other. Perhaps this makes me a bit slap dash on my testing process, but I would like to think that we have some form of operational testing to declare some baseline functionality ( it loads a snapshot, it plays a snapshot, it loads an engine, it plays the engine, it removes the engine, it’s not there…)
sort of stuff.
We can now drive the interface from MIDI messages and perhaps a sequence of MIDI and some examination of the recorded or listened output might provide a quick method fro confirming the basic functionality before we refine the examination to long running issues like the passive jack failure.
So you say wyleu, my sweet, why don’t you get on and prepare such a thing? Well that’s where the hierarchy of the commercial organisation CAN apply some pressure that produces results, because with the best will in the world the tedious ( and I say that from experience) process of constructing and refining such tests is not something that the discipline of the Open source community does address particularly well. What is a priority on Monday can disappears under the simple process of getting throu’ the week, to be replaced by the next weeks great enthusiasm .
So perhaps we need to adopt a community consensus to make the process of testing a zynth as competent, successful and considered as the effort we put into developing new features.
As I said I am as guilty as any, but should I be charged by the community to provide a SUBSET of the testing area then I would in the same vain as the ‘Where’s your sound sample?’ confrontation feel a pressure to produce such a SUBSET. it really only requires a couple of components.
1/ An overall agreed approach to HOW we test.
2/ Volunteers to accept the subsets of testing
3/ A draconian allocation of subsets to those volunteers
4/ Much congratulation and partying when tests are provided.
I’m up for it, hopefully others are as well…
Once we have this nailed THEN we can get back to the much more enjoyable process of sentences that start, How about, or what if, or could we…?