1 | | = Testing issues = |
| 1 | = Development/testing issues = |
| 2 | |
| 3 | == [https://fedorahosted.org/beaker Beaker test framework] == |
| 4 | |
| 5 | Should we start configuring a dedicated testing framework for OpenVPN? This idea will be presented by ''mattock''. |
| 6 | |
| 7 | '''Rationale''' |
| 8 | |
| 9 | Would allow testing that OpenVPN works on a large number of predefined platforms and configurations with ''minimal manual effort'' (after initial setup). Would allow spotting bugs early. |
| 10 | |
| 11 | '''Limitations'' |
| 12 | |
| 13 | Only tests for external behavior of the application. Does not replace human testing when problems are complex, e.g. dependent on complex interaction of configuration parameters or components. |
| 14 | |
| 15 | == Continuous integration server == |
| 16 | |
| 17 | Should we build a continuous integration / automated release management server? This idea will be presented by ''mattock''. |
| 18 | |
| 19 | '''Rationale''' |
| 20 | |
| 21 | Would allow spotting build problems early on by building "allmerged" and reporting developers of build problems. Would allow automated packaging of openvpn-testing for various platforms and publishing those on a web server. This in turn would increased use of openvpn-testing, leading to earlier bug reports and would thus make the release process (new code -> acceptance to testing -> acceptance to stable -> release) faster. Could also allow centralized automated searching of code problems. |
| 22 | |
| 23 | == Developer bounties == |
| 24 | |
| 25 | Should we have a bounty system for writing missing features? This idea will be presented by ''ecrist''. |