Testing Howto

The Plan

  1. create lots and lots of test documents
  2. note down the various (real or perceived) quirks of the ODF spec itself on the OIC Wiki
  3. send the list of these remarks to the main ODF TC (While minding the IPR requirements according to this note)

  4. in parallel, generate a test set for a given ODF version (using some basic scripting, running over all the test meta data files)
  5. encourage vendors and/or researchers to send in the results of specific implementations (they can use "result" elements in the meta data file)
  6. generate a general report (using some basic scripting), removing the names of specific vendors/implementations
  7. publish this report
  8. use this report to make recommendations on interop, profiles etc

Contributors

The suggested approach is:

  1. Read about "atomic" testing and methods and tools

  2. Take a look at the proposal for writing scenarios and handling meta-data

  3. If you want to create test documents, please add your name to the list of who's doing what

  4. Upload test documents and scenario to the document repository
  5. If you have issues / suggestions / remarks on the spec, add them to the list of remarks

Developers

Publishing developer's notes or adding your thoughts to the list of remarks would be very helpful.

Vendors / researchers

Submitting test results and/or test documents would be very helpful.

Note: Various vendors do have test documents, but cannot contribute them since they were sent to them by customers (so they don't own the copyright of said documents and/or the documents may contain sensitive information)

Tests (last edited 2009-08-12 18:04:58 by localhost)