Invitation to "adopt pytest month"

Brianna Laugher brianna.laugher at gmail.com
Fri Mar 27 19:13:45 EDT 2015


On 27 March 2015 at 20:27, Thomas De Schampheleire
<patrickdepinguin at gmail.com> wrote:
> That sounds great, thanks!
> In practice, how will this work?

Well, I'm kind of making it up as I go, but I envisage something like
· initial familiarisation with project, discussion of aims (which we
have done a bit here and now on the wiki page)
· basics: pytest can run all tests, tests run without errors/failures,
CI server set up (I haven't seen one for Kallithea yet - is there
one?), probably with coverage - pytest helpers submit PRs, project
authors review and discuss

Then you have a choice of focusing on existing tests or exploring
needs for new tests.
Existing tests: Progressively rewrite to be more pytesthonic. You
mentioned improving the speed and that is another definite area to
look at!
New tests: Add unit tests for low coverage areas. Figure out what
fixtures might be needed, especially for functional tests.

I would suggest the latter could be especially useful for Kallithea as
you already mentioned fixtures for different vcs repos - sounds
perfect.

As I put 3 helpers with Kallithea they are people with a variety of
pytest experience, so something like converting or writing new unit
tests is a good task for someone with less experience, and analysing
the speed and writing fixtures is a good task for someone with more.

Like that is actually heaps for a months' effort, right? I get really
excited and I'm really bad at keeping the scope down. :) But that's
pretty much what I think - then if you/Kallithea would like a
different focus, or have other ideas, point the way!

Below are some notes that I wrote on the web page and in an email to
the helpers. They are written with a bit of assumption that the
project contributors are not very familiar with pytest, which seems
like is not quite the case with Kallithea.

HTH,
Brianna


What does it mean to “adopt pytest”?

There can be many different definitions of “success”. Pytest can run
many nose and unittest tests by default, so using pytest as your
testrunner may be possible from day 1. Job done, right?

Progressive success might look like:

· tests can be run (by pytest) without errors (there may be failures)
· tests can be run (by pytest) without failures
· test runner is integrated into CI server
· existing tests are rewritten to take advantage of pytest features -
this can happen in several iterations, for example:
·· changing to native assert statements (pycmd has a script to help
with that, pyconvert_unittest.py)
·· changing setUp/tearDown methods to fixtures
·· adding markers
·· other changes to reduce boilerplate
· assess needs for future tests to be written, e.g. new fixtures,
distributed testing tweaks

“Success” should also include that the development team feels
comfortable with their knowledge of how to use pytest. In fact this is
probably more important than anything else. So spending a lot of time
on communication, giving examples, etc will probably be important -
both in running the tests, and in writing them.

It may be after the month is up, the partner project decides that
pytest is not right for it. That’s okay - hopefully the pytest team
will also learn something about its weaknesses or deficiencies.


--------------------
Notes sent to pytest helpers:


If you are happy with the project I have put you with, please make
contact with the project person and start having a look at the project
itself. Install it, build it, play with it. Join the mailing list and
IRC channel if they exist. Read the docs and any existing tests. Run
the tests and see if they all pass! (Also in the existing test
runner.) Take note of any test files which are noticably slower than
the others. Try running them distributed and see if any extra failures
pop up, this is a notoriously good way of bringing out test
interdependencies or reliance on globals.

Before you start writing any tests, be sure to discuss with the
project person about what the project is, what their goals are, what
their knowledge of testing and pytest is, what they see as areas that
need attention, any specific things they want to achieve. Try to make
a rough plan which has lots of steps, each step being an achievement
in its own right. A month is not very long, so for a large project
don't expect to get everything 100% perfect.

Some projects have not yet made a pypi release yet - see if they want
help with that. Some projects don't have CI set up yet - I would
suggest that is a good early goal to set, along with getting into the
practice of having all existing tests passing. Also record the test
coverage, although keep in mind it can be misleading.

Projects that are closer to being libraries may be happy with just
unit tests. Projects closer to applications, will probably benefit
from some functional/system tests. Find out which areas of the code
are bug-prone, or are already planned to be rewritten/refactored -
these are excellent candidates for functional tests, NOT unit tests.

If you find yourself rewriting tests several times, this is not a bad
thing at all. Tests should evolve as everyone's knowledge evolves and
also as the code evolves.

When you submit PRs, be extra verbose at the start to explain what you
are doing and why. Encourage them to ask you about anything they don't
understand. Suggest useful command line options for them to use when
running the tests.

Also, keep in mind that this is an opportunity to see pytest from the
eyes of a newcomer. Take notes of what seems confusing to them, or
where you notice some detail missing in the pytest docs - or some
functionality in pytest itself. I will ask you all about this at the
end and it's much easier to record these details as you go rather than
try to remember everything at the end of the month!



-- 
They've just been waiting in a mountain for the right moment:
http://modernthings.org/


More information about the kallithea-general mailing list