Invitation to "adopt pytest month"

Thomas De Schampheleire patrickdepinguin at
Sun Mar 29 15:43:59 EDT 2015

On Sun, Mar 29, 2015 at 3:51 PM, Mads Kiilerich <mads at> wrote:
> On 03/28/2015 08:54 PM, Thomas De Schampheleire wrote:
>> On Sat, Mar 28, 2015 at 12:13 AM, Brianna Laugher
>> <brianna.laugher at> wrote:
>>> Well, I'm kind of making it up as I go, but I envisage something like
>>> · initial familiarisation with project, discussion of aims (which we
>>> have done a bit here and now on the wiki page)
>>> · basics: pytest can run all tests, tests run without errors/failures,
>>> CI server set up (I haven't seen one for Kallithea yet - is there
>>> one?), probably with coverage - pytest helpers submit PRs, project
>>> authors review and discuss
>> I don't think there is a CI server currently.
>> Mads: are you running one privately?
> No, nothing continuous. I run the tests locally before deploying internally
> or pushing upstream. That works ok. Right now I don't see a big need for CI;
> I don't think it would solve a "real" problem.

The recent failures of two tests, caused by the Dulwich bump but only
visible when you actually deploy this new Dulwich, could have been
detected by a CI server (if they'd run 'python develop' each
time, that is).

But it's a cornercase, that's true.

> Long term it would be nice for Kallithea development to have test run
> integrated with PRs, as one automatic and mandatory "review" criteria.


> I think CI is more valuable for projects where multiple people are pushing
> to the same branch frequently and where it not is feasible to test
> everything before pushing (perhaps because it must be tested on multiple
> platforms or use special hardware or just is too slow).
>>> Then you have a choice of focusing on existing tests or exploring
>>> needs for new tests.
>>> Existing tests: Progressively rewrite to be more pytesthonic. You
>>> mentioned improving the speed and that is another definite area to
>>> look at!
>>> New tests: Add unit tests for low coverage areas. Figure out what
>>> fixtures might be needed, especially for functional tests.
>>> I would suggest the latter could be especially useful for Kallithea as
>>> you already mentioned fixtures for different vcs repos - sounds
>>> perfect.
>>> As I put 3 helpers with Kallithea they are people with a variety of
>>> pytest experience, so something like converting or writing new unit
>>> tests is a good task for someone with less experience, and analysing
>>> the speed and writing fixtures is a good task for someone with more.
>>> Like that is actually heaps for a months' effort, right? I get really
>>> excited and I'm really bad at keeping the scope down. :) But that's
>>> pretty much what I think - then if you/Kallithea would like a
>>> different focus, or have other ideas, point the way!
>> What you describe sounds good to me, but I definitely welcome more
>> feedback from other Kallithea contributors/maintainers in this
>> matter...
> I can imagine some valuable milestones / deliverables:
> * be able to run the existing tests with pytest instead of nosetests
> * be able to run individual tests - that is currently tricky because
> nosetest and/or the way our tests are structured

Can you give some examples here?
When you know the names of the test, for example from a previous run,
you can simply run something like:

nosetests kallithea/tests/functional/

> * test coverage profiling ... and eventually tests for the areas without
> coverage
> * pull requests is one area with bad coverage ... and it is tricky because
> we want tests for many different repo scenarios with common and special
> cases

I would really like more tests in this area too, as we've seen already
two bugs related to our different use cases: you use pull requests in
one repo with different branches, and I use it with different repos
but the same branch in each repo.

Best regards,

More information about the kallithea-general mailing list