Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Flaky test coverage #1975

Closed
blueyed opened this issue Aug 11, 2017 · 8 comments · Fixed by #2214
Closed

Flaky test coverage #1975

blueyed opened this issue Aug 11, 2017 · 8 comments · Fixed by #2214
Labels

Comments

@blueyed
Copy link
Member

blueyed commented Aug 11, 2017

There are currently 2 places where the test coverage is flaky:

  1. event_handle_reparentnotify: https://codecov.io/gh/awesomeWM/awesome/pull/1974/changes
  2. shape related (?!): https://codecov.io/gh/awesomeWM/awesome/pull/1963/changes
  • objects/client.c (client_get_client_shape_bounding)
  • xwindow.c (xwindow_get_shape)
  • lib/awful/client/shape.lua (shape.get_transformed)

@psychon mentioned some details regarding 2. already, but I could not find it again now.

@blueyed blueyed added the bug label Aug 11, 2017
@psychon
Copy link
Member

psychon commented Aug 11, 2017

For (1): Hmm... No idea. Might be a race after a test finished where awesome shutting down races with some test clients exiting, but I don't really know why "a test client exits" would give us a reparent notify. They should be unmapping...

For (2): Querying the shape fails, i.e. xcb_shape_query_extents_reply in xwindow.c returns NULL, which means that the X11 request fails. The most likely reason that I can see for this is "no such window", which would mean that the window was already destroyed. ("Most likely" means: No idea what any other reason would be)

So my theory for this is that there is a race: Windows are closing at roughly the same time that awesome tries to read their shape. When awesome wins this race, it can try to handle the shape, else it cannot.

My fix would be (in case "just ignore it" is not a viable option): Add a test that runs xeyes and makes sure that its shape was handled correctly. No idea how to test if the shape is handled correctly, but a test that just starts xeyes and waits a bit doesn't feel like a good test to me.

@blueyed
Copy link
Member Author

blueyed commented Aug 11, 2017

in case "just ignore it" is not a viable option

It makes the coverage reporting unrealiable, that's why I would like to see it being fixed.

a test that just starts xeyes and waits a bit doesn't feel like a good test to me.

It's kind of some integration test after all: we ensure that this code path does not cause a SIGSEGV etc. (apart from stabilizing coverage)
Or do you mean that the "waits a bit" is the really bad part about it?

@psychon
Copy link
Member

psychon commented Aug 11, 2017

"Does not crash" is quite a low bar to pass. I'd really hope that our tests have better quality than that.

Different question: Which of the things we start in our tests are shaped? Our test runner should not be opening a shaped window and neither does xterm, right?

@blueyed
Copy link
Member Author

blueyed commented Oct 23, 2017

Regarding shaped clients I have no idea. @Elv13 maybe?

Here is a new set of changes FWIW: https://codecov.io/gh/awesomeWM/awesome/pull/2079/changes

psychon added a commit to psychon/awesome that referenced this issue Mar 11, 2018
This is not really a test. It just starts xeyes and sees if anything
explodes.

However, the reason for this test is to stabilise code coverage.
Apparently, Gtk(?) sometimes creates shaped windows. This had the effect
that codecov always reports random changes to code coverage, depending
on if this specific test run saw any shaped client windows or not.

Thus, by explicitly adding a test that runs a shaped client, hopefully
these random fluctuations disappear.

Hopefully-fixes: awesomeWM#1975
Signed-off-by: Uli Schlachter <[email protected]>
psychon added a commit to psychon/awesome that referenced this issue Mar 11, 2018
This is not really a test. It just starts xeyes and sees if anything
explodes.

However, the reason for this test is to stabilise code coverage.
Apparently, Gtk(?) sometimes creates shaped windows. This had the effect
that codecov always reports random changes to code coverage, depending
on if this specific test run saw any shaped client windows or not.

Thus, by explicitly adding a test that runs a shaped client, hopefully
these random fluctuations disappear.

Hopefully-fixes: awesomeWM#1975
Signed-off-by: Uli Schlachter <[email protected]>
Elv13 pushed a commit that referenced this issue Mar 13, 2018
This is not really a test. It just starts xeyes and sees if anything
explodes.

However, the reason for this test is to stabilise code coverage.
Apparently, Gtk(?) sometimes creates shaped windows. This had the effect
that codecov always reports random changes to code coverage, depending
on if this specific test run saw any shaped client windows or not.

Thus, by explicitly adding a test that runs a shaped client, hopefully
these random fluctuations disappear.

Hopefully-fixes: #1975
Signed-off-by: Uli Schlachter <[email protected]>
@blueyed
Copy link
Member Author

blueyed commented Jan 11, 2019

Re-opening.
#2574 improved things (probably), but there is still some flakyness (via #2564 (comment)):

Impacted Files	Coverage Δ	
event.c	71.2% <0%> (-0.9%)	
property.c	80% <0%> (+0.46%)	

Unfortunately Codecov's compare link times out currently: https://codecov.io/gh/awesomeWM/awesome/compare/3804040dbe56de80c003bba962c30bf719253300...9954bbea9444dc7231b536d7fc5d5c4c0ef58bdf

@blueyed blueyed reopened this Jan 11, 2019
@blueyed
Copy link
Member Author

blueyed commented Jan 11, 2019

The diff with property.c is:

HANDLE_PROPERTY(wm_protocols)

(https://codecov.io/gh/awesomeWM/awesome/src/9954bbea9444dc7231b536d7fc5d5c4c0ef58bdf/property.c#L86)

@blueyed
Copy link
Member Author

blueyed commented Jun 15, 2019

Closing in favor of #2677 (newer).

@blueyed blueyed closed this as completed Jun 15, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants