Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Warning integration breaks warnings.filterwarnings #2430

Closed
zzzeek opened this issue May 23, 2017 · 30 comments · Fixed by psf/requests#4056 or #2445
Closed

Warning integration breaks warnings.filterwarnings #2430

zzzeek opened this issue May 23, 2017 · 30 comments · Fixed by psf/requests#4056 or #2445
Labels
plugin: warnings related to the warnings builtin plugin type: bug problem that needs to be addressed type: regression indicates a problem that was introduced in a release which was working previously

Comments

@zzzeek
Copy link

zzzeek commented May 23, 2017

SQLAlchemy CI is now broken due to the unconditional inclusion of pytest-warnings in py.test 3.1.0. I need at least to know how to disable this plugin entirely.

In our test suite we make use of the warnings filter internally in order to propagate our own warnings to errors. It seems like warnings.filter() is now non-functional when this plugin is installed.

Demo test case:

import warnings

class MyWarning(Warning):
    pass

warnings.filterwarnings("error", category=MyWarning)

class TestWarnings(object):
    def test_my_warning(self):
        try:
            warnings.warn(MyWarning("warn!"))
            assert False
        except MyWarning:
            assert True


with py.test 3.0.7:


$ py.test test_warning.py 
======================================================== test session starts ========================================================
platform linux2 -- Python 2.7.13, pytest-3.0.7, py-1.4.33, pluggy-0.4.0
rootdir: /home/classic/Desktop/tmp, inifile:
plugins: xdist-1.15.0, cov-2.4.0
collected 1 items 

test_warning.py .

===================================================== 1 passed in 0.01 seconds ======================================================

with py.test 3.1.0:

$ py.test test_warning.py 
======================================================== test session starts ========================================================
platform linux2 -- Python 2.7.13, pytest-3.1.0, py-1.4.33, pluggy-0.4.0
rootdir: /home/classic/Desktop/tmp, inifile:
plugins: xdist-1.15.0, cov-2.4.0
collected 1 items 

test_warning.py F

============================================================= FAILURES ==============================================================
___________________________________________________ TestWarnings.test_my_warning ____________________________________________________

self = <test_warning.TestWarnings object at 0x7fecb08a5290>

    def test_my_warning(self):
        try:
            warnings.warn(MyWarning("warn!"))
>           assert False
E           assert False

test_warning.py:12: AssertionError
========================================================= warnings summary ==========================================================
test_warning.py::TestWarnings::()::test_my_warning
  /home/classic/Desktop/tmp/test_warning.py:11: MyWarning: warn!
    warnings.warn(MyWarning("warn!"))

-- Docs: http://doc.pytest.org/en/latest/warnings.html
=============================================== 1 failed, 1 warnings in 0.03 seconds ================================================

I have tried everything I can think of with the new -W flag, disable-warnings, no luck.

Please advise on the correct options to allow Python stdlib warnings.filter() to work again, thanks!

zzzeek added a commit to zzzeek/sqlalchemy that referenced this issue May 23, 2017
Change-Id: I587282da141aa6ea92f944eeb4c9e5782d0b5f29
(cherry picked from commit eed7888)
zzzeek added a commit to zzzeek/sqlalchemy that referenced this issue May 23, 2017
Change-Id: I587282da141aa6ea92f944eeb4c9e5782d0b5f29
@The-Compiler The-Compiler changed the title forcing pytest-warnings breaks the world Warning integration breaks warnings.filterwarnings May 23, 2017
@zzzeek
Copy link
Author

zzzeek commented May 23, 2017

how do I get a helper to help triage my bugs and change nasty headlines to nice ones? :(

@nicoddemus
Copy link
Member

SQLAlchemy CI is now broken due to the unconditional inclusion of pytest-warnings in py.test 3.1.0. I need at least to know how to disable this plugin entirely.

@zzzeek you can disable the warnings plugin by passing -p no:warnings; this will disable the plugin completely and you should get green builds again. You can choose to add this to your pytest.ini file:

[pytest]
addopts = -p no:warnings

As for the problem itself, I will take a more careful look later.

@nicoddemus nicoddemus added type: bug problem that needs to be addressed type: regression indicates a problem that was introduced in a release which was working previously labels May 23, 2017
@zzzeek
Copy link
Author

zzzeek commented May 23, 2017

great, thanks!

zzzeek added a commit to zzzeek/sqlalchemy that referenced this issue May 23, 2017
lift cap on py.test.
references: pytest-dev/pytest#2430

Change-Id: Ieb8a6258ba1d15efa570d9cda2b51cf021499a23
zzzeek added a commit to zzzeek/sqlalchemy that referenced this issue May 23, 2017
lift cap on py.test.
references: pytest-dev/pytest#2430

Change-Id: Ieb8a6258ba1d15efa570d9cda2b51cf021499a23
(cherry picked from commit a987942)
@RonnyPfannschmidt
Copy link
Member

i believe https://github.com/pytest-dev/pytest/blob/master/_pytest/warnings.py#L54 is gutting all prior setups

we should just call reset_warning instead when we do something like that and give people a opt-out to allow respecting in code specifications

@nicoddemus
Copy link
Member

@zzzeek thanks for posting the example, it helped understand the issue.

Can I ask why you add that warnings filter? If the objective is to turn certain warnings into errors, you can now configure your pytest.ini to the same effect. Raising more awareness about warnings was the motivation for this new feature.

Also, do you think we should add a blurb to the warnings docs making it easy for users to find how to disable it in case they run into trouble?

@RonnyPfannschmidt

i believe https://github.com/pytest-dev/pytest/blob/master/_pytest/warnings.py#L54 is gutting all prior setups

Actually the implementation uses catch_warnings, which saves the previous filters and restores them after the call (see https://github.com/python/cpython/blob/master/Lib/warnings.py#L454). Unfortunately this has the effect of silencing errors configured manually by other warnings.simplefilter calls, which is the heart of this issue. Also, by default Python will install a number of default filters silencing DeprecationWarnings, ImportWarnings and others, which are precisely the kind of warnings we believe should be displayed in test suites.

@RonnyPfannschmidt

we should just call reset_warning instead when we do something like that

resetwarnings() will actually throw out all filters:

(https://docs.python.org/3/library/warnings.html#warnings.resetwarnings)

This discards the effect of all previous calls to filterwarnings(), including that of the -W command line options and calls to simplefilter().

Which is not what we want, I think.

I'm not sure how to proceed here, so I would like to hear some opinions.

@RonnyPfannschmidt
Copy link
Member

@nicoddemus pushing a new once simple filter is invalidating all other filter s due to matching first
so its literally pushing a new filter to be there

i beleive that by pushing a unfiltered once for all warning setups literally breaks all expectations,
people should opt in into breaking filter changes, not have to opt out

in retrospect the warnings addition is implemented in a way that warrants a major release, and the way we should fix it warrants a major release as well because to fix this issue we need to do a breaking change, and we never adopted a process of publishing experiments that we may change our mind on

shoyer added a commit to shoyer/xarray that referenced this issue May 24, 2017
This is causing CI failures for some builds.

This is really a pytest bug (pytest-dev/pytest#2430),
but working around this on our end is easy enough (and cleans things up a
little).
@nicoddemus
Copy link
Member

pushing a new once simple filter is invalidating all other filter s due to matching first
so its literally pushing a new filter to be there

True... it does have an append option though, so perhaps we should change it to append=True to add it to the end of the list. The drawback of this is that the default warnings filter that hide DeprecatedWarning, ImportWarning and etc will be active and hide those warnings by default.

Perhaps we should add two filters, a once filter to the end of the lists and a filter to show some selected warnings (DeprecatedWarning, ImportWarning) to the beginning of the list? After writing this I just realized though: someone might have configured a filter for DeprecatedWarning (for example) to error out, in which case we might again break some test suite which relies on that.

@RonnyPfannschmidt
Copy link
Member

we simply should not do anying per default

@RonnyPfannschmidt
Copy link
Member

any choice we make breaks some peoples expectations, we should jsut let the people decide

@The-Compiler
Copy link
Member

I don't think hiding warnings by default is a good behavior for a test runner - don't all other test runners show them by default too? How do they solve that problem?

@RonnyPfannschmidt
Copy link
Member

@The-Compiler this is not about showing vs hiding, but about not altering the warnings subsystem unless requested

python warnings are inherently state-full, and people register different actions for different warnings
any test-runner that changes anything by default is broken

note that warnings are not hidden by default, but library authors may choose to hide/error on certain errors, or decide to show some errors only once

changing those behaviours inwarranted breaks code

@The-Compiler
Copy link
Member

Python does hide DeprecationWarnings (and some others?) by default, because it doesn't want to confront users with those when they're simply using a library.

Test runners should turn them on (like unittest.py does too), so they aren't simply ignored.

shoyer added a commit to pydata/xarray that referenced this issue May 24, 2017
* Fix errors in the test suite due to pytest warning changes

This is causing CI failures for some builds.

This is really a pytest bug (pytest-dev/pytest#2430),
but working around this on our end is easy enough (and cleans things up a
little).

* deprecated_call() -> warns(FutureWarning)

* another attempt
@zzzeek
Copy link
Author

zzzeek commented May 24, 2017

so the first reason we add the warnings filter is because any SAWarnings, e.g. those generated by SQLAlchemy, should never occur without us adding directives that we expect them - so these all go to "error" by default, they need to raise.

The next reason we use "error" is that a lot of the older tests ensure that particular code emits warnings by doing an assertRaises() style of check, taking advantage of the fact that the SAWarnings all raise.

In order to accommodate for code that expects to emit SAWarning but we don't care, there are function decorators like @testing.emits_warning("foo") which alter the warnings filter per test to allow certain warnings to not throw an error; the fliter is then restored back after the test is complete.

Building on that approach, the new style of writing tests that's expected to emit warnings is that we have a context manager "with expect_warnings(*strings)" that manipulates the warnings filter within the block to allow warnings that match our strings to go through, to assert that they were actually seen else fail the test, and to have all other SAWarnings coninue to raise exceptions.

These last use cases require the use of highly specific set of filter operations that could not be sufficiently addressed by flags in "pytest.ini / setup.cfg".

I also agree with the argument that py.test should try not to make changes that interfere with existing approaches. Whlie we could move our global "warnings" filter to be omitted under a py.test run and instead rely upon py.test specific settings, this makes our test suite more complicated as the sutie is constructed to have a layer of indirection between which testing framework is in use.

While SQLAlchemy makes extensive use of py.test features, the dependencies on these features is contained entirely within a plugin-oriented module that invokes only when the "py.test" command is used. You can still run our tests with nose. The rationale for this is not that anyone uses nose anymore, but rather that nobody uses nose anymore - a reminder that after I and other developers spent many months converting our test suite to use nose, nose within a couple of years went unmaintained and I had to pick up and start all over again with py.test. Therefore while I love py.test, I keep its featureset at arm's length and nothing within the tests themselves refer to py.test. our test suite is always ready to accommodate other test runners should that need ever arise.

@zzzeek
Copy link
Author

zzzeek commented May 24, 2017

to be fair, while older SQLAlchemy versions were using ad-hoc filterwarnings() in order to get the "per-test warnings filter" behavior, currently we're just mock.patching warnings.warn() to make it simple.

Lukasa added a commit to Lukasa/requests that referenced this issue May 24, 2017
We may be able to remove this if pytest resolves pytest-dev/pytest#2430
Lukasa added a commit to Lukasa/requests that referenced this issue May 24, 2017
We may be able to remove this if pytest resolves pytest-dev/pytest#2430
@RonnyPfannschmidt
Copy link
Member

@nicoddemus
Copy link
Member

@zzzeek

I agree that displaying the deprecation warnings is a great idea but an approach that doesn't break code that wants to use the warnings filter would be best

Definitely agree. But looking at how warnings is implemented we are still trying to figure out if that's even possible, unfortunately.

in my own projects, very often there are API changes that would vastly improve everything but they would break the world for a lot of people, so unfortunately, you can't just turn them on - you just provide the options, document them, blog how great they are, and allow people to opt in.

We didn't expect the introduction to break existing suites, and for that we apologize. We try our best to ensure we don't break existing suites in minor releases, but this went out in 3.1 and I'm not sure if rolling the change back (or defaulting it to opt-in) in a new 3.2 release would help, because by this point I suspect most people who had trouble with warnings has already disabled the internal plugin.

@nicoddemus
Copy link
Member

@Themanwithoutaplan

If so how should tests for warnings look like without pytest-warnings disabled? If this is not possible then 3.1 broke a pretty fundamental test feature.

Does your test suite deal with the warnings module directly, or is the matter that one specific test which uses pytest.warns() or one of the other warnings-related facilities is not working as expected? If the first then it might be that your customization conflicts with the internal warnings plugin in which case disabling it might be the best solution. If the latter, it sounds like a bug and we should have it fixed (preferably in a separate issue).

@nicoddemus
Copy link
Member

@zzzeek just curious: in your opinion, if you were to embrace the new pytest warnings feature in 3.1 by throwing away your customization would that improve the test suite? Would that be relatively simple to do? Just as a thought exercise, disregarding your point about having a layer over the test runner.

I'm under the impression that the new plugin will be a problem in cases where the test suite implements/customizes its own handling of warnings and that creates clashes. In this case it might be better to just disable the plugin to keep things working, but users might consider to throw away their own customizations and use the built in features.

@Themanwithoutaplan
Copy link

The test that randomly fails is at https://bitbucket.org/openpyxl/openpyxl/src/44212297981845d4cfad77e879b812ebd2d7171e/openpyxl/reader/tests/test_worksheet.py?at=2.4&fileviewer=file-view-default#test_worksheet.py-470
Note I had to play around with the configuration a few years ago for similar reasons.

I don't have many tests that check for warnings so it's not too much work to change them but this change did trip me up. The new context manager looks nicer than recwarn but I couldn't get it to work for much the same reason.

@zzzeek
Copy link
Author

zzzeek commented May 25, 2017

@nicoddemus if i werent using py.test, and i had a simple test suite where we just want our own warnings to raise an error and for all python deprecation stuff to emit, then setting up a configuration level would be more palatable, although even then, a filter that wants to ensure a certain class of warnings is errored is still IMO a little better expressed in code rather than config since it refers to a class that's part of the library being tested.

next, if my test suite needed to do per-test alterations to the warnings filter so that we can assert or ignore various warnings, that has to be done in code w/ decorators, context managers. so if the feature here only allows config-level flags, that's not sufficient in any case.

@nicoddemus
Copy link
Member

@Themanwithoutaplan

I don't have many tests that check for warnings so it's not too much work to change them but this change did trip me up.

Indeed, we should make that more obvious in the docs.

@nicoddemus
Copy link
Member

@zzzeek

a filter that wants to ensure a certain class of warnings is errored is still IMO a little better expressed in code rather than config since it refers to a class that's part of the library being tested.

This is a very valid point, thanks. This seems like a feature request though, would you like to create a new issue for it so we can track it?

@okken
Copy link
Contributor

okken commented May 26, 2017

Having a warning system is hugely useful.
Defaulting to all warnings on is not.

Even if everyone decides to leave this feature in, it's broken.
Only reasonable course of action is to back it out until it's more thoroughly tested.ASAP

1. This code shows a warning that should not be on by default:

test_something.py

class TestHelper():
  def __init__(self):
    pass

  def someNonTestStuff(self):
    pass
(venv) $ pytest test_something.py 
========================== test session starts ==========================
platform darwin -- Python 3.6.1, pytest-3.1.0, py-1.4.33, pluggy-0.4.0
rootdir: /Users/okken/foo, inifile:
collected 0 items 

=========================== warnings summary ============================
test_something.py::TestHelper
  cannot collect test class 'TestHelper' because it has a __init__ constructor

-- Docs: http://doc.pytest.org/en/latest/warnings.html
====================== 1 warnings in 0.00 seconds =======================

2. --strict doesn't work

(venv) $ pytest --help
...
  --strict              run pytest in strict mode, warnings become errors
...

(venv) $ pytest --strict test_something.py 
========================== test session starts ==========================
platform darwin -- Python 3.6.1, pytest-3.1.0, py-1.4.33, pluggy-0.4.0
rootdir: /Users/okken/projects/book/bopytest/Book/code/foo, inifile:
collected 0 items 

=========================== warnings summary ============================
test_something.py::TestHelper
  cannot collect test class 'TestHelper' because it has a __init__ constructor

-- Docs: http://doc.pytest.org/en/latest/warnings.html
====================== 1 warnings in 0.01 seconds =======================

@nicoddemus nicoddemus added the plugin: warnings related to the warnings builtin plugin label May 29, 2017
nicoddemus added a commit to nicoddemus/pytest that referenced this issue May 29, 2017
@nicoddemus
Copy link
Member

Hi everyone,

After a discussion on the mailing list the team decided to release 3.2.0 with the new plugin being opt-in instead of being active by default (#2443).

Again we apologize for the problems many of you had; it was not our intention in the least. We just did not foresee that the new plugin could break some test suites, otherwise we would have never released it active by default.

If all goes well, we should see a new release later today or tomorrow.

Thanks everyone for their feedback and understanding! 👍

@nicoddemus
Copy link
Member

@okken

This code shows a warning that should not be on by default:

Hi Brian, just now saw this sorry. Actually that warning should appear by default because it helps users realize why that test class is not being collected; in fact if I'm not mistaken that warning already appears in pytest 3.0.

--strict doesn't work

Definitely, opened a new issue for it: #2444.

nicoddemus added a commit to nicoddemus/pytest that referenced this issue May 29, 2017
myroslav pushed a commit to quintagroup/openpyxl that referenced this issue Sep 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
plugin: warnings related to the warnings builtin plugin type: bug problem that needs to be addressed type: regression indicates a problem that was introduced in a release which was working previously
Projects
None yet
6 participants