Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RLS: 1.2 #37784

Closed
simonjayhawkins opened this issue Nov 12, 2020 · 81 comments
Closed

RLS: 1.2 #37784

simonjayhawkins opened this issue Nov 12, 2020 · 81 comments
Labels
Milestone

Comments

@simonjayhawkins
Copy link
Member

Tracking issue for the 1.2 release. https://github.com/pandas-dev/pandas/milestone/73

Planning a release candidate for sometime towards the end of the month, 1-2 weeks before the final release.

List of open regressions: https://github.com/pandas-dev/pandas/issues?q=is%3Aopen+is%3Aissue+label%3ARegression (includes regressions that may be included in 1.1.5)

List of open blockers: https://github.com/pandas-dev/pandas/issues?q=is%3Aopen+is%3Aissue+label%3Ablocker+

@simonjayhawkins simonjayhawkins added this to the 1.2 milestone Nov 12, 2020
@simonjayhawkins
Copy link
Member Author

nightly builds on master are currently failing. https://dev.azure.com/pandas-dev/pandas-wheels/_build/results?buildId=48433&view=results

the failures for 3.7 (for linux and macOS) are related to -Werror. (not applicable to Windows)

 #warning "Using deprecated NumPy API, disable it by " \
  ^
cc1: all warnings being treated as errors

These are not failing on 1.1.x. In setup.py on 1.1.x we have

# in numpy>=1.16.0, silence build warnings about deprecated API usage
#  we can't do anything about these warnings because they stem from
#  cython+numpy version mismatches.
macros.append(("NPY_NO_DEPRECATED_API", "0"))
if "-Werror" in extra_compile_args:
    try:
        import numpy as np
    except ImportError:
        pass
    else:
        if np.__version__ < LooseVersion("1.16.0"):
            extra_compile_args.remove("-Werror")

this has been removed on master in since the min numpy version for pandas 1.2 is now 1.16.5

in MacPython we have NP_BUILD_DEP: "1.14.5" for 3.7

we need to get the nightly builds to pass (and before 1.2rc0 release) while also allowing 1.1.5 to build.

in MacPython/pandas-wheels#111 I have changed to NP_BUILD_DEP: "1.16.5" for 3.7 and indeed the py37 wheels are building.

in MacPython/pandas-wheels#109 I have done the same to check that this change will not affect the 1.1.5 release

Will be opening a PR shortly on MacPython with this change. (if MacPython/pandas-wheels#109 passes)

@simonjayhawkins
Copy link
Member Author

the pip-test and conda-test stages of the release process are currently failing. https://github.com/simonjayhawkins/pandas-release/runs/1447063807?check_suite_focus=true

have reproduced this locally, so will investigate further.

@simonjayhawkins
Copy link
Member Author

the pip-test and conda-test stages of the release process are currently failing. https://github.com/simonjayhawkins/pandas-release/runs/1447063807?check_suite_focus=true

#35895 added the use of the worker_id fixture, which is only available if pytest-xdist is installed. see #35895 (comment)

IIUC pytest-xdist is an optional dependency and pd.test is part of the public api https://pandas.pydata.org/docs/reference/api/pandas.test.html

cc @TomAugspurger

@jreback
Copy link
Contributor

jreback commented Nov 24, 2020

@pandas-dev/pandas-core don't tag any more issues /PRs for 1.2, unless they are actually ready to merge, better to just ping and say ready for 1.2

@simonjayhawkins
Copy link
Member Author

wrt documentation

@jorisvandenbossche
Copy link
Member

imo the tracebacks are a distraction https://pandas.pydata.org/pandas-docs/dev/whatsnew/v1.2.0.html#optionally-disallow-duplicate-labels - any alternative to this?

Yes, I also noticed that and wanted to do something about it. I think the easiest alternative for now is a plain code-block with the output we want (nicer general solution would be #10715)

@simonjayhawkins
Copy link
Member Author

1.1.5 is in a good state for release. Not sure whether the preference is to do 1.1.5 before/after 1.2.0rc0

I'll open an issue for tracking the 1.1.5 release shortly. I'm thinking, looking at the open issues, that will just remove milestones from all the open issues, with a message about a imminent release and stating community contributions welcome.

Some of the issues have now had the milestone changed several times, some from 1.0.x. So that's why I think we could now just remove the milestone altogether. If I do this today, that will give interested parties time to comment prior to the release.

Scheduled for Monday. but could do say 1.2.0rc0 Monday, 1.1.5 following Monday, and 1.2 the Monday after that.

thoughts?

@simonjayhawkins
Copy link
Member Author

simonjayhawkins commented Nov 25, 2020

Yes, I also noticed that and wanted to do something about it.

I think, unlike code changes, it's ok to tidy release notes between 1.2.0rc0 and 1.2, so not a blocker imo.

I'm going to look at the tag script later, so that the v1.2.0rc0 tag, the 1.2.x branch and v1.3.0.dev0 tags are created atomically, to try an deter any changes after tagging the release candidate other than reported bugs/regressions (with the rc). Any older bugs/regressions could perhaps wait for v1.2.1

@jorisvandenbossche
Copy link
Member

I don't know if an rc for 1.2 on Monday is realistic, given the number of open issues?

Regarding the issues tagged for 1.1.5: I agree that several of them can be removed from that milestone (eg some performance regressions of a year old that were already partly solved). But the actual behaviour regressions, if not removing from the 1.1.5 milestone should then be added to the 1.2 (or 1.2.1) milestone, IMO, so we don't loose track of them.

@jorisvandenbossche
Copy link
Member

BTW, I have find it easier in the past to create a temporary milestone for "1.2.0rc", so we can distinguish between issues that only need to be fixed before the final 1.2 release, and issues that are blockers for the RC. After the RC we can put the issues labeled with 1.2.0rc back to the 1.2 milestone with a batch change, and remove the milestone.
(just a suggestion, up to you)

@simonjayhawkins
Copy link
Member Author

BTW, I have find it easier in the past to create a temporary milestone for "1.2.0rc", so we can distinguish between issues that only need to be fixed before the final 1.2 release, and issues that are blockers for the RC. After the RC we can put the issues labeled with 1.2.0rc back to the 1.2 milestone with a batch change, and remove the milestone.
(just a suggestion, up to you)

That sort of goes against my thinking that a release candidate is moreless final subject to any issues with the rc.

I'm not keen on doing a rc if there are more changes planned before the final release.

@simonjayhawkins
Copy link
Member Author

But the actual behaviour regressions, if not removing from the 1.1.5 milestone should then be added to the 1.2 (or 1.2.1) milestone, IMO, so we don't loose track of them.

on a similar vein to my above response. I wouldn't be keen on moving them to 1.2, ok with 1.2.1 though.

@jorisvandenbossche
Copy link
Member

That sort of goes against my thinking that a release candidate is moreless final subject to any issues with the rc.

Yep, I am certainly fine with that. In the past we have been more liberal with changes during the RC, but I think it would be good to change that to what you propose to be more strict. In which case another label might not be needed (there could be still issues only for the final release, but it should be only very few issues).

I wouldn't be keen on moving them to 1.2, ok with 1.2.1 though.

Unless our goal is to actually fix them. IMO we should take the time to go through the remaining issues, identify the important regressions, and actually try to fix them before the release. Even if that delays the release a little bit, IMO that's still a good outcome. We have some support for maintenance now, and fixing release blockers can be part of that IMO.

@simonjayhawkins
Copy link
Member Author

I don't know if an rc for 1.2 on Monday is realistic, given the number of open issues?

only 3 are labelled as blockers. once/if MacPython/pandas-wheels#113 is merged, the blocker label can be removed from #36429

if a non-blocking issue isn't fixed before the rc, then it'll have to wait for 1.2.1 or 1.3. (maybe sounds a bit harsh)

IIUC our release schedule is more calender based than feature based.

@jorisvandenbossche
Copy link
Member

IIUC our release schedule is more calender based than feature based.

I was not really talking about features, but about regressions, though

@jreback
Copy link
Contributor

jreback commented Nov 25, 2020

we need to finish off the PRs (about 10) & issues and either move them off / merge. i think by monday is doable.

This was referenced Nov 25, 2020
@simonjayhawkins
Copy link
Member Author

the pip-test and conda-test stages of the release process are currently failing. https://github.com/simonjayhawkins/pandas-release/runs/1447063807?check_suite_focus=true

#35895 added the use of the worker_id fixture, which is only available if pytest-xdist is installed. see #35895 (comment)

IIUC pytest-xdist is an optional dependency and pd.test is part of the public api https://pandas.pydata.org/docs/reference/api/pandas.test.html

cc @TomAugspurger

for now, if this is not blocking will add pytest-xdist to pip test and conda recipe to pandas-release, to get passing tests. tested at https://github.com/simonjayhawkins/pandas-release/actions/runs/388724204

@simonjayhawkins
Copy link
Member Author

we need to finish off the PRs (about 10) & issues and either move them off / merge. i think by monday is doable.

all the nightly wheels built on azure overnight. https://dev.azure.com/pandas-dev/pandas-wheels/_build/results?buildId=48896&view=results and https://anaconda.org/scipy-wheels-nightly/pandas/files

(no nightly wheels for arm, MacPython/pandas-wheels#102)

so from the wheel building perspective, 1.2.0rc0 release is doable.

@jreback
Copy link
Contributor

jreback commented Nov 29, 2020

the only remaining PRs which we should do before the RC are:

others can be for during the RC period is ok

@simonjayhawkins
Copy link
Member Author

the only remaining PRs which we should do before the RC are:

I've added the blocker tag to these.

@jorisvandenbossche you can add the blocker tag to the PRs you've created today, if they need to be done before rc.

@jorisvandenbossche
Copy link
Member

jorisvandenbossche commented Nov 30, 2020

I would like to get in the fixes I am doing for FloatingArray: #38178 (so that PR would need to get a review).

For the rest, they are not a blocker for the RC, but still for 1.2, IMO. So if you are fine with doing some more fixes after the RC, that's fine.

BTW, it might make more sense to first do a 1.1.5 release, and afterwards cut the 1.2.0.RC. Because any fix going into 1.1.5, also needs to be in 1.2.0 (I think it wouldn't make sense to fix a regression in 1.1.5, but have it fail again on 1.2.0. But again, that depends on how liberal we want to be with more fixes after the RC.

@jreback
Copy link
Contributor

jreback commented Nov 30, 2020

I don't think it makes a big diff whether 1.1.5 or 1.20rc comes out first.

@simonjayhawkins
Copy link
Member Author

BTW, it might make more sense to first do a 1.1.5 release, and afterwards cut the 1.2.0.RC. Because any fix going into 1.1.5, also needs to be in 1.2.0 (I think it wouldn't make sense to fix a regression in 1.1.5, but have it fail again on 1.2.0. But again, that depends on how liberal we want to be with more fixes after the RC.

in #37784 (comment), i suggested

Scheduled for Monday. but could do say 1.2.0rc0 Monday, 1.1.5 following Monday, and 1.2 the Monday after that.

I don't think it makes a big diff whether 1.1.5 or 1.20rc comes out first.

It just means two backports, but since OK to do regression fixes between 1.2.0rc0 and 1.2, gives another week to look into the outstanding regressions from 1.0.5

@jorisvandenbossche
Copy link
Member

If we are doing more fixes after 1.2RC, then that's certainly fine for me as well (
just wanted to make sure we are on the same page, as before you noted you proposed to be more strict about what gets included after 1.2RC by directly branching. But of course, for regressions, even if not found by the RC itself but known already now, that of course makes sense to still add those to 1.2 after the RC anyway)

@simonjayhawkins
Copy link
Member Author

not had a response from @mroeschke on #38523. which should maybe be fixed for 1.2

@simonjayhawkins
Copy link
Member Author

wheels are building on nightly. will open PR to test 1.2.x shortly after recent backports merged to 12.x

@simonjayhawkins
Copy link
Member Author

have spent some time on #38451, unsucessfully, but I see that has been moved off 1.2 (not a blokcer, will publish pdf docs with known issues)

@jorisvandenbossche
Copy link
Member

It has been only a good week. If we want that people test the RC, we need to give a bit more time IMO

@jorisvandenbossche
Copy link
Member

BTW, I checked some upstream projects: statsmodels has one failure (statsmodels/statsmodels#7215), but didn't check in detail if its caused by a regression or was a bug in statsmodels. Seaborn is passing (mwaskom/seaborn#2392). For dask, I opened a PR to test the RC (dask/dask#6996) which has a bunch of failures, but didn't investigate yet.

@simonjayhawkins
Copy link
Member Author

The most import outstanding regressions, all seem to be problematic to resolve

  1. indexing DatetimeIndex with date object API: indexing DatetimeIndex with date object #35830 and BUG: Date objects cannot be compared against a DatetimeIndex #35466 (old behaviour should probably be restored and deprecated)
  2. behavior change on DataFrame.apply() BUG: Pandas 1.0.5 → 1.1.0 behavior change on DataFrame.apply() where func returns tuple #35518 and BUG: Pandas 1.0.5 → 1.1.0 behavior change on DataFrame.apply() where fn returns np.ndarray #35517 (old behaviour should probably be restored)
  3. different behaviour of df.isin() when df contains None (we are inconsistent in many places e.g. with df.eq)

@simonjayhawkins
Copy link
Member Author

It has been only a good week. If we want that people test the RC, we need to give a bit more time IMO

it's 2 weeks tomorrow, (or the next day for conda-forge wheels). The conda-forge wheel download numbers are disappointing... https://anaconda.org/conda-forge/pandas/files

@jreback
Copy link
Contributor

jreback commented Dec 21, 2020

sure we are going to have a 1.2.1 - let's not hold up anything (except for xlrd patch)

@jreback
Copy link
Contributor

jreback commented Dec 21, 2020

let's just release, we are planning a 1.2.1. people really do not test the rc0s so if we release we will have actual reports. waiting only begets waiting.

@jorisvandenbossche
Copy link
Member

Let's give it at least a few more days. As mentioned above, I started running the dask test suite, but the failures still need to be checked. I already identified one regression.

@jreback
Copy link
Contributor

jreback commented Dec 21, 2020

Let's give it at least a few more days. As mentioned above, I started running the dask test suite, but the failures still need to be checked. I already identified one regression.

sure, but let's set a release date. how about wed/thurs (@simonjayhawkins according to your schedule). anything else can certainly wait till 1.2.1

@simonjayhawkins
Copy link
Member Author

The date for me is not critical, since Christmas is cancelled. so anytime.

@mrocklin
Copy link
Contributor

The date for me is not critical, since Christmas is cancelled. so anytime.

Hi folks! We were just discussing this on the Dask weekly call. We're scrambling to address the breakages that this introduces in Dask, but are somewhat short-staffed this week due to the holidays. If it's possible to delay releasing a bit while we get this sorted that would be welcome.

More broadly, this may be the first joint pandas+dask release since Tom left Anaconda and moved to Microsoft. He historically smoothed over co-releases, and we need to build a process to fill that gap on our end going forward.

@jreback
Copy link
Contributor

jreback commented Dec 22, 2020

@mrocklin what exactly is failing in dask? I think @jorisvandenbossche pointed to a pandas regression, but AFAIK dask is not failing on anything? (or better to cross link.

happy to try to accomodate, but we are already way behind on this release. the .0 are typically not heavily tested anyhow. so delaying them doesn't actually do anything.

@mrocklin
Copy link
Contributor

It looks like @jorisvandenbossche and @jsignell are working on it here: dask/dask#6996

@simonjayhawkins
Copy link
Member Author

It looks like @jorisvandenbossche and @jsignell are working on it here: dask/dask#6996

@jorisvandenbossche is it just #38649 that would need to be included in 1.2 to cover the dask failures?

@jorisvandenbossche
Copy link
Member

Yes, there are some more failures as well, but from a quick look those are either things that are known and need to updated in dask or are already fixed in pandas. So #38649 is the main one that would be good to have

@simonjayhawkins
Copy link
Member Author

once #38678 is backported, will be ready for final pre-release checks

@simonjayhawkins
Copy link
Member Author

release notes checked - formatting issue in https://pandas.pydata.org/pandas-docs/dev/whatsnew/v1.2.0.html#consistency-of-dataframe-reductions

In [26]: In [5]: df.all(bool_only=True)

not a blocker

@simonjayhawkins
Copy link
Member Author

simonjayhawkins commented Dec 26, 2020

ci checks

known failure

=========================== short test summary info ============================
FAILED pandas/tests/io/test_html.py::TestReadHtml::test_banklist_url_positional_match[bs4]
= 1 failed, 146190 passed, 3165 skipped, 1091 xfailed, 59 warnings in 2716.21s (0:45:16) =

not affecting wheel builds, xref #38703

=========================== short test summary info ============================
FAILED pandas/tests/arrays/categorical/test_warnings.py::TestCategoricalWarnings::test_tab_complete_warning
FAILED pandas/tests/frame/test_api.py::TestDataFrameMisc::test_tab_complete_warning[DataFrame]
FAILED pandas/tests/frame/test_api.py::TestDataFrameMisc::test_tab_complete_warning[Series]
FAILED pandas/tests/indexes/test_base.py::TestIndex::test_tab_complete_warning
FAILED pandas/tests/resample/test_resampler_grouper.py::test_tab_complete_ipython6_warning
= 5 failed, 145318 passed, 4090 skipped, 1090 xfailed, 6 warnings in 1050.35s (0:17:30) =

@simonjayhawkins
Copy link
Member Author

simonjayhawkins commented Dec 26, 2020

starting release now.

@simonjayhawkins
Copy link
Member Author

3.7 arm build failed. moving forward with release without it.

@jorisvandenbossche
Copy link
Member

@simonjayhawkins thanks for the release!

You didn't yet send an announcement email?

(also, if you do, could you use the "1.2" link instead of "1.2.0"? Eg https://pandas.pydata.org/pandas-docs/version/1.2/whatsnew/v1.2.0.html instead of https://pandas.pydata.org/pandas-docs/version/1.2.0/whatsnew/v1.2.0.html)

@simonjayhawkins
Copy link
Member Author

simonjayhawkins commented Jan 19, 2021

3.7 arm build failed. moving forward with release without it.

closing as discussion on official aarch support still open #33971

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

5 participants