BUG: Segfault on np.maximum(series, ...)
#60611
Labels
Bug
Needs Info
Clarification about behavior needed to assess issue
Needs Triage
Issue that has not been reviewed by a pandas team member
Regression
Functionality that used to work in a prior pandas version
ufuncs
__array_ufunc__ and __array_function__
Pandas version checks
I have checked that this issue has not already been reported.
I have confirmed this bug exists on the latest version of pandas.
I have confirmed this bug exists on the main branch of pandas.
Reproducible Example
Issue Description
Segmentation fault (core dumped)
when executing above code.np.maximum(...)
goes into an infinite call cycle which eventually exceeds the max. stack size.Call stack (bottom up):
__array_ufunc__, generic.py:2171
(core/generic.py
):array_ufunc, arraylike.py:399
(core/arraylike.py
):Expected Behavior
No recursion and successful execution of code. This used to work fine in
pandas==2.1.1
(or perhaps even higher).Installed Versions
INSTALLED VERSIONS
commit : 0691c5c
python : 3.13.1
python-bits : 64
OS : Linux
OS-release : 6.12.5-200.fc41.x86_64
Version : #1 SMP PREEMPT_DYNAMIC Sun Dec 15 16:48:23 UTC 2024
machine : x86_64
processor :
byteorder : little
LC_ALL : None
LANG : en_AU.UTF-8
LOCALE : en_AU.UTF-8
pandas : 2.2.3
numpy : 2.2.1
pytz : 2020.4
dateutil : 2.9.0.post0
pip : 24.3.1
Cython : 3.0.11
sphinx : None
IPython : None
adbc-driver-postgresql: None
adbc-driver-sqlite : None
bs4 : None
blosc : None
bottleneck : 1.4.2
dataframe-api-compat : None
fastparquet : None
fsspec : None
html5lib : None
hypothesis : None
gcsfs : None
jinja2 : None
lxml.etree : None
matplotlib : None
numba : None
numexpr : 2.10.2
odfpy : None
openpyxl : 3.1.2
pandas_gbq : None
psycopg2 : 2.9.10
pymysql : None
pyarrow : 18.1.0
pyreadstat : None
pytest : 8.3.4
python-calamine : None
pyxlsb : None
s3fs : None
scipy : 1.14.1
sqlalchemy : None
tables : 3.10.1
tabulate : None
xarray : None
xlrd : 2.0.1
xlsxwriter : None
zstandard : None
tzdata : 2024.2
qtpy : None
pyqt5 : None
The text was updated successfully, but these errors were encountered: