-
-
Notifications
You must be signed in to change notification settings - Fork 18.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BUG: Memory leak when creating a df inside a loop #60897
Comments
Thanks for the report, cannot reproduce on linux. Can you include the stdout from your reproducer. Further investigations are welcome! |
I use Windows, so I tried running the reproducer. Here's an excerpt of what I got from stdout, since GitHub isn't letting me post the whole thing. (Note: I don't have Pandas installed in the core Python path, so I manually redacted my user directory out of the result.)
|
@rhshadrach What Linux OS and architecture are you using? Im using aarch Raspberry PI on Ubuntu and I still get the memory leak |
@narukaze132 - thanks, I neglected to see how many times the reproducer was looping. I've cut your output down to the first 10 iterations; this is more than enough already.
|
Hi! I also ran the script above on my dev environment and I see the same type of behaviour. If there's something more you need, please let me know :) The production environment versions:
My dev environment versions:
My dev run script output:
|
@jacobus-herman - can you post the output of this from your linux env; it is modified from the OP. Codeimport pandas as pd
import tracemalloc
import numpy as np
import time
import gc
# Start memory tracking
tracemalloc.start()
iteration = 0
Row_Number = 20000
prev_snapshot = tracemalloc.take_snapshot()
while iteration < 500:
test_lst = [*range(12)]
for i in range(12):
# Create a DataFrame with X amount of rows
df = pd.DataFrame({
"A": np.arange(Row_Number), # Sequential Row_Numbers from 0 to 999999
"B": np.random.rand(Row_Number), # Random floats between 0 and 1
"C": np.random.randint(0, 100, size=Row_Number), # Random integers between 0 and 99
"D": np.random.choice(["apple", "banana", "cherry"], size=Row_Number), # Random categories
"E": np.random.randn(Row_Number) # Normally distributed random Row_Numbers
})
test_lst[i] = df # The bug also appears without appending to list
del df # Deleting df at the end of loop doesnt affect memory leak
del test_lst # Deleting list at the end of loop doesnt affect memory leak
iteration += 1
# Check memory usage for 3rd party packages
if iteration % 50 == 0:
snapshot = tracemalloc.take_snapshot()
top_stats = snapshot.compare_to(prev_snapshot, 'lineno')
print(f"{iteration=}")
for k, stat in enumerate(top_stats[:10]):
if "site-packages/pandas" in str(stat.traceback):
print(" ", k, stat)
prev_snapshot = snapshot
tracemalloc.stop() |
@rhshadrach, sure the Linux environment output is below.
|
I am expecting to see the same without |
Thanks @Chuck321123 - LLMs can be helpful assistants, but I think at this time their responses need to go through a human filter to discern whether they are accurate / helpful. In this particular case, do you think ChatGPTs response was helpful? If you aren't able to tell, I would highly recommend not posting it into issues. Doing so can lead to extra noise without signal, and makes issues harder to understand. |
@rhshadrach Sorry, just wanted to help. |
Hey Richard! I ran it without the
So while |
Thanks @jacobus-herman! @Chuck321123 - can you run the code in #60897 (comment) and post the output you get? |
@rhshadrach I Tried, and I get similar results as @jacobus-herman. Seems to stabilize after 4-500 iterations. However, the memory usage is more than what it should be. It stabilizes at around 8.5kb if you iterate for longer |
Pandas version checks
I have checked that this issue has not already been reported.
I have confirmed this bug exists on the latest version of pandas.
I have confirmed this bug exists on the main branch of pandas.
Reproducible Example
Issue Description
By using tracemalloc (a tool to track memory usage in loops), I can see that pandas doesnt release memory when creating dfs inside a loop. The problem seems to come from pandas\core\internals\blocks around line 228. Would be nice if anyone could find a fix to this.
Expected Behavior
That the memory doesnt leak
Installed Versions
INSTALLED VERSIONS
commit : 0691c5c
python : 3.13.1
python-bits : 64
OS : Windows
OS-release : 11
Version : 10.0.22631
machine : AMD64
processor : Intel64 Family 6 Model 186 Stepping 2, GenuineIntel
byteorder : little
LC_ALL : None
LANG : en
LOCALE : Norwegian Bokmål_Norway.1252
pandas : 2.2.3
numpy : 2.2.2
pytz : 2024.2
dateutil : 2.9.0.post0
pip : 24.2
Cython : None
sphinx : 8.1.3
IPython : 8.31.0
adbc-driver-postgresql: None
adbc-driver-sqlite : None
bs4 : 4.12.3
blosc : None
bottleneck : None
dataframe-api-compat : None
fastparquet : None
fsspec : None
html5lib : None
hypothesis : None
gcsfs : None
jinja2 : 3.1.5
lxml.etree : None
matplotlib : 3.10.0
numba : None
numexpr : None
odfpy : None
openpyxl : 3.1.5
pandas_gbq : None
psycopg2 : None
pymysql : None
pyarrow : 19.0.0
pyreadstat : None
pytest : None
python-calamine : None
pyxlsb : None
s3fs : None
scipy : 1.15.1
sqlalchemy : None
tables : None
tabulate : 0.9.0
xarray : None
xlrd : None
xlsxwriter : None
zstandard : None
tzdata : 2024.2
qtpy : 2.4.2
pyqt5 : None
The text was updated successfully, but these errors were encountered: