You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
One issue I've been having is with python eating up way too much memory for some of my larger tables. Once read in, a dataframe I have is only 2.1MB, built from around 150k rows. I'm getting memory quota errors on Heroku due to this.
Been loving django-pandas so far!
One issue I've been having is with python eating up way too much memory for some of my larger tables. Once read in, a dataframe I have is only 2.1MB, built from around 150k rows. I'm getting memory quota errors on Heroku due to this.
A more memory-efficient iteration for io.read_frame would be great
Something like this: http://www.poeschko.com/2012/02/memory-efficient-django-queries/
The text was updated successfully, but these errors were encountered: