You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When deep=True, data is copied but actual Python objects will not be copied recursively, only the reference to the object. This is in contrast to copy.deepcopy in the Standard Library, which recursively copies object data (see examples below
So it could be more memory-efficient.
According to https://stackoverflow.com/questions/9058305/getting-attributes-of-a-class , a possible suggestion for implementation could be gather the attributes except for _df attribute (that will be copied with pd.copy()) and create a new instance with the copy.deepcopy() of those attributes (not _df):
My idea would be to use Pandas for the copy of the DataFrame (which is the biggest object in memory for the instance).
Looking at the docs they say (https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.copy.html):
So it could be more memory-efficient.
According to https://stackoverflow.com/questions/9058305/getting-attributes-of-a-class , a possible suggestion for implementation could be gather the attributes except for
_df
attribute (that will be copied withpd.copy()
) and create a new instance with thecopy.deepcopy()
of those attributes (not_df
):What do you think? Does it make sense?
Originally posted by @lorenz-gorini in #67 (comment)
The text was updated successfully, but these errors were encountered: