jbrockmendel
jbrockmendel
Going though `gc.get_objects()` I don't see any big objects left behind
Is there a viable non-pickle alternative? When I change `pickle.dumps(group)` to `pickle.dumps(group.values)` to pickle the underlying ndarrays I end up with 50-60 mb less (and the gc.collect no longer gets...
IIRC np.where uses NEP 18, which means to get this working would require implementing __array_function__. Ive tried that a couple times with little luck. PR would be welcome.
``` _________________________________________________________ TestTableSchemaRepr.test_publishes __________________________________________________________ self = ip = def test_publishes(self, ip): ipython = ip.instance(config=ip.config) df = DataFrame({"A": [1, 2]}) objects = [df["A"], df] # dataframe / series expected_keys = [...
That fixes it for me too
This came up in the last dev call. I think Irv's solution was to make a non-cdef class with the docstring. The perf impact was about 100ns on construction which...
Is this something that can be reverted once sphinx/cython fixes something upstream? if so, can there be a `# TODO(sphinx3.14.159): ...` attached to these
closable?
Can you post an example that is copy-pastable without having to load an unknown zip file? See https://matthewrocklin.com/minimal-bug-reports.html
what kind of tzinfo object are you getting back? might be fixable by passing an appropriate pydatetime object to utcoffset, but we wouldn't want to pay the cost of constructing...