pystore
pystore copied to clipboard
Fast data store for Pandas time-series data
If I read back an item with no _metadata files but a metadata.json, I end up with the error below when fastparquet attemps to read the metadata.json as a parquet...
Hi there, i was following pystore tutorial to check it's performance. it seems when it want to load data; the dataframe is empty and i have this error  Python...
https://github.com/ranaroussi/pystore/blob/f3c864ca9d9321b3d5a83793e3f325708aa5aef3/pystore/collection.py#L170 this is causing most of exceptions preventing append to work, maybe print something instead of empty return.
the latest stock bar data may change due to not finished the timeframe. for example: ``` python >>> datetime.now() datetime.datetime(2019, 12, 5, 15, 48, 46, 595878) >>> exchange.fetch_ohlcv(symbol='BTC/USDT', limit=3) [[1575531960000,...
It seems that when using collection.append, some data is not written. If I do a `collection.write()` then one or more `collection.append()` and then read it back, I get less rows...
based on this code , on each append we load all the data into memory to check for duplicates then doing a write on all the data to rewrite parquet....
Seems I cannot have multiindexes in my frame when storing data ... how do you handle minute bars ... especially for when you are doing futures when it crosses days......
So this issue already posted, i can't get "append" to update the data using the demo example. Changed python version, Dask, pystore, numba, pandas, I tested same code on Linux...
Has anyone been able to make append() work in a recent release? Could anyone share with me a set of deps that allow collection.append() to work. I've been through the...
``` import pystore pystore.set_path("./pystore") store = pystore.store('test') collection = store.collection('demo collection') df_path_and_hash = pd.DataFrame({'path': ['path1', 'path2'], 'hash': [0, 1]}) d_container = {'idx':[1], 'dfs':[df_path_and_hash]} df_container = pd.DataFrame(d_container) collection.write('test item', df_container) item...