arctic icon indicating copy to clipboard operation
arctic copied to clipboard

Tickstore Number of Rows

Open sreenumalae opened this issue 4 years ago • 8 comments

Arctic Store

# TickStore

Hey Currently i'm using arctic Tickstore to store Historical Data.

Its storing only 5032 rows of each symbol in a TICKSTORE For example I have TICKSTORE Named 'nutella' and it has a symbol 'EUR_USD' Under this im able to store only 5032 rows.

I would like to know is there any way to store more than that or its limited to 5032 rows only?

sreenumalae avatar Apr 15 '20 13:04 sreenumalae

Hi @sreenumalae, there is no limit here. What exception do you get when you attempt to store more than 5032 rows?

rob256 avatar Apr 15 '20 14:04 rob256

Hi @sreenumalae, there is no limit here. What exception do you get when you attempt to store more than 5032 rows?

There is no exception, when we try to update the tickers to latest date it removes the last row to inlcude the new row. Also it more like 5030- 5035 rows limit

rohankumar10 avatar Apr 15 '20 14:04 rohankumar10

Or if we add new tickers now, ( 20 years data for each ticker). It starts in March 2000, while when we added tickers in Feburary this year the ticker started from Feb 2000

rohankumar10 avatar Apr 15 '20 14:04 rohankumar10

Or if we add new tickers now, ( 20 years data for each ticker). It starts in March 2000, while when we added tickers in Feburary this year the ticker started from Feb 2000

See when I added data for AAPL on exactly 24 Feb 2020, it had data starting from Feb 2000 to Feb 2020, It had total of 5032 rows but.... when added new tickers on 14th March 2020, the data started from March 2000 to March 2020 in the arctic when the data for Feb 2000 exists and the common feature was row =5032

rohankumar10 avatar Apr 15 '20 15:04 rohankumar10

It sounds like you are using tickstore incorrectly. It has some interval requirements about how data is written

bmoscon avatar Apr 15 '20 15:04 bmoscon

It sounds like you are using tickstore incorrectly. It has some interval requirements about how data is written

Could you please elaborate more on this comment?

rohankumar10 avatar Apr 15 '20 15:04 rohankumar10

@rohankumar10 , could you provide us with a code snippet so we can reproduce the issue locally?

rob256 avatar Apr 15 '20 17:04 rob256

import pandas as pd
import pytz
import arctic
from pandas.util.testing import assert_frame_equal
from pandas import DataFrame, Series, Panel
from datetime import datetime, timedelta
from arctic import Arctic, TICK_STORE
from pymongo import MongoClient

client = MongoClient('localhost', username='cooluser', password='coolpassword')
store = Arctic(client)
store.initialize_library('Events', lib_type=TICK_STORE)
library = store['Events']

events = [{'Server': 'AWS-001', 'Process': 'CoolProcess', 'event': f'alert-{x}'} for x in range(1234567)]
datetimes = [datetime(2020, 6, 15, 10, tzinfo=pytz.UTC) + timedelta(minutes=x) for x in range(1234567)]
d = pd.DataFrame(events, index=datetimes)
library.write('alerts', d)

readData = library.read('alerts')
print(f'length of readData: {len(readData)}')
print(readData)

Result:

length of readData: 100000
                            Server        event      Process
2020-06-15 12:00:00+02:00  AWS-001      alert-0  CoolProcess
2020-06-15 12:01:00+02:00  AWS-001      alert-1  CoolProcess
2020-06-15 12:02:00+02:00  AWS-001      alert-2  CoolProcess
2020-06-15 12:03:00+02:00  AWS-001      alert-3  CoolProcess
2020-06-15 12:04:00+02:00  AWS-001      alert-4  CoolProcess
...                            ...          ...          ...
2020-08-23 22:35:00+02:00  AWS-001  alert-99995  CoolProcess
2020-08-23 22:36:00+02:00  AWS-001  alert-99996  CoolProcess
2020-08-23 22:37:00+02:00  AWS-001  alert-99997  CoolProcess
2020-08-23 22:38:00+02:00  AWS-001  alert-99998  CoolProcess
2020-08-23 22:39:00+02:00  AWS-001  alert-99999  CoolProcess

[100000 rows x 3 columns]

Expected: 1234567 rows.

FWIW, mongo seems to have all the bookkeeping for all the data (12 documents of where c=10,000 and a 13th where c=34567), just reading the data has the truncated amount.

ehiggs avatar Jun 29 '20 12:06 ehiggs