alpaca-backtrader-api
alpaca-backtrader-api copied to clipboard
Datafactory compression not affecting when Next() is called
I'm currently trying to paper trade an algo that works well on 1H bars. I've used the samples to get paper trading to stream data in. I add the "compression=60" to my DataFactory which has timeframe=bt.TimeFrame.Minutes.
When I set historical=True, everything works great, my indicators run on the 1H bar. However when i switch to historical=False and connect to paper trading, it seems next() is getting called for every tick, and the rate that trades are being made it seems like the indicators are being calculated not on 1H bars but maybe the ticks as trades are constantly being triggered.
Do you have any samples showing how to use compression on bars in live/paper trading?
so, this project at its core is event driven or tick driven. so you cannot change the call rate. the compression should change the data rate (not the call rate) what you could do is in your code check if it's the right time and act upon the event only if the time is at your desired rate
That of course can change the meaning of an algorithm though. Say I have something that works with daily data (daily SMA or something like that). My solution was to use resampledata
to resample the ticks to days. This works well except it is broken with the latest updates: https://github.com/alpacahq/alpaca-backtrader-api/issues/60
A Filter might be another way to skip all data except for Hourly or Daily ticks, however I've noticed that the ticks don't always land on Minutely or Hourly boundaries? Still doable but painful.
I want to chime in on this. It is very important that backtest data and live data act the same. This is an import feature of a backtest framework such as backtrader.
I did some testing and it actually looks like it's working correctly using the latest release. @tylercollins590 note that live data is like having backtest data with second granularity. So if you want hour granularity, you need to resample. Hence, if you want hourly data you should resample like this:
DataFactory = store.getdata
datakwargs = dict(
historical=False,
backfill_start=False,
)
data0 = DataFactory(dataname='SPY', **datakwargs)
cerebro.resampledata(data0, timeframe=bt.TimeFrame.Minutes, compression=60)
I would like to trade on an hour frequency using Alpaca. I successfully set up the backtest of my strategy. Now, I would like to migrate it to paper trading. If I use @kr011 approach (for test, I set Minutes timeframe and compression to 2), nothing happens. If backfill_start is equal to False, and my indicator has period of 600, doesn't that mean that algo should wait for 600 * 2 minutes to start working (for hour data I need to wait for 75 days!)? Why should I set backfill_start to True?
Hi, when you set the backfill_start to True you initiate a REST api call to fill past data. for paper/live trading it uses the Websocket api to get the data
the purpose of this, is to get past data to calculate indicators. the Websocket api is live streaming (meaning you can't, e.g get yesterday's data)
as I mentioned, this project in live mode is event driven, meaning when we get a new event (doesn't matter if it's a quote or a minute bar) we pass it to the user's code
@shlomikushchi , what is then the recommended approach to use hour frequency data for paper trading, with the assumption that indicators are computed from the beginning. For example, if I want 600 period SMA on an hourly basis in paper trading should I use:
data = DataFactory(dataname='SPY',
historical=False,
backfill_start=False)
cerebro.resampledata(data, timeframe=timeframe, compression=compression)
as @kr011 recommended? Or should I set backfill_start
to True?
when you use the backfill with a compression we get the requested data with a rest call (so far nothing new), and then resample it to the requested granularity and comression (e.g 15 minute frames or 1 hour, 5 hours ...)
after that, the calls to your code are quote driven you could check in your code what time it is, and act accordingly
when you use the backfill with a compression we get the requested data with a rest call (so far nothing new), and then resample it to the requested granularity and comression (e.g 15 minute frames or 1 hour, 5 hours ...)
You resample it if I use cerebro.resample
or you resample it no metter if I use cerebro.resample
(if I use DataFactory only)?
after that, the calls to your code are quote driven
What does that mean?
you could check in your code what time it is, and act accordingly
I don't understand this part eathier. Why do I need to check time?
we resample the data according to what you specify in the DataFactory, so the data you get from the API will match your configuration.
now, the data is inside backtrader. cerebro.resample is a backtrader functionality. you don't need to use it, if you specify the desired data in the datafactory
there are 2 types of triggers: clock (every X time units) and events
this project is an integration between backtrader and alpaca api (so it's not pure backtrader) the events that were selected to trigger the next() function are quotes (events) , not a clock.
that means you are called every quote. you can select if act upon that quote or not.
in backtesting mode, the backtrader engine has all the data (meaning it doesn't wait for events) and can trigger the next() function in any way you define.
in paper/live mode, it work a bit differently(as described). it's also the way it works with IB integration (if I remember correctly, haven't read that code in a while)
If you 'resample the data according to what you specify in the DataFactory,' than following code:
data0 = DataFactory(dataname=symbol,
historical=False,
timeframe=bt.TimeFrame.Days,
backfill_start=True,
)
should work as expected. I add log at next to see what;s happening. I can see execution every few seconds. Example:
2020-09-22T22:33:53, Close, 330.03
2020-09-22T22:33:54, Close, 330.03
2020-09-22T22:33:56, Close, 330.08
2020-09-22T22:33:59, Close, 330.11
2020-09-22T22:34:00, Close, 330.13
2020-09-22T22:34:01, Close, 330.13
2020-09-22T22:34:04, Close, 330.13
2020-09-22T22:34:07, Close, 330.13
2020-09-22T22:34:12, Close, 330.16
I can see buy and sell orders, but they can be executed at seconds too:
2020-09-22T22:34:12, Close, 330.16
2020-09-22T22:34:13, Close, 330.18
2020-09-22T22:34:14, Close, 330.23
Sell at 1.0642
2020-09-22T22:34:14, SELL CREATE, 330.23
2020-09-22T22:34:15, Close, 330.23
Buy at 0.8915
2020-09-22T22:34:15, BUY CREATE, 330.23
2020-09-22T22:34:16, Close, 330.26
2020-09-22T22:34:19, Close, 330.19
Buy at -0.4643
2020-09-22T22:34:19, BUY CREATE, 330.19
2020-09-22T22:34:21, Close, 330.17
2020-09-22T22:34:22, Close, 330.20
I can't see any of the orders in my alpaca accpunt.
I have changed the code, following examples in this repo. My cerebro looks like this:
def run(args=None):
# Create a cerebro entity
cerebro = bt.Cerebro()
# parse args
args = parse_args(args)
# Definetrade option
"""
You have 3 options:
- backtest (IS_BACKTEST=True, IS_LIVE=False)
- paper trade (IS_BACKTEST=False, IS_LIVE=False)
- live trade (IS_BACKTEST=False, IS_LIVE=True)
"""
IS_BACKTEST = args.isbacktest
IS_LIVE = args.islive
symbol = "SPY"
# Add a strategy
cerebro.addstrategy(TestStrategy)
# Alpaca API store
store = alpaca_backtrader_api.AlpacaStore(
key_id=ALPACA_API_KEY,
secret_key=ALPACA_SECRET_KEY,
paper=IS_LIVE,
usePolygon=USE_POLYGON
)
# Set data factory: Alpaca or pandas
DataFactory = store.getdata # or use alpaca_backtrader_api.AlpacaData
# extract args
from_date = args.fromdate
from_date = datetime.datetime.strptime(from_date, '%Y-%m-%d')
timeframe = bt.TimeFrame.TFrame(args.timeframealpaca)
compression = args.compression
# Data feed
if IS_BACKTEST:
data0 = DataFactory(dataname=symbol,
historical=True,
fromdate=from_date,
timeframe=timeframe,
compression=compression)
else:
data0 = DataFactory(dataname=symbol,
historical=False,
timeframe=timeframe)
# or just alpaca_backtrader_api.AlpacaBroker()
broker = store.getbroker()
cerebro.setbroker(broker)
# Add dat to cerebro
cerebro.adddata(data0)
# set cash if backtest
if IS_BACKTEST:
# backtrader broker set initial simulated cash
cerebro.broker.setcash(100000.0)
# Set sizer
cerebro.addsizer(bt.sizers.AllInSizer)
# cerebro.broker.set_checksubmit(checksubmit=False)
# check returns and banchmarks
if args.timereturn:
cerebro.addobserver(bt.observers.TimeReturn,
timeframe=TIMEFRAMES[args.timeframe])
else:
benchdata = data0
if args.benchdata1:
data1 = DataFactory(dataname=symbol, historical=True,
fromdate=from_date,
timeframe=bt.TimeFrame.Minutes, compression=60)
# cerebro.adddata(data1, name='Data1')
# benchdata = data1
cerebro.addobserver(bt.observers.Benchmark,
data=benchdata,
timeframe=TIMEFRAMES[args.timeframe])
# Run over everything
cerebro.run(maxcpus=8)
# Plot
cerebro.plot()
Backtesting works as expected, but When I want to run paper trading, it returns:
ERROR:root:error while consuming ws messages: Invalid Alpaca API credentials, Failed to authenticate: {'stream': 'authorization', 'data': {'action': 'authenticate', 'message': 'access key verification failed', 'status': 'unauthorized'}}
Can I use paper trading, even if I am from EU?
anyone can use paper trading
What can be the reason for this error? I tried with sample code from this github repo and get the same error. I copy-paste key and id from alpaca paper trading site. Backtest works.
well basically backtest and paper trading use the same apis make sure you:
- make sure you don't use the polygon stream
- make sure you pass the credentials to the AlpacaStore object
- make sure you are using the paper url
- make sure you don't run it over with environment variables (or if you do use them, make sure they are correct)
To be sure it's not problem in my code I have used this example: https://github.com/alpacahq/alpaca-backtrader-api/blob/master/sample/strategy_multiple_indicators.py
In this example:
- polygon is set to false (USE_POLYGON = False) so this is not the problem.
- I have set the creditentials like: ALPACA_API_KEY = "<key_id>" from url https://app.alpaca.markets/paper/dashboard/overview ALPACA_SECRET_KEY = "<secret_key>" from url https://app.alpaca.markets/paper/dashboard/overview
- If you mean the url from which I copy paste keys it is: https://app.alpaca.markets/paper/dashboard/overview
- I set keys directly in the script, but to be sure I set them in env file to (but I don't use this env file in the script)
I copy paste here error message:
(base) PS C:\Users\Mislav\Documents\GitHub\trademl\trademl\algos> python .\ib_test.py
C:\Users\Mislav\AppData\Roaming\Python\Python38\site-packages\requests\__init__.py:78: RequestsDependencyWarning: urllib3 (1.25.10) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({0}) or chardet ({1}) doesn't match a supported "
2020-09-23 20:36:55,521 error while consuming ws messages: Invalid Alpaca API credentials, Failed to authenticate: {'stream': 'authorization', 'data': {'action': 'authenticate', 'message': 'access key verification failed', 'status': 'unauthorized'}}
2020-09-23 20:36:56,352 error while consuming ws messages: Invalid Alpaca API credentials, Failed to authenticate: {'stream': 'authorization', 'data': {'action': 'authenticate', 'message': 'access key verification failed', 'status': 'unauthorized'}}
2020-09-23 20:36:57,223 error while consuming ws messages: Invalid Alpaca API credentials, Failed to authenticate: {'stream': 'authorization', 'data': {'action': 'authenticate', 'message': 'access key verification failed', 'status': 'unauthorized'}}
2020-09-23 20:36:58,132 error while consuming ws messages: Invalid Alpaca API credentials, Failed to authenticate: {'stream': 'authorization', 'data': {'action': 'authenticate', 'message': 'access key verification failed', 'status': 'unauthorized'}}
2020-09-23 20:36:59,004 error while consuming ws messages: Invalid Alpaca API credentials, Failed to authenticate: {'stream': 'authorization', 'data': {'action': 'authenticate', 'message': 'access key verification failed', 'status': 'unauthorized'}}
2020-09-23 20:36:59,891 error while consuming ws messages: Invalid Alpaca API credentials, Failed to authenticate: {'stream': 'authorization', 'data': {'action': 'authenticate', 'message': 'access key verification failed', 'status': 'unauthorized'}}
2020-09-23 20:37:00,805 error while consuming ws messages: Invalid Alpaca API credentials, Failed to authenticate: {'stream': 'authorization', 'data': {'action': 'authenticate', 'message': 'access key verification failed', 'status': 'unauthorized'}}
2020-09-23 20:37:01,773 error while consuming ws messages: Invalid Alpaca API credentials, Failed to authenticate: {'stream': 'authorization', 'data': {'action': 'authenticate', 'message': 'access key verification failed', 'status': 'unauthorized'}}
2020-09-23 20:37:02,613 error while consuming ws messages: Invalid Alpaca API credentials, Failed to authenticate: {'stream': 'authorization', 'data': {'action': 'authenticate', 'message': 'access key verification failed', 'status': 'unauthorized'}}
2020-09-23 20:37:03,466 error while consuming ws messages: Invalid Alpaca API credentials, Failed to authenticate: {'stream': 'authorization', 'data': {'action': 'authenticate', 'message': 'access key verification failed', 'status': 'unauthorized'}}
2020-09-23 20:37:04,333 error while consuming ws messages: Invalid Alpaca API credentials, Failed to authenticate: {'stream': 'authorization', 'data': {'action': 'authenticate', 'message': 'access key verification failed', 'status': 'unauthorized'}}
2020-09-23 20:37:05,188 error while consuming ws messages: Invalid Alpaca API credentials, Failed to authenticate: {'stream': 'authorization', 'data': {'action': 'authenticate', 'message': 'access key verification failed', 'status': 'unauthorized'}}
2020-09-23 20:37:06,089 error while consuming ws messages: Invalid Alpaca API credentials, Failed to authenticate: {'stream': 'authorization', 'data': {'action': 'authenticate', 'message': 'access key verification failed', 'status': 'unauthorized'}}
2020-09-23 20:37:06,811 connected to: wss://data.alpaca.markets/stream
2020-09-23 20:37:06,950 error while consuming ws messages: Invalid Alpaca API credentials, Failed to authenticate: {'stream': 'authorization', 'data': {'action': 'authenticate', 'message': 'access key verification failed', 'status': 'unauthorized'}}
2020-09-23 20:37:07,828 error while consuming ws messages: Invalid Alpaca API credentials, Failed to authenticate: {'stream': 'authorization', 'data': {'action': 'authenticate', 'message': 'access key verification failed', 'status': 'unauthorized'}}
2020-09-23 20:37:08,724 error while consuming ws messages: Invalid Alpaca API credentials, Failed to authenticate: {'stream': 'authorization', 'data': {'action': 'authenticate', 'message': 'access key verification failed', 'status': 'unauthorized'}}
Traceback (most recent call last):
File ".\ib_test.py", line 110, in <module>
cerebro.run()
File "C:\ProgramData\Anaconda3\lib\site-packages\backtrader\cerebro.py", line 1127, in run
runstrat = self.runstrategies(iterstrat)
File "C:\ProgramData\Anaconda3\lib\site-packages\backtrader\cerebro.py", line 1298, in runstrategies
self._runnext(runstrats)
File "C:\ProgramData\Anaconda3\lib\site-packages\backtrader\cerebro.py", line 1630, in _runnext
strat._next()
File "C:\ProgramData\Anaconda3\lib\site-packages\backtrader\strategy.py", line 347, in _next
super(Strategy, self)._next()
File "C:\ProgramData\Anaconda3\lib\site-packages\backtrader\lineiterator.py", line 265, in _next
self._notify()
File "C:\ProgramData\Anaconda3\lib\site-packages\backtrader\strategy.py", line 605, in _notify
value = self.broker.getvalue()
File "C:\ProgramData\Anaconda3\lib\site-packages\alpaca_backtrader_api\alpacabroker.py", line 159, in getvalue
self.value = float(self.o.oapi.get_account().portfolio_value)
File "C:\ProgramData\Anaconda3\lib\site-packages\alpaca_trade_api\entity.py", line 29, in __getattr__
return super().__getattribute__(key)
AttributeError: 'Account' object has no attribute 'portfolio_value'
you are still getting this message: Invalid Alpaca API credentials. try using the example code as is, just put in your credentials
I am using the code as it is. That's why I am confused, it should work. I have restarted the paper trading (button right of Equity), regenerate the keys again, and enter them as creds. I get the same error again. Do I have to enter the endpoint somewhere?
I forget to say, that I get this error too:
(base) PS C:\Users\Mislav\Documents\GitHub\trademl\trademl\algos> python .\ib_test.py
C:\Users\Mislav\AppData\Roaming\Python\Python38\site-packages\requests\__init__.py:78: RequestsDependencyWarning: urllib3 (1.25.10) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({0}) or chardet ({1}) doesn't match a supported "
Traceback (most recent call last):
File ".\ib_test.py", line 109, in <module>
print('Starting Portfolio Value: {}'.format(cerebro.broker.getvalue()))
File "C:\ProgramData\Anaconda3\lib\site-packages\alpaca_backtrader_api\alpacabroker.py", line 159, in getvalue
self.value = float(self.o.oapi.get_account().portfolio_value)
File "C:\ProgramData\Anaconda3\lib\site-packages\alpaca_trade_api\entity.py", line 29, in __getattr__
return super().__getattribute__(key)
AttributeError: 'Account' object has no attribute 'portfolio_value'
but this error disappeared when I comment those 2 lines:
...
if IS_BACKTEST:
# backtrader broker set initial simulated cash
cerebro.broker.setcash(100000.0)
# print('Starting Portfolio Value: {}'.format(cerebro.broker.getvalue()))
cerebro.run()
# print('Final Portfolio Value: {}'.format(cerebro.broker.getvalue()))
cerebro.plot()
EDIT: I am on windows 10, if that is important info.
How is possible that backtesting is working and paper trading doesnćt work. If it is a problem with keys, than none should work?
in backtesting mode you are using the rest client only (without live stream update)
in the live/paper mode you are using the websocket with the rest api so 2 clients, each initialized by the API keys
I added an example code to demonstrate how to change the quote data to minute bars using backtrader's resample functionality: https://github.com/alpacahq/alpaca-backtrader-api/blob/master/sample/strategy_resampling_example.py
if that helps anyone