alpha_vantage icon indicating copy to clipboard operation
alpha_vantage copied to clipboard

extended time series support

Open densadak opened this issue 3 years ago • 2 comments

i could be doing something wrong, but i'm getting JSONDecoderError when trying to use extended ts api:

envs\DataStore\lib\json\decoder.py in raw_decode(self, s, idx) 353 obj, end = self.scan_once(s, idx) 354 except StopIteration as err: --> 355 raise JSONDecodeError("Expecting value", s, err.value) from None 356 return obj, end

JSONDecodeError: Expecting value: line 1 column 1 (char 0)

sample code: import alpha_vantage as av from alpha_vantage.timeseries import TimeSeries import os source_key = os.getenv('ALPHAVANTAGE_API_KEY') ts = TimeSeries(source_key) ts.get_intraday_extended('SPY', interval='15min', slice='year1month1')

densadak avatar Jan 31 '22 03:01 densadak

I am surprised noone responded in 2 months, but I face the same problem.

After debugging sources and reading documentation, it is clear that either this endpoint never worked or alpha_vantage did a breaking change, because per doc:

...this endpoint uses the CSV format which is more memory-efficient than JSON...

This wrapper naively appends &datatype=json to the end of the uri which does work for the regular API but not extended one.

So a hotfix would be to request output_format="csv" and then pandas.from_csv for example

btlk avatar Mar 26 '22 14:03 btlk

if you don't have premium api key, you got only 5 API requests per minute

quanthero avatar Jul 24 '22 18:07 quanthero

Not an API issue. I can get a response if I specify CSV format. the issue becomes converting it. as mentioned in https://github.com/RomelTorres/alpha_vantage/issues/287#issuecomment-803411808 the output is a _csv.reader object so pipe it through list and then use pandas.dataframe.from_records to convert it back to pandas.

csv_list = list(_csv.reader)   
df = pandas.DataFrame.from_records(csv_list[1:], columns=csv_list[0])

As far as your API limit if you wanted to iterate through the 24 slices you can simply add a throttling delay to avoid hitting the limit.

Dracozny avatar Nov 27 '22 18:11 Dracozny

Why not implement this into the package? This solution works great

arturdaraujo avatar Nov 28 '22 15:11 arturdaraujo

[2023-07] Temporarily closing this issue due to inactivity.

AlphaVantageSupport avatar Jul 07 '23 06:07 AlphaVantageSupport