RLTrader icon indicating copy to clipboard operation
RLTrader copied to clipboard

how use it ?

Open hardy110 opened this issue 5 years ago • 1 comments

first install it next ? python ./cli.py optimize or python ./optimize.py next ? python ./cli.py train next ? python ./cli.py test

hardy110 avatar Aug 12 '19 02:08 hardy110

I've been heavily using RTrader for weeks now.

The essences of what you do is: optimize, train, test, rinse and repeat.

Tips:

  • Read the README and the recommend series:
    • https://towardsdatascience.com/creating-bitcoin-trading-bots-that-dont-lose-money-2e7165fb0b29
  • Get a meaningful set of data. It comes with a couple OHLCV data files. At the moment I'm using FOREX EUR/USD D1.
data_file = data_path + file_name
df = pd.read_csv('data/' + data_file)
columns = ["Date", "Open", "High", "Low", "Close", "Volume"]
df.columns = columns
  • Adjust the Date column (if needed)
df["Date"] = pd.to_datetime(df["Date"], format="%Y-%m-%d")
  • Make the data a time series (if wanted)
df.set_index('Date', inplace=True)
  • Sort the data by Date
df.sort_index(ascending=True, inplace=True)
  • Add some TA features and clip the data into segments
from ta.volatility import BollingerBands
from ta.momentum import rsi
from ta.trend import macd, macd_diff, macd_signal, ema_indicator

dataset = df.copy(deep=True)

close = dataset["Close"]

# Exponential moving average (EMA)
dataset["ema"] = ema_indicator(close=close, window=10)

dataset['rsi'] = rsi(close=close)
dataset['macd'] = macd(close=close)
dataset['macd_diff'] = macd_diff(close=close)
dataset['macd_signal'] = macd_signal(close=close)

clipped_data = dataset[0:5000]
clipped_data["Close"].plot()
clipped_data.to_csv("GBPUSD_M30_0_5000.csv")
  • If the GPU consumes all your GPU mem and crashes your machine, turn it off
os.environ["CUDA_VISIBLE_DEVICES"] = "-1"
  • If the standard optimize.py uses all your CPU and memory, create your own version and limit the number CPU it uses in its pool or remove/bi-pass the multiprocessing
# n_processes = multiprocessing.cpu_count()
n_processes = 6

or bi-pass the multiprocessing and write your own optimize script.

file_name = "EURUSD_D1.csv"
input_data_path = f"{input_path}/{file_name}"
date_format = ProviderDateFormat.DATETIME_HOUR_24
data_provider = StaticDataProvider(date_format=date_format,
                                   csv_data_path=input_data_path,
                                   data_columns=data_columns)

params = {
    # "tensorboard_path": tensorboard_path,
    "data_provider": data_provider,
    "reward_strategy": CustomRewards,
    "show_debug": True,
}

if __name__ == '__main__':
    for i in range(1):
        trader = RLTrader(**params)
        trader.optimize()
        trader.train(render_test_env=False, render_report=False, save_report=True)
  • Learn some basics about using optuna

E.g., show all the studies saved to Sqlite:

optuna studies --storage sqlite:///data/params.db

Or delete a study:

optuna delete-study --study-name PPO2_MlpLnLstmPolicy_WeightedUnrealizedProfit --storage sqlite:///data/params.db
  • Monitor your optuna study progress in a Jupyter notebook:
import optuna
db_path = "sqlite:///data/params.db"
study_name = "PPO2__MlpLnLstmPolicy__CustomRewards"
direction = "maximize"

optuna_study = optuna.create_study(
    study_name=study_name, storage=db_path, direction=direction, load_if_exists=True
)
len(optuna_study.get_trials())

trials_df = optuna_study.trials_dataframe()

trials_df.rename({"params_cliprange": "cliprange", "params_ent_coef": "ent_coef", "params_gamma": "gamma", "params_lam": "lam", "params_learning_rate": "lrn_rate", "params_n_steps":"n_steps", "params_noptepochs": "noptepochs", "system_attrs_fail_reason":"fail"}, axis=1, inplace=True)
trials_df.drop('datetime_complete', axis=1, inplace=True)

trials_df.sort_values(by="value", ascending=False).head(5)

optuna_study.best_value

trials_df[["value"]].plot()

fig = optuna.visualization.plot_optimization_history(optuna_study)
fig.show()

optuna.visualization.plot_slice(optuna_study)
  • Add logging statements to get insights and/or add break points and/or watch the code in an IDE like PyCharm.
  • If the logs start getting too large using rolling logs and compress them:
max_bytes = 10485760
backup_count = 20
use_gzip = True

    fh = ConcurrentRotatingFileHandler(info_log, maxBytes=max_bytes, backupCount=backup_count, use_gzip=use_gzip)

MichaelQuaMan avatar Feb 05 '22 05:02 MichaelQuaMan