201 KiB
Multi-Asset Strategy Simulation¶
In this section, we will run the Double Bollinger Band Strategy from our earlier tutorial on multiple assets. But before we do that, we have to bring the quote value of all our forex currency pairs to the account currency (USD).
import numpy as np import pandas as pd import vectorbtpro as vbt
## Forex Data hdf_data = vbt.HDFData.fetch('/Users/dilip.rajkumar/Documents/vbtpro_tuts_private/data/MultiAsset_OHLCV_3Y_m1.h5', # missing_index = 'drop', silence_warnings=True) ## Crypto Data # m1_data = vbt.HDFData.fetch('../data/Binance_MultiAsset_OHLCV_3Y_m1.h5')
0%| | 0/8 [00:00<?, ?it/s]
/opt/miniconda3/envs/vbt/lib/python3.10/site-packages/vectorbtpro/data/base.py:688: UserWarning: Symbols have mismatching index. Setting missing data points to NaN. data = cls.align_index(data, missing=missing_index, silence_warnings=silence_warnings)
Convert FX pairs where quote_currency != account currency ( US$ )¶
We will be converting OHLC price columns for the following currency pairs to the account currency (USD), as in these pairs either the quote currency or both the base currency & quote currency are not the same as the account currency which in our case is USD.
symbols = hdf_data.symbols print('Multi-Asset DataFrame Symbols:',symbols)
Multi-Asset DataFrame Symbols: ['AUDUSD', 'EURGBP', 'EURUSD', 'GBPAUD', 'GBPJPY', 'GBPUSD', 'USDCAD', 'USDJPY']
## Convert FX pairs where quote_currency is != USD (account currency) price_cols = ["Open", "High", "Low", "Close"] symbols_to_convert = ["USDJPY", "USDCAD", "GBPJPY", "EURGBP", "GBPAUD"]
def convert_to_account_currency(price_data : pd.Series, account_currency : str = "USD", bridge_pair_price_data: pd.Series = None) -> pd.Series: """ Convert prices of different FX pairs to account currency. Parameters ========== price_data : pd.Series, Price data from (OHLC) columns of the pair to be converted account_currency: str, default = 'USD' bridge_pair_price_data: pd.Series, price data to be used when neither, the base or quote currency is = account currency Returns ======= new_instrument_price : pd.Series, converted price data """ symbol = price_data.name base_currency = symbol[0:3].upper() quote_currency = symbol[3:6].upper() ## a.k.a Counter_currency if base_currency == account_currency: ## case 1 - Eg: USDJPY # print(f"BaseCurrency: {base_currency} is same as AccountCurrency: {account_currency} for Symbol:- {symbol}."+ \ # "Performing price inversion") new_instrument_price = (1/price_data) elif (quote_currency != account_currency) and (base_currency != account_currency): ## Case 2 - Eg: GBPJPY bridge_pair_symbol = account_currency + quote_currency ## Bridge Pair symbol is : USDJPY print(f"Applying currency conversion for {symbol} with {bridge_pair_symbol} price data") if (bridge_pair_price_data is None): raise Exception(f"Price data for {bridge_pair_symbol} is missing. Please provide the same") elif (bridge_pair_symbol != bridge_pair_price_data.name.upper()): message = f"Mismatched data. Price data for {bridge_pair_symbol} is expected, but" + \ f"{bridge_pair_price_data.name.upper()} price data is provided" print(message) ## Eg: When AUDUSD is provided instead of USDAUD new_instrument_price = price_data * bridge_pair_price_data else: new_instrument_price = price_data/ bridge_pair_price_data ## Divide GBPJPY / USDJPY else: # print(f"No currency conversion needed for {symbol} as QuoteCurreny: {quote_currency} == Account Currency") new_instrument_price = price_data return new_instrument_price
We copy the data from the origina hdf_data file and store them in a dictionary of dataframes.
For symbols whose price columns are to be converted we create an empty pd.DataFrame which we will be filling with the converted price values
new_data = {} for symbol, df in hdf_data.data.items(): if symbol in symbols_to_convert: ## symbols whose price columns needs to be converted to account currency new_data[symbol] = pd.DataFrame(columns=['Open','High','Low','Close','Volume']) else: ## for other symbols store the data as it is new_data[symbol] = df
## Quick Sanity Check to see if empty dataframe was created new_data['USDCAD']
| Open | High | Low | Close | Volume |
|---|
Here we call our convert_to_account_currency() function to convert the price data to account cuurency.
For pairs like USDJPY and USDCAD a simple price inversion (Eg: 1 / USDJPY ) alone is sufficient, so for these cases we will be setting bridge_pair == None.
bridge_pairs = [None, None, "USDJPY", "GBPUSD", "AUDUSD"] for ticker_source, ticker_bridge in zip(symbols_to_convert, bridge_pairs): new_data[ticker_source]["Volume"] = hdf_data.get("Volume")[ticker_source] for col in price_cols: print("Source Symbol:", ticker_source, "|| Bridge Pair:", ticker_bridge, "|| Column:", col) new_data[ticker_source][col] = convert_to_account_currency( price_data = hdf_data.get(col)[ticker_source], bridge_pair_price_data = None if ticker_bridge is None else hdf_data.get(col)[ticker_bridge] )
Source Symbol: USDJPY || Bridge Pair: None || Column: Open Source Symbol: USDJPY || Bridge Pair: None || Column: High Source Symbol: USDJPY || Bridge Pair: None || Column: Low Source Symbol: USDJPY || Bridge Pair: None || Column: Close Source Symbol: USDCAD || Bridge Pair: None || Column: Open Source Symbol: USDCAD || Bridge Pair: None || Column: High Source Symbol: USDCAD || Bridge Pair: None || Column: Low Source Symbol: USDCAD || Bridge Pair: None || Column: Close Source Symbol: GBPJPY || Bridge Pair: USDJPY || Column: Open Applying currency conversion for GBPJPY with USDJPY price data Source Symbol: GBPJPY || Bridge Pair: USDJPY || Column: High Applying currency conversion for GBPJPY with USDJPY price data Source Symbol: GBPJPY || Bridge Pair: USDJPY || Column: Low Applying currency conversion for GBPJPY with USDJPY price data Source Symbol: GBPJPY || Bridge Pair: USDJPY || Column: Close Applying currency conversion for GBPJPY with USDJPY price data Source Symbol: EURGBP || Bridge Pair: GBPUSD || Column: Open Applying currency conversion for EURGBP with USDGBP price data Mismatched data. Price data for USDGBP is expected, but GBPUSD price data is provided Source Symbol: EURGBP || Bridge Pair: GBPUSD || Column: High Applying currency conversion for EURGBP with USDGBP price data Mismatched data. Price data for USDGBP is expected, but GBPUSD price data is provided Source Symbol: EURGBP || Bridge Pair: GBPUSD || Column: Low Applying currency conversion for EURGBP with USDGBP price data Mismatched data. Price data for USDGBP is expected, but GBPUSD price data is provided Source Symbol: EURGBP || Bridge Pair: GBPUSD || Column: Close Applying currency conversion for EURGBP with USDGBP price data Mismatched data. Price data for USDGBP is expected, but GBPUSD price data is provided Source Symbol: GBPAUD || Bridge Pair: AUDUSD || Column: Open Applying currency conversion for GBPAUD with USDAUD price data Mismatched data. Price data for USDAUD is expected, but AUDUSD price data is provided Source Symbol: GBPAUD || Bridge Pair: AUDUSD || Column: High Applying currency conversion for GBPAUD with USDAUD price data Mismatched data. Price data for USDAUD is expected, but AUDUSD price data is provided Source Symbol: GBPAUD || Bridge Pair: AUDUSD || Column: Low Applying currency conversion for GBPAUD with USDAUD price data Mismatched data. Price data for USDAUD is expected, but AUDUSD price data is provided Source Symbol: GBPAUD || Bridge Pair: AUDUSD || Column: Close Applying currency conversion for GBPAUD with USDAUD price data Mismatched data. Price data for USDAUD is expected, but AUDUSD price data is provided
## Converts this `new_data` dict of dataframes into a vbt.Data object m1_data = vbt.Data.from_data(new_data)
Ensuring Correct data for High and Low columns¶
Once we have the converted OHLC price columns for a particular symbol (ticker_source), we recalculate the High and Low by getting the max and min of each row in the OHLC columns respectively using df.max(axis=1) and df.min(axis=1)
for ticker_source in symbols: m1_data.data[ticker_source]['High'] = m1_data.data[ticker_source][price_cols].max(axis=1) m1_data.data[ticker_source]['Low'] = m1_data.data[ticker_source][price_cols].min(axis=1) # m1_data.data[ticker_source].dropna(inplace = True) ## This creates out of Bounds error
What need is there for above step?
Lets assume for a symbol X if low is 10 and high is 20, then when we do a simple price inversion ( 1/X ) new high would become 1/10 = 0.1 and new low would become 1/20 = 0.05 which will result in complications and thus arises the need for the above step
## Sanity check to see if empty pd.DataFrame got filled now m1_data.data['USDCAD'].dropna()
| Open | High | Low | Close | Volume | |
|---|---|---|---|---|---|
| time | |||||
| 2019-01-01 22:06:00+00:00 | 0.733353 | 0.733501 | 0.733353 | 0.733450 | 21.220 |
| 2019-01-01 22:07:00+00:00 | 0.733415 | 0.733452 | 0.733396 | 0.733396 | 7.875 |
| 2019-01-01 22:08:00+00:00 | 0.733393 | 0.733399 | 0.733393 | 0.733396 | 3.000 |
| 2019-01-01 22:09:00+00:00 | 0.733399 | 0.733469 | 0.733350 | 0.733350 | 17.750 |
| 2019-01-01 22:10:00+00:00 | 0.733348 | 0.733522 | 0.733348 | 0.733469 | 12.625 |
| ... | ... | ... | ... | ... | ... |
| 2022-04-27 15:02:30+00:00 | 0.771129 | 0.771197 | 0.771129 | 0.771159 | 93.290 |
| 2022-04-27 15:03:30+00:00 | 0.771156 | 0.771206 | 0.771156 | 0.771168 | 70.050 |
| 2022-04-27 15:04:30+00:00 | 0.771165 | 0.771239 | 0.771165 | 0.771224 | 91.840 |
| 2022-04-27 15:05:30+00:00 | 0.771230 | 0.771236 | 0.771209 | 0.771209 | 30.620 |
| 2022-04-27 15:06:30+00:00 | 0.771209 | 0.771340 | 0.771209 | 0.771304 | 72.445 |
296614 rows × 5 columns
Double Bollinger Band Strategy over Multi-Asset portfolio¶
The following steps are very similar we already saw in the Alignment and Resampling and Strategy Development tutorials, except now they are applied over multiple symbols (assets) in a portfolio. So I will just put the code here and won't be explaining anything here in detail, when in doubt refer back to the above two tutorials.
m15_data = m1_data.resample('15T') # Convert 1 minute to 15 mins h1_data = m1_data.resample("1h") # Convert 1 minute to 1 hour h4_data = m1_data.resample('4h') # Convert 1 minute to 4 hour
m15_data.wrapper.index
DatetimeIndex(['2019-01-01 22:00:00+00:00', '2019-01-01 22:15:00+00:00',
'2019-01-01 22:30:00+00:00', '2019-01-01 22:45:00+00:00',
'2019-01-01 23:00:00+00:00', '2019-01-01 23:15:00+00:00',
'2019-01-01 23:30:00+00:00', '2019-01-01 23:45:00+00:00',
'2019-01-02 00:00:00+00:00', '2019-01-02 00:15:00+00:00',
...
'2023-01-16 04:30:00+00:00', '2023-01-16 04:45:00+00:00',
'2023-01-16 05:00:00+00:00', '2023-01-16 05:15:00+00:00',
'2023-01-16 05:30:00+00:00', '2023-01-16 05:45:00+00:00',
'2023-01-16 06:00:00+00:00', '2023-01-16 06:15:00+00:00',
'2023-01-16 06:30:00+00:00', '2023-01-16 06:45:00+00:00'],
dtype='datetime64[ns, UTC]', name='time', length=141636, freq='15T')
# Obtain all the required prices using the .get() method m15_close = m15_data.get('Close') ## h1 data h1_open = h1_data.get('Open') h1_close = h1_data.get('Close') h1_high = h1_data.get('High') h1_low = h1_data.get('Low') ## h4 data h4_open = h4_data.get('Open') h4_close = h4_data.get('Close') h4_high = h4_data.get('High') h4_low = h4_data.get('Low')
Create (manually) the indicators for Multi-Time Frames¶
rsi_period = 21 ## 15m indicators m15_rsi = vbt.talib("RSI", timeperiod = rsi_period).run(m15_close, skipna=True).real.ffill() m15_bbands = vbt.talib("BBANDS").run(m15_close, skipna=True) m15_bbands_rsi = vbt.talib("BBANDS").run(m15_rsi, skipna=True) ## h1 indicators h1_rsi = vbt.talib("RSI", timeperiod = rsi_period).run(h1_close, skipna=True).real.ffill() h1_bbands = vbt.talib("BBANDS").run(h1_close, skipna=True) h1_bbands_rsi = vbt.talib("BBANDS").run(h1_rsi, skipna=True) ## h4 indicators h4_rsi = vbt.talib("RSI", timeperiod = rsi_period).run(h4_close, skipna=True).real.ffill() h4_bbands = vbt.talib("BBANDS").run(h4_close, skipna=True) h4_bbands_rsi = vbt.talib("BBANDS").run(h4_rsi, skipna=True)
def create_resamplers(result_dict_keys_list : list, source_indices : list, source_frequencies :list, target_index : pd.Series, target_freq : str): """ Creates a dictionary of vbtpro resampler objects. Parameters ========== result_dict_keys_list : list, list of strings, which are keys of the output dictionary source_indices : list, list of pd.time series objects of the higher timeframes source_frequencies : list(str), which are short form representation of time series order. Eg:["1D", "4h"] target_index : pd.Series, target time series for the resampler objects target_freq : str, target time frequency for the resampler objects Returns =========== resamplers_dict : dict, vbt pro resampler objects """ resamplers = [] for si, sf in zip(source_indices, source_frequencies): resamplers.append(vbt.Resampler(source_index = si, target_index = target_index, source_freq = sf, target_freq = target_freq)) return dict(zip(result_dict_keys_list, resamplers))
## Initialize dictionary mtf_data = {} col_values = [ m15_close, m15_rsi, m15_bbands.upperband, m15_bbands.middleband, m15_bbands.lowerband, m15_bbands_rsi.upperband, m15_bbands_rsi.middleband, m15_bbands_rsi.lowerband ] col_keys = [ "m15_close", "m15_rsi", "m15_bband_price_upper", "m15_bband_price_middle", "m15_bband_price_lower", "m15_bband_rsi_upper", "m15_bband_rsi_middle", "m15_bband_rsi_lower" ] # Assign key, value pairs for method of time series data to store in data dict for key, time_series in zip(col_keys, col_values): mtf_data[key] = time_series.ffill()
## Create Resampler Objects for upsampling src_indices = [h1_close.index, h4_close.index] src_frequencies = ["1H","4H"] resampler_dict_keys = ["h1_m15","h4_m15"] list_resamplers = create_resamplers(resampler_dict_keys, src_indices, src_frequencies, m15_close.index, "15T") list_resamplers
{'h1_m15': <vectorbtpro.base.resampling.base.Resampler at 0x28c2d54e0>,
'h4_m15': <vectorbtpro.base.resampling.base.Resampler at 0x28c2d7280>}
## Use along with Manual indicator creation method for MTF series_to_resample = [ [h1_open, h1_high, h1_low, h1_close, h1_rsi, h1_bbands.upperband, h1_bbands.middleband, h1_bbands.lowerband, h1_bbands_rsi.upperband, h1_bbands_rsi.middleband, h1_bbands_rsi.lowerband], [h4_high, h4_low, h4_close, h4_rsi, h4_bbands.upperband, h4_bbands.middleband, h4_bbands.lowerband, h4_bbands_rsi.upperband, h4_bbands_rsi.middleband, h4_bbands_rsi.lowerband] ] data_keys = [ ["h1_open","h1_high", "h1_low", "h1_close", "h1_rsi", "h1_bband_price_upper", "h1_bband_price_middle", "h1_bband_price_lower", "h1_bband_rsi_upper", "h1_bband_rsi_middle", "h1_bband_rsi_lower"], ["h4_open","h4_high", "h4_low", "h4_close", "h4_rsi", "h4_bband_price_upper", "h4_bband_price_middle", "h4_bband_price_lower", "h4_bband_rsi_upper", "h4_bband_rsi_middle", "h4_bband_rsi_lower"] ]
for lst_series, lst_keys, resampler in zip(series_to_resample, data_keys, resampler_dict_keys): for key, time_series in zip(lst_keys, lst_series): if key.lower().endswith('open'): print(f'Resampling {key} differently using vbt.resample_opening using "{resampler}" resampler') resampled_time_series = time_series.vbt.resample_opening(list_resamplers[resampler]) else: resampled_time_series = time_series.vbt.resample_closing(list_resamplers[resampler]) mtf_data[key] = resampled_time_series
Resampling h1_open differently using vbt.resample_opening using "h1_m15" resampler Resampling h4_open differently using vbt.resample_opening using "h4_m15" resampler
cols_order = ['m15_close', 'm15_rsi', 'm15_bband_price_upper','m15_bband_price_middle', 'm15_bband_price_lower', 'm15_bband_rsi_upper','m15_bband_rsi_middle', 'm15_bband_rsi_lower', 'h1_open', 'h1_high', 'h1_low', 'h1_close', 'h1_rsi', 'h1_bband_price_upper', 'h1_bband_price_middle', 'h1_bband_price_lower', 'h1_bband_rsi_upper', 'h1_bband_rsi_middle', 'h1_bband_rsi_lower', 'h4_open', 'h4_high', 'h4_low', 'h4_close', 'h4_rsi', 'h4_bband_price_upper', 'h4_bband_price_middle', 'h4_bband_price_lower', 'h4_bband_rsi_upper', 'h4_bband_rsi_middle', 'h4_bband_rsi_lower' ]
mtf_data.get('m15_rsi')
| symbol | AUDUSD | EURGBP | EURUSD | GBPAUD | GBPJPY | GBPUSD | USDCAD | USDJPY |
|---|---|---|---|---|---|---|---|---|
| time | ||||||||
| 2019-01-01 22:00:00+00:00 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| 2019-01-01 22:15:00+00:00 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| 2019-01-01 22:30:00+00:00 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| 2019-01-01 22:45:00+00:00 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| 2019-01-01 23:00:00+00:00 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 2023-01-16 05:45:00+00:00 | 68.223309 | 35.73201 | 52.757665 | 65.976674 | 40.581654 | 60.242668 | 60.559401 | 84.30117 |
| 2023-01-16 06:00:00+00:00 | 68.223309 | 35.73201 | 52.757665 | 65.976674 | 40.581654 | 60.242668 | 60.559401 | 84.30117 |
| 2023-01-16 06:15:00+00:00 | 68.223309 | 35.73201 | 52.757665 | 65.976674 | 40.581654 | 60.242668 | 60.559401 | 84.30117 |
| 2023-01-16 06:30:00+00:00 | 68.223309 | 35.73201 | 52.757665 | 65.976674 | 40.581654 | 60.242668 | 60.559401 | 84.30117 |
| 2023-01-16 06:45:00+00:00 | 68.223309 | 35.73201 | 52.757665 | 65.976674 | 40.581654 | 60.242668 | 60.559401 | 84.30117 |
141636 rows × 8 columns
Double Bollinger Band - Strategy Conditions¶
required_cols = ['m15_close','m15_rsi','m15_bband_rsi_lower', 'm15_bband_rsi_upper', 'h4_low', "h4_rsi", "h4_bband_price_lower", "h4_bband_price_upper" ]
## Higher values greater than 1.0 are like moving up the lower RSI b-band, ## signifying if the lowerband rsi is anywhere around 1% of the lower b-band validate that case as True bb_upper_fract = 0.99 bb_lower_fract = 1.01 ## Long Entry Conditions # c1_long_entry = (mtf_data['h1_low'] <= mtf_data['h1_bband_price_lower']) c1_long_entry = (mtf_data['h4_low'] <= mtf_data['h4_bband_price_lower']) c2_long_entry = (mtf_data['m15_rsi'] <= (bb_lower_fract * mtf_data['m15_bband_rsi_lower']) ) ## Long Exit Conditions # c1_long_exit = (mtf_data['h1_high'] >= mtf_data['h1_bband_price_upper']) c1_long_exit = (mtf_data['h4_high'] >= mtf_data['h4_bband_price_upper']) c2_long_exit = (mtf_data['m15_rsi'] >= (bb_upper_fract * mtf_data['m15_bband_rsi_upper']))
c1_long_entry
| symbol | AUDUSD | EURGBP | EURUSD | GBPAUD | GBPJPY | GBPUSD | USDCAD | USDJPY |
|---|---|---|---|---|---|---|---|---|
| time | ||||||||
| 2019-01-01 22:00:00+00:00 | False | False | False | False | False | False | False | False |
| 2019-01-01 22:15:00+00:00 | False | False | False | False | False | False | False | False |
| 2019-01-01 22:30:00+00:00 | False | False | False | False | False | False | False | False |
| 2019-01-01 22:45:00+00:00 | False | False | False | False | False | False | False | False |
| 2019-01-01 23:00:00+00:00 | False | False | False | False | False | False | False | False |
| ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 2023-01-16 05:45:00+00:00 | True | True | True | True | True | True | True | True |
| 2023-01-16 06:00:00+00:00 | True | True | True | True | True | True | True | True |
| 2023-01-16 06:15:00+00:00 | True | True | True | True | True | True | True | True |
| 2023-01-16 06:30:00+00:00 | True | True | True | True | True | True | True | True |
| 2023-01-16 06:45:00+00:00 | True | True | True | True | True | True | True | True |
141636 rows × 8 columns
pd.concat([mtf_data[col][c1_long_entry].add_suffix(f"_{col}") for col in required_cols], axis = 1)
| symbol | AUDUSD_m15_close | EURGBP_m15_close | EURUSD_m15_close | GBPAUD_m15_close | GBPJPY_m15_close | GBPUSD_m15_close | USDCAD_m15_close | USDJPY_m15_close | AUDUSD_m15_rsi | EURGBP_m15_rsi | ... | USDCAD_h4_bband_price_lower | USDJPY_h4_bband_price_lower | AUDUSD_h4_bband_price_upper | EURGBP_h4_bband_price_upper | EURUSD_h4_bband_price_upper | GBPAUD_h4_bband_price_upper | GBPJPY_h4_bband_price_upper | GBPUSD_h4_bband_price_upper | USDCAD_h4_bband_price_upper | USDJPY_h4_bband_price_upper |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| time | |||||||||||||||||||||
| 2019-01-01 22:00:00+00:00 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| 2019-01-01 22:15:00+00:00 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| 2019-01-01 22:30:00+00:00 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| 2019-01-01 22:45:00+00:00 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| 2019-01-01 23:00:00+00:00 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 2023-01-16 05:45:00+00:00 | 0.695595 | 1.122143 | 1.054005 | 1.309694 | 1.225058 | 1.198255 | 0.771304 | 0.008549 | 68.223309 | 35.73201 | ... | 52.28795 | 69.81101 | 0.692467 | 1.15173 | 1.054172 | 1.303757 | 1.228194 | 1.196343 | 0.770143 | 0.008446 |
| 2023-01-16 06:00:00+00:00 | 0.695595 | 1.122143 | 1.054005 | 1.309694 | 1.225058 | 1.198255 | 0.771304 | 0.008549 | 68.223309 | 35.73201 | ... | 52.28795 | 69.81101 | 0.692467 | 1.15173 | 1.054172 | 1.303757 | 1.228194 | 1.196343 | 0.770143 | 0.008446 |
| 2023-01-16 06:15:00+00:00 | 0.695595 | 1.122143 | 1.054005 | 1.309694 | 1.225058 | 1.198255 | 0.771304 | 0.008549 | 68.223309 | 35.73201 | ... | 52.28795 | 69.81101 | 0.692467 | 1.15173 | 1.054172 | 1.303757 | 1.228194 | 1.196343 | 0.770143 | 0.008446 |
| 2023-01-16 06:30:00+00:00 | 0.695595 | 1.122143 | 1.054005 | 1.309694 | 1.225058 | 1.198255 | 0.771304 | 0.008549 | 68.223309 | 35.73201 | ... | 52.28795 | 69.81101 | 0.692467 | 1.15173 | 1.054172 | 1.303757 | 1.228194 | 1.196343 | 0.770143 | 0.008446 |
| 2023-01-16 06:45:00+00:00 | 0.695595 | 1.122143 | 1.054005 | 1.309694 | 1.225058 | 1.198255 | 0.771304 | 0.008549 | 68.223309 | 35.73201 | ... | 52.28795 | 69.81101 | 0.692467 | 1.15173 | 1.054172 | 1.303757 | 1.228194 | 1.196343 | 0.770143 | 0.008446 |
141636 rows × 64 columns
pd.concat([mtf_data[col][c2_long_entry].add_suffix(f"_{col}") for col in required_cols], axis = 1)
| symbol | AUDUSD_m15_close | EURGBP_m15_close | EURUSD_m15_close | GBPAUD_m15_close | GBPJPY_m15_close | GBPUSD_m15_close | USDCAD_m15_close | USDJPY_m15_close | AUDUSD_m15_rsi | EURGBP_m15_rsi | ... | USDCAD_h4_bband_price_lower | USDJPY_h4_bband_price_lower | AUDUSD_h4_bband_price_upper | EURGBP_h4_bband_price_upper | EURUSD_h4_bband_price_upper | GBPAUD_h4_bband_price_upper | GBPJPY_h4_bband_price_upper | GBPUSD_h4_bband_price_upper | USDCAD_h4_bband_price_upper | USDJPY_h4_bband_price_upper |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| time | |||||||||||||||||||||
| 2019-01-01 22:00:00+00:00 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| 2019-01-01 22:15:00+00:00 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| 2019-01-01 22:30:00+00:00 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| 2019-01-01 22:45:00+00:00 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| 2019-01-01 23:00:00+00:00 | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | ... | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 2023-01-16 05:45:00+00:00 | 0.695595 | 1.122143 | 1.054005 | 1.309694 | 1.225058 | 1.198255 | 0.771304 | 0.008549 | 68.223309 | 35.73201 | ... | 52.28795 | 69.81101 | 0.692467 | 1.15173 | 1.054172 | 1.303757 | 1.228194 | 1.196343 | 0.770143 | 0.008446 |
| 2023-01-16 06:00:00+00:00 | 0.695595 | 1.122143 | 1.054005 | 1.309694 | 1.225058 | 1.198255 | 0.771304 | 0.008549 | 68.223309 | 35.73201 | ... | 52.28795 | 69.81101 | 0.692467 | 1.15173 | 1.054172 | 1.303757 | 1.228194 | 1.196343 | 0.770143 | 0.008446 |
| 2023-01-16 06:15:00+00:00 | 0.695595 | 1.122143 | 1.054005 | 1.309694 | 1.225058 | 1.198255 | 0.771304 | 0.008549 | 68.223309 | 35.73201 | ... | 52.28795 | 69.81101 | 0.692467 | 1.15173 | 1.054172 | 1.303757 | 1.228194 | 1.196343 | 0.770143 | 0.008446 |
| 2023-01-16 06:30:00+00:00 | 0.695595 | 1.122143 | 1.054005 | 1.309694 | 1.225058 | 1.198255 | 0.771304 | 0.008549 | 68.223309 | 35.73201 | ... | 52.28795 | 69.81101 | 0.692467 | 1.15173 | 1.054172 | 1.303757 | 1.228194 | 1.196343 | 0.770143 | 0.008446 |
| 2023-01-16 06:45:00+00:00 | 0.695595 | 1.122143 | 1.054005 | 1.309694 | 1.225058 | 1.198255 | 0.771304 | 0.008549 | 68.223309 | 35.73201 | ... | 52.28795 | 69.81101 | 0.692467 | 1.15173 | 1.054172 | 1.303757 | 1.228194 | 1.196343 | 0.770143 | 0.008446 |
141636 rows × 64 columns
## Strategy conditions check - Using m15 and h4 data mtf_data['entries'] = c1_long_entry & c2_long_entry mtf_data['exits'] = c1_long_exit & c2_long_exit
mtf_data['signal'] = 0 mtf_data['signal'] = np.where( mtf_data['entries'], 1, 0) mtf_data['signal'] = np.where( mtf_data['exits'] , -1, mtf_data['signal'])
After the above np.where, we can use this pd.df.where to return a pandas object
# After the above `np.where`, we can use this pd.df.where() # (https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.where.html) to return a pandas object mtf_data['signal'] = mtf_data['entries'].vbt.wrapper.wrap(mtf_data['signal']) mtf_data['signal'] = mtf_data['exits'].vbt.wrapper.wrap(mtf_data['signal'])
# list(mtf_data['signal']['GBPUSD'].unique()) symbols = m1_data.symbols for symbol in symbols: print(f"{symbol} Unique Signal col Values: {list(mtf_data['signal'][symbol].unique())}")
AUDUSD Unique Signal col Values: [0, -1, 1] EURGBP Unique Signal col Values: [0, 1, -1] EURUSD Unique Signal col Values: [0, -1, 1] GBPAUD Unique Signal col Values: [0, -1, 1] GBPJPY Unique Signal col Values: [0, -1, 1] GBPUSD Unique Signal col Values: [0, -1, 1] USDCAD Unique Signal col Values: [0, -1, 1] USDJPY Unique Signal col Values: [0, -1, 1]
print(type(mtf_data['signal']), "\nShape:", mtf_data['signal'].shape ) mtf_data['signal']
<class 'pandas.core.frame.DataFrame'> Shape: (141636, 8)
| symbol | AUDUSD | EURGBP | EURUSD | GBPAUD | GBPJPY | GBPUSD | USDCAD | USDJPY |
|---|---|---|---|---|---|---|---|---|
| time | ||||||||
| 2019-01-01 22:00:00+00:00 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 2019-01-01 22:15:00+00:00 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 2019-01-01 22:30:00+00:00 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 2019-01-01 22:45:00+00:00 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 2019-01-01 23:00:00+00:00 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 2023-01-16 05:45:00+00:00 | -1 | 1 | 1 | -1 | 1 | -1 | -1 | -1 |
| 2023-01-16 06:00:00+00:00 | -1 | 1 | 1 | -1 | 1 | -1 | -1 | -1 |
| 2023-01-16 06:15:00+00:00 | -1 | 1 | 1 | -1 | 1 | -1 | -1 | -1 |
| 2023-01-16 06:30:00+00:00 | -1 | 1 | 1 | -1 | 1 | -1 | -1 | -1 |
| 2023-01-16 06:45:00+00:00 | -1 | 1 | 1 | -1 | 1 | -1 | -1 | -1 |
141636 rows × 8 columns
mtf_data.keys()
dict_keys(['m15_close', 'm15_rsi', 'm15_bband_price_upper', 'm15_bband_price_middle', 'm15_bband_price_lower', 'm15_bband_rsi_upper', 'm15_bband_rsi_middle', 'm15_bband_rsi_lower', 'h1_open', 'h1_high', 'h1_low', 'h1_close', 'h1_rsi', 'h1_bband_price_upper', 'h1_bband_price_middle', 'h1_bband_price_lower', 'h1_bband_rsi_upper', 'h1_bband_rsi_middle', 'h1_bband_rsi_lower', 'h4_open', 'h4_high', 'h4_low', 'h4_close', 'h4_rsi', 'h4_bband_price_upper', 'h4_bband_price_middle', 'h4_bband_price_lower', 'h4_bband_rsi_upper', 'h4_bband_rsi_middle', 'entries', 'exits', 'signal'])
Cleaning and Resample entries and exits to H4 timeframe
entries = mtf_data['signal'] == 1.0 exits = mtf_data['signal'] == -1.0 # print(f"Total Nr. of Entry Signals:\n {entries.vbt.signals.total()}\n") # print(f"Total Nr. of Exit Signals:\n {exits.vbt.signals.total()}")
## Clean redundant and duplicate signals clean_entries, clean_exits = entries.vbt.signals.clean(exits) print(f"Total nr. of Signals in Clean_Entries and Clean_Exits") pd.DataFrame(data = {"Entries":clean_entries.vbt.signals.total(), "Exits": clean_exits.vbt.signals.total()})
Total nr. of Signals in Clean_Entries and Clean_Exits
| Entries | Exits | |
|---|---|---|
| symbol | ||
| AUDUSD | 343 | 343 |
| EURGBP | 173 | 172 |
| EURUSD | 396 | 395 |
| GBPAUD | 214 | 214 |
| GBPJPY | 131 | 130 |
| GBPUSD | 432 | 432 |
| USDCAD | 376 | 376 |
| USDJPY | 290 | 290 |
We can resample the entries and exits for plotting purposes on H4 chart, but this always produces some loss in the nr. of signals as the entries / exits in our strategy is based on M15 timeframe. So just be aware of this.
## Resample clean entries to H$ timeframe clean_h4_entries = clean_entries.vbt.resample_apply("4h", "any", wrap_kwargs=dict(dtype=bool)) clean_h4_exits = clean_exits.vbt.resample_apply("4h", "any", wrap_kwargs=dict(dtype=bool)) print("Total nr. of signals in H4_Entries and H4_Exits Signals:") pd.DataFrame(data = {"H4_Entries":clean_h4_entries.vbt.signals.total(), "h4_Exits": clean_h4_exits.vbt.signals.total()})
Total nr. of signals in H4_Entries and H4_Exits Signals:
| H4_Entries | h4_Exits | |
|---|---|---|
| symbol | ||
| AUDUSD | 310 | 307 |
| EURGBP | 136 | 143 |
| EURUSD | 353 | 357 |
| GBPAUD | 182 | 194 |
| GBPJPY | 104 | 119 |
| GBPUSD | 386 | 382 |
| USDCAD | 333 | 332 |
| USDJPY | 269 | 272 |
Saving Data to .pickle file¶
For the purposes of plotting, we will be saving various data like:
- price data across various timeframes
- indicator data across various timeframes
- entries & exits
- finally, the
vectorbt.portfolioobjects after running each type of portfolio simulation
## Save Specific Data to pickle file for plotting purposes price_data = {"h4_data": h4_data, "m15_data" : m15_data} vbt_indicators = {'m15_rsi': m15_rsi,'m15_price_bbands': m15_bbands, 'm15_rsi_bbands' : m15_bbands_rsi, 'h4_rsi': h4_rsi, 'h4_price_bbands':h4_bbands, 'h4_rsi_bbands' : h4_bbands_rsi} entries_exits_data = {'clean_entries' : clean_entries, 'clean_exits' : clean_exits} print(type(h4_data), '||' ,type(m15_data)) print(type(h4_bbands), '||', type(h4_bbands_rsi), '||', type(h1_rsi)) print(type(m15_bbands), '||', type(m15_bbands_rsi), '||', type(m15_rsi)) file_path1 = '../vbt_dashboard/data/price_data' file_path2 = '../vbt_dashboard/data/indicators_data' file_path3 = '../vbt_dashboard/data/entries_exits_data' vbt.save(price_data, file_path1) vbt.save(vbt_indicators, file_path2) vbt.save(entries_exits_data, file_path3)
<class 'vectorbtpro.data.base.Data'> || <class 'vectorbtpro.data.base.Data'> <class 'vectorbtpro.indicators.factory.talib.BBANDS'> || <class 'vectorbtpro.indicators.factory.talib.BBANDS'> || <class 'pandas.core.frame.DataFrame'> <class 'vectorbtpro.indicators.factory.talib.BBANDS'> || <class 'vectorbtpro.indicators.factory.talib.BBANDS'> || <class 'pandas.core.frame.DataFrame'>
Multi-asset Portfolio Backtesting simulation using vbt.Portfolio.from_signals()¶
In this section, we will see different ways to run this portfolio.from_signals() simulation and save the results as .pickle files to be used in a plotly-dash data visualization dashboard later (in another tutorial)
1.) Asset-wise Discrete Portfolio Simulation¶
In this section we will see how to run the portfolio simulation for each asset in the portfolio independently. If we start with the default from_signals() function as we had from the previous tutorial, the simulation is run for each symbol independently, which means the account balance is not connected between the various trades executed across symbols
pf_from_signals_v1 = vbt.Portfolio.from_signals( close = mtf_data['m15_close'], entries = mtf_data['entries'], exits = mtf_data['exits'], direction = "both", ## This setting trades both long and short signals freq = pd.Timedelta(minutes=15), init_cash = 100000 ) ## Save portfolio simulation as a pickle file pf_from_signals_v1.save("../vbt_dashboard/data/pf_sim_discrete") ## Load saved portfolio simulation from pickle file pf = vbt.Portfolio.load('../vbt_dashboard/data/pf_sim_discrete') ## View Trading History of pf.simulation pf_trade_history = pf.trade_history print("Unique Symbols:", list(pf_trade_history['Column'].unique()) ) pf_trade_history
Unique Symbols: ['AUDUSD', 'EURGBP', 'EURUSD', 'GBPAUD', 'GBPJPY', 'GBPUSD', 'USDCAD', 'USDJPY']
| Order Id | Column | Signal Index | Creation Index | Fill Index | Side | Type | Stop Type | Size | Price | Fees | PnL | Return | Direction | Status | Entry Trade Id | Exit Trade Id | Position Id | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0 | AUDUSD | 2019-01-08 16:00:00+00:00 | 2019-01-08 16:00:00+00:00 | 2019-01-08 16:00:00+00:00 | Sell | Market | None | 1.415919e+05 | 0.706255 | 0.0 | -934.506658 | -0.009345 | Short | Closed | 0 | -1 | 0 |
| 2 | 1 | AUDUSD | 2019-01-16 23:45:00+00:00 | 2019-01-16 23:45:00+00:00 | 2019-01-16 23:45:00+00:00 | Buy | Market | None | 1.415919e+05 | 0.712855 | 0.0 | -934.506658 | -0.009345 | Short | Closed | -1 | 0 | 0 |
| 1 | 1 | AUDUSD | 2019-01-16 23:45:00+00:00 | 2019-01-16 23:45:00+00:00 | 2019-01-16 23:45:00+00:00 | Buy | Market | None | 1.389700e+05 | 0.712855 | 0.0 | 1419.579037 | 0.014330 | Long | Closed | 1 | -1 | 1 |
| 4 | 2 | AUDUSD | 2019-01-20 23:00:00+00:00 | 2019-01-20 23:00:00+00:00 | 2019-01-20 23:00:00+00:00 | Sell | Market | None | 1.389700e+05 | 0.723070 | 0.0 | 1419.579037 | 0.014330 | Long | Closed | -1 | 1 | 1 |
| 3 | 2 | AUDUSD | 2019-01-20 23:00:00+00:00 | 2019-01-20 23:00:00+00:00 | 2019-01-20 23:00:00+00:00 | Sell | Market | None | 1.389700e+05 | 0.723070 | 0.0 | 25.014609 | 0.000249 | Short | Closed | 2 | -1 | 2 |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 7652 | 497 | USDJPY | 2022-01-09 08:15:00+00:00 | 2022-01-09 08:15:00+00:00 | 2022-01-09 08:15:00+00:00 | Buy | Market | None | 1.246473e+07 | 0.008489 | 0.0 | -223.637166 | -0.002114 | Long | Closed | 497 | -1 | 497 |
| 7655 | 498 | USDJPY | 2022-01-09 15:45:00+00:00 | 2022-01-09 15:45:00+00:00 | 2022-01-09 15:45:00+00:00 | Sell | Market | None | 1.246473e+07 | 0.008471 | 0.0 | -223.637166 | -0.002114 | Long | Closed | -1 | 497 | 497 |
| 7654 | 498 | USDJPY | 2022-01-09 15:45:00+00:00 | 2022-01-09 15:45:00+00:00 | 2022-01-09 15:45:00+00:00 | Sell | Market | None | 1.246473e+07 | 0.008471 | 0.0 | 125.074237 | 0.001185 | Short | Closed | 498 | -1 | 498 |
| 7657 | 499 | USDJPY | 2022-01-09 16:00:00+00:00 | 2022-01-09 16:00:00+00:00 | 2022-01-09 16:00:00+00:00 | Buy | Market | None | 1.246473e+07 | 0.008461 | 0.0 | 125.074237 | 0.001185 | Short | Closed | -1 | 498 | 498 |
| 7656 | 499 | USDJPY | 2022-01-09 16:00:00+00:00 | 2022-01-09 16:00:00+00:00 | 2022-01-09 16:00:00+00:00 | Buy | Market | None | 1.249429e+07 | 0.008461 | 0.0 | 1103.518449 | 0.010439 | Long | Open | 499 | -1 | 499 |
7658 rows × 18 columns
## View Portfolio Stats as a dataframe for pf_from_signals_v1 case stats_df = pd.concat([pf.stats()] + [pf[symbol].stats() for symbol in symbols], axis = 1) stats_df.loc['Avg Winning Trade Duration'] = [x.floor('s') for x in stats_df.iloc[21]] stats_df.loc['Avg Losing Trade Duration'] = [x.floor('s') for x in stats_df.iloc[22]] stats_df = stats_df.reset_index() stats_df.rename(inplace = True, columns = {'agg_stats':'Agg_Stats', 'index' : 'Metrics' }) stats_df
/var/folders/v1/9vbsmmyj7ml0rx9r62b_03c80000gn/T/ipykernel_15728/903157885.py:2: UserWarning: Object has multiple columns. Aggregated some metrics using <function mean at 0x1068cc3a0>. Pass column to select a single column/group. stats_df = pd.concat([pf.stats()] + [pf[symbol].stats() for symbol in symbols], axis = 1)
| Metrics | Agg_Stats | AUDUSD | EURGBP | EURUSD | GBPAUD | GBPJPY | GBPUSD | USDCAD | USDJPY | |
|---|---|---|---|---|---|---|---|---|---|---|
| 0 | Start | 2019-01-01 22:00:00+00:00 | 2019-01-01 22:00:00+00:00 | 2019-01-01 22:00:00+00:00 | 2019-01-01 22:00:00+00:00 | 2019-01-01 22:00:00+00:00 | 2019-01-01 22:00:00+00:00 | 2019-01-01 22:00:00+00:00 | 2019-01-01 22:00:00+00:00 | 2019-01-01 22:00:00+00:00 |
| 1 | End | 2023-01-16 06:45:00+00:00 | 2023-01-16 06:45:00+00:00 | 2023-01-16 06:45:00+00:00 | 2023-01-16 06:45:00+00:00 | 2023-01-16 06:45:00+00:00 | 2023-01-16 06:45:00+00:00 | 2023-01-16 06:45:00+00:00 | 2023-01-16 06:45:00+00:00 | 2023-01-16 06:45:00+00:00 |
| 2 | Period | 1475 days 09:00:00 | 1475 days 09:00:00 | 1475 days 09:00:00 | 1475 days 09:00:00 | 1475 days 09:00:00 | 1475 days 09:00:00 | 1475 days 09:00:00 | 1475 days 09:00:00 | 1475 days 09:00:00 |
| 3 | Start Value | 100000.0 | 100000.0 | 100000.0 | 100000.0 | 100000.0 | 100000.0 | 100000.0 | 100000.0 | 100000.0 |
| 4 | Min Value | 91702.349208 | 90045.131523 | 95322.347798 | 98627.645459 | 91206.754921 | 88023.335074 | 85813.084412 | 92669.521096 | 91910.973382 |
| 5 | Max Value | 107556.598818 | 109189.793819 | 106013.882745 | 111388.137845 | 112422.030122 | 107282.960571 | 101280.220766 | 105983.225022 | 106892.539653 |
| 6 | End Value | 100367.541502 | 99956.64516 | 101133.781763 | 104162.736879 | 97718.99519 | 97861.297648 | 91182.110407 | 104107.161632 | 106817.603339 |
| 7 | Total Return [%] | 0.367542 | -0.043355 | 1.133782 | 4.162737 | -2.281005 | -2.138702 | -8.81789 | 4.107162 | 6.817603 |
| 8 | Benchmark Return [%] | -2.566604 | -1.315145 | -2.391998 | -8.042332 | 2.733491 | -3.966899 | -6.427266 | 5.16234 | -6.285025 |
| 9 | Total Time Exposure [%] | 98.93371 | 99.542489 | 98.605581 | 98.738315 | 97.170917 | 98.401536 | 99.260075 | 99.867265 | 99.883504 |
| 10 | Max Gross Exposure [%] | 104.022697 | 102.613517 | 106.638701 | 101.206839 | 106.234838 | 106.373637 | 102.547336 | 105.004417 | 101.56229 |
| 11 | Max Drawdown [%] | 13.50183 | 17.533381 | 10.085033 | 11.455881 | 18.871101 | 15.141241 | 14.760923 | 9.236595 | 10.930482 |
| 12 | Max Drawdown Duration | 801 days 01:26:15 | 809 days 02:00:00 | 1215 days 23:15:00 | 479 days 16:45:00 | 1027 days 17:30:00 | 704 days 01:15:00 | 836 days 23:45:00 | 378 days 07:30:00 | 956 days 15:30:00 |
| 13 | Total Orders | 479.125 | 573 | 251 | 674 | 328 | 176 | 700 | 631 | 500 |
| 14 | Total Fees Paid | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 15 | Total Trades | 479.125 | 573 | 251 | 674 | 328 | 176 | 700 | 631 | 500 |
| 16 | Win Rate [%] | 62.855149 | 61.888112 | 66.4 | 64.041605 | 62.996942 | 58.857143 | 62.088698 | 64.444444 | 62.124248 |
| 17 | Best Trade [%] | 2.868856 | 2.696122 | 3.32835 | 1.69205 | 4.06637 | 4.057935 | 2.662538 | 1.170649 | 3.276834 |
| 18 | Worst Trade [%] | -4.567903 | -7.984396 | -3.586285 | -2.145795 | -6.862178 | -4.234196 | -6.089636 | -2.404781 | -3.235954 |
| 19 | Avg Winning Trade [%] | 0.23967 | 0.212419 | 0.277743 | 0.144288 | 0.361669 | 0.450155 | 0.169079 | 0.121248 | 0.180762 |
| 20 | Avg Losing Trade [%] | -0.392531 | -0.344853 | -0.521055 | -0.24255 | -0.626788 | -0.623333 | -0.314004 | -0.20067 | -0.266994 |
| 21 | Avg Winning Trade Duration | 1 days 19:21:39 | 1 days 04:19:47 | 3 days 02:11:06 | 1 days 08:06:38 | 1 days 19:07:08 | 3 days 09:30:08 | 1 days 06:56:38 | 1 days 02:53:58 | 1 days 05:47:51 |
| 22 | Avg Losing Trade Duration | 4 days 15:07:36 | 2 days 15:40:48 | 7 days 06:38:53 | 3 days 07:28:10 | 4 days 20:41:07 | 8 days 09:34:35 | 3 days 08:03:57 | 3 days 08:51:21 | 3 days 20:01:55 |
| 23 | Profit Factor | 1.027429 | 0.999423 | 1.076212 | 1.067599 | 0.973194 | 1.015027 | 0.884548 | 1.089144 | 1.114286 |
| 24 | Expectancy | 2.699415 | -0.075795 | 13.126114 | 6.124405 | -6.315343 | 3.805774 | -12.908394 | 6.387491 | 11.451072 |
| 25 | Sharpe Ratio | 0.07429 | 0.031206 | 0.074893 | 0.224996 | -0.020954 | -0.029906 | -0.29204 | 0.253851 | 0.352274 |
| 26 | Calmar Ratio | 0.019512 | -0.000612 | 0.027695 | 0.088521 | -0.030163 | -0.035229 | -0.152962 | 0.108346 | 0.150498 |
| 27 | Omega Ratio | 1.00543 | 1.001928 | 1.012664 | 1.012787 | 0.997674 | 0.993371 | 0.981075 | 1.018555 | 1.025383 |
| 28 | Sortino Ratio | 0.108364 | 0.042061 | 0.103337 | 0.328878 | -0.028837 | -0.043598 | -0.432771 | 0.336553 | 0.56129 |
print("Mean Total Return [%] (across cols):", np.round(np.mean(stats_df.iloc[[7]].values.tolist()[0][1:]), 4) ) print("Mean Total Orders (across cols):", np.round(np.mean(stats_df.iloc[[13]].values.tolist()[0][1:]), 4) ) print("Mean Sortino Ratio (across cols):", np.round(np.mean(stats_df.iloc[[28]].values.tolist()[0][1:]), 4) )
Mean Total Return [%] (across cols): 0.3675 Mean Total Orders (across cols): 479.125 Mean Sortino Ratio (across cols): 0.1084
2.) Run potfolio simulation treating the entire portfolio as a singular asset by enabling the following parameters in the pf.from_signals():
cash_sharing = Truegroup_by = Truesize = 100call_seq = "auto"
pf_from_signals_v2 = vbt.Portfolio.from_signals( close = mtf_data['m15_close'], entries = mtf_data['entries'], exits = mtf_data['exits'], direction = "both", ## This setting trades both long and short signals freq = pd.Timedelta(minutes=15), init_cash = "auto", size = 100000, group_by = True, cash_sharing = True, call_seq = "auto" ) ## Save portfolio simulation as a pickle file pf_from_signals_v2.save("../vbt_dashboard/data/pf_sim_single") ## Load portfolio simulation from pickle file pf = vbt.Portfolio.load('../vbt_dashboard/data/pf_sim_single') ## View Trading History of pf.simulation pf_trade_history = pf.trade_history print("Unique Symbols:", list(pf_trade_history['Column'].unique()) ) pf_trade_history
Unique Symbols: ['AUDUSD', 'EURGBP', 'EURUSD', 'GBPAUD', 'GBPJPY', 'GBPUSD', 'USDCAD', 'USDJPY']
| Order Id | Column | Signal Index | Creation Index | Fill Index | Side | Type | Stop Type | Size | Price | Fees | PnL | Return | Direction | Status | Entry Trade Id | Exit Trade Id | Position Id | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0 | AUDUSD | 2019-01-08 16:00:00+00:00 | 2019-01-08 16:00:00+00:00 | 2019-01-08 16:00:00+00:00 | Sell | Market | None | 100000.0 | 0.706255 | 0.0 | -660.000000 | -0.009345 | Short | Closed | 0 | -1 | 0 |
| 2 | 1 | AUDUSD | 2019-01-16 23:45:00+00:00 | 2019-01-16 23:45:00+00:00 | 2019-01-16 23:45:00+00:00 | Buy | Market | None | 100000.0 | 0.712855 | 0.0 | -660.000000 | -0.009345 | Short | Closed | -1 | 0 | 0 |
| 1 | 1 | AUDUSD | 2019-01-16 23:45:00+00:00 | 2019-01-16 23:45:00+00:00 | 2019-01-16 23:45:00+00:00 | Buy | Market | None | 100000.0 | 0.712855 | 0.0 | 1021.500000 | 0.014330 | Long | Closed | 1 | -1 | 1 |
| 4 | 2 | AUDUSD | 2019-01-20 23:00:00+00:00 | 2019-01-20 23:00:00+00:00 | 2019-01-20 23:00:00+00:00 | Sell | Market | None | 100000.0 | 0.723070 | 0.0 | 1021.500000 | 0.014330 | Long | Closed | -1 | 1 | 1 |
| 3 | 2 | AUDUSD | 2019-01-20 23:00:00+00:00 | 2019-01-20 23:00:00+00:00 | 2019-01-20 23:00:00+00:00 | Sell | Market | None | 100000.0 | 0.723070 | 0.0 | 18.000000 | 0.000249 | Short | Closed | 2 | -1 | 2 |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 7652 | 497 | USDJPY | 2022-01-09 08:15:00+00:00 | 2022-01-09 08:15:00+00:00 | 2022-01-09 08:15:00+00:00 | Buy | Market | None | 100000.0 | 0.008489 | 0.0 | -1.794160 | -0.002114 | Long | Closed | 497 | -1 | 497 |
| 7655 | 498 | USDJPY | 2022-01-09 15:45:00+00:00 | 2022-01-09 15:45:00+00:00 | 2022-01-09 15:45:00+00:00 | Sell | Market | None | 100000.0 | 0.008471 | 0.0 | -1.794160 | -0.002114 | Long | Closed | -1 | 497 | 497 |
| 7654 | 498 | USDJPY | 2022-01-09 15:45:00+00:00 | 2022-01-09 15:45:00+00:00 | 2022-01-09 15:45:00+00:00 | Sell | Market | None | 100000.0 | 0.008471 | 0.0 | 1.003425 | 0.001185 | Short | Closed | 498 | -1 | 498 |
| 7657 | 499 | USDJPY | 2022-01-09 16:00:00+00:00 | 2022-01-09 16:00:00+00:00 | 2022-01-09 16:00:00+00:00 | Buy | Market | None | 100000.0 | 0.008461 | 0.0 | 1.003425 | 0.001185 | Short | Closed | -1 | 498 | 498 |
| 7656 | 499 | USDJPY | 2022-01-09 16:00:00+00:00 | 2022-01-09 16:00:00+00:00 | 2022-01-09 16:00:00+00:00 | Buy | Market | None | 100000.0 | 0.008461 | 0.0 | 8.832179 | 0.010439 | Long | Open | 499 | -1 | 499 |
7658 rows × 18 columns
## View Portfolio Stats as a dataframe for pf_from_signals_v2 case pf.stats()
Start 2019-01-01 22:00:00+00:00 End 2023-01-16 06:45:00+00:00 Period 1475 days 09:00:00 Start Value 781099.026861 Min Value 751459.25085 Max Value 808290.908182 End Value 778580.017067 Total Return [%] -0.322496 Benchmark Return [%] 0.055682 Total Time Exposure [%] 99.883504 Max Gross Exposure [%] 99.851773 Max Drawdown [%] 4.745308 Max Drawdown Duration 740 days 04:15:00 Total Orders 3833 Total Fees Paid 0.0 Total Trades 3833 Win Rate [%] 63.006536 Best Trade [%] 4.06637 Worst Trade [%] -7.984396 Avg Winning Trade [%] 0.200416 Avg Losing Trade [%] -0.336912 Avg Winning Trade Duration 1 days 12:07:11.327800829 Avg Losing Trade Duration 3 days 22:03:11.201716738 Profit Factor 1.007163 Expectancy 0.858292 Sharpe Ratio -0.015093 Calmar Ratio -0.016834 Omega Ratio 0.999318 Sortino Ratio -0.021298 Name: group, dtype: object
3.) Run portfolio simulation by grouping individual instruments in the portfolio basket into two groups and enabling the following parameters in the pf.from_signals():
cash_sharing = Truegroup_by = 0call_seq = "auto"size = 100000
print("Symbols:",list(pf_from_signals_v2.wrapper.columns)) grp_type = ['USDPairs', 'NonUSDPairs', 'USDPairs', 'NonUSDPairs', 'NonUSDPairs', 'USDPairs', 'USDPairs', 'USDPairs'] unique_grp_types = list(set(grp_type)) print("Group Types:", grp_type) print("Nr. of Unique Groups:", unique_grp_types)
Symbols: ['AUDUSD', 'EURGBP', 'EURUSD', 'GBPAUD', 'GBPJPY', 'GBPUSD', 'USDCAD', 'USDJPY'] Group Types: ['USDPairs', 'NonUSDPairs', 'USDPairs', 'NonUSDPairs', 'NonUSDPairs', 'USDPairs', 'USDPairs', 'USDPairs'] Nr. of Unique Groups: ['USDPairs', 'NonUSDPairs']
def reorder_columns(df, group_by): return df.vbt.stack_index(group_by).sort_index(axis=1, level=0) pf_from_signals_v3 = vbt.Portfolio.from_signals( close = reorder_columns(mtf_data["m15_close"], group_by = grp_type), entries = reorder_columns(mtf_data['entries'], group_by = grp_type), exits = reorder_columns(mtf_data['exits'], group_by = grp_type), direction = "both", ## This setting trades both long and short signals freq = pd.Timedelta(minutes=15), init_cash = "auto", size = 100000, group_by = 0, cash_sharing=True, call_seq="auto" ) ## Save portfolio simulation as a pickle file pf_from_signals_v3.save("../vbt_dashboard/data/pf_sim_grouped") ## Load portfolio simulation from a pickle file pf = vbt.Portfolio.load('../vbt_dashboard/data/pf_sim_grouped')
Here we basically appended grp_type list as the top-most level to the columns of each dataframe, which makes it the first in the hierarchy.
Group-by can accept both level position and level name (which we don't have in this case, since we passed the grp_type list).
Refer the following pandas documentation to understand more hierarchical indexing: https://pandas.pydata.org/docs/user_guide/advanced.html
## View Trading History of pf.simulation pf_trade_history = pf.trade_history print("Unique Symbols:", list(pf_trade_history['Column'].unique()) ) pf_trade_history
Unique Symbols: [('NonUSDPairs', 'EURGBP'), ('NonUSDPairs', 'GBPAUD'), ('NonUSDPairs', 'GBPJPY'), ('USDPairs', 'AUDUSD'), ('USDPairs', 'EURUSD'), ('USDPairs', 'GBPUSD'), ('USDPairs', 'USDCAD'), ('USDPairs', 'USDJPY')]
| Order Id | Column | Signal Index | Creation Index | Fill Index | Side | Type | Stop Type | Size | Price | Fees | PnL | Return | Direction | Status | Entry Trade Id | Exit Trade Id | Position Id | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0 | (NonUSDPairs, EURGBP) | 2019-01-22 11:45:00+00:00 | 2019-01-22 11:45:00+00:00 | 2019-01-22 11:45:00+00:00 | Buy | Market | None | 100000.0 | 1.139290 | 0.0 | -1946.875030 | -0.017088 | Long | Closed | 0 | -1 | 0 |
| 2 | 1 | (NonUSDPairs, EURGBP) | 2019-01-28 21:00:00+00:00 | 2019-01-28 21:00:00+00:00 | 2019-01-28 21:00:00+00:00 | Sell | Market | None | 100000.0 | 1.119821 | 0.0 | -1946.875030 | -0.017088 | Long | Closed | -1 | 0 | 0 |
| 1 | 1 | (NonUSDPairs, EURGBP) | 2019-01-28 21:00:00+00:00 | 2019-01-28 21:00:00+00:00 | 2019-01-28 21:00:00+00:00 | Sell | Market | None | 100000.0 | 1.119821 | 0.0 | 71.304000 | 0.000637 | Short | Closed | 1 | -1 | 1 |
| 4 | 2 | (NonUSDPairs, EURGBP) | 2019-01-28 21:15:00+00:00 | 2019-01-28 21:15:00+00:00 | 2019-01-28 21:15:00+00:00 | Buy | Market | None | 100000.0 | 1.119108 | 0.0 | 71.304000 | 0.000637 | Short | Closed | -1 | 1 | 1 |
| 3 | 2 | (NonUSDPairs, EURGBP) | 2019-01-28 21:15:00+00:00 | 2019-01-28 21:15:00+00:00 | 2019-01-28 21:15:00+00:00 | Buy | Market | None | 100000.0 | 1.119108 | 0.0 | 102.818190 | 0.000919 | Long | Closed | 2 | -1 | 2 |
| ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
| 7652 | 497 | (USDPairs, USDJPY) | 2022-01-09 08:15:00+00:00 | 2022-01-09 08:15:00+00:00 | 2022-01-09 08:15:00+00:00 | Buy | Market | None | 100000.0 | 0.008489 | 0.0 | -1.794160 | -0.002114 | Long | Closed | 497 | -1 | 497 |
| 7655 | 498 | (USDPairs, USDJPY) | 2022-01-09 15:45:00+00:00 | 2022-01-09 15:45:00+00:00 | 2022-01-09 15:45:00+00:00 | Sell | Market | None | 100000.0 | 0.008471 | 0.0 | -1.794160 | -0.002114 | Long | Closed | -1 | 497 | 497 |
| 7654 | 498 | (USDPairs, USDJPY) | 2022-01-09 15:45:00+00:00 | 2022-01-09 15:45:00+00:00 | 2022-01-09 15:45:00+00:00 | Sell | Market | None | 100000.0 | 0.008471 | 0.0 | 1.003425 | 0.001185 | Short | Closed | 498 | -1 | 498 |
| 7657 | 499 | (USDPairs, USDJPY) | 2022-01-09 16:00:00+00:00 | 2022-01-09 16:00:00+00:00 | 2022-01-09 16:00:00+00:00 | Buy | Market | None | 100000.0 | 0.008461 | 0.0 | 1.003425 | 0.001185 | Short | Closed | -1 | 498 | 498 |
| 7656 | 499 | (USDPairs, USDJPY) | 2022-01-09 16:00:00+00:00 | 2022-01-09 16:00:00+00:00 | 2022-01-09 16:00:00+00:00 | Buy | Market | None | 100000.0 | 0.008461 | 0.0 | 8.832179 | 0.010439 | Long | Open | 499 | -1 | 499 |
7658 rows × 18 columns
# For pf_from_signals_v3 case # stats_df = pd.concat([pf.stats()] + [pf[grp].stats() for grp in unique_grp_types], axis = 1) stats_df = pd.concat([pf[grp].stats() for grp in unique_grp_types], axis = 1) stats_df.loc['Avg Winning Trade Duration'] = [x.floor('s') for x in stats_df.iloc[21]] stats_df.loc['Avg Losing Trade Duration'] = [x.floor('s') for x in stats_df.iloc[22]] stats_df = stats_df.reset_index() stats_df.rename(inplace = True, columns = {'agg_stats':'Agg_Stats', 'index' : 'Metrics' }) stats_df
| Metrics | USDPairs | NonUSDPairs | |
|---|---|---|---|
| 0 | Start | 2019-01-01 22:00:00+00:00 | 2019-01-01 22:00:00+00:00 |
| 1 | End | 2023-01-16 06:45:00+00:00 | 2023-01-16 06:45:00+00:00 |
| 2 | Period | 1475 days 09:00:00 | 1475 days 09:00:00 |
| 3 | Start Value | 401135.388932 | 382831.001776 |
| 4 | Min Value | 385507.804066 | 366381.544007 |
| 5 | Max Value | 416965.770321 | 401539.384864 |
| 6 | End Value | 398793.379763 | 382654.001151 |
| 7 | Total Return [%] | -0.583845 | -0.046235 |
| 8 | Benchmark Return [%] | -2.182048 | 1.650999 |
| 9 | Total Time Exposure [%] | 99.883504 | 98.605581 |
| 10 | Max Gross Exposure [%] | 100.0 | 100.0 |
| 11 | Max Drawdown [%] | 6.431557 | 7.693616 |
| 12 | Max Drawdown Duration | 604 days 16:15:00 | 740 days 10:15:00 |
| 13 | Total Orders | 3078 | 755 |
| 14 | Total Fees Paid | 0.0 | 0.0 |
| 15 | Total Trades | 3078 | 755 |
| 16 | Win Rate [%] | 62.967784 | 63.164894 |
| 17 | Best Trade [%] | 3.276834 | 4.06637 |
| 18 | Worst Trade [%] | -7.984396 | -6.862178 |
| 19 | Avg Winning Trade [%] | 0.163322 | 0.351527 |
| 20 | Avg Losing Trade [%] | -0.274403 | -0.594505 |
| 21 | Avg Winning Trade Duration | 1 days 05:41:36 | 2 days 14:17:56 |
| 22 | Avg Losing Trade Duration | 3 days 06:57:16 | 6 days 12:16:19 |
| 23 | Profit Factor | 0.989343 | 1.029618 |
| 24 | Expectancy | -0.886256 | 7.987277 |
| 25 | Sharpe Ratio | -0.034462 | 0.020401 |
| 26 | Calmar Ratio | -0.022508 | -0.001487 |
| 27 | Omega Ratio | 0.998725 | 1.001834 |
| 28 | Sortino Ratio | -0.049602 | 0.028554 |