45 Commits

Author SHA1 Message Date
1e5e6a311f toml validation to frontend 2024-03-14 17:36:48 +01:00
8de1356aa8 #163 transferables (#172) 2024-03-14 14:16:01 +01:00
7f47890cad #168 #166 and additional fixes (#169) 2024-03-13 12:31:06 +01:00
8cf1aea2a8 run updte 2024-03-07 14:07:46 +01:00
9231c1d273 bugfix - kontrolu na maxloss provadime az u eventy FILL, kdy je znama celkova castka 2024-03-06 15:50:16 +01:00
9391d89aab #148 #158 config refactoring to support profiles/reloading (#165) 2024-03-06 14:30:24 +01:00
9cff5fe6a1 #155 + presun row_to from db.py to transform.py 2024-03-06 13:31:09 +01:00
0e5cf5f3e0 Merge pull request #161 from drew2323/local
Minor changes for installation on windows
2024-03-04 17:03:50 +01:00
90c33c0528 Delete run.sh 2024-03-04 17:01:47 +01:00
e9e6534d2b primary live account api and secret changed 2024-03-04 16:57:10 +01:00
5874528d23 line 29 has deleted integrity and crossorigin value 2024-02-28 08:08:21 +01:00
985445d814 user_data_dir function has a second parameter author, ACCOUNT1_LIVE has still PAPER_API_KEY and SECRET_KEY 2024-02-28 08:04:02 +01:00
6c1f7f0e2e changed VIRTUAL_ENV_DIR and PYTHON_TO_USE 2024-02-27 18:15:35 +01:00
20aaa2ac23 #135 -> BT same period button 2024-02-27 12:03:57 +07:00
691514b102 all dates in gui are in market time zone (even start/stop) 2024-02-27 10:53:30 +07:00
84903aff77 batchprofit/batchcount columns hidden from archiverunners gui 2024-02-27 08:15:07 +07:00
4887e32665 #149 2024-02-26 22:42:03 +07:00
ce99448a48 moved config related services into separated package 2024-02-26 19:35:19 +07:00
887ea0ef00 #147 2024-02-26 11:30:13 +07:00
af7b678699 zpet debug podminka 2024-02-24 21:23:17 +07:00
04c63df045 docasny disable pro testing 2024-02-24 21:17:10 +07:00
ebac207489 #143 2024-02-24 20:32:01 +07:00
9f99ddc86a live_data_feed stored in runner_archive 2024-02-23 21:20:07 +07:00
e75fbc7194 bugfix 2024-02-23 21:04:23 +07:00
c4d05f47ff #139 konfigurace LIVE_DATA_FEED 2024-02-23 12:35:02 +07:00
f6e31f45f9 #136 bugfix properly closing ws 2024-02-23 10:30:12 +07:00
c42b1c4e1e fix 2024-02-22 23:23:20 +07:00
1bf11d0dc4 fix 2024-02-22 23:20:54 +07:00
1abbb07390 Scheduler support #24sched 2024-02-22 23:05:49 +07:00
b58639454b unknown symbol msg 2024-02-12 10:45:23 +07:00
a7e83fe051 bugfix create batch image (check for None from Alpaca) 2024-02-11 15:26:15 +07:00
6795338eba createbatch image tool + send to telefram enrichment 2024-02-11 12:37:19 +07:00
9aa8b58877 updatnute requirements.txt 2024-02-10 21:35:53 +07:00
eff78e8157 keys to env variables, optimalizations 2024-02-10 21:02:00 +07:00
d8bcc4bb8f Merge branch 'master' of https://github.com/drew2323/v2trading 2024-02-06 11:16:58 +07:00
7abdf47545 ok 2024-02-06 11:16:09 +07:00
1f8afef042 calendar wrapper with retry, histo bars with retry 2024-02-06 11:14:38 +07:00
df60d16eb4 Update README.md 2024-02-06 09:52:53 +07:00
535c2824b0 Update README.md 2024-02-06 09:34:33 +07:00
9cf936672d Update README.md 2024-02-06 09:30:56 +07:00
c1ad713a12 bugfix None in trade response 2024-02-05 10:22:20 +07:00
e9bb8b84ec fixes 2024-02-04 17:55:43 +07:00
603736d441 Merge branch 'master' of https://github.com/drew2323/v2trading 2024-02-04 17:54:09 +07:00
2c968691d1 Update README.md 2024-01-31 13:39:33 +07:00
435b4d899a Create README.md 2024-01-31 13:37:45 +07:00
107 changed files with 955 additions and 1832739 deletions

View File

@ -1,9 +1,9 @@
# V2TRADING - Advanced Algorithmic Trading Platform
**README - V2TRADING - Advanced Algorithmic Trading Platform**
## Overview
Custom-built algorithmic trading platform for research, backtesting and live trading. Trading engine capable of processing tick data, providing custom aggregation, managing trades, and supporting backtesting in a highly accurate and efficient manner.
**Overview**
Custom-built algorithmic trading platform for research, backtesting and automated trading. Trading engine capable of processing tick data, managing trades, and supporting backtesting in a highly accurate and efficient manner.
## Key Features
**Key Features**
- **Trading Engine**: At the core of the platform is a trading engine that processes tick data in real time. This engine is responsible for aggregating data and managing the execution of trades, ensuring precision and speed in trade placement and execution.
- **High-Fidelity Backtesting Environment**: ability to backtest strategies with 1:1 precision - meaning a tick-by-tick backtesting. This level of precision in backtesting, down to millisecond accuracy, mirrors live trading environments and is vital for developing and testing high-frequency trading strategies.
@ -51,78 +51,3 @@ This repository represents a sophisticated and evolving tool for algorithmic tra
</p>
# Installation Instructions
This document outlines the steps for installing and setting up the necessary environment for the application. These instructions are applicable for both Windows and Linux operating systems. Please follow the steps carefully to ensure a smooth setup.
## Prerequisites
Before beginning the installation process, ensure the following prerequisites are met:
- TA-Lib Library:
- Windows: Download and build the TA-Lib library. Install Visual Studio Community with the Visual C++ feature. Navigate to `C:\ta-lib\c\make\cdr\win32\msvc` in the command prompt and build the library using the available makefile.
- Linux: Install TA-Lib using your distribution's package manager or compile from source following the instructions available on the TA-Lib GitHub repository.
- Alpaca Paper Trading Account: Create an account at [Alpaca Markets](https://alpaca.markets/) and generate `API_KEY` and `SECRET_KEY` for your paper trading account.
## Installation Steps
**Clone the Repository:** Clone the remote repository to your local machine.
`git clone git@github.com:drew2323/v2trading.git <name_of_local_folder>`
**Install Python:** Ensure Python 3.10.11 is installed on your system.
**Create a Virtual Environment:** Set up a Python virtual environment.
`python -m venv <path_to_venv_folder>`
**Activate Virtual Environment:**
- Windows: `source ./<venv_folder>/Scripts/activate`
- Linux: `source ./<venv_folder>/bin/activate`
**Install Dependencies:** Install the program requirements.
pip install -r requirements.txt
Note: It's permissible to comment out references to `keras` and `tensorflow` modules, as well as the `ml-room` repository in `requirements.txt`.
**Environment Variables:** In `run.sh`, modify the `VIRTUAL_ENV_DIR` and `PYTHON_TO_USE` variables as necessary.
**Data Directory:** Navigate to `DATA_DIR` and create folders: `aggcache`, `tradecache`, and `models`.
**Media and Static Folders:** Create `media` and `static` folders one level above the repository directory. Also create `.env` file there.
**Database Setup:** Create the `v2trading.db` file using SQL commands from `v2trading_create_db.sql`.
```
import sqlite3
with open("v2trading_create_db.sql", "r") as f:
sql_statements = f.read()
conn = sqlite3.connect('v2trading.db')
cursor = conn.cursor()
cursor.executescript(sql_statements)
conn.commit()
conn.close()
```
Ensure the `config_table` is not empty by making an initial entry.
**Start the Application:** Run `main.py` in VSCode to start the application.
**Accessing the Application:** If the uvicorn server runs successfully at `http://0.0.0.0:8000`, access the application at `http://localhost:8000/static/`.
**Database Configuration:** Add dynamic button and JS configurations to the `config_table` in `v2trading.db` via the "Config" section on the main page.
Please replace placeholders (e.g., `<name_of_local_folder>`, `<path_to_venv_folder>`) with your actual paths and details. Follow these instructions to ensure the application is set up correctly and ready for use.
## Environmental variables
Trading platform can support N different accounts. Their API keys are stored as environmental variables in .env file located in the root directory.
Account for trading api is selected when each strategy is run. However for realtime websocket data), always ACCOUNT1 is used for all strategies. The data point selection (iex vs sip) is set by LIVE_DATA_FEED environment variable.
.env file should contain:
```
ACCOUNT1_LIVE_API_KEY=<ACCOUNT1_LIVE_API_KEY>
ACCOUNT1_LIVE_SECRET_KEY=<ACCOUNT1_LIVE_SECRET_KEY>
ACCOUNT1_LIVE_FEED=sip
ACCOUNT1_PAPER_API_KEY=<ACCOUNT1_PAPER_API_KEY>
ACCOUNT1_PAPER_SECRET_KEY=<ACCOUNT1_PAPER_SECRET_KEY>
ACCOUNT1_PAPER_FEED=sip
ACCOUNT2_PAPER_API_KEY=<ACCOUNT2_PAPER_API_KEY>
ACCOUNT2_PAPER_SECRET_KEY=ACCOUNT2_PAPER_SECRET_KEY<>
ACCOUNT2_PAPER_FEED=iex
WEB_API_KEY=<pass-for-webapi>
```

View File

@ -1,34 +1,21 @@
absl-py==2.0.0
alpaca==1.0.0
alpaca-py==0.18.1
alpaca-py==0.7.1
altair==4.2.2
annotated-types==0.6.0
anyio==3.6.2
appdirs==1.4.4
appnope==0.1.3
APScheduler==3.10.4
argon2-cffi==23.1.0
argon2-cffi-bindings==21.2.0
arrow==1.3.0
asttokens==2.2.1
astunparse==1.6.3
async-lru==2.0.4
attrs==22.2.0
Babel==2.15.0
beautifulsoup4==4.12.3
better-exceptions==0.3.3
bleach==6.0.0
blinker==1.5
bottle==0.12.25
cachetools==5.3.0
CD==1.1.0
certifi==2022.12.7
cffi==1.16.0
chardet==5.1.0
charset-normalizer==3.0.1
click==8.1.3
colorama==0.4.6
comm==0.1.4
contourpy==1.0.7
cycler==0.11.0
dash==2.9.1
@ -36,189 +23,90 @@ dash-bootstrap-components==1.4.1
dash-core-components==2.0.0
dash-html-components==2.0.0
dash-table==5.0.0
dateparser==1.1.8
debugpy==1.8.1
decorator==5.1.1
defusedxml==0.7.1
dill==0.3.7
dm-tree==0.1.8
entrypoints==0.4
exceptiongroup==1.1.3
executing==1.2.0
fastapi==0.109.2
fastjsonschema==2.19.1
filelock==3.13.1
fastapi==0.95.0
Flask==2.2.3
flatbuffers==23.5.26
fonttools==4.39.0
fpdf2==2.7.6
fqdn==1.5.1
gast==0.4.0
gitdb==4.0.10
GitPython==3.1.31
google-auth==2.23.0
google-auth-oauthlib==1.0.0
google-pasta==0.2.0
greenlet==3.0.3
grpcio==1.58.0
h11==0.14.0
h5py==3.10.0
html2text==2024.2.26
httpcore==1.0.5
httpx==0.27.0
humanize==4.9.0
h5py==3.9.0
icecream==2.1.3
idna==3.4
imageio==2.31.6
importlib-metadata==6.1.0
ipykernel==6.29.4
ipython==8.17.2
ipywidgets==8.1.1
isoduration==20.11.0
itables==2.0.1
itsdangerous==2.1.2
jax==0.4.23
jaxlib==0.4.23
jedi==0.19.1
Jinja2==3.1.2
joblib==1.3.2
json5==0.9.25
jsonpointer==2.4
jsonschema==4.22.0
jsonschema-specifications==2023.12.1
jupyter-events==0.10.0
jupyter-lsp==2.2.5
jupyter_client==8.6.1
jupyter_core==5.7.2
jupyter_server==2.14.0
jupyter_server_terminals==0.5.3
jupyterlab==4.1.8
jupyterlab-widgets==3.0.9
jupyterlab_pygments==0.3.0
jupyterlab_server==2.27.1
kaleido==0.2.1
keras==3.0.2
keras-core==0.1.7
keras-nightly==3.0.3.dev2024010203
keras-nlp-nightly==0.7.0.dev2024010203
keras-tcn @ git+https://github.com/drew2323/keras-tcn.git@4bddb17a02cb2f31c9fe2e8f616b357b1ddb0e11
jsonschema==4.17.3
keras==2.13.1
kiwisolver==1.4.4
libclang==16.0.6
lightweight-charts @ git+https://github.com/drew2323/lightweight-charts-python@10fd42f785182edfbf6b46a19a4ef66e85985a23
llvmlite==0.39.1
Markdown==3.4.3
markdown-it-py==2.2.0
MarkupSafe==2.1.2
matplotlib==3.8.2
matplotlib-inline==0.1.6
matplotlib==3.7.1
mdurl==0.1.2
mistune==3.0.2
ml-dtypes==0.3.1
mlroom @ git+https://github.com/drew2323/mlroom.git@692900e274c4e0542d945d231645c270fc508437
mplfinance==0.12.10b0
msgpack==1.0.4
mypy-extensions==1.0.0
namex==0.0.7
nbclient==0.10.0
nbconvert==7.16.4
nbformat==5.10.4
nest-asyncio==1.6.0
newtulipy==0.4.6
notebook_shim==0.2.4
numba==0.56.4
numpy==1.23.5
numpy==1.24.2
oauthlib==3.2.2
opt-einsum==3.3.0
orjson==3.9.10
overrides==7.7.0
packaging==23.0
pandas==2.2.1
pandocfilters==1.5.1
pandas==1.5.3
param==1.13.0
parso==0.8.3
patsy==0.5.6
pexpect==4.8.0
Pillow==9.4.0
platformdirs==4.2.0
plotly==5.22.0
prometheus_client==0.20.0
prompt-toolkit==3.0.39
plotly==5.13.1
proto-plus==1.22.2
protobuf==3.20.3
proxy-tools==0.1.0
psutil==5.9.8
ptyprocess==0.7.0
pure-eval==0.2.2
pyarrow==11.0.0
pyasn1==0.4.8
pyasn1-modules==0.2.8
pycparser==2.22
pyct==0.5.0
pydantic==2.6.4
pydantic_core==2.16.3
pydantic==1.10.5
pydeck==0.8.0
Pygments==2.14.0
pyinstrument==4.5.3
Pympler==1.0.1
pyobjc-core==10.3
pyobjc-framework-Cocoa==10.3
pyobjc-framework-Security==10.3
pyobjc-framework-WebKit==10.3
pyparsing==3.0.9
pyrsistent==0.19.3
pysos==1.3.0
python-dateutil==2.8.2
python-dotenv==1.0.0
python-json-logger==2.0.7
python-multipart==0.0.6
pytz==2022.7.1
pytz-deprecation-shim==0.1.0.post0
pyviz-comms==2.2.1
PyWavelets==1.5.0
pywebview==5.1
PyYAML==6.0
pyzmq==25.1.2
referencing==0.35.1
regex==2023.10.3
requests==2.31.0
requests-oauthlib==1.3.1
rfc3339-validator==0.1.4
rfc3986-validator==0.1.1
rich==13.3.1
rpds-py==0.18.0
rsa==4.9
schedule==1.2.1
scikit-learn==1.3.2
scikit-learn==1.3.1
scipy==1.11.2
seaborn==0.12.2
semver==2.13.0
Send2Trash==1.8.3
six==1.16.0
smmap==5.0.0
sniffio==1.3.0
soupsieve==2.5
SQLAlchemy==2.0.27
sseclient-py==1.7.2
stack-data==0.6.3
starlette==0.36.3
statsmodels==0.14.1
starlette==0.26.1
streamlit==1.20.0
structlog==23.1.0
TA-Lib==0.4.28
tb-nightly==2.16.0a20240102
tenacity==8.2.2
tensorboard==2.15.1
tensorboard==2.13.0
tensorboard-data-server==0.7.1
tensorflow-addons==0.23.0
tensorflow-estimator==2.15.0
tensorflow==2.13.0
tensorflow-estimator==2.13.0
tensorflow-io-gcs-filesystem==0.34.0
termcolor==2.3.0
terminado==0.18.1
tf-estimator-nightly==2.14.0.dev2023080308
tf-nightly==2.16.0.dev20240101
tf_keras-nightly==2.16.0.dev2023123010
threadpoolctl==3.2.0
tinycss2==1.3.0
tinydb==4.7.1
tinydb-serialization==2.1.0
tinyflux==0.4.0
@ -227,24 +115,15 @@ tomli==2.0.1
toolz==0.12.0
tornado==6.2
tqdm==4.65.0
traitlets==5.13.0
typeguard==2.13.3
types-python-dateutil==2.9.0.20240316
typing_extensions==4.9.0
typing_extensions==4.5.0
tzdata==2023.2
tzlocal==4.3
uri-template==1.3.0
urllib3==1.26.14
uvicorn==0.21.1
-e git+https://github.com/drew2323/v2trading.git@1f85b271dba2b9baf2c61b591a08849e9d684374#egg=v2realbot
#-e git+https://github.com/drew2323/v2trading.git@940348412f67ecd551ef8d0aaedf84452abf1320#egg=v2realbot
validators==0.20.0
vectorbtpro @ file:///Users/davidbrazda/Downloads/vectorbt.pro-2024.2.22
wcwidth==0.2.9
webcolors==1.13
webencodings==0.5.1
websocket-client==1.7.0
websockets==11.0.3
websockets==10.4
Werkzeug==2.2.3
widgetsnbextension==4.0.9
wrapt==1.14.1
wrapt==1.15.0
zipp==3.15.0

View File

@ -1,243 +0,0 @@
absl-py
alpaca
alpaca-py
altair
annotated-types
anyio
appdirs
appnope
APScheduler
argon2-cffi
argon2-cffi-bindings
arrow
asttokens
astunparse
async-lru
attrs
Babel
beautifulsoup4
better-exceptions
bleach
blinker
bottle
cachetools
CD
certifi
cffi
chardet
charset-normalizer
click
colorama
comm
contourpy
cycler
dash
dash-bootstrap-components
dash-core-components
dash-html-components
dash-table
dateparser
debugpy
decorator
defusedxml
dill
dm-tree
entrypoints
exceptiongroup
executing
fastapi
fastjsonschema
filelock
Flask
flatbuffers
fonttools
fpdf2
fqdn
gast
gitdb
GitPython
google-auth
google-auth-oauthlib
google-pasta
greenlet
grpcio
h11
h5py
html2text
httpcore
httpx
humanize
icecream
idna
imageio
importlib-metadata
ipykernel
ipython
ipywidgets
isoduration
itables
itsdangerous
jax
jaxlib
jedi
Jinja2
joblib
json5
jsonpointer
jsonschema
jsonschema-specifications
jupyter-events
jupyter-lsp
jupyter_client
jupyter_core
jupyter_server
jupyter_server_terminals
jupyterlab
jupyterlab-widgets
jupyterlab_pygments
jupyterlab_server
kaleido
keras
keras-core
keras-nightly
keras-nlp-nightly
keras-tcn @ git+https://github.com/drew2323/keras-tcn.git
kiwisolver
libclang
lightweight-charts @ git+https://github.com/drew2323/lightweight-charts-python.git
llvmlite
Markdown
markdown-it-py
MarkupSafe
matplotlib
matplotlib-inline
mdurl
mistune
ml-dtypes
mlroom @ git+https://github.com/drew2323/mlroom.git
mplfinance
msgpack
mypy-extensions
namex
nbclient
nbconvert
nbformat
nest-asyncio
newtulipy
notebook_shim
numba
numpy
oauthlib
opt-einsum
orjson
overrides
packaging
pandas
pandocfilters
param
parso
patsy
pexpect
Pillow
platformdirs
plotly
prometheus_client
prompt-toolkit
proto-plus
protobuf
proxy-tools
psutil
ptyprocess
pure-eval
pyarrow
pyasn1
pyasn1-modules
pycparser
pyct
pydantic
pydantic_core
pydeck
Pygments
pyinstrument
pyparsing
pyrsistent
pysos
python-dateutil
python-dotenv
python-json-logger
python-multipart
pytz
pytz-deprecation-shim
pyviz-comms
PyWavelets
pywebview
PyYAML
pyzmq
referencing
regex
requests
requests-oauthlib
rfc3339-validator
rfc3986-validator
rich
rpds-py
rsa
schedule
scikit-learn
scipy
seaborn
semver
Send2Trash
six
smmap
sniffio
soupsieve
SQLAlchemy
sseclient-py
stack-data
starlette
statsmodels
streamlit
structlog
TA-Lib
tb-nightly
tenacity
tensorboard
tensorboard-data-server
tensorflow-addons
tensorflow-estimator
tensorflow-io-gcs-filesystem
termcolor
terminado
tf-estimator-nightly
tf-nightly
tf_keras-nightly
threadpoolctl
tinycss2
tinydb
tinydb-serialization
tinyflux
toml
tomli
toolz
tornado
tqdm
traitlets
typeguard
types-python-dateutil
typing_extensions
tzdata
tzlocal
uri-template
urllib3
uvicorn
validators
wcwidth
webcolors
webencodings
websocket-client
websockets
Werkzeug
widgetsnbextension
wrapt
zipp

File diff suppressed because it is too large Load Diff

View File

@ -1,410 +0,0 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Loading trades and vectorized aggregation\n",
"Describes how to fetch trades (remote/cached) and use new vectorized aggregation to aggregate bars of given type (time, volume, dollar) and resolution\n",
"\n",
"`fetch_trades_parallel` enables to fetch trades of given symbol and interval, also can filter conditions and minimum size. return `trades_df`\n",
"`aggregate_trades` acceptss `trades_df` and ressolution and type of bars (VOLUME, TIME, DOLLAR) and return aggregated ohlcv dataframe `ohlcv_df`"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">Activating profile profile1\n",
"</pre>\n"
],
"text/plain": [
"Activating profile profile1\n"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"trades_df-BAC-2024-01-11T09:30:00-2024-01-12T16:00:00.parquet\n",
"trades_df-SPY-2024-01-01T09:30:00-2024-05-14T16:00:00.parquet\n",
"ohlcv_df-BAC-2024-01-11T09:30:00-2024-01-12T16:00:00.parquet\n",
"ohlcv_df-SPY-2024-01-01T09:30:00-2024-05-14T16:00:00.parquet\n"
]
}
],
"source": [
"import pandas as pd\n",
"import numpy as np\n",
"from numba import jit\n",
"from alpaca.data.historical import StockHistoricalDataClient\n",
"from v2realbot.config import ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, DATA_DIR\n",
"from alpaca.data.requests import StockTradesRequest\n",
"from v2realbot.enums.enums import BarType\n",
"import time\n",
"from datetime import datetime\n",
"from v2realbot.utils.utils import parse_alpaca_timestamp, ltp, zoneNY, send_to_telegram, fetch_calendar_data\n",
"import pyarrow\n",
"from v2realbot.loader.aggregator_vectorized import fetch_daily_stock_trades, fetch_trades_parallel, generate_time_bars_nb, aggregate_trades\n",
"import vectorbtpro as vbt\n",
"import v2realbot.utils.config_handler as cfh\n",
"\n",
"vbt.settings.set_theme(\"dark\")\n",
"vbt.settings['plotting']['layout']['width'] = 1280\n",
"vbt.settings.plotting.auto_rangebreaks = True\n",
"# Set the option to display with pagination\n",
"pd.set_option('display.notebook_repr_html', True)\n",
"pd.set_option('display.max_rows', 20) # Number of rows per page\n",
"# pd.set_option('display.float_format', '{:.9f}'.format)\n",
"\n",
"\n",
"#trade filtering\n",
"exclude_conditions = cfh.config_handler.get_val('AGG_EXCLUDED_TRADES') #standard ['C','O','4','B','7','V','P','W','U','Z','F']\n",
"minsize = 100\n",
"\n",
"symbol = \"SPY\"\n",
"#datetime in zoneNY \n",
"day_start = datetime(2024, 1, 1, 9, 30, 0)\n",
"day_stop = datetime(2024, 1, 14, 16, 00, 0)\n",
"day_start = zoneNY.localize(day_start)\n",
"day_stop = zoneNY.localize(day_stop)\n",
"#filename of trades_df parquet, date are in isoformat but without time zone part\n",
"dir = DATA_DIR + \"/notebooks/\"\n",
"#parquet interval cache contains exclude conditions and minsize filtering\n",
"file_trades = dir + f\"trades_df-{symbol}-{day_start.strftime('%Y-%m-%dT%H:%M:%S')}-{day_stop.strftime('%Y-%m-%dT%H:%M:%S')}-{exclude_conditions}-{minsize}.parquet\"\n",
"#file_trades = dir + f\"trades_df-{symbol}-{day_start.strftime('%Y-%m-%dT%H:%M:%S')}-{day_stop.strftime('%Y-%m-%dT%H:%M:%S')}.parquet\"\n",
"file_ohlcv = dir + f\"ohlcv_df-{symbol}-{day_start.strftime('%Y-%m-%dT%H:%M:%S')}-{day_stop.strftime('%Y-%m-%dT%H:%M:%S')}-{exclude_conditions}-{minsize}.parquet\"\n",
"\n",
"#PRINT all parquet in directory\n",
"import os\n",
"files = [f for f in os.listdir(dir) if f.endswith(\".parquet\")]\n",
"for f in files:\n",
" print(f)"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"NOT FOUND. Fetching from remote\n"
]
},
{
"ename": "KeyboardInterrupt",
"evalue": "",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)",
"Cell \u001b[0;32mIn[2], line 1\u001b[0m\n\u001b[0;32m----> 1\u001b[0m trades_df \u001b[38;5;241m=\u001b[39m \u001b[43mfetch_daily_stock_trades\u001b[49m\u001b[43m(\u001b[49m\u001b[43msymbol\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mday_start\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mday_stop\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mexclude_conditions\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mexclude_conditions\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mminsize\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mminsize\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mforce_remote\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mFalse\u001b[39;49;00m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mmax_retries\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;241;43m5\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mbackoff_factor\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;241;43m1\u001b[39;49m\u001b[43m)\u001b[49m\n\u001b[1;32m 2\u001b[0m trades_df\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/v2realbot/loader/aggregator_vectorized.py:200\u001b[0m, in \u001b[0;36mfetch_daily_stock_trades\u001b[0;34m(symbol, start, end, exclude_conditions, minsize, force_remote, max_retries, backoff_factor)\u001b[0m\n\u001b[1;32m 198\u001b[0m \u001b[38;5;28;01mfor\u001b[39;00m attempt \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28mrange\u001b[39m(max_retries):\n\u001b[1;32m 199\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 200\u001b[0m tradesResponse \u001b[38;5;241m=\u001b[39m \u001b[43mclient\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mget_stock_trades\u001b[49m\u001b[43m(\u001b[49m\u001b[43mstockTradeRequest\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 201\u001b[0m is_empty \u001b[38;5;241m=\u001b[39m \u001b[38;5;129;01mnot\u001b[39;00m tradesResponse[symbol]\n\u001b[1;32m 202\u001b[0m \u001b[38;5;28mprint\u001b[39m(\u001b[38;5;124mf\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mRemote fetched: \u001b[39m\u001b[38;5;132;01m{\u001b[39;00mis_empty\u001b[38;5;132;01m=}\u001b[39;00m\u001b[38;5;124m\"\u001b[39m, start, end)\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/alpaca/data/historical/stock.py:144\u001b[0m, in \u001b[0;36mStockHistoricalDataClient.get_stock_trades\u001b[0;34m(self, request_params)\u001b[0m\n\u001b[1;32m 141\u001b[0m params \u001b[38;5;241m=\u001b[39m request_params\u001b[38;5;241m.\u001b[39mto_request_fields()\n\u001b[1;32m 143\u001b[0m \u001b[38;5;66;03m# paginated get request for market data api\u001b[39;00m\n\u001b[0;32m--> 144\u001b[0m raw_trades \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_data_get\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 145\u001b[0m \u001b[43m \u001b[49m\u001b[43mendpoint_data_type\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mtrades\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m,\u001b[49m\n\u001b[1;32m 146\u001b[0m \u001b[43m \u001b[49m\u001b[43mendpoint_asset_class\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mstocks\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m,\u001b[49m\n\u001b[1;32m 147\u001b[0m \u001b[43m \u001b[49m\u001b[43mapi_version\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mv2\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m,\u001b[49m\n\u001b[1;32m 148\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mparams\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 149\u001b[0m \u001b[43m\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 151\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_use_raw_data:\n\u001b[1;32m 152\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m raw_trades\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/alpaca/data/historical/stock.py:338\u001b[0m, in \u001b[0;36mStockHistoricalDataClient._data_get\u001b[0;34m(self, endpoint_asset_class, endpoint_data_type, api_version, symbol_or_symbols, limit, page_limit, extension, **kwargs)\u001b[0m\n\u001b[1;32m 335\u001b[0m params[\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mlimit\u001b[39m\u001b[38;5;124m\"\u001b[39m] \u001b[38;5;241m=\u001b[39m actual_limit\n\u001b[1;32m 336\u001b[0m params[\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mpage_token\u001b[39m\u001b[38;5;124m\"\u001b[39m] \u001b[38;5;241m=\u001b[39m page_token\n\u001b[0;32m--> 338\u001b[0m response \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mget\u001b[49m\u001b[43m(\u001b[49m\u001b[43mpath\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mpath\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mdata\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mparams\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mapi_version\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mapi_version\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 340\u001b[0m \u001b[38;5;66;03m# TODO: Merge parsing if possible\u001b[39;00m\n\u001b[1;32m 341\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m extension \u001b[38;5;241m==\u001b[39m DataExtensionType\u001b[38;5;241m.\u001b[39mSNAPSHOT:\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/alpaca/common/rest.py:221\u001b[0m, in \u001b[0;36mRESTClient.get\u001b[0;34m(self, path, data, **kwargs)\u001b[0m\n\u001b[1;32m 210\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mget\u001b[39m(\u001b[38;5;28mself\u001b[39m, path: \u001b[38;5;28mstr\u001b[39m, data: Union[\u001b[38;5;28mdict\u001b[39m, \u001b[38;5;28mstr\u001b[39m] \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mNone\u001b[39;00m, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m HTTPResult:\n\u001b[1;32m 211\u001b[0m \u001b[38;5;250m \u001b[39m\u001b[38;5;124;03m\"\"\"Performs a single GET request\u001b[39;00m\n\u001b[1;32m 212\u001b[0m \n\u001b[1;32m 213\u001b[0m \u001b[38;5;124;03m Args:\u001b[39;00m\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 219\u001b[0m \u001b[38;5;124;03m dict: The response\u001b[39;00m\n\u001b[1;32m 220\u001b[0m \u001b[38;5;124;03m \"\"\"\u001b[39;00m\n\u001b[0;32m--> 221\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_request\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mGET\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mpath\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mdata\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/alpaca/common/rest.py:129\u001b[0m, in \u001b[0;36mRESTClient._request\u001b[0;34m(self, method, path, data, base_url, api_version)\u001b[0m\n\u001b[1;32m 127\u001b[0m \u001b[38;5;28;01mwhile\u001b[39;00m retry \u001b[38;5;241m>\u001b[39m\u001b[38;5;241m=\u001b[39m \u001b[38;5;241m0\u001b[39m:\n\u001b[1;32m 128\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 129\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_one_request\u001b[49m\u001b[43m(\u001b[49m\u001b[43mmethod\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43murl\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mopts\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mretry\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 130\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m RetryException:\n\u001b[1;32m 131\u001b[0m time\u001b[38;5;241m.\u001b[39msleep(\u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_retry_wait)\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/alpaca/common/rest.py:193\u001b[0m, in \u001b[0;36mRESTClient._one_request\u001b[0;34m(self, method, url, opts, retry)\u001b[0m\n\u001b[1;32m 174\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21m_one_request\u001b[39m(\u001b[38;5;28mself\u001b[39m, method: \u001b[38;5;28mstr\u001b[39m, url: \u001b[38;5;28mstr\u001b[39m, opts: \u001b[38;5;28mdict\u001b[39m, retry: \u001b[38;5;28mint\u001b[39m) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m \u001b[38;5;28mdict\u001b[39m:\n\u001b[1;32m 175\u001b[0m \u001b[38;5;250m \u001b[39m\u001b[38;5;124;03m\"\"\"Perform one request, possibly raising RetryException in the case\u001b[39;00m\n\u001b[1;32m 176\u001b[0m \u001b[38;5;124;03m the response is 429. Otherwise, if error text contain \"code\" string,\u001b[39;00m\n\u001b[1;32m 177\u001b[0m \u001b[38;5;124;03m then it decodes to json object and returns APIError.\u001b[39;00m\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 191\u001b[0m \u001b[38;5;124;03m dict: The response data\u001b[39;00m\n\u001b[1;32m 192\u001b[0m \u001b[38;5;124;03m \"\"\"\u001b[39;00m\n\u001b[0;32m--> 193\u001b[0m response \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_session\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mrequest\u001b[49m\u001b[43m(\u001b[49m\u001b[43mmethod\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43murl\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mopts\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 195\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m 196\u001b[0m response\u001b[38;5;241m.\u001b[39mraise_for_status()\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/requests/sessions.py:589\u001b[0m, in \u001b[0;36mSession.request\u001b[0;34m(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json)\u001b[0m\n\u001b[1;32m 584\u001b[0m send_kwargs \u001b[38;5;241m=\u001b[39m {\n\u001b[1;32m 585\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mtimeout\u001b[39m\u001b[38;5;124m\"\u001b[39m: timeout,\n\u001b[1;32m 586\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mallow_redirects\u001b[39m\u001b[38;5;124m\"\u001b[39m: allow_redirects,\n\u001b[1;32m 587\u001b[0m }\n\u001b[1;32m 588\u001b[0m send_kwargs\u001b[38;5;241m.\u001b[39mupdate(settings)\n\u001b[0;32m--> 589\u001b[0m resp \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43msend\u001b[49m\u001b[43m(\u001b[49m\u001b[43mprep\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43msend_kwargs\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 591\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m resp\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/requests/sessions.py:703\u001b[0m, in \u001b[0;36mSession.send\u001b[0;34m(self, request, **kwargs)\u001b[0m\n\u001b[1;32m 700\u001b[0m start \u001b[38;5;241m=\u001b[39m preferred_clock()\n\u001b[1;32m 702\u001b[0m \u001b[38;5;66;03m# Send the request\u001b[39;00m\n\u001b[0;32m--> 703\u001b[0m r \u001b[38;5;241m=\u001b[39m \u001b[43madapter\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43msend\u001b[49m\u001b[43m(\u001b[49m\u001b[43mrequest\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 705\u001b[0m \u001b[38;5;66;03m# Total elapsed time of the request (approximately)\u001b[39;00m\n\u001b[1;32m 706\u001b[0m elapsed \u001b[38;5;241m=\u001b[39m preferred_clock() \u001b[38;5;241m-\u001b[39m start\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/requests/adapters.py:486\u001b[0m, in \u001b[0;36mHTTPAdapter.send\u001b[0;34m(self, request, stream, timeout, verify, cert, proxies)\u001b[0m\n\u001b[1;32m 483\u001b[0m timeout \u001b[38;5;241m=\u001b[39m TimeoutSauce(connect\u001b[38;5;241m=\u001b[39mtimeout, read\u001b[38;5;241m=\u001b[39mtimeout)\n\u001b[1;32m 485\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 486\u001b[0m resp \u001b[38;5;241m=\u001b[39m \u001b[43mconn\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43murlopen\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 487\u001b[0m \u001b[43m \u001b[49m\u001b[43mmethod\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mrequest\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mmethod\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 488\u001b[0m \u001b[43m \u001b[49m\u001b[43murl\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43murl\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 489\u001b[0m \u001b[43m \u001b[49m\u001b[43mbody\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mrequest\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mbody\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 490\u001b[0m \u001b[43m \u001b[49m\u001b[43mheaders\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mrequest\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mheaders\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 491\u001b[0m \u001b[43m \u001b[49m\u001b[43mredirect\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mFalse\u001b[39;49;00m\u001b[43m,\u001b[49m\n\u001b[1;32m 492\u001b[0m \u001b[43m \u001b[49m\u001b[43massert_same_host\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mFalse\u001b[39;49;00m\u001b[43m,\u001b[49m\n\u001b[1;32m 493\u001b[0m \u001b[43m \u001b[49m\u001b[43mpreload_content\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mFalse\u001b[39;49;00m\u001b[43m,\u001b[49m\n\u001b[1;32m 494\u001b[0m \u001b[43m \u001b[49m\u001b[43mdecode_content\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mFalse\u001b[39;49;00m\u001b[43m,\u001b[49m\n\u001b[1;32m 495\u001b[0m \u001b[43m \u001b[49m\u001b[43mretries\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mmax_retries\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 496\u001b[0m \u001b[43m \u001b[49m\u001b[43mtimeout\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mtimeout\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 497\u001b[0m \u001b[43m \u001b[49m\u001b[43mchunked\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mchunked\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 498\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 500\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m (ProtocolError, \u001b[38;5;167;01mOSError\u001b[39;00m) \u001b[38;5;28;01mas\u001b[39;00m err:\n\u001b[1;32m 501\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mConnectionError\u001b[39;00m(err, request\u001b[38;5;241m=\u001b[39mrequest)\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/urllib3/connectionpool.py:703\u001b[0m, in \u001b[0;36mHTTPConnectionPool.urlopen\u001b[0;34m(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, **response_kw)\u001b[0m\n\u001b[1;32m 700\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_prepare_proxy(conn)\n\u001b[1;32m 702\u001b[0m \u001b[38;5;66;03m# Make the request on the httplib connection object.\u001b[39;00m\n\u001b[0;32m--> 703\u001b[0m httplib_response \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_make_request\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 704\u001b[0m \u001b[43m \u001b[49m\u001b[43mconn\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 705\u001b[0m \u001b[43m \u001b[49m\u001b[43mmethod\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 706\u001b[0m \u001b[43m \u001b[49m\u001b[43murl\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 707\u001b[0m \u001b[43m \u001b[49m\u001b[43mtimeout\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mtimeout_obj\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 708\u001b[0m \u001b[43m \u001b[49m\u001b[43mbody\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mbody\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 709\u001b[0m \u001b[43m \u001b[49m\u001b[43mheaders\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mheaders\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 710\u001b[0m \u001b[43m \u001b[49m\u001b[43mchunked\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mchunked\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 711\u001b[0m \u001b[43m\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 713\u001b[0m \u001b[38;5;66;03m# If we're going to release the connection in ``finally:``, then\u001b[39;00m\n\u001b[1;32m 714\u001b[0m \u001b[38;5;66;03m# the response doesn't need to know about the connection. Otherwise\u001b[39;00m\n\u001b[1;32m 715\u001b[0m \u001b[38;5;66;03m# it will also try to release it and we'll have a double-release\u001b[39;00m\n\u001b[1;32m 716\u001b[0m \u001b[38;5;66;03m# mess.\u001b[39;00m\n\u001b[1;32m 717\u001b[0m response_conn \u001b[38;5;241m=\u001b[39m conn \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m release_conn \u001b[38;5;28;01melse\u001b[39;00m \u001b[38;5;28;01mNone\u001b[39;00m\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/urllib3/connectionpool.py:449\u001b[0m, in \u001b[0;36mHTTPConnectionPool._make_request\u001b[0;34m(self, conn, method, url, timeout, chunked, **httplib_request_kw)\u001b[0m\n\u001b[1;32m 444\u001b[0m httplib_response \u001b[38;5;241m=\u001b[39m conn\u001b[38;5;241m.\u001b[39mgetresponse()\n\u001b[1;32m 445\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mBaseException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[1;32m 446\u001b[0m \u001b[38;5;66;03m# Remove the TypeError from the exception chain in\u001b[39;00m\n\u001b[1;32m 447\u001b[0m \u001b[38;5;66;03m# Python 3 (including for exceptions like SystemExit).\u001b[39;00m\n\u001b[1;32m 448\u001b[0m \u001b[38;5;66;03m# Otherwise it looks like a bug in the code.\u001b[39;00m\n\u001b[0;32m--> 449\u001b[0m \u001b[43msix\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mraise_from\u001b[49m\u001b[43m(\u001b[49m\u001b[43me\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43;01mNone\u001b[39;49;00m\u001b[43m)\u001b[49m\n\u001b[1;32m 450\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m (SocketTimeout, BaseSSLError, SocketError) \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[1;32m 451\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_raise_timeout(err\u001b[38;5;241m=\u001b[39me, url\u001b[38;5;241m=\u001b[39murl, timeout_value\u001b[38;5;241m=\u001b[39mread_timeout)\n",
"File \u001b[0;32m<string>:3\u001b[0m, in \u001b[0;36mraise_from\u001b[0;34m(value, from_value)\u001b[0m\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/urllib3/connectionpool.py:444\u001b[0m, in \u001b[0;36mHTTPConnectionPool._make_request\u001b[0;34m(self, conn, method, url, timeout, chunked, **httplib_request_kw)\u001b[0m\n\u001b[1;32m 441\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mTypeError\u001b[39;00m:\n\u001b[1;32m 442\u001b[0m \u001b[38;5;66;03m# Python 3\u001b[39;00m\n\u001b[1;32m 443\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 444\u001b[0m httplib_response \u001b[38;5;241m=\u001b[39m \u001b[43mconn\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mgetresponse\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 445\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mBaseException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[1;32m 446\u001b[0m \u001b[38;5;66;03m# Remove the TypeError from the exception chain in\u001b[39;00m\n\u001b[1;32m 447\u001b[0m \u001b[38;5;66;03m# Python 3 (including for exceptions like SystemExit).\u001b[39;00m\n\u001b[1;32m 448\u001b[0m \u001b[38;5;66;03m# Otherwise it looks like a bug in the code.\u001b[39;00m\n\u001b[1;32m 449\u001b[0m six\u001b[38;5;241m.\u001b[39mraise_from(e, \u001b[38;5;28;01mNone\u001b[39;00m)\n",
"File \u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py:1375\u001b[0m, in \u001b[0;36mHTTPConnection.getresponse\u001b[0;34m(self)\u001b[0m\n\u001b[1;32m 1373\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m 1374\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m-> 1375\u001b[0m \u001b[43mresponse\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mbegin\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 1376\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mConnectionError\u001b[39;00m:\n\u001b[1;32m 1377\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mclose()\n",
"File \u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py:318\u001b[0m, in \u001b[0;36mHTTPResponse.begin\u001b[0;34m(self)\u001b[0m\n\u001b[1;32m 316\u001b[0m \u001b[38;5;66;03m# read until we get a non-100 response\u001b[39;00m\n\u001b[1;32m 317\u001b[0m \u001b[38;5;28;01mwhile\u001b[39;00m \u001b[38;5;28;01mTrue\u001b[39;00m:\n\u001b[0;32m--> 318\u001b[0m version, status, reason \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_read_status\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 319\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m status \u001b[38;5;241m!=\u001b[39m CONTINUE:\n\u001b[1;32m 320\u001b[0m \u001b[38;5;28;01mbreak\u001b[39;00m\n",
"File \u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py:279\u001b[0m, in \u001b[0;36mHTTPResponse._read_status\u001b[0;34m(self)\u001b[0m\n\u001b[1;32m 278\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21m_read_status\u001b[39m(\u001b[38;5;28mself\u001b[39m):\n\u001b[0;32m--> 279\u001b[0m line \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mstr\u001b[39m(\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mfp\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mreadline\u001b[49m\u001b[43m(\u001b[49m\u001b[43m_MAXLINE\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m+\u001b[39;49m\u001b[43m \u001b[49m\u001b[38;5;241;43m1\u001b[39;49m\u001b[43m)\u001b[49m, \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124miso-8859-1\u001b[39m\u001b[38;5;124m\"\u001b[39m)\n\u001b[1;32m 280\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28mlen\u001b[39m(line) \u001b[38;5;241m>\u001b[39m _MAXLINE:\n\u001b[1;32m 281\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m LineTooLong(\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mstatus line\u001b[39m\u001b[38;5;124m\"\u001b[39m)\n",
"File \u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/socket.py:705\u001b[0m, in \u001b[0;36mSocketIO.readinto\u001b[0;34m(self, b)\u001b[0m\n\u001b[1;32m 703\u001b[0m \u001b[38;5;28;01mwhile\u001b[39;00m \u001b[38;5;28;01mTrue\u001b[39;00m:\n\u001b[1;32m 704\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 705\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_sock\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mrecv_into\u001b[49m\u001b[43m(\u001b[49m\u001b[43mb\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 706\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m timeout:\n\u001b[1;32m 707\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_timeout_occurred \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mTrue\u001b[39;00m\n",
"File \u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/ssl.py:1274\u001b[0m, in \u001b[0;36mSSLSocket.recv_into\u001b[0;34m(self, buffer, nbytes, flags)\u001b[0m\n\u001b[1;32m 1270\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m flags \u001b[38;5;241m!=\u001b[39m \u001b[38;5;241m0\u001b[39m:\n\u001b[1;32m 1271\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mValueError\u001b[39;00m(\n\u001b[1;32m 1272\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mnon-zero flags not allowed in calls to recv_into() on \u001b[39m\u001b[38;5;132;01m%s\u001b[39;00m\u001b[38;5;124m\"\u001b[39m \u001b[38;5;241m%\u001b[39m\n\u001b[1;32m 1273\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m\u001b[38;5;18m__class__\u001b[39m)\n\u001b[0;32m-> 1274\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mread\u001b[49m\u001b[43m(\u001b[49m\u001b[43mnbytes\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mbuffer\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 1275\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[1;32m 1276\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28msuper\u001b[39m()\u001b[38;5;241m.\u001b[39mrecv_into(buffer, nbytes, flags)\n",
"File \u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/ssl.py:1130\u001b[0m, in \u001b[0;36mSSLSocket.read\u001b[0;34m(self, len, buffer)\u001b[0m\n\u001b[1;32m 1128\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m 1129\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m buffer \u001b[38;5;129;01mis\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m \u001b[38;5;28;01mNone\u001b[39;00m:\n\u001b[0;32m-> 1130\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_sslobj\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mread\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43mlen\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mbuffer\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 1131\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[1;32m 1132\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_sslobj\u001b[38;5;241m.\u001b[39mread(\u001b[38;5;28mlen\u001b[39m)\n",
"\u001b[0;31mKeyboardInterrupt\u001b[0m: "
]
}
],
"source": [
"trades_df = fetch_daily_stock_trades(symbol, day_start, day_stop, exclude_conditions=exclude_conditions, minsize=minsize, force_remote=False, max_retries=5, backoff_factor=1)\n",
"trades_df"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"#Either load trades or ohlcv from parquet if exists\n",
"\n",
"#trades_df = fetch_trades_parallel(symbol, day_start, day_stop, exclude_conditions=exclude_conditions, minsize=50, max_workers=20) #exclude_conditions=['C','O','4','B','7','V','P','W','U','Z','F'])\n",
"# trades_df.to_parquet(file_trades, engine='pyarrow', compression='gzip')\n",
"\n",
"trades_df = pd.read_parquet(file_trades,engine='pyarrow')\n",
"ohlcv_df = aggregate_trades(symbol=symbol, trades_df=trades_df, resolution=1, type=BarType.TIME)\n",
"ohlcv_df.to_parquet(file_ohlcv, engine='pyarrow', compression='gzip')\n",
"\n",
"# ohlcv_df = pd.read_parquet(file_ohlcv,engine='pyarrow')\n",
"# trades_df = pd.read_parquet(file_trades,engine='pyarrow')\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#list all files is dir directory with parquet extension\n",
"dir = DATA_DIR + \"/notebooks/\"\n",
"import os\n",
"files = [f for f in os.listdir(dir) if f.endswith(\".parquet\")]\n",
"file_name = \"\"\n",
"ohlcv_df = pd.read_parquet(file_ohlcv,engine='pyarrow')"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"ohlcv_df"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import matplotlib.pyplot as plt\n",
"import seaborn as sns\n",
"# Calculate daily returns\n",
"ohlcv_df['returns'] = ohlcv_df['close'].pct_change().dropna()\n",
"#same as above but pct_change is from 3 datapoints back, but only if it is the same date, else na\n",
"\n",
"\n",
"# Plot the probability distribution curve\n",
"plt.figure(figsize=(10, 6))\n",
"sns.histplot(df['returns'].dropna(), kde=True, stat='probability', bins=30)\n",
"plt.title('Probability Distribution of Daily Returns')\n",
"plt.xlabel('Daily Returns')\n",
"plt.ylabel('Probability')\n",
"plt.show()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import pandas as pd\n",
"import numpy as np\n",
"from sklearn.model_selection import train_test_split\n",
"from sklearn.preprocessing import StandardScaler\n",
"from sklearn.linear_model import LogisticRegression\n",
"from sklearn.metrics import accuracy_score\n",
"\n",
"# Define the intervals from 5 to 20 s, returns for each interval\n",
"#maybe use rolling window?\n",
"intervals = range(5, 21, 5)\n",
"\n",
"# Create columns for percentage returns\n",
"rolling_window = 50\n",
"\n",
"# Normalize the returns using rolling mean and std\n",
"for N in intervals:\n",
" column_name = f'returns_{N}'\n",
" rolling_mean = ohlcv_df[column_name].rolling(window=rolling_window).mean()\n",
" rolling_std = ohlcv_df[column_name].rolling(window=rolling_window).std()\n",
" ohlcv_df[f'norm_{column_name}'] = (ohlcv_df[column_name] - rolling_mean) / rolling_std\n",
"\n",
"# Display the dataframe with normalized return columns\n",
"ohlcv_df\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Calculate the sum of the normalized return columns for each row\n",
"ohlcv_df['sum_norm_returns'] = ohlcv_df[[f'norm_returns_{N}' for N in intervals]].sum(axis=1)\n",
"\n",
"# Sort the DataFrame based on the sum of normalized returns in descending order\n",
"df_sorted = ohlcv_df.sort_values(by='sum_norm_returns', ascending=False)\n",
"\n",
"# Display the top rows with the highest sum of normalized returns\n",
"df_sorted\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Drop initial rows with NaN values due to pct_change\n",
"ohlcv_df.dropna(inplace=True)\n",
"\n",
"# Plotting the probability distribution curves\n",
"plt.figure(figsize=(14, 8))\n",
"for N in intervals:\n",
" sns.kdeplot(ohlcv_df[f'returns_{N}'].dropna(), label=f'Returns {N}', fill=True)\n",
"\n",
"plt.title('Probability Distribution of Percentage Returns')\n",
"plt.xlabel('Percentage Return')\n",
"plt.ylabel('Density')\n",
"plt.legend()\n",
"plt.show()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import matplotlib.pyplot as plt\n",
"import seaborn as sns\n",
"# Plot the probability distribution curve\n",
"plt.figure(figsize=(10, 6))\n",
"sns.histplot(ohlcv_df['returns'].dropna(), kde=True, stat='probability', bins=30)\n",
"plt.title('Probability Distribution of Daily Returns')\n",
"plt.xlabel('Daily Returns')\n",
"plt.ylabel('Probability')\n",
"plt.show()\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#show only rows from ohlcv_df where returns > 0.005\n",
"ohlcv_df[ohlcv_df['returns'] > 0.0005]\n",
"\n",
"#ohlcv_df[ohlcv_df['returns'] < -0.005]"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#ohlcv where index = date 2024-03-13 and between hour 12\n",
"\n",
"a = ohlcv_df.loc['2024-03-13 12:00:00':'2024-03-13 13:00:00']\n",
"a"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"ohlcv_df"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"trades_df"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"ohlcv_df.info()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"trades_df.to_parquet(\"trades_df-spy-0111-0111.parquett\", engine='pyarrow', compression='gzip')\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"trades_df.to_parquet(\"trades_df-spy-111-0516.parquett\", engine='pyarrow', compression='gzip', allow_truncated_timestamps=True)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"ohlcv_df.to_parquet(\"ohlcv_df-spy-111-0516.parquett\", engine='pyarrow', compression='gzip')"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"basic_data = vbt.Data.from_data(vbt.symbol_dict({symbol: ohlcv_df}), tz_convert=zoneNY)\n",
"vbt.settings['plotting']['auto_rangebreaks'] = True\n",
"basic_data.ohlcv.plot()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#access just BCA\n",
"#df_filtered = df.loc[\"BAC\"]"
]
}
],
"metadata": {
"kernelspec": {
"display_name": ".venv",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.10"
}
},
"nbformat": 4,
"nbformat_minor": 2
}

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because it is too large Load Diff

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -1,932 +0,0 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from v2realbot.tools.loadbatch import load_batch\n",
"from v2realbot.utils.utils import zoneNY\n",
"import pandas as pd\n",
"import numpy as np\n",
"import vectorbtpro as vbt\n",
"from itables import init_notebook_mode, show\n",
"import datetime\n",
"from itertools import product\n",
"from v2realbot.config import ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, DATA_DIR\n",
"\n",
"init_notebook_mode(all_interactive=True)\n",
"\n",
"vbt.settings.set_theme(\"dark\")\n",
"vbt.settings['plotting']['layout']['width'] = 1280\n",
"vbt.settings.plotting.auto_rangebreaks = True\n",
"# Set the option to display with pagination\n",
"pd.set_option('display.notebook_repr_html', True)\n",
"pd.set_option('display.max_rows', 10) # Number of rows per page\n",
"\n",
"# Define the market open and close times\n",
"market_open = datetime.time(9, 30)\n",
"market_close = datetime.time(16, 0)\n",
"entry_window_opens = 1\n",
"entry_window_closes = 370\n",
"\n",
"forced_exit_start = 380\n",
"forced_exit_end = 390\n",
"\n",
"#LOAD FROM PARQUET\n",
"#list all files is dir directory with parquet extension\n",
"dir = DATA_DIR + \"/notebooks/\"\n",
"import os\n",
"files = [f for f in os.listdir(dir) if f.endswith(\".parquet\")]\n",
"#print('\\n'.join(map(str, files)))\n",
"file_name = \"ohlcv_df-SPY-2024-01-01T09:30:00-2024-05-14T16:00:00.parquet\"\n",
"ohlcv_df = pd.read_parquet(dir+file_name,engine='pyarrow')\n",
"basic_data = vbt.Data.from_data(vbt.symbol_dict({\"SPY\": ohlcv_df}), tz_convert=zoneNY)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#parameters (primary y line, secondary y line, close)\n",
"def plot_2y_close(priminds, secinds, close):\n",
" fig = vbt.make_subplots(rows=1, cols=1, shared_xaxes=True, specs=[[{\"secondary_y\": True}]], vertical_spacing=0.02, subplot_titles=(\"MOM\", \"Price\" ))\n",
" close.vbt.plot(fig=fig, add_trace_kwargs=dict(secondary_y=False), trace_kwargs=dict(line=dict(color=\"blue\")))\n",
" for ind in priminds:\n",
" ind.plot(fig=fig, add_trace_kwargs=dict(secondary_y=False))\n",
" for ind in secinds:\n",
" ind.plot(fig=fig, add_trace_kwargs=dict(secondary_y=True))\n",
" return fig\n",
"\n",
"# close = basic_data.xloc[\"09:30\":\"10:00\"].close"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#PIPELINE - FOR - LOOP\n",
"\n",
"#indicator parameters\n",
"mom_timeperiod = list(range(2, 12))\n",
"\n",
"#uzavreni okna od 1 do 200\n",
"#entry_window_closes = list(range(2, 50, 3))\n",
"entry_window_closes = [5, 10, 30, 45]\n",
"#entry_window_closes = 30\n",
"#threshold entries parameters\n",
"#long\n",
"mom_th = np.round(np.arange(0.01, 0.5 + 0.02, 0.02),4).tolist()#-0.02\n",
"# short\n",
"#mom_th = np.round(np.arange(-0.01, -0.3 - 0.02, -0.02),4).tolist()#-0.02\n",
"roc_th = np.round(np.arange(-0.2, -0.8 - 0.05, -0.05),4).tolist()#-0.2\n",
"#print(mom_th, roc_th)\n",
"\n",
"#portfolio simulation parameters\n",
"sl_stop =np.round(np.arange(0.02/100, 0.7/100, 0.05/100),4).tolist()\n",
"tp_stop = np.round(np.arange(0.02/100, 0.7/100, 0.05/100),4).tolist()\n",
"\n",
"combs = list(product(mom_timeperiod, mom_th, roc_th, sl_stop, tp_stop))\n",
"\n",
"@vbt.parameterized(merge_func = \"concat\", random_subset = 2000, show_progress=True) \n",
"def test_strat(entry_window_closes=60,\n",
" mom_timeperiod=2,\n",
" mom_th=-0.04,\n",
" #roc_th=-0.2,\n",
" sl_stop=0.19/100,\n",
" tp_stop=0.19/100):\n",
" # mom_timeperiod=2\n",
" # mom_th=-0.06\n",
" # roc_th=-0.2\n",
" # sl_stop=0.04/100\n",
" # tp_stop=0.04/100\n",
"\n",
" momshort = vbt.indicator(\"talib:MOM\").run(basic_data.close, timeperiod=mom_timeperiod, short_name = \"slope_short\")\n",
" rocp = vbt.indicator(\"talib:ROC\").run(basic_data.close, short_name = \"rocp\")\n",
" #rate of change + momentum\n",
"\n",
" #momshort.plot rocp.real_crossed_below(roc_th) & \n",
" #short_signal = momshort.real_crossed_below(mom_th)\n",
" long_signal = momshort.real_crossed_above(mom_th)\n",
" # print(\"short signal\")\n",
" # print(short_signal.value_counts())\n",
"\n",
" #forced_exit = pd.Series(False, index=close.index)\n",
" forced_exit = basic_data.symbol_wrapper.fill(False)\n",
" #entry_window_open = pd.Series(False, index=close.index)\n",
" entry_window_open= basic_data.symbol_wrapper.fill(False)\n",
"\n",
" #print(entry_window_closes, \"entry window closes\")\n",
" # Calculate the time difference in minutes from market open for each timestamp\n",
" elapsed_min_from_open = (forced_exit.index.hour - market_open.hour) * 60 + (forced_exit.index.minute - market_open.minute)\n",
"\n",
" entry_window_open[(elapsed_min_from_open >= entry_window_opens) & (elapsed_min_from_open < entry_window_closes)] = True\n",
"\n",
" #print(entry_window_open.value_counts())\n",
"\n",
" forced_exit[(elapsed_min_from_open >= forced_exit_start) & (elapsed_min_from_open < forced_exit_end)] = True\n",
" #short_entries = (short_signal & entry_window_open)\n",
" #short_exits = forced_exit\n",
" entries = (long_signal & entry_window_open)\n",
" exits = forced_exit\n",
" #long_entries.info()\n",
" #number of trues and falses in long_entries\n",
" #print(short_exits.value_counts())\n",
" #print(short_entries.value_counts())\n",
"\n",
" #fig = plot_2y_close([],[momshort, rocp], close)\n",
" #short_signal.vbt.signals.plot_as_entries(close, fig=fig, add_trace_kwargs=dict(secondary_y=False))\n",
" #print(sl_stop)\n",
" #tsl_th=sl_stop, \n",
" #short_entries=short_entries, short_exits=short_exits,\n",
" pf = vbt.Portfolio.from_signals(close=basic_data.close, entries=entries, exits=exits, tsl_stop=sl_stop, tp_stop = tp_stop, fees=0.0167/100, freq=\"1s\", price=\"close\") #sl_stop=sl_stop, tp_stop = sl_stop,\n",
" \n",
" return pf.stats([\n",
" 'total_return',\n",
" 'max_dd', \n",
" 'total_trades', \n",
" 'win_rate', \n",
" 'expectancy'\n",
" ])\n",
"\n",
"pf_results = test_strat(vbt.Param(entry_window_closes),\n",
" vbt.Param(mom_timeperiod),\n",
" vbt.Param(mom_th),\n",
" #vbt.Param(roc_th)\n",
" vbt.Param(sl_stop),\n",
" vbt.Param(tp_stop, condition=\"tp_stop > sl_stop\"))\n",
"pf_results = pf_results.unstack(level=-1)\n",
"pf_results.sort_values(by=[\"Total Return [%]\", \"Max Drawdown [%]\"], ascending=[False, True])\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#pf_results.load(\"10tiscomb.pickle\")\n",
"#pf_results.info()\n",
"\n",
"vbt.save(pf_results, \"8tiscomb_tsl.pickle\")\n",
"\n",
"# pf_results = vbt.load(\"8tiscomb_tsl.pickle\")\n",
"# pf_results\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# parallel_coordinates method¶\n",
"\n",
"# attach_px_methods.<locals>.plot_func(\n",
"# *args,\n",
"# layout=None,\n",
"# **kwargs\n",
"# )\n",
"\n",
"# pf_results.vbt.px.parallel_coordinates() #ocdf\n",
"\n",
"res = pf_results.reset_index()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pf_results"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import pandas as pd\n",
"from sklearn.decomposition import PCA\n",
"from sklearn.preprocessing import StandardScaler\n",
"import matplotlib.pyplot as plt\n",
"\n",
"# Assuming pf_results is your DataFrame\n",
"# Convert columns to numeric, assuming NaNs where conversion fails\n",
"metrics = ['Total Return [%]', 'Max Drawdown [%]', 'Total Trades']\n",
"for metric in metrics:\n",
" pf_results[metric] = pd.to_numeric(pf_results[metric], errors='coerce')\n",
"\n",
"# Handle missing values, for example filling with the median\n",
"pf_results['Max Drawdown [%]'].fillna(pf_results['Max Drawdown [%]'].median(), inplace=True)\n",
"\n",
"# Extract the metrics into a new DataFrame\n",
"data_for_pca = pf_results[metrics]\n",
"\n",
"# Standardize the data before applying PCA\n",
"scaler = StandardScaler()\n",
"data_scaled = scaler.fit_transform(data_for_pca)\n",
"\n",
"# Apply PCA\n",
"pca = PCA(n_components=2) # Adjust components as needed\n",
"principal_components = pca.fit_transform(data_scaled)\n",
"\n",
"# Create a DataFrame with the principal components\n",
"pca_results = pd.DataFrame(data=principal_components, columns=['PC1', 'PC2'])\n",
"\n",
"# Visualize the results\n",
"plt.figure(figsize=(8,6))\n",
"plt.scatter(pca_results['PC1'], pca_results['PC2'], alpha=0.5)\n",
"plt.xlabel('Principal Component 1')\n",
"plt.ylabel('Principal Component 2')\n",
"plt.title('PCA of Strategy Optimization Results')\n",
"plt.grid(True)\n",
"plt.savefig(\"ddd.png\")\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Check if there is any unnamed level and rename it\n",
"if None in df.index.names:\n",
" # Generate new names list replacing None with 'stat'\n",
" new_names = ['stat' if name is None else name for name in df.index.names]\n",
" df.index.set_names(new_names, inplace=True)\n",
"\n",
"rs= df\n",
"\n",
"rs.info()\n",
"\n",
"\n",
"# # Now, 'stat' is the name of the previously unnamed level\n",
"\n",
"# # Filter for 'Total Return' assuming it is a correct identifier in the 'stat' level\n",
"# total_return_series = df.xs('Total Return [%]', level='stat')\n",
"\n",
"# # Sort the Series to get the largest 'Total Return' values\n",
"# sorted_series = total_return_series.sort_values(ascending=False)\n",
"\n",
"# # Print the sorted filtered data\n",
"# sorted_series.head(20)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"sorted_series.vbt.save()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#df.info()\n",
"total_return_series = df.xs('Total Return [%]')\n",
"sorted_series = total_return_series.sort_values(ascending=False)\n",
"\n",
"# Display the top N entries, e.g., top 5\n",
"sorted_series.head(5)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"df"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"comb_stats_df.nlargest(10, 'Total Return [%]')\n",
"#stats_df.info()\n",
"\n",
"\n",
"8\t-0.06\t-0.2\t0.0028\t0.0048\t4.156254\n",
"4 -0.02 -0.25 0.0028 0.0048 0.84433\n",
"3 -0.02 -0.25 0.0033 0.0023 Total Return [%] 0.846753\n",
"#2\t-0.04\t-0.2\t0.0019\t0.0019\n",
"# 2\t-0.04\t-0.2\t0.0019\t0.0019\t0.556919\t91\t60.43956\t0.00612\n",
"# 2\t-0.04\t-0.25\t0.0019\t0.0019\t0.556919\t91\t60.43956\t0.00612\n",
"# 2\t-0.04\t-0.3\t0.0019\t0.0019\t0.556919\t91\t60.43956\t0.00612\n",
"# 2\t-0.04\t-0.35\t0.0019\t0.0019\t0.556919\t91\t60.43956\t0.00612\n",
"# 2\t-0.04\t-0.4\t0.0019\t0.0019\t0.556919\t91\t60.43956\t0.00612\n",
"# 2\t-0.04\t-0.2\t0.0019\t0.0017\t0.451338\t93\t63.44086\t0.004853\n",
"# 2\t-0.04\t-0.25\t0.0019\t0.0017\t0.451338\t93\t63.44086\t0.004853\n",
"# 2\t-0.04\t-0.3\t0.0019\t0.0017\t0.451338\t93\t63.44086\t0.004853\n",
"# 2\t-0.04\t-0.35\t0.0019\t0.0017\t0.451338\t93\t63.44086\t0.004853\n",
"# 2\t-0.04\t-0.4\t0.0019\t0.0017\t0.451338\t93\t63.44086\t0.004853"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pf.plot()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"basic_data.symbols"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
">>> def apply_func(ts, entries, exits, fastw, sloww, minp=None):\n",
"... fast_ma = vbt.nb.rolling_mean_nb(ts, fastw, minp=minp)\n",
"... slow_ma = vbt.nb.rolling_mean_nb(ts, sloww, minp=minp)\n",
"... entries[:] = vbt.nb.crossed_above_nb(fast_ma, slow_ma) \n",
"... exits[:] = vbt.nb.crossed_above_nb(slow_ma, fast_ma)\n",
"... return (fast_ma, slow_ma) \n",
"\n",
">>> CrossSig = vbt.IF(\n",
"... class_name=\"CrossSig\",\n",
"... input_names=['ts'],\n",
"... in_output_names=['entries', 'exits'],\n",
"... param_names=['fastw', 'sloww'],\n",
"... output_names=['fast_ma', 'slow_ma']\n",
"... ).with_apply_func(\n",
"... apply_func,\n",
"... in_output_settings=dict(\n",
"... entries=dict(dtype=np.bool_), #initialize output with bool\n",
"... exits=dict(dtype=np.bool_)\n",
"... )\n",
"... )\n",
">>> cross_sig = CrossSig.run(ts2, 2, 4)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#PIPELINE - parameters in one go\n",
"\n",
"\n",
"#TOTO prepsat do FOR-LOOPu\n",
"\n",
"\n",
"#indicator parameters\n",
"mom_timeperiod = list(range(2, 6))\n",
"\n",
"#threshold entries parameters\n",
"mom_th = np.round(np.arange(-0.02, -0.1 - 0.02, -0.02),4).tolist()#-0.02\n",
"roc_th = np.round(np.arange(-0.2, -0.4 - 0.05, -0.05),4).tolist()#-0.2\n",
"#print(mom_th, roc_th)\n",
"#jejich product\n",
"# mom_th_prod, roc_th_prod = zip(*product(mom_th, roc_th))\n",
"\n",
"# #convert threshold to vbt param\n",
"# mom_th_index = vbt.Param(mom_th_prod, name='mom_th_th') \n",
"# roc_th_index = vbt.Param(roc_th_prod, name='roc_th_th')\n",
"\n",
"mom_th = vbt.Param(mom_th, name='mom_th')\n",
"roc_th = vbt.Param(roc_th, name='roc_th')\n",
"\n",
"#portfolio simulation parameters\n",
"sl_stop = np.arange(0.03/100, 0.2/100, 0.02/100).tolist()\n",
"# Using the round function\n",
"sl_stop = [round(val, 4) for val in sl_stop]\n",
"tp_stop = np.arange(0.03/100, 0.2/100, 0.02/100).tolist()\n",
"# Using the round function\n",
"tp_stop = [round(val, 4) for val in tp_stop]\n",
"sl_stop = vbt.Param(sl_stop) #np.nan mean s no stoploss\n",
"tp_stop = vbt.Param(tp_stop) #np.nan mean s no stoploss\n",
"\n",
"\n",
"#def test_mom(window=14, mom_th=0.2, roc_th=0.2, sl_stop=0.03/100, tp_stop=0.03/100):\n",
"#close = basic_data.xloc[\"09:30\":\"10:00\"].close\n",
"momshort = vbt.indicator(\"talib:MOM\").run(basic_data.get(\"Close\"), timeperiod=mom_timeperiod, short_name = \"slope_short\")\n",
"\n",
"#ht_trendline = vbt.indicator(\"talib:HT_TRENDLINE\").run(close, short_name = \"httrendline\")\n",
"rocp = vbt.indicator(\"talib:ROC\").run(basic_data.get(\"Close\"), short_name = \"rocp\")\n",
"#rate of change + momentum\n",
"\n",
"rocp_signal = rocp.real_crossed_below(mom_th)\n",
"mom_signal = momshort.real_crossed_below(roc_th)\n",
"\n",
"#mom_signal\n",
"print(rocp_signal.info())\n",
"print(mom_signal.info())\n",
"#print(rocp.real)\n",
"\n",
"\n",
"short_signal = (mom_signal.vbt & rocp_signal)\n",
"\n",
"# #short_signal = (rocp.real_crossed_below(roc_th_index) & momshort.real_crossed_below(mom_th_index))\n",
"# forced_exit = m1_data.symbol_wrapper.fill(False)\n",
"# entry_window_open= m1_data.symbol_wrapper.fill(False)\n",
"\n",
"\n",
"# # Calculate the time difference in minutes from market open for each timestamp\n",
"# elapsed_min_from_open = (forced_exit.index.hour - market_open.hour) * 60 + (forced_exit.index.minute - market_open.minute)\n",
"\n",
"# entry_window_open[(elapsed_min_from_open >= entry_window_opens) & (elapsed_min_from_open < entry_window_closes)] = True\n",
"# forced_exit[(elapsed_min_from_open >= forced_exit_start) & (elapsed_min_from_open < forced_exit_end)] = True\n",
"# short_entries = (short_signal & entry_window_open)\n",
"# short_exits = forced_exit\n",
"# #long_entries.info()\n",
"# #number of trues and falses in long_entries\n",
"# #short_exits.value_counts()\n",
"# #short_entries.value_counts()\n",
"\n",
"\n",
"# pf = vbt.Portfolio.from_signals(close=close, short_entries=short_entries, short_exits=short_exits, sl_stop=sl_stop, tp_stop = tp_stop, fees=0.0167/100, freq=\"1s\") #sl_stop=sl_stop, tp_stop = sl_stop,\n",
"\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# filter dates"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#filter na dny\n",
"dates_of_interest = pd.to_datetime(['2024-04-22']).tz_localize('US/Eastern')\n",
"filtered_df = df.loc[df.index.normalize().isin(dates_of_interest)]\n",
"\n",
"df = filtered_df\n",
"df.info()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# import plotly.io as pio\n",
"# pio.renderers.default = 'notebook'\n",
"\n",
"#naloadujeme do vbt symbol as column\n",
"basic_data = vbt.Data.from_data({\"BAC\": df}, tz_convert=zoneNY)\n",
"\n",
"vbt.settings.plotting.auto_rangebreaks = True\n",
"#basic_data.data[\"BAC\"].vbt.ohlcv.plot()\n",
"\n",
"#basic_data.data[\"BAC\"]"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"m1_data = basic_data[['Open', 'High', 'Low', 'Close', 'Volume']]\n",
"\n",
"m1_data.data[\"BAC\"]\n",
"#m5_data = m1_data.resample(\"5T\")\n",
"\n",
"#m5_data.data[\"BAC\"].head(10)\n",
"\n",
"# m15_data = m1_data.resample(\"15T\")\n",
"\n",
"# m15 = m15_data.data[\"BAC\"]\n",
"\n",
"# m15.vbt.ohlcv.plot()\n",
"\n",
"# m1_data.wrapper.index\n",
"\n",
"# m1_resampler = m1_data.wrapper.get_resampler(\"1T\")\n",
"# m1_resampler.index_difference(reverse=True)\n",
"\n",
"\n",
"# m5_resampler.prettify()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# MOM indicator"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"vbt.phelp(vbt.indicator(\"talib:ROCP\").run)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"vyuzití rychleho klesani na sekundove urovni behem open rush\n",
"- MOM + ROC during open rush\n",
"- short signal\n",
"- pipeline kombinace thresholdu pro vstup mom_th, roc_th + hodnota sl_stop a tp_stop (pripadne trailing) - nalezeni optimalni kombinace atributu"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# fig = plot_2y_close([ht_trendline],[momshort, rocp], close)\n",
"# short_signal.vbt.signals.plot_as_entries(close, fig=fig, add_trace_kwargs=dict(secondary_y=False)) \n",
"\n",
"#parameters (primary y line, secondary y line, close)\n",
"def plot_2y_close(priminds, secinds, close):\n",
" fig = vbt.make_subplots(rows=1, cols=1, shared_xaxes=True, specs=[[{\"secondary_y\": True}]], vertical_spacing=0.02, subplot_titles=(\"MOM\", \"Price\" ))\n",
" close.vbt.plot(fig=fig, add_trace_kwargs=dict(secondary_y=False), trace_kwargs=dict(line=dict(color=\"blue\")))\n",
" for ind in priminds:\n",
" ind.plot(fig=fig, add_trace_kwargs=dict(secondary_y=False))\n",
" for ind in secinds:\n",
" ind.plot(fig=fig, add_trace_kwargs=dict(secondary_y=True))\n",
" return fig\n",
"\n",
"close = m1_data.xloc[\"09:30\":\"10:00\"].close\n",
"momshort = vbt.indicator(\"talib:MOM\").run(close, timeperiod=3, short_name = \"slope_short\")\n",
"ht_trendline = vbt.indicator(\"talib:HT_TRENDLINE\").run(close, short_name = \"httrendline\")\n",
"rocp = vbt.indicator(\"talib:ROC\").run(close, short_name = \"rocp\")\n",
"#rate of change + momentum\n",
"short_signal = (rocp.real_crossed_below(-0.2) & momshort.real_crossed_below(-0.02))\n",
"#indlong = vbt.indicator(\"talib:MOM\").run(close, timeperiod=10, short_name = \"slope_long\")\n",
"fig = plot_2y_close([ht_trendline],[momshort, rocp], close)\n",
"short_signal.vbt.signals.plot_as_entries(close, fig=fig, add_trace_kwargs=dict(secondary_y=False)) "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"close = m1_data.close\n",
"#vbt.phelp(vbt.OLS.run)\n",
"\n",
"#oer steepmnes of regression line\n",
"#talib.LINEARREG_SLOPE(close, timeperiod=timeperiod)\n",
"#a také ON BALANCE VOLUME - http://5.161.179.223:8000/static/js/vbt/api/indicators/custom/obv/index.html\n",
"\n",
"\n",
"\n",
"mom_ind = vbt.indicator(\"talib:MOM\") \n",
"#vbt.phelp(mom_ind.run)\n",
"\n",
"mom = mom_ind.run(close, timeperiod=10)\n",
"\n",
"plot_2y_close(mom, close)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# defining ENTRY WINDOW and forced EXIT window"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#m1_data.data[\"BAC\"].info()\n",
"import datetime\n",
"# Define the market open and close times\n",
"market_open = datetime.time(9, 30)\n",
"market_close = datetime.time(16, 0)\n",
"entry_window_opens = 2\n",
"entry_window_closes = 30\n",
"\n",
"forced_exit_start = 380\n",
"forced_exit_end = 390\n",
"\n",
"forced_exit = m1_data.symbol_wrapper.fill(False)\n",
"entry_window_open= m1_data.symbol_wrapper.fill(False)\n",
"\n",
"# Calculate the time difference in minutes from market open for each timestamp\n",
"elapsed_min_from_open = (forced_exit.index.hour - market_open.hour) * 60 + (forced_exit.index.minute - market_open.minute)\n",
"\n",
"entry_window_open[(elapsed_min_from_open >= entry_window_opens) & (elapsed_min_from_open < entry_window_closes)] = True\n",
"forced_exit[(elapsed_min_from_open >= forced_exit_start) & (elapsed_min_from_open < forced_exit_end)] = True\n",
"\n",
"#entry_window_open.info()\n",
"# forced_exit.tail(100)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"close = m1_data.close\n",
"\n",
"#rsi = vbt.RSI.run(close, window=14)\n",
"\n",
"short_entries = (short_signal & entry_window_open)\n",
"short_exits = forced_exit\n",
"#long_entries.info()\n",
"#number of trues and falses in long_entries\n",
"#short_exits.value_counts()\n",
"short_entries.value_counts()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"def plot_rsi(close, entries, exits):\n",
" fig = vbt.make_subplots(rows=1, cols=1, shared_xaxes=True, specs=[[{\"secondary_y\": True}]], vertical_spacing=0.02, subplot_titles=(\"RSI\", \"Price\" ))\n",
" close.vbt.plot(fig=fig, add_trace_kwargs=dict(secondary_y=True))\n",
" #rsi.plot(fig=fig, add_trace_kwargs=dict(secondary_y=False))\n",
" entries.vbt.signals.plot_as_entries(close, fig=fig, add_trace_kwargs=dict(secondary_y=False)) \n",
" exits.vbt.signals.plot_as_exits(close, fig=fig, add_trace_kwargs=dict(secondary_y=False)) \n",
" return fig\n",
"\n",
"plot_rsi(close, short_entries, short_exits)\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"vbt.phelp(vbt.Portfolio.from_signals)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"sl_stop = np.arange(0.03/100, 0.2/100, 0.02/100).tolist()\n",
"# Using the round function\n",
"sl_stop = [round(val, 4) for val in sl_stop]\n",
"print(sl_stop)\n",
"sl_stop = vbt.Param(sl_stop) #np.nan mean s no stoploss\n",
"\n",
"pf = vbt.Portfolio.from_signals(close=close, short_entries=short_entries, short_exits=short_exits, sl_stop=0.03/100, tp_stop = 0.03/100, fees=0.0167/100, freq=\"1s\") #sl_stop=sl_stop, tp_stop = sl_stop,\n",
"\n",
"#pf.stats()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#list of orders\n",
"#pf.orders.records_readable\n",
"#pf.orders.plots()\n",
"#pf.stats()\n",
"pf.stats()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pf.plot()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pf[(0.0015,0.0013)].plot()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pf[0.03].plot_trade_signals()\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# pristup k pf jako multi index"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#pf[0.03].plot()\n",
"#pf.order_records\n",
"pf[(0.03)].stats()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#zgrupovane statistiky\n",
"stats_df = pf.stats([\n",
" 'total_return',\n",
" 'total_trades',\n",
" 'win_rate',\n",
" 'expectancy'\n",
"], agg_func=None)\n",
"stats_df\n",
"\n",
"\n",
"stats_df.nlargest(10, 'Total Return [%]')\n",
"#stats_df.info()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pf[(0.0011,0.0013000000000000002)].plot()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from pandas.tseries.offsets import DateOffset\n",
"\n",
"temp_data = basic_data['2024-4-22']\n",
"temp_data\n",
"res1m = temp_data[[\"Open\", \"High\", \"Low\", \"Close\", \"Volume\"]]\n",
"\n",
"# Define a custom date offset that starts at 9:30 AM and spans 4 hours\n",
"custom_offset = DateOffset(hours=4, minutes=30)\n",
"\n",
"# res1m = res1m.get().resample(\"4H\").agg({ \n",
"# \"Open\": \"first\",\n",
"# \"High\": \"max\",\n",
"# \"Low\": \"min\",\n",
"# \"Close\": \"last\",\n",
"# \"Volume\": \"sum\"\n",
"# })\n",
"\n",
"res4h = res1m.resample(\"1h\", resample_kwargs=dict(origin=\"start\"))\n",
"\n",
"res4h.data\n",
"\n",
"res15m = res1m.resample(\"15T\", resample_kwargs=dict(origin=\"start\"))\n",
"\n",
"res15m.data[\"BAC\"]"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"@vbt.njit\n",
"def long_entry_place_func_nb(c, low, close, time_in_ns, rsi14, window_open, window_close):\n",
" market_open_minutes = 570 # 9 hours * 60 minutes + 30 minutes\n",
"\n",
" for out_i in range(len(c.out)):\n",
" i = c.from_i + out_i\n",
"\n",
" current_minutes = vbt.dt_nb.hour_nb(time_in_ns[i]) * 60 + vbt.dt_nb.minute_nb(time_in_ns[i])\n",
" #print(\"current_minutes\", current_minutes)\n",
" # Calculate elapsed minutes since market open at 9:30 AM\n",
" elapsed_from_open = current_minutes - market_open_minutes\n",
" elapsed_from_open = elapsed_from_open if elapsed_from_open >= 0 else 0\n",
" #print( \"elapsed_from_open\", elapsed_from_open)\n",
"\n",
" #elapsed_from_open = elapsed_minutes_from_open_nb(time_in_ns) \n",
" in_window = elapsed_from_open > window_open and elapsed_from_open < window_close\n",
" #print(\"in_window\", in_window)\n",
" # if in_window:\n",
" # print(\"in window\")\n",
"\n",
" if in_window and rsi14[i] > 60: # and low[i, c.col] <= hit_price: # and hour == 9: # (4)!\n",
" return out_i\n",
" return -1\n",
"\n",
"@vbt.njit\n",
"def long_exit_place_func_nb(c, high, close, time_index, tp, sl): # (5)!\n",
" entry_i = c.from_i - c.wait\n",
" entry_price = close[entry_i, c.col]\n",
" hit_price = entry_price * (1 + tp)\n",
" stop_price = entry_price * (1 - sl)\n",
" for out_i in range(len(c.out)):\n",
" i = c.from_i + out_i\n",
" last_bar_of_day = vbt.dt_nb.day_changed_nb(time_index[i], time_index[i + 1])\n",
"\n",
" #print(next_day)\n",
" if last_bar_of_day: #pokud je dalsi next day, tak zavirame posledni\n",
" print(\"ted\",out_i)\n",
" return out_i\n",
" if close[i, c.col] >= hit_price or close[i, c.col] <= stop_price :\n",
" return out_i\n",
" return -1\n",
"\n",
"\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"df = pd.DataFrame(np.random.random(size=(5, 10)), columns=list('abcdefghij'))\n",
"\n",
"df"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"df.sum()"
]
}
],
"metadata": {
"kernelspec": {
"display_name": ".venv",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.11"
}
},
"nbformat": 4,
"nbformat_minor": 2
}

View File

@ -1,782 +0,0 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# SUPERTREND\n",
"\n",
"* kombinace supertrendu na vice urovnich"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [],
"source": [
"from dotenv import load_dotenv\n",
"\n",
"#as V2realbot is client , load env variables here\n",
"env_file = \"/Users/davidbrazda/Documents/Development/python/.env\"\n",
"# Load the .env file\n",
"load_dotenv(env_file)\n",
"\n",
"from v2realbot.utils.utils import zoneNY\n",
"import pandas as pd\n",
"import numpy as np\n",
"import vectorbtpro as vbt\n",
"# from itables import init_notebook_mode, show\n",
"import datetime\n",
"from itertools import product\n",
"from v2realbot.config import DATA_DIR\n",
"from lightweight_charts import JupyterChart, chart, Panel, PlotAccessor\n",
"from IPython.display import display\n",
"\n",
"# init_notebook_mode(all_interactive=True)\n",
"\n",
"vbt.settings.set_theme(\"dark\")\n",
"vbt.settings['plotting']['layout']['width'] = 1280\n",
"vbt.settings.plotting.auto_rangebreaks = True\n",
"# Set the option to display with pagination\n",
"pd.set_option('display.notebook_repr_html', True)\n",
"pd.set_option('display.max_rows', 10) # Number of rows per page"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"trades_df-BAC-2024-01-01T09_30_00-2024-05-14T16_00_00-CO4B7VPWUZF-100.parquet\n",
"trades_df-BAC-2024-01-11T09:30:00-2024-01-12T16:00:00.parquet\n",
"trades_df-SPY-2024-01-01T09:30:00-2024-05-14T16:00:00.parquet\n",
"trades_df-BAC-2023-01-01T09_30_00-2024-05-25T16_00_00-47BCFOPUVWZ-100.parquet\n",
"ohlcv_df-BAC-2024-01-11T09:30:00-2024-01-12T16:00:00.parquet\n",
"trades_df-BAC-2024-05-15T09_30_00-2024-05-25T16_00_00-47BCFOPUVWZ-100.parquet\n",
"ohlcv_df-BAC-2024-01-01T09_30_00-2024-05-25T16_00_00-47BCFOPUVWZ-100.parquet\n",
"ohlcv_df-SPY-2024-01-01T09:30:00-2024-05-14T16:00:00.parquet\n",
"ohlcv_df-BAC-2024-01-01T09_30_00-2024-05-14T16_00_00-CO4B7VPWUZF-100.parquet\n",
"ohlcv_df-BAC-2023-01-01T09_30_00-2024-05-25T16_00_00-47BCFOPUVWZ-100.parquet\n",
"ohlcv_df-BAC-2023-01-01T09_30_00-2024-05-25T15_30_00-47BCFOPUVWZ-100.parquet\n"
]
},
{
"data": {
"text/plain": [
"351"
]
},
"execution_count": 2,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Define the market open and close times\n",
"market_open = datetime.time(9, 30)\n",
"market_close = datetime.time(16, 0)\n",
"entry_window_opens = 1\n",
"entry_window_closes = 370\n",
"forced_exit_start = 380\n",
"forced_exit_end = 390\n",
"\n",
"#LOAD FROM PARQUET\n",
"#list all files is dir directory with parquet extension\n",
"dir = DATA_DIR + \"/notebooks/\"\n",
"import os\n",
"files = [f for f in os.listdir(dir) if f.endswith(\".parquet\")]\n",
"print('\\n'.join(map(str, files)))\n",
"file_name = \"ohlcv_df-BAC-2023-01-01T09_30_00-2024-05-25T15_30_00-47BCFOPUVWZ-100.parquet\"\n",
"ohlcv_df = pd.read_parquet(dir+file_name,engine='pyarrow')\n",
"#filter ohlcv_df to certain date range (assuming datetime index)\n",
"#ohlcv_df = ohlcv_df.loc[\"2024-02-12 9:30\":\"2024-02-14 16:00\"]\n",
"\n",
"#add vwap column to ohlcv_df\n",
"#ohlcv_df[\"hlcc4\"] = (ohlcv_df[\"close\"] + ohlcv_df[\"high\"] + ohlcv_df[\"low\"] + ohlcv_df[\"close\"]) / 4\n",
"\n",
"basic_data = vbt.Data.from_data(vbt.symbol_dict({\"BAC\": ohlcv_df}), tz_convert=zoneNY)\n",
"ohlcv_df= None\n",
"basic_data.wrapper.index.normalize().nunique()"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"<class 'pandas.core.frame.DataFrame'>\n",
"DatetimeIndex: 4549772 entries, 2023-01-03 09:30:01-05:00 to 2024-05-24 15:59:59-04:00\n",
"Data columns (total 10 columns):\n",
" # Column Dtype \n",
"--- ------ ----- \n",
" 0 open float64 \n",
" 1 high float64 \n",
" 2 low float64 \n",
" 3 close float64 \n",
" 4 volume float64 \n",
" 5 trades float64 \n",
" 6 updated datetime64[ns, US/Eastern]\n",
" 7 vwap float64 \n",
" 8 buyvolume float64 \n",
" 9 sellvolume float64 \n",
"dtypes: datetime64[ns, US/Eastern](1), float64(9)\n",
"memory usage: 381.8 MB\n"
]
}
],
"source": [
"basic_data.data[\"BAC\"].info()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Add resample function to custom columns"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"from vectorbtpro.utils.config import merge_dicts, Config, HybridConfig\n",
"from vectorbtpro import _typing as tp\n",
"from vectorbtpro.generic import nb as generic_nb\n",
"\n",
"_feature_config: tp.ClassVar[Config] = HybridConfig(\n",
" {\n",
" \"buyvolume\": dict(\n",
" resample_func=lambda self, obj, resampler: obj.vbt.resample_apply(\n",
" resampler,\n",
" generic_nb.sum_reduce_nb,\n",
" )\n",
" ),\n",
" \"sellvolume\": dict(\n",
" resample_func=lambda self, obj, resampler: obj.vbt.resample_apply(\n",
" resampler,\n",
" generic_nb.sum_reduce_nb,\n",
" )\n",
" ),\n",
" \"trades\": dict(\n",
" resample_func=lambda self, obj, resampler: obj.vbt.resample_apply(\n",
" resampler,\n",
" generic_nb.sum_reduce_nb,\n",
" )\n",
" )\n",
" }\n",
")\n",
"\n",
"basic_data._feature_config = _feature_config"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
"s1data = basic_data[['open', 'high', 'low', 'close', 'volume','vwap','buyvolume','trades','sellvolume']]\n",
"\n",
"s5data = s1data.resample(\"5s\")\n",
"s5data = s5data.transform(lambda df: df.between_time('09:30', '16:00').dropna())\n",
"\n",
"t1data = basic_data[['open', 'high', 'low', 'close', 'volume','vwap','buyvolume','trades','sellvolume']].resample(\"1T\")\n",
"t1data = t1data.transform(lambda df: df.between_time('09:30', '16:00').dropna())\n",
"# t1data.data[\"BAC\"].info()\n",
"\n",
"t30data = basic_data[['open', 'high', 'low', 'close', 'volume','vwap','buyvolume','trades','sellvolume']].resample(\"30T\")\n",
"t30data = t30data.transform(lambda df: df.between_time('09:30', '16:00').dropna())\n",
"# t30data.data[\"BAC\"].info()\n",
"\n",
"s1close = s1data.close\n",
"t1close = t1data.close\n",
"t30close = t30data.close\n",
"t30volume = t30data.volume\n",
"\n",
"#resample on specific index \n",
"resampler = vbt.Resampler(t30data.index, s1data.index, source_freq=\"30T\", target_freq=\"1s\")\n",
"t30close_realigned = t30close.vbt.realign_closing(resampler)\n",
"\n",
"#resample 1min to s\n",
"resampler_s = vbt.Resampler(t1data.index, s1data.index, source_freq=\"1T\", target_freq=\"1s\")\n",
"t1close_realigned = t1close.vbt.realign_closing(resampler_s)"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"datetime64[ns, US/Eastern]\n",
"datetime64[ns, US/Eastern]\n"
]
}
],
"source": [
"print(t30data.index.dtype)\n",
"print(s1data.index.dtype)"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"<class 'pandas.core.frame.DataFrame'>\n",
"DatetimeIndex: 4551 entries, 2023-01-03 09:30:00-05:00 to 2024-05-24 15:30:00-04:00\n",
"Data columns (total 9 columns):\n",
" # Column Non-Null Count Dtype \n",
"--- ------ -------------- ----- \n",
" 0 open 4551 non-null float64\n",
" 1 high 4551 non-null float64\n",
" 2 low 4551 non-null float64\n",
" 3 close 4551 non-null float64\n",
" 4 volume 4551 non-null float64\n",
" 5 vwap 4551 non-null float64\n",
" 6 buyvolume 4551 non-null float64\n",
" 7 trades 4551 non-null float64\n",
" 8 sellvolume 4551 non-null float64\n",
"dtypes: float64(9)\n",
"memory usage: 355.5 KB\n"
]
}
],
"source": [
"t30data.data[\"BAC\"].info()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"vbt.IF.list_indicators(\"*vwap\")\n",
"vbt.phelp(vbt.VWAP.run)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# VWAP"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"\n",
"t1vwap_h = vbt.VWAP.run(t1data.high, t1data.low, t1data.close, t1data.volume, anchor=\"H\")\n",
"t1vwap_d = vbt.VWAP.run(t1data.high, t1data.low, t1data.close, t1data.volume, anchor=\"D\")\n",
"t1vwap_t = vbt.VWAP.run(t1data.high, t1data.low, t1data.close, t1data.volume, anchor=\"T\")\n",
"\n",
"t1vwap_h_real = t1vwap_h.vwap.vbt.realign_closing(resampler_s)\n",
"t1vwap_d_real = t1vwap_d.vwap.vbt.realign_closing(resampler_s)\n",
"t1vwap_t_real = t1vwap_t.vwap.vbt.realign_closing(resampler_s)\n",
"\n",
"#t1vwap_5t.xloc[\"2024-01-3 09:30:00\":\"2024-01-03 16:00:00\"].plot()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#m30data.close.lw.plot()\n",
"#quick few liner\n",
"pane1 = Panel(\n",
" histogram=[\n",
" #(s1data.volume, \"volume\",None, 0.8),\n",
" #(m30volume, \"m30volume\",None, 1)\n",
" ], # [(series, name, \"rgba(53, 94, 59, 0.6)\", opacity)]\n",
" right=[\n",
" (s1data.close, \"1s close\"),\n",
" (t1data.close, \"1min close\"),\n",
" (t1vwap_t, \"1mvwap_t\"),\n",
" (t1vwap_h, \"1mvwap_h\"),\n",
" (t1vwap_d, \"1mvwap_d\"),\n",
" (t1vwap_t_real, \"1mvwap_t_real\"),\n",
" (t1vwap_h_real, \"1mvwap_h_real\"),\n",
" (t1vwap_d_real, \"1mvwap_d_real\")\n",
" # (t1close_realigned, \"1min close realigned\"),\n",
" # (m30data.close, \"30min-close\"),\n",
" # (m30close_realigned, \"30min close realigned\"),\n",
" ],\n",
")\n",
"ch = chart([pane1], size=\"s\", xloc=slice(\"2024-05-1 09:30:00\",\"2024-05-25 16:00:00\"))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# SUPERTREND"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"supertrend_s1 = vbt.SUPERTREND.run(s1data.high, s1data.low, s1data.close, period=5, multiplier=3)\n",
"direction_series_s1 = supertrend_s1.direction\n",
"supertrend_t1 = vbt.SUPERTREND.run(t1data.high, t1data.low, t1data.close, period=14, multiplier=3)\n",
"direction_series_t1 = supertrend_t1.direction\n",
"supertrend_t30 = vbt.SUPERTREND.run(t30data.high, t30data.low, t30data.close, period=14, multiplier=3)\n",
"direction_series_t30 = supertrend_t30.direction\n",
"\n",
"resampler_1t_sec = vbt.Resampler(direction_series_t1.index, direction_series_s1.index, source_freq=\"1T\", target_freq=\"1s\")\n",
"resampler_30t_sec = vbt.Resampler(direction_series_t30.index, direction_series_s1.index, source_freq=\"30T\", target_freq=\"1s\")\n",
"direction_series_t1_realigned = direction_series_t1.vbt.realign_closing(resampler_1t_sec)\n",
"direction_series_t30_realigned = direction_series_t30.vbt.realign_closing(resampler_30t_sec)\n",
"\n",
"#supertrend_s1.xloc[\"2024-01-3 09:30:00\":\"2024-01-03 16:00:00\"].plot()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# aligned_ups= pd.Series(False, index=direction_real.index)\n",
"# aligned_downs= pd.Series(False, index=direction_real.index)\n",
"\n",
"# aligned_ups = direction_real == 1 & supertrend.direction == 1\n",
"# aligned_ups"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"s5close = s5data.data[\"BAC\"].close\n",
"s5open = s5data.data[\"BAC\"].open\n",
"s5high = s5data.data[\"BAC\"].high\n",
"s5close_prev = s5close.shift(1)\n",
"s5open_prev = s5open.shift(1)\n",
"s5high_prev = s5high.shift(1)\n",
"#gap nahoru od byci svicky a nevraci se zpet na jeji uroven\n",
"entry_ups = (s5close_prev > s5open_prev) & (s5open > s5high_prev + 0.010) & (s5close > s5close_prev)\n",
"\n",
"entry_ups.value_counts()\n",
"\n",
"#entry_ups.info()\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Entry window"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"market_open = datetime.time(9, 30)\n",
"market_close = datetime.time(16, 0)\n",
"entry_window_opens = 10\n",
"entry_window_closes = 370"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"entry_window_open= pd.Series(False, index=entry_ups.index)\n",
"# Calculate the time difference in minutes from market open for each timestamp\n",
"elapsed_min_from_open = (entry_ups.index.hour - market_open.hour) * 60 + (entry_ups.index.minute - market_open.minute)\n",
"entry_window_open[(elapsed_min_from_open >= entry_window_opens) & (elapsed_min_from_open < entry_window_closes)] = True\n",
"#entry_window_open\n",
"\n",
"entry_ups = entry_ups & entry_window_open\n",
"# entry_ups\n",
"\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"s5vwap_h = vbt.VWAP.run(s5data.high, s5data.low, s5data.close, s5data.volume, anchor=\"H\")\n",
"s5vwap_d = vbt.VWAP.run(s5data.high, s5data.low, s5data.close, s5data.volume, anchor=\"D\")\n",
"\n",
"# s5vwap_h_real = s5vwap_h.vwap.vbt.realign_closing(resampler_s)\n",
"# s5vwap_d_real = s5vwap_d.vwap.vbt.realign_closing(resampler_s)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pane1 = Panel(\n",
" ohlcv=(s5data.data[\"BAC\"],), #(series, entries, exits, other_markers)\n",
" histogram=[], # [(series, name, \"rgba(53, 94, 59, 0.6), opacity\")]\n",
" right=[#(bbands,), #[(series, name, entries, exits, other_markers)]\n",
" (s5data.data[\"BAC\"].close, \"close\", entry_ups),\n",
" (s5data.data[\"BAC\"].open, \"open\"),\n",
" (s5vwap_h, \"vwap5s_H\",),\n",
" (s5vwap_d, \"vwap5s_D\",)\n",
" # (t1data.data[\"BAC\"].vwap, \"vwap\"),\n",
" # (t1data.close, \"1min close\"),\n",
" # (supertrend_s1.trend,\"STtrend\"),\n",
" # (supertrend_s1.long,\"STlong\"),\n",
" # (supertrend_s1.short,\"STshort\")\n",
" ],\n",
" left = [\n",
" #(direction_series_s1,\"direction_s1\"),\n",
" # (direction_series_t1,\"direction_t1\"),\n",
" # (direction_series_t30,\"direction_t30\")\n",
" \n",
" ],\n",
" # right=[(bbands.upperband, \"upperband\",),\n",
" # (bbands.lowerband, \"lowerband\",),\n",
" # (bbands.middleband, \"middleband\",)\n",
" # ], #[(series, name, entries, exits, other_markers)]\n",
" middle1=[],\n",
" middle2=[],\n",
")\n",
"\n",
"# pane2 = Panel(\n",
"# ohlcv=(t1data.data[\"BAC\"],uptrend_m30, downtrend_m30), #(series, entries, exits, other_markers)\n",
"# histogram=[], # [(series, name, \"rgba(53, 94, 59, 0.6), opacity\")]\n",
"# left=[#(bbands,), #[(series, name, entries, exits, other_markers)]\n",
"# (direction_real,\"direction30min_real\"),\n",
"# ],\n",
"# # left = [(supertrendm30.direction,\"STdirection30\")],\n",
"# # # right=[(bbands.upperband, \"upperband\",),\n",
"# # # (bbands.lowerband, \"lowerband\",),\n",
"# # # (bbands.middleband, \"middleband\",)\n",
"# # # ], #[(series, name, entries, exits, other_markers)]\n",
"# middle1=[],\n",
"# middle2=[],\n",
"# title = \"1m\")\n",
"\n",
"ch = chart([pane1], sync=True, size=\"s\", xloc=slice(\"2024-02-20 09:30:00\",\"2024-02-22 16:00:00\"), precision=6)\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# data = s5data.xloc[\"2024-01-03 09:30:00\":\"2024-03-10 16:00:00\"]\n",
"# entry = entry_ups.vbt.xloc[\"2024-01-03 09:30:00\":\"2024-03-10 16:00:00\"].obj\n",
"\n",
"pf = vbt.Portfolio.from_signals(close=s5data, entries=entry_ups, direction=\"longonly\", sl_stop=0.05/100, tp_stop = 0.05/100, fees=0.0167/100, freq=\"5s\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pf.xloc[\"2024-01-26 09:30:00\":\"2024-02-28 16:00:00\"].positions.plot()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pf.xloc[\"2024-01-26 09:30:00\":\"2024-01-28 16:00:00\"].plot()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pd.set_option('display.max_rows', None)\n",
"pf.stats()\n",
"# pf.xloc[\"monday\"].stats()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"buyvolume = t1data.data[\"BAC\"].buyvolume\n",
"sellvolume = t1data.data[\"BAC\"].sellvolume\n",
"totalvolume = buyvolume + sellvolume\n",
"\n",
"#adjust to minimal value to avoid division by zero\n",
"sellvolume_adjusted = sellvolume.replace(0, 1e-10)\n",
"oibratio = buyvolume / sellvolume\n",
"\n",
"#cumulative order flow (net difference)\n",
"cof = buyvolume - sellvolume\n",
"\n",
"# Calculate the order imbalance (net differene) normalize the order imbalance by calculating the difference between buy and sell volumes and then scaling it by the total volume.\n",
"order_imbalance = cof / totalvolume\n",
"order_imbalance = order_imbalance.fillna(0) #nan nahradime 0\n",
"\n",
"order_imbalance_allvolume = cof / t1data.data[\"BAC\"].volume\n",
"\n",
"order_imbalance_sma = vbt.indicator(\"talib:EMA\").run(order_imbalance, timeperiod=5)\n",
"short_signals = order_imbalance.vbt < -0.5\n",
"#short_entries = oibratio.vbt < 0.01\n",
"short_signals.value_counts()\n",
"short_signals.name = \"short_entries\"\n",
"#.fillna(False)\n",
"short_exits = short_signals.shift(-2).fillna(False).astype(bool)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pane1 = Panel(\n",
" ohlcv=(t1data.data[\"BAC\"],), #(series, entries, exits, other_markers)\n",
" histogram=[(order_imbalance_allvolume, \"oib_allvolume\", \"rgba(53, 94, 59, 0.6)\",0.5),\n",
" (t1data.data[\"BAC\"].trades, \"trades\",None,0.4),\n",
" ], # [(series, name, \"rgba(53, 94, 59, 0.6)\", opacity)]\n",
" # right=[\n",
" # (supertrend.trend,\"STtrend\"),\n",
" # (supertrend.long,\"STlong\"),\n",
" # (supertrend.short,\"STshort\")\n",
" # ],\n",
" # left = [(supertrend.direction,\"STdirection\")],\n",
" # right=[(bbands.upperband, \"upperband\",),\n",
" # (bbands.lowerband, \"lowerband\",),\n",
" # (bbands.middleband, \"middleband\",)\n",
" # ], #[(series, name, entries, exits, other_markers)]\n",
" middle1=[],\n",
" middle2=[],\n",
")\n",
"\n",
"pane2 = Panel(\n",
" ohlcv=(basic_data.data[\"BAC\"],), #(series, entries, exits, other_markers)\n",
" left=[(basic_data.data[\"BAC\"].trades, \"trades\")],\n",
" histogram=[(basic_data.data[\"BAC\"].trades, \"trades_hist\", \"white\", 0.5)], #\"rgba(53, 94, 59, 0.6)\"\n",
" # ], # [(series, name, \"rgba(53, 94, 59, 0.6)\")]\n",
" # right=[\n",
" # (supertrend.trend,\"STtrend\"),\n",
" # (supertrend.long,\"STlong\"),\n",
" # (supertrend.short,\"STshort\")\n",
" # ],\n",
" # left = [(supertrend.direction,\"STdirection\")],\n",
" # right=[(bbands.upperband, \"upperband\",),\n",
" # (bbands.lowerband, \"lowerband\",),\n",
" # (bbands.middleband, \"middleband\",)\n",
" # ], #[(series, name, entries, exits, other_markers)]\n",
" middle1=[],\n",
" middle2=[],\n",
")\n",
"\n",
"\n",
"ch = chart([pane1, pane2], size=\"m\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#short_signal = t1slope.real_below(t1_th) & t2slope.real_below(t2_th) & t3slope.real_below(t3_th) & t4slope.real_below(t4_th)\n",
"#long_signal = t1slope.real_above(t1_th) & t2slope.real_above(t2_th) & t3slope.real_above(t3_th) & t4slope.real_above(t4_th)\n",
"\n",
"#test na daily s reversem crossed 0\n",
"short_signal = t2slope.vbt < -0.01 & t3slope.vbt < -0.01 #min value of threshold\n",
"long_signal = t2slope.vbt > 0.01 & t3slope.vbt > 0.01 #min\n",
"\n",
"# thirty_up_signal = t3slope.vbt.crossed_above(0.01)\n",
"# thirty_down_signal = t3slope.vbt.crossed_below(-0.01)\n",
"\n",
"fig = plot_2y_close(priminds=[], secinds=[t3slope], close=t1data.close)\n",
"#short_signal.vbt.signals.plot_as_entries(basic_data.close, fig=fig)\n",
"\n",
"short_signal.vbt.signals.plot_as_entries(t1data.close, fig=fig, trace_kwargs=dict(name=\"SHORTS\",\n",
" line=dict(color=\"#ffe476\"),\n",
" marker=dict(color=\"red\", symbol=\"triangle-down\"),\n",
" fill=None,\n",
" connectgaps=True,\n",
" ))\n",
"long_signal.vbt.signals.plot_as_entries(t1data.close, fig=fig, trace_kwargs=dict(name=\"LONGS\",\n",
" line=dict(color=\"#ffe476\"),\n",
" marker=dict(color=\"limegreen\"),\n",
" fill=None,\n",
" connectgaps=True,\n",
" ))\n",
"\n",
"# thirty_down_signal.vbt.signals.plot_as_entries(t1data.close, fig=fig, trace_kwargs=dict(name=\"DOWN30\",\n",
"# line=dict(color=\"#ffe476\"),\n",
"# marker=dict(color=\"yellow\", symbol=\"triangle-down\"),\n",
"# fill=None,\n",
"# connectgaps=True,\n",
"# ))\n",
"# thirty_up_signal.vbt.signals.plot_as_entries(t1data.close, fig=fig, trace_kwargs=dict(name=\"UP30\",\n",
"# line=dict(color=\"#ffe476\"),\n",
"# marker=dict(color=\"grey\"),\n",
"# fill=None,\n",
"# connectgaps=True,\n",
"# ))\n",
"\n",
"# thirtymin_slope_to_compare.vbt.plot(fig=fig, add_trace_kwargs=dict(secondary_y=True), trace_kwargs=dict(name=\"30min slope\",\n",
"# line=dict(color=\"yellow\"), \n",
"# fill=None,\n",
"# connectgaps=True,\n",
"# ))\n",
"\n",
"fig.show()\n",
"# print(\"short signal\")\n",
"# print(short_signal.value_counts())\n",
"\n",
"#forced_exit = pd.Series(False, index=close.index)\n",
"forced_exit = basic_data.symbol_wrapper.fill(False)\n",
"#entry_window_open = pd.Series(False, index=close.index)\n",
"entry_window_open= basic_data.symbol_wrapper.fill(False)\n",
"\n",
"# Calculate the time difference in minutes from market open for each timestamp\n",
"elapsed_min_from_open = (forced_exit.index.hour - market_open.hour) * 60 + (forced_exit.index.minute - market_open.minute)\n",
"\n",
"entry_window_open[(elapsed_min_from_open >= entry_window_opens) & (elapsed_min_from_open < entry_window_closes)] = True\n",
"\n",
"#print(entry_window_open.value_counts())\n",
"\n",
"forced_exit[(elapsed_min_from_open >= forced_exit_start) & (elapsed_min_from_open < forced_exit_end)] = True\n",
"short_entries = (short_signal & entry_window_open)\n",
"short_exits = forced_exit\n",
"\n",
"entries = (long_signal & entry_window_open)\n",
"exits = forced_exit\n",
"#long_entries.info()\n",
"#number of trues and falses in long_entries\n",
"# print(short_exits.value_counts())\n",
"# print(short_entries.value_counts())\n",
"\n",
"#fig = plot_2y_close([],[momshort, rocp], close)\n",
"#short_signal.vbt.signals.plot_as_entries(close, fig=fig, add_trace_kwargs=dict(secondary_y=False))\n",
"#print(sl_stop)\n",
"#short_entries=short_entries, short_exits=short_exits,\n",
"# pf = vbt.Portfolio.from_signals(close=basic_data, entries=short_entries, exits=exits, tsl_stop=0.005, tp_stop = 0.05, fees=0.0167/100, freq=\"1s\") #sl_stop=sl_stop, tp_stop = sl_stop,\n",
"\n",
"# pf.stats()\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"forced_exit = t1data.symbol_wrapper.fill(False)\n",
"#entry_window_open = pd.Series(False, index=close.index)\n",
"entry_window_open= t1data.symbol_wrapper.fill(False)\n",
"\n",
"# Calculate the time difference in minutes from market open for each timestamp\n",
"elapsed_min_from_open = (forced_exit.index.hour - market_open.hour) * 60 + (forced_exit.index.minute - market_open.minute)\n",
"\n",
"entry_window_open[(elapsed_min_from_open >= entry_window_opens) & (elapsed_min_from_open < entry_window_closes)] = True\n",
"\n",
"#print(entry_window_open.value_counts())\n",
"\n",
"forced_exit[(elapsed_min_from_open >= forced_exit_start) & (elapsed_min_from_open < forced_exit_end)] = True\n",
"short_entries = (short_signals & entry_window_open)\n",
"short_exits = forced_exit\n",
"\n",
"entries = (long_signals & entry_window_open)\n",
"exits = forced_exit\n",
"\n",
"pf = vbt.Portfolio.from_signals(close=t1data, entries=entries, exits=exits, short_entries=short_entries, short_exits=exits,\n",
"td_stop=2, time_delta_format=\"rows\",\n",
"tsl_stop=0.005, tp_stop = 0.005, fees=0.0167/100)#, freq=\"1s\") #sl_stop=sl_stop, tp_stop = sl_stop,\n",
"\n",
"pf.stats()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pf.plot()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pf.get_drawdowns().records_readable"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pf.orders.records_readable"
]
}
],
"metadata": {
"kernelspec": {
"display_name": ".venv",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.11"
}
},
"nbformat": 4,
"nbformat_minor": 2
}

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because it is too large Load Diff

File diff suppressed because one or more lines are too long

View File

@ -1,421 +0,0 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from v2realbot.tools.loadbatch import load_batch\n",
"from v2realbot.utils.utils import zoneNY\n",
"import pandas as pd\n",
"import numpy as np\n",
"import vectorbtpro as vbt\n",
"from itables import init_notebook_mode, show\n",
"\n",
"init_notebook_mode(all_interactive=True)\n",
"\n",
"vbt.settings.set_theme(\"dark\")\n",
"vbt.settings['plotting']['layout']['width'] = 1280\n",
"vbt.settings.plotting.auto_rangebreaks = True\n",
"# Set the option to display with pagination\n",
"pd.set_option('display.notebook_repr_html', True)\n",
"pd.set_option('display.max_rows', 10) # Number of rows per page\n",
"\n",
"res, df = load_batch(batch_id=\"0fb5043a\", #46 days 1.3 - 6.5.\n",
" space_resolution_evenly=False,\n",
" indicators_columns=[\"Rsi14\"],\n",
" main_session_only=True,\n",
" verbose = False)\n",
"if res < 0:\n",
" print(\"Error\" + str(res) + str(df))\n",
"df = df[\"bars\"]\n",
"\n",
"df"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# filter dates"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#filter na dny\n",
"# dates_of_interest = pd.to_datetime(['2024-04-22', '2024-04-23']).tz_localize('US/Eastern')\n",
"# filtered_df = df.loc[df.index.normalize().isin(dates_of_interest)]\n",
"\n",
"# df = filtered_df\n",
"# df.info()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import plotly.io as pio\n",
"pio.renderers.default = 'notebook'\n",
"\n",
"#naloadujeme do vbt symbol as column\n",
"basic_data = vbt.Data.from_data({\"BAC\": df}, tz_convert=zoneNY)\n",
"start_date = pd.Timestamp('2024-03-12 09:30', tz=zoneNY)\n",
"end_date = pd.Timestamp('2024-03-13 16:00', tz=zoneNY)\n",
"\n",
"#basic_data = basic_data.transform(lambda df: df[df.index.date == start_date.date()])\n",
"#basic_data = basic_data.transform(lambda df: df[(df.index >= start_date) & (df.index <= end_date)])\n",
"#basic_data.data[\"BAC\"].info()\n",
"\n",
"# fig = basic_data.plot(plot_volume=False)\n",
"# pivot_info = basic_data.run(\"pivotinfo\", up_th=0.003, down_th=0.002)\n",
"# #pivot_info.plot()\n",
"# pivot_info.plot(fig=fig, conf_value_trace_kwargs=dict(visible=True))\n",
"# fig.show()\n",
"\n",
"\n",
"# rsi14 = basic_data.data[\"BAC\"][\"Rsi14\"].rename(\"Rsi14\")\n",
"\n",
"# rsi14.vbt.plot().show()\n",
"#basic_data.xloc[\"09:30\":\"10:00\"].data[\"BAC\"].vbt.ohlcv.plot().show()\n",
"\n",
"vbt.settings.plotting.auto_rangebreaks = True\n",
"#basic_data.data[\"BAC\"].vbt.ohlcv.plot()\n",
"\n",
"#basic_data.data[\"BAC\"]"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"m1_data = basic_data[['Open', 'High', 'Low', 'Close', 'Volume']]\n",
"\n",
"m1_data.data[\"BAC\"]\n",
"#m5_data = m1_data.resample(\"5T\")\n",
"\n",
"#m5_data.data[\"BAC\"].head(10)\n",
"\n",
"# m15_data = m1_data.resample(\"15T\")\n",
"\n",
"# m15 = m15_data.data[\"BAC\"]\n",
"\n",
"# m15.vbt.ohlcv.plot()\n",
"\n",
"# m1_data.wrapper.index\n",
"\n",
"# m1_resampler = m1_data.wrapper.get_resampler(\"1T\")\n",
"# m1_resampler.index_difference(reverse=True)\n",
"\n",
"\n",
"# m5_resampler.prettify()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# defining ENTRY WINDOW and forced EXIT window"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#m1_data.data[\"BAC\"].info()\n",
"import datetime\n",
"# Define the market open and close times\n",
"market_open = datetime.time(9, 30)\n",
"market_close = datetime.time(16, 0)\n",
"entry_window_opens = 1\n",
"entry_window_closes = 350\n",
"\n",
"forced_exit_start = 380\n",
"forced_exit_end = 390\n",
"\n",
"forced_exit = m1_data.symbol_wrapper.fill(False)\n",
"entry_window_open= m1_data.symbol_wrapper.fill(False)\n",
"\n",
"# Calculate the time difference in minutes from market open for each timestamp\n",
"elapsed_min_from_open = (forced_exit.index.hour - market_open.hour) * 60 + (forced_exit.index.minute - market_open.minute)\n",
"\n",
"entry_window_open[(elapsed_min_from_open >= entry_window_opens) & (elapsed_min_from_open < entry_window_closes)] = True\n",
"forced_exit[(elapsed_min_from_open >= forced_exit_start) & (elapsed_min_from_open < forced_exit_end)] = True\n",
"\n",
"#entry_window_open.info()\n",
"# forced_exit.tail(100)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"close = m1_data.close\n",
"\n",
"rsi = vbt.RSI.run(close, window=14)\n",
"\n",
"long_entries = (rsi.rsi.vbt.crossed_below(20) & entry_window_open)\n",
"long_exits = (rsi.rsi.vbt.crossed_above(70) | forced_exit)\n",
"#long_entries.info()\n",
"#number of trues and falses in long_entries\n",
"long_entries.value_counts()\n",
"#long_exits.value_counts()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"def plot_rsi(rsi, close, entries, exits):\n",
" fig = vbt.make_subplots(rows=1, cols=1, shared_xaxes=True, specs=[[{\"secondary_y\": True}]], vertical_spacing=0.02, subplot_titles=(\"RSI\", \"Price\" ))\n",
" close.vbt.plot(fig=fig, add_trace_kwargs=dict(secondary_y=True))\n",
" rsi.plot(fig=fig, add_trace_kwargs=dict(secondary_y=False))\n",
" entries.vbt.signals.plot_as_entries(rsi.rsi, fig=fig, add_trace_kwargs=dict(secondary_y=False)) \n",
" exits.vbt.signals.plot_as_exits(rsi.rsi, fig=fig, add_trace_kwargs=dict(secondary_y=False)) \n",
" return fig\n",
"\n",
"plot_rsi(rsi, close, long_entries, long_exits)\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"vbt.phelp(vbt.Portfolio.from_signals)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"sl_stop = np.arange(0.03/100, 0.2/100, 0.02/100).tolist()\n",
"# Using the round function\n",
"sl_stop = [round(val, 4) for val in sl_stop]\n",
"print(sl_stop)\n",
"sl_stop = vbt.Param(sl_stop) #np.nan mean s no stoploss\n",
"\n",
"pf = vbt.Portfolio.from_signals(close=close, entries=long_entries, sl_stop=sl_stop, tp_stop = sl_stop, exits=long_exits,fees=0.0167/100, freq=\"1s\") #sl_stop=sl_stop, tp_stop = sl_stop, \n",
"\n",
"#pf.stats()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pf.plot()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pf[(0.0015,0.0013)].plot()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pf[0.03].plot_trade_signals()\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# pristup k pf jako multi index"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#pf[0.03].plot()\n",
"#pf.order_records\n",
"pf[(0.03)].stats()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#zgrupovane statistiky\n",
"stats_df = pf.stats([\n",
" 'total_return',\n",
" 'total_trades',\n",
" 'win_rate',\n",
" 'expectancy'\n",
"], agg_func=None)\n",
"stats_df\n",
"\n",
"\n",
"stats_df.nlargest(50, 'Total Return [%]')\n",
"#stats_df.info()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pf[(0.0011,0.0013)].plot()\n",
"\n",
"#pf[(0.0011,0.0013000000000000002)].plot()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from pandas.tseries.offsets import DateOffset\n",
"\n",
"temp_data = basic_data['2024-4-22']\n",
"temp_data\n",
"res1m = temp_data[[\"Open\", \"High\", \"Low\", \"Close\", \"Volume\"]]\n",
"\n",
"# Define a custom date offset that starts at 9:30 AM and spans 4 hours\n",
"custom_offset = DateOffset(hours=4, minutes=30)\n",
"\n",
"# res1m = res1m.get().resample(\"4H\").agg({ \n",
"# \"Open\": \"first\",\n",
"# \"High\": \"max\",\n",
"# \"Low\": \"min\",\n",
"# \"Close\": \"last\",\n",
"# \"Volume\": \"sum\"\n",
"# })\n",
"\n",
"res4h = res1m.resample(\"1h\", resample_kwargs=dict(origin=\"start\"))\n",
"\n",
"res4h.data\n",
"\n",
"res15m = res1m.resample(\"15T\", resample_kwargs=dict(origin=\"start\"))\n",
"\n",
"res15m.data[\"BAC\"]"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"@vbt.njit\n",
"def long_entry_place_func_nb(c, low, close, time_in_ns, rsi14, window_open, window_close):\n",
" market_open_minutes = 570 # 9 hours * 60 minutes + 30 minutes\n",
"\n",
" for out_i in range(len(c.out)):\n",
" i = c.from_i + out_i\n",
"\n",
" current_minutes = vbt.dt_nb.hour_nb(time_in_ns[i]) * 60 + vbt.dt_nb.minute_nb(time_in_ns[i])\n",
" #print(\"current_minutes\", current_minutes)\n",
" # Calculate elapsed minutes since market open at 9:30 AM\n",
" elapsed_from_open = current_minutes - market_open_minutes\n",
" elapsed_from_open = elapsed_from_open if elapsed_from_open >= 0 else 0\n",
" #print( \"elapsed_from_open\", elapsed_from_open)\n",
"\n",
" #elapsed_from_open = elapsed_minutes_from_open_nb(time_in_ns) \n",
" in_window = elapsed_from_open > window_open and elapsed_from_open < window_close\n",
" #print(\"in_window\", in_window)\n",
" # if in_window:\n",
" # print(\"in window\")\n",
"\n",
" if in_window and rsi14[i] > 60: # and low[i, c.col] <= hit_price: # and hour == 9: # (4)!\n",
" return out_i\n",
" return -1\n",
"\n",
"@vbt.njit\n",
"def long_exit_place_func_nb(c, high, close, time_index, tp, sl): # (5)!\n",
" entry_i = c.from_i - c.wait\n",
" entry_price = close[entry_i, c.col]\n",
" hit_price = entry_price * (1 + tp)\n",
" stop_price = entry_price * (1 - sl)\n",
" for out_i in range(len(c.out)):\n",
" i = c.from_i + out_i\n",
" last_bar_of_day = vbt.dt_nb.day_changed_nb(time_index[i], time_index[i + 1])\n",
"\n",
" #print(next_day)\n",
" if last_bar_of_day: #pokud je dalsi next day, tak zavirame posledni\n",
" print(\"ted\",out_i)\n",
" return out_i\n",
" if close[i, c.col] >= hit_price or close[i, c.col] <= stop_price :\n",
" return out_i\n",
" return -1\n",
"\n",
"\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"df = pd.DataFrame(np.random.random(size=(5, 10)), columns=list('abcdefghij'))\n",
"\n",
"df"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"df.sum()"
]
}
],
"metadata": {
"kernelspec": {
"display_name": ".venv",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.11"
}
},
"nbformat": 4,
"nbformat_minor": 2
}

File diff suppressed because one or more lines are too long

View File

@ -1,7 +1,7 @@
#!/bin/bash
# file: restart.sh
# Usage: ./restart.sh [test|prod|all]
# Define server addresses

View File

@ -1,107 +0,0 @@
# Plotly
* MAKE_SUBPLOT Defines layout (if more then 1x1 or secondary y axis are required)
```python
fig = vbt.make_subplots(rows=2, cols=1, shared_xaxes=True,
specs=[[{"secondary_y": True}], [{"secondary_y": False}]],
vertical_spacing=0.02, subplot_titles=("Row 1 title", "Row 2 title"))
```
Then the different [sr/df generic accessor](http://5.161.179.223:8000/static/js/vbt/api/generic/accessors/index.html#vectorbtpro.generic.accessors.GenericAccessor.areaplot) are added with ADD_TRACE_KWARGS and TRACE_KWARGS. Other types of plot available in [plotting module](http://5.161.179.223:8000/static/js/vbt/api/generic/plotting/index.html)
```python
#using accessor
close.vbt.plot(fig=fig, add_trace_kwargs=dict(secondary_y=False,row=1, col=1), trace_kwargs=dict(line=dict(color="blue")))
indvolume.vbt.barplot(fig=fig, add_trace_kwargs=dict(secondary_y=False, row=2, col=1))
#using plotting module
vbt.Bar(indvolume, fig=fig, add_trace_kwargs=dict(secondary_y=False, row=2, col=1))
```
* ADD_TRACE_KWARGS - determines positioning withing subplot
```python
add_trace_kwargs=dict(secondary_y=False,row=1, col=1)
```
* TRACE_KWARGS - other styling of trace
```python
trace_kwargs=dict(name="LONGS",
line=dict(color="#ffe476"),
marker=dict(color="limegreen"),
fill=None,
connectgaps=True)
```
## Example
```python
fig = vbt.make_subplots(rows=2, cols=1, shared_xaxes=True,
specs=[[{"secondary_y": True}], [{"secondary_y": False}]],
vertical_spacing=0.02, subplot_titles=("Price and Indicators", "Volume"))
# Plotting the close price
close.vbt.plot(fig=fig, add_trace_kwargs=dict(secondary_y=False,row=1, col=1), trace_kwargs=dict(line=dict(color="blue")))
```
# Data
## Resampling
```python
t1data = basic_data[['open', 'high', 'low', 'close', 'volume','vwap','buyvolume','sellvolume']].resample("1T")
t1data = t1data.transform(lambda df: df.between_time('09:30', '16:00').dropna()) #main session data only, no nans
t5data = basic_data[['open', 'high', 'low', 'close', 'volume','vwap','buyvolume','sellvolume']].resample("5T")
t5data = t5data.transform(lambda df: df.between_time('09:30', '16:00').dropna())
dailydata = basic_data[['open', 'high', 'low', 'close', 'volume', 'vwap']].resample("D").dropna()
#realign 5min close to 1min so it can be compared with 1min
t5data_close_realigned = t5data.close.vbt.realign_closing("1T").between_time('09:30', '16:00').dropna()
#same with open
t5data.open.vbt.realign_opening("1h")
```
### Define resample function for custom column
Example of custom feature config [Binance Data](http://5.161.179.223:8000/static/js/vbt/api/data/custom/binance/index.html#vectorbtpro.data.custom.binance.BinanceData.feature_config).
Other [reduced functions available](http://5.161.179.223:8000/static/js/vbt/api/generic/nb/apply_reduce/index.html). (mean, min, max, median, nth ...)
```python
from vectorbtpro.utils.config import merge_dicts, Config, HybridConfig
from vectorbtpro import _typing as tp
from vectorbtpro.generic import nb as generic_nb
_feature_config: tp.ClassVar[Config] = HybridConfig(
{
"buyvolume": dict(
resample_func=lambda self, obj, resampler: obj.vbt.resample_apply(
resampler,
generic_nb.sum_reduce_nb,
)
),
"sellvolume": dict(
resample_func=lambda self, obj, resampler: obj.vbt.resample_apply(
resampler,
generic_nb.sum_reduce_nb,
)
)
}
)
basic_data._feature_config = _feature_config
```
### Validate resample
```python
t2dataclose = t2data.close.rename("15MIN - realigned").vbt.realign_closing("1T")
fig = t1data.close.rename("1MIN").vbt.plot()
t2data.close.rename("15MIN").vbt.plot(fig=fig)
t2dataclose.vbt.plot(fig=fig)
```
## Persisting
```python
basic_data.to_parquet(partition_by="day", compression="gzip")
day_data = vbt.ParquetData.pull("BAC", filters=[("group", "==", "2024-05-03")])
vbt.print_dir_tree("BTC-USD")#overeni directory structure
```
# Discover
```python
vbt.phelp(vbt.talib(atr).run) #parameters it accepts
vbt.pdir(pf) - get available properties and methods
vbt.pprint(basic_data) #to get correct shape, info about instance
```

View File

@ -1,3 +1,3 @@
API_KEY = ''
SECRET_KEY = ''
API_KEY = 'PKGGEWIEYZOVQFDRY70L'
SECRET_KEY = 'O5Kt8X4RLceIOvM98i5LdbalItsX7hVZlbPYHy8Y'
MAX_BATCH_SIZE = 1

View File

@ -1,9 +1,9 @@
import numpy as np
from v2realbot.common.model import Trade, TradeDirection, TradeStatus
from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeStatus
from typing import Tuple
from copy import deepcopy
from v2realbot.strategy.base import StrategyState
from v2realbot.strategyblocks.activetrade.helpers import get_max_profit_price, get_profit_target_price, get_signal_section_directive, keyword_conditions_met
from v2realbot.strategyblocks.activetrade.helpers import get_max_profit_price, get_profit_target_price, get_override_for_active_trade, keyword_conditions_met
from v2realbot.utils.utils import safe_get
# FIBONACCI PRO PROFIT A SL
@ -63,10 +63,10 @@ class SLOptimizer:
def initialize_levels(self, state):
directive_name = 'SL_opt_exit_levels_'+str(self.direction)
SL_opt_exit_levels = get_signal_section_directive(state=state, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, None))
SL_opt_exit_levels = get_override_for_active_trade(state=state, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, None))
directive_name = 'SL_opt_exit_sizes_'+str(self.direction)
SL_opt_exit_sizes = get_signal_section_directive(state=state, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, None))
SL_opt_exit_sizes = get_override_for_active_trade(state=state, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, None))
if SL_opt_exit_levels is None or SL_opt_exit_sizes is not None:
print("no directives found: SL_opt_exit_levels/SL_opt_exit_sizes")

View File

@ -1,9 +1,7 @@
import os,sys
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
print(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import pandas as pd
import numpy as np
from alpaca.data.historical import StockHistoricalDataClient
from alpaca.data.historical import CryptoHistoricalDataClient, StockHistoricalDataClient
from alpaca.data.requests import CryptoLatestTradeRequest, StockLatestTradeRequest, StockLatestBarRequest, StockTradesRequest
from alpaca.data.enums import DataFeed
from v2realbot.config import ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY

View File

@ -1,2 +0,0 @@
import locale
print(locale.getdefaultlocale())

View File

@ -1,66 +0,0 @@
import os
from bs4 import BeautifulSoup
import html2text
def convert_html_to_markdown(html_content, link_mapping):
h = html2text.HTML2Text()
h.ignore_links = False
# Update internal links to point to the relevant sections in the Markdown
soup = BeautifulSoup(html_content, 'html.parser')
for a in soup.find_all('a', href=True):
href = a['href']
if href in link_mapping:
a['href'] = f"#{link_mapping[href]}"
return h.handle(str(soup))
def create_link_mapping(root_dir):
link_mapping = {}
for subdir, _, files in os.walk(root_dir):
for file in files:
if file == "index.html":
relative_path = os.path.relpath(os.path.join(subdir, file), root_dir)
chapter_id = relative_path.replace(os.sep, '-').replace('index.html', '')
link_mapping[relative_path] = chapter_id
link_mapping[relative_path.replace(os.sep, '/')] = chapter_id # for URLs with slashes
return link_mapping
def read_html_files(root_dir, link_mapping):
markdown_content = []
for subdir, _, files in os.walk(root_dir):
relative_path = os.path.relpath(subdir, root_dir)
if files and any(file == "index.html" for file in files):
# Add directory as a heading based on its depth
heading_level = relative_path.count(os.sep) + 1
markdown_content.append(f"{'#' * heading_level} {relative_path}\n")
for file in files:
if file == "index.html":
file_path = os.path.join(subdir, file)
with open(file_path, 'r', encoding='utf-8') as f:
html_content = f.read()
soup = BeautifulSoup(html_content, 'html.parser')
title = soup.title.string if soup.title else "No Title"
chapter_id = os.path.relpath(file_path, root_dir).replace(os.sep, '-').replace('index.html', '')
markdown_content.append(f"<a id='{chapter_id}'></a>\n")
markdown_content.append(f"{'#' * (heading_level + 1)} {title}\n")
markdown_content.append(convert_html_to_markdown(html_content, link_mapping))
return "\n".join(markdown_content)
def save_to_markdown_file(content, output_file):
with open(output_file, 'w', encoding='utf-8') as f:
f.write(content)
def main():
root_dir = "./v2realbot/static/js/vbt/"
output_file = "output.md"
link_mapping = create_link_mapping(root_dir)
markdown_content = read_html_files(root_dir, link_mapping)
save_to_markdown_file(markdown_content, output_file)
print(f"Markdown document created at {output_file}")
if __name__ == "__main__":
main()

View File

@ -59,12 +59,25 @@ Hlavní loop:
"""
def next(data, state: StrategyState):
print(10*"*", state.account_variables)
##print(10*"*","NEXT START",10*"*")
# important vars state.avgp, state.positions, state.vars, data
#indicators moved to call_next in upper class
execute_prescribed_trades(state, data)
signal_search(state, data)
execute_prescribed_trades(state, data) #pro jistotu ihned zpracujeme
manage_active_trade(state, data)
#pokud mame prazdne pozice a neceka se na nic
if state.positions == 0 and state.vars.pending is None:
#vykoname trady ve fronte
execute_prescribed_trades(state, data)
#pokud se neaktivoval nejaky trade, poustime signal search - ale jen jednou za bar?
#if conf_bar == 1:
if state.vars.pending is None:
signal_search(state, data)
#pro jistotu ihned zpracujeme
execute_prescribed_trades(state, data)
#mame aktivni trade a neceka se n anic
elif state.vars.activeTrade and state.vars.pending is None:
manage_active_trade(state, data)
def init(state: StrategyState):
#place to declare new vars
@ -75,13 +88,13 @@ def init(state: StrategyState):
#nove atributy na rizeni tradu
#identifikuje provedenou změnu na Tradu (neděláme změny dokud nepřijde potvrzeni z notifikace)
#state.vars.pending = None #nahrazeno pebnding pod accountem state.account_variables[account.name].pending
state.vars.pending = None
#obsahuje aktivni Trade a jeho nastaveni
#state.vars.activeTrade = None #pending/Trade moved to account_variables
state.vars.activeTrade = None #pending/Trade
#obsahuje pripravene Trady ve frontě
state.vars.prescribedTrades = []
#flag pro reversal
#state.vars.requested_followup = None #nahrazeno pod accountem
state.vars.requested_followup = None
#TODO presunout inicializaci work_dict u podminek - sice hodnoty nepujdou zmenit, ale zlepsi se performance
#pripadne udelat refresh kazdych x-iterací
@ -89,8 +102,9 @@ def init(state: StrategyState):
state.vars.mode = None
state.vars.last_50_deltas = []
state.vars.next_new = 0
state.vars.last_entry_index = None #mponechano obecne pro vsechny accounty
state.vars.last_exit_index = None #obecna varianta ponechana
state.vars.last_buy_index = None
state.vars.last_exit_index = None
state.vars.last_in_index = None
state.vars.last_update_time = 0
state.vars.reverse_position_waiting_amount = 0
#INIT promenne, ktere byly zbytecne ve stratvars

View File

@ -39,7 +39,7 @@
"""
from uuid import UUID, uuid4
from alpaca.trading.enums import OrderSide, OrderStatus, TradeEvent, OrderType
from v2realbot.common.model import TradeUpdate, Order, Account
from v2realbot.common.model import TradeUpdate, Order
from rich import print as printanyway
import threading
import asyncio
@ -61,7 +61,6 @@ import dash_bootstrap_components as dbc
from dash.dependencies import Input, Output
from dash import dcc, html, dash_table, Dash
import v2realbot.utils.config_handler as cfh
from typing import Set
""""
LATENCY DELAYS
.000 trigger - last_trade_time (.4246266)
@ -75,20 +74,7 @@ lock = threading.Lock
#todo nejspis dat do classes, aby se mohlo backtestovat paralelne
#ted je globalni promena last_time_now a self.account a cash
class Backtester:
"""
Initializes a new instance of the Backtester class.
Args:
symbol (str): The symbol of the security being backtested.
accounts (set): A set of accounts to use for backtesting.
order_fill_callback (callable): A callback function to handle order fills.
btdata (list): A list of backtesting data.
bp_from (datetime): The start date of the backtesting period.
bp_to (datetime): The end date of the backtesting period.
cash (float, optional): The initial cash balance. Defaults to 100000.
Returns:
None
"""
def __init__(self, symbol: str, accounts: Set, order_fill_callback: callable, btdata: list, bp_from: datetime, bp_to: datetime, cash: float = 100000):
def __init__(self, symbol: str, order_fill_callback: callable, btdata: list, bp_from: datetime, bp_to: datetime, cash: float = 100000):
#this TIME value determines true time for submit, replace, cancel order should happen (allowing past)
#it is set by every iteration of BT or before fill callback - allowing past events to happen
self.time = None
@ -97,7 +83,6 @@ class Backtester:
self.btdata = btdata
self.backtest_start = None
self.backtest_end = None
self.accounts = accounts
self.cash_init = cash
#backtesting period
self.bp_from = bp_from
@ -105,10 +90,9 @@ class Backtester:
self.cash = cash
self.cash_reserved_for_shorting = 0
self.trades = []
self.internal_account = { account.name:{self.symbol: [0, 0]} for account in accounts }
# { "ACCOUNT1": {}"BAC": [avgp, size]}, .... }
self.open_orders =[] #open orders shared for all accounts, account being an attribute
self.account = { "BAC": [0, 0] }
# { "BAC": [avgp, size] }
self.open_orders =[]
# self.open_orders = [Order(id=uuid4(),
# submitted_at = datetime(2023, 3, 17, 9, 30, 0, 0, tzinfo=zoneNY),
# symbol = "BAC",
@ -126,8 +110,6 @@ class Backtester:
# side = OrderSide.BUY)]
#
def execute_orders_and_callbacks(self, intime: float):
"""""
Voláno ze strategie před každou iterací s časem T.
@ -184,7 +166,7 @@ class Backtester:
for order in self.open_orders:
#pokud je vyplneny symbol, tak jedeme jen tyto, jinak vsechny
print(order.account.name, order.id, datetime.timestamp(order.submitted_at), order.symbol, order.side, order.order_type, order.qty, order.limit_price, order.submitted_at)
print(order.id, datetime.timestamp(order.submitted_at), order.symbol, order.side, order.order_type, order.qty, order.limit_price, order.submitted_at)
if order.canceled_at:
#ic("deleting canceled order",order.id)
todel.append(order)
@ -366,22 +348,21 @@ class Backtester:
#ic(o.filled_at, o.filled_avg_price)
a = self.update_internal_account(o = o)
a = self.update_account(o = o)
if a < 0:
tlog("update_account ERROR")
return -1
trade = TradeUpdate(account=o.account,
order = o,
trade = TradeUpdate(order = o,
event = TradeEvent.FILL,
execution_id = str(uuid4()),
timestamp = datetime.fromtimestamp(fill_time),
position_qty= self.internal_account[o.account.name][o.symbol][0],
position_qty= self.account[o.symbol][0],
price=float(fill_price),
qty = o.qty,
value = float(o.qty*fill_price),
cash = self.cash,
pos_avg_price = self.internal_account[o.account.name][o.symbol][1])
pos_avg_price = self.account[o.symbol][1])
self.trades.append(trade)
@ -398,49 +379,49 @@ class Backtester:
self.time = time + float(cfh.config_handler.get_val('BT_DELAYS','fill_to_not'))
print("current bt.time",self.time)
#print("FILL NOTIFICATION: ", tradeupdate)
res = asyncio.run(self.order_fill_callback(tradeupdate, tradeupdate.account))
res = asyncio.run(self.order_fill_callback(tradeupdate))
return 0
def update_internal_account(self, o: Order):
def update_account(self, o: Order):
#updatujeme self.account
#pokud neexistuje klic v accountu vytvorime si ho
if o.symbol not in self.internal_account[o.account.name]:
if o.symbol not in self.account:
# { "BAC": [size, avgp] }
self.internal_account[o.account.name][o.symbol] = [0,0]
self.account[o.symbol] = [0,0]
if o.side == OrderSide.BUY:
#[size, avgp]
newsize = (self.internal_account[o.account.name][o.symbol][0] + o.qty)
newsize = (self.account[o.symbol][0] + o.qty)
#JPLNE UZAVRENI SHORT (avgp 0)
if newsize == 0: newavgp = 0
#CASTECNE UZAVRENI SHORT (avgp puvodni)
elif newsize < 0: newavgp = self.internal_account[o.account.name][o.symbol][1]
elif newsize < 0: newavgp = self.account[o.symbol][1]
#JDE O LONG (avgp nove)
else:
newavgp = ((self.internal_account[o.account.name][o.symbol][0] * self.internal_account[o.account.name][o.symbol][1]) + (o.qty * o.filled_avg_price)) / (self.internal_account[o.account.name][o.symbol][0] + o.qty)
newavgp = ((self.account[o.symbol][0] * self.account[o.symbol][1]) + (o.qty * o.filled_avg_price)) / (self.account[o.symbol][0] + o.qty)
self.internal_account[o.account.name][o.symbol] = [newsize, newavgp]
self.account[o.symbol] = [newsize, newavgp]
self.cash = self.cash - (o.qty * o.filled_avg_price)
return 1
#sell
elif o.side == OrderSide.SELL:
newsize = self.internal_account[o.account.name][o.symbol][0]-o.qty
newsize = self.account[o.symbol][0]-o.qty
#UPLNE UZAVRENI LONGU (avgp 0)
if newsize == 0: newavgp = 0
#CASTECNE UZAVRENI LONGU (avgp puvodni)
elif newsize > 0: newavgp = self.internal_account[o.account.name][o.symbol][1]
elif newsize > 0: newavgp = self.account[o.symbol][1]
#jde o SHORT (avgp nove)
else:
#pokud je predchozi 0 - tzn. jde o prvni short
if self.internal_account[o.account.name][o.symbol][1] == 0:
if self.account[o.symbol][1] == 0:
newavgp = o.filled_avg_price
else:
newavgp = ((abs(self.internal_account[o.account.name][o.symbol][0]) * self.internal_account[o.account.name][o.symbol][1]) + (o.qty * o.filled_avg_price)) / (abs(self.internal_account[o.account.name][o.symbol][0]) + o.qty)
newavgp = ((abs(self.account[o.symbol][0]) * self.account[o.symbol][1]) + (o.qty * o.filled_avg_price)) / (abs(self.account[o.symbol][0]) + o.qty)
self.internal_account[o.account.name][o.symbol] = [newsize, newavgp]
self.account[o.symbol] = [newsize, newavgp]
#pokud jde o prodej longu(nova pozice je>=0) upravujeme cash
if self.internal_account[o.account.name][o.symbol][0] >= 0:
if self.account[o.symbol][0] >= 0:
self.cash = float(self.cash + (o.qty * o.filled_avg_price))
print("uprava cashe, jde o prodej longu")
else:
@ -485,7 +466,7 @@ class Backtester:
# #ic("get last price")
# return self.btdata[i-1][1]
def submit_order(self, time: float, symbol: str, side: OrderSide, size: int, order_type: OrderType, account: Account, price: float = None):
def submit_order(self, time: float, symbol: str, side: OrderSide, size: int, order_type: OrderType, price: float = None):
"""submit order
- zakladni validace
- vloží do self.open_orders s daným časem
@ -518,9 +499,9 @@ class Backtester:
return -1
#pokud neexistuje klic v accountu vytvorime si ho
if symbol not in self.internal_account[account.name]:
if symbol not in self.account:
# { "BAC": [size, avgp] }
self.internal_account[account.name][symbol] = [0,0]
self.account[symbol] = [0,0]
#check for available quantity
if side == OrderSide.SELL:
@ -528,22 +509,22 @@ class Backtester:
reserved_price = 0
#with lock:
for o in self.open_orders:
if o.side == OrderSide.SELL and o.symbol == symbol and o.canceled_at is None and o.account==account:
if o.side == OrderSide.SELL and o.symbol == symbol and o.canceled_at is None:
reserved += o.qty
cena = o.limit_price if o.limit_price else self.get_last_price(time, o.symbol)
reserved_price += o.qty * cena
print("blokovano v open orders pro sell qty: ", reserved, "celkem:", reserved_price)
actual_minus_reserved = int(self.internal_account[account.name][symbol][0]) - reserved
actual_minus_reserved = int(self.account[symbol][0]) - reserved
if actual_minus_reserved > 0 and actual_minus_reserved - int(size) < 0:
printanyway("not enough shares available to sell or shorting while long position",self.internal_account[account.name][symbol][0],"reserved",reserved,"available",int(self.internal_account[account.name][symbol][0]) - reserved,"selling",size)
printanyway("not enough shares available to sell or shorting while long position",self.account[symbol][0],"reserved",reserved,"available",int(self.account[symbol][0]) - reserved,"selling",size)
return -1
#if is shorting - check available cash to short
if actual_minus_reserved <= 0:
cena = price if price else self.get_last_price(time, self.symbol)
if (self.cash - reserved_price - float(int(size)*float(cena))) < 0:
printanyway("ERROR: not enough cash for shorting. cash",self.cash,"reserved",reserved,"available",self.cash-reserved,"needed",float(int(size)*float(cena)))
printanyway("not enough cash for shorting. cash",self.cash,"reserved",reserved,"available",self.cash-reserved,"needed",float(int(size)*float(cena)))
return -1
#check for available cash
@ -552,13 +533,13 @@ class Backtester:
reserved_price = 0
#with lock:
for o in self.open_orders:
if o.side == OrderSide.BUY and o.canceled_at is None and o.account==account:
if o.side == OrderSide.BUY and o.canceled_at is None:
cena = o.limit_price if o.limit_price else self.get_last_price(time, o.symbol)
reserved_price += o.qty * cena
reserved_qty += o.qty
print("blokovano v open orders for buy: qty, price", reserved_qty, reserved_price)
actual_plus_reserved_qty = int(self.internal_account[account.name][symbol][0]) + reserved_qty
actual_plus_reserved_qty = int(self.account[symbol][0]) + reserved_qty
#jde o uzavreni shortu
if actual_plus_reserved_qty < 0 and (actual_plus_reserved_qty + int(size)) > 0:
@ -569,12 +550,11 @@ class Backtester:
if actual_plus_reserved_qty >= 0:
cena = price if price else self.get_last_price(time, self.symbol)
if (self.cash - reserved_price - float(int(size)*float(cena))) < 0:
printanyway("ERROR: not enough cash to buy long. cash",self.cash,"reserved_qty",reserved_qty,"reserved_price",reserved_price, "available",self.cash-reserved_price,"needed",float(int(size)*float(cena)))
printanyway("not enough cash to buy long. cash",self.cash,"reserved_qty",reserved_qty,"reserved_price",reserved_price, "available",self.cash-reserved_price,"needed",float(int(size)*float(cena)))
return -1
id = str(uuid4())
order = Order(id=id,
account=account,
submitted_at = datetime.fromtimestamp(float(time)),
symbol=symbol,
order_type = order_type,
@ -589,7 +569,7 @@ class Backtester:
return id
def replace_order(self, id: str, time: float, account: Account, size: int = None, price: float = None):
def replace_order(self, id: str, time: float, size: int = None, price: float = None):
"""replace order
- zakladni validace vrací synchronně
- vrací číslo nové objednávky
@ -606,7 +586,7 @@ class Backtester:
#with lock:
for o in self.open_orders:
print(o.id)
if str(o.id) == str(id) and o.account == account:
if str(o.id) == str(id):
newid = str(uuid4())
o.id = newid
o.submitted_at = datetime.fromtimestamp(time)
@ -617,7 +597,7 @@ class Backtester:
print("BT: replacement order doesnt exist")
return 0
def cancel_order(self, time: float, id: str, account: Account):
def cancel_order(self, time: float, id: str):
"""cancel order
- základní validace vrací synchronně
- vymaže objednávku z open orders
@ -633,22 +613,22 @@ class Backtester:
return 0
#with lock:
for o in self.open_orders:
if str(o.id) == id and o.account == account:
if str(o.id) == id:
o.canceled_at = time
print("set as canceled in self.open_orders")
return 1
print("BTC: cantchange. open order doesnt exist")
return 0
def get_open_position(self, symbol: str, account: Account):
def get_open_position(self, symbol: str):
"""get positions ->(avg,size)"""
#print("BT:get open positions entry")
try:
return self.internal_account[account.name][symbol][1], self.internal_account[account.name][symbol][0]
return self.account[symbol][1], self.account[symbol][0]
except:
return (0,0)
def get_open_orders(self, side: OrderSide, symbol: str, account: Account):
def get_open_orders(self, side: OrderSide, symbol: str):
"""get open orders ->list(OrderNotification)"""
print("BT:get open orders entry")
if len(self.open_orders) == 0:
@ -658,7 +638,7 @@ class Backtester:
#with lock:
for o in self.open_orders:
#print(o)
if o.symbol == symbol and o.canceled_at is None and o.account == account:
if o.symbol == symbol and o.canceled_at is None:
if side is None or o.side == side:
res.append(o)
return res

View File

@ -0,0 +1,41 @@
from enum import Enum
from datetime import datetime
from pydantic import BaseModel
from typing import Any, Optional, List, Union
from uuid import UUID
class TradeStatus(str, Enum):
READY = "ready"
ACTIVATED = "activated"
CLOSED = "closed"
#FINISHED = "finished"
class TradeDirection(str, Enum):
LONG = "long"
SHORT = "short"
class TradeStoplossType(str, Enum):
FIXED = "fixed"
TRAILING = "trailing"
#Predpis obchodu vygenerovany signalem, je to zastresujici jednotka
#ke kteremu jsou pak navazany jednotlivy FILLy (reprezentovany model.TradeUpdate) - napr. castecne exity atp.
class Trade(BaseModel):
id: UUID
last_update: datetime
entry_time: Optional[datetime] = None
exit_time: Optional[datetime] = None
status: TradeStatus
generated_by: Optional[str] = None
direction: TradeDirection
entry_price: Optional[float] = None
goal_price: Optional[float] = None
size: Optional[int] = None
# size_multiplier je pomocna promenna pro pocitani relativniho denniho profit
size_multiplier: Optional[float] = None
# stoploss_type: TradeStoplossType
stoploss_value: Optional[float] = None
profit: Optional[float] = 0
profit_sum: Optional[float] = 0
rel_profit: Optional[float] = 0
rel_profit_cum: Optional[float] = 0

View File

@ -5,75 +5,10 @@ from rich import print
from typing import Any, Optional, List, Union
from datetime import datetime, date
from pydantic import BaseModel, Field
from v2realbot.enums.enums import Mode, Account, SchedulerStatus, Moddus, Market, Followup
from v2realbot.enums.enums import Mode, Account, SchedulerStatus, Moddus
from alpaca.data.enums import Exchange
from enum import Enum
from datetime import datetime
from pydantic import BaseModel
from typing import Any, Optional, List, Union
from uuid import UUID
#prescribed model
#from prescribed model
class InstantIndicator(BaseModel):
name: str
toml: str
class TradeStatus(str, Enum):
READY = "ready"
ACTIVATED = "activated"
CLOSED = "closed"
#FINISHED = "finished"
class TradeDirection(str, Enum):
LONG = "long"
SHORT = "short"
class TradeStoplossType(str, Enum):
FIXED = "fixed"
TRAILING = "trailing"
#Predpis obchodu vygenerovany signalem, je to zastresujici jednotka
#ke kteremu jsou pak navazany jednotlivy FILLy (reprezentovany model.TradeUpdate) - napr. castecne exity atp.
class Trade(BaseModel):
account: Account
id: UUID
last_update: datetime
entry_time: Optional[datetime] = None
exit_time: Optional[datetime] = None
status: TradeStatus
generated_by: Optional[str] = None
direction: TradeDirection
entry_price: Optional[float] = None
goal_price: Optional[float] = None
size: Optional[int] = None
# size_multiplier je pomocna promenna pro pocitani relativniho denniho profit
size_multiplier: Optional[float] = None
# stoploss_type: TradeStoplossType
stoploss_value: Optional[float] = None
profit: Optional[float] = 0
profit_sum: Optional[float] = 0
rel_profit: Optional[float] = 0
rel_profit_cum: Optional[float] = 0
#account variables that can be accessed by ACCOUNT key dictionary
class AccountVariables(BaseModel):
positions: float = 0
avgp: float = 0
pending: str = None
blockbuy: int = 0
wait_for_fill: float = None
profit: float = 0
docasny_rel_profit: list = []
rel_profit_cum: list = []
last_entry_index: int = None #acc varianta, mame taky obnecnou state.vars.last_entry_index
requested_followup: Followup = None
activeTrade: Trade = None
dont_exit_already_activated: bool = False
#activeTrade, prescribedTrades
#tbd transferables?
#models for server side datatables
@ -156,7 +91,7 @@ class TestList(BaseModel):
dates: List[Intervals]
#for GUI to fetch historical trades on given symbol
class TradeView(BaseModel):
class Trade(BaseModel):
symbol: str
timestamp: datetime
exchange: Optional[Union[Exchange, str]] = None
@ -224,7 +159,6 @@ class RunManagerRecord(BaseModel):
mode: Mode
note: Optional[str] = None
ilog_save: bool = False
market: Optional[Market] = Market.US
bt_from: Optional[datetime] = None
bt_to: Optional[datetime] = None
#weekdays filter
@ -254,8 +188,8 @@ class RunnerView(BaseModel):
run_symbol: Optional[str] = None
run_trade_count: Optional[int] = 0
run_profit: Optional[float] = 0
run_positions: Optional[dict] = 0
run_avgp: Optional[dict] = 0
run_positions: Optional[int] = 0
run_avgp: Optional[float] = 0
run_stopped: Optional[datetime] = None
run_paused: Optional[datetime] = None
@ -273,8 +207,8 @@ class Runner(BaseModel):
run_ilog_save: Optional[bool] = False
run_trade_count: Optional[int] = None
run_profit: Optional[float] = None
run_positions: Optional[dict] = None
run_avgp: Optional[dict] = None
run_positions: Optional[int] = None
run_avgp: Optional[float] = None
run_strat_json: Optional[str] = None
run_stopped: Optional[datetime] = None
run_paused: Optional[datetime] = None
@ -312,7 +246,6 @@ class Bar(BaseModel):
vwap: Optional[float] = 0
class Order(BaseModel):
account: Account
id: UUID
submitted_at: datetime
filled_at: Optional[datetime] = None
@ -328,7 +261,6 @@ class Order(BaseModel):
#entita pro kazdy kompletni FILL, je navazana na prescribed_trade
class TradeUpdate(BaseModel):
account: Account
event: Union[TradeEvent, str]
execution_id: Optional[UUID] = None
order: Order
@ -374,8 +306,8 @@ class RunArchive(BaseModel):
ilog_save: Optional[bool] = False
profit: float = 0
trade_count: int = 0
end_positions: Union[dict,str] = None
end_positions_avgp: Union[dict,str] = None
end_positions: int = 0
end_positions_avgp: float = 0
metrics: Union[dict, str] = None
stratvars_toml: Optional[str] = None
@ -396,8 +328,8 @@ class RunArchiveView(BaseModel):
ilog_save: Optional[bool] = False
profit: float = 0
trade_count: int = 0
end_positions: Union[dict,int] = None
end_positions_avgp: Union[dict,float] = None
end_positions: int = 0
end_positions_avgp: float = 0
metrics: Union[dict, str] = None
batch_profit: float = 0 # Total profit for the batch - now calculated during query
batch_count: int = 0 # Count of runs in the batch - now calculated during query
@ -414,8 +346,6 @@ class SLHistory(BaseModel):
id: Optional[UUID] = None
time: datetime
sl_val: float
direction: TradeDirection
account: Account
#Contains archive of running strategies (runner) - detail data
class RunArchiveDetail(BaseModel):
@ -428,3 +358,9 @@ class RunArchiveDetail(BaseModel):
trades: List[TradeUpdate]
ext_data: Optional[dict] = None
class InstantIndicator(BaseModel):
name: str
toml: str

View File

@ -5,7 +5,9 @@ import v2realbot.controller.services as cs
#prevede dict radku zpatky na objekt vcetme retypizace
def row_to_runmanager(row: dict) -> RunManagerRecord:
is_running = cs.is_runner_running(row['runner_id']) if row['runner_id'] else False
res = RunManagerRecord(
moddus=row['moddus'],
id=row['id'],
@ -15,7 +17,6 @@ def row_to_runmanager(row: dict) -> RunManagerRecord:
account=row['account'],
note=row['note'],
ilog_save=bool(row['ilog_save']),
market=row['market'] if row['market'] is not None else None,
bt_from=datetime.fromisoformat(row['bt_from']) if row['bt_from'] else None,
bt_to=datetime.fromisoformat(row['bt_to']) if row['bt_to'] else None,
weekdays_filter=[int(x) for x in row['weekdays_filter'].split(',')] if row['weekdays_filter'] else [],
@ -51,8 +52,8 @@ def row_to_runarchiveview(row: dict) -> RunArchiveView:
ilog_save=bool(row['ilog_save']),
profit=float(row['profit']),
trade_count=int(row['trade_count']),
end_positions=orjson.loads(row['end_positions']),
end_positions_avgp=orjson.loads(row['end_positions_avgp']),
end_positions=int(row['end_positions']),
end_positions_avgp=float(row['end_positions_avgp']),
metrics=orjson.loads(row['metrics']) if row['metrics'] else None,
batch_profit=int(row['batch_profit']) if row['batch_profit'] and row['batch_id'] else 0,
batch_count=int(row['batch_count']) if row['batch_count'] and row['batch_id'] else 0,
@ -79,8 +80,8 @@ def row_to_runarchive(row: dict) -> RunArchive:
ilog_save=bool(row['ilog_save']),
profit=float(row['profit']),
trade_count=int(row['trade_count']),
end_positions=str(row['end_positions']),
end_positions_avgp=str(row['end_positions_avgp']),
end_positions=int(row['end_positions']),
end_positions_avgp=float(row['end_positions_avgp']),
metrics=orjson.loads(row['metrics']),
stratvars_toml=row['stratvars_toml'],
transferables=orjson.loads(row['transferables']) if row['transferables'] else None

View File

@ -4,7 +4,6 @@ from appdirs import user_data_dir
from pathlib import Path
import os
from collections import defaultdict
from dotenv import load_dotenv
# Global flag to track if the ml module has been imported (solution for long import times of tensorflow)
#the first occurence of using it will load it globally
_ml_module_loaded = False
@ -27,32 +26,6 @@ MODEL_DIR = Path(DATA_DIR)/"models"
PROFILING_NEXT_ENABLED = False
PROFILING_OUTPUT_DIR = DATA_DIR
def find_dotenv(start_path):
"""
Searches for a .env file in the given directory or its parents and returns the path.
Args:
start_path: The directory to start searching from.
Returns:
Path to the .env file if found, otherwise None.
"""
current_path = Path(start_path)
for _ in range(6): # Limit search depth to 5 levels
dotenv_path = current_path / '.env'
if dotenv_path.exists():
return dotenv_path
current_path = current_path.parent
return None
ENV_FILE = find_dotenv(__file__)
#NALOADUJEME DOTENV ENV VARIABLES
if load_dotenv(ENV_FILE, verbose=True) is False:
print(f"Error loading.env file {ENV_FILE}. Now depending on ENV VARIABLES set externally.")
else:
print(f"Loaded env variables from file {ENV_FILE}")
#WIP - FILL CONFIGURATION CLASS FOR BACKTESTING
class BT_FILL_CONF:
""""
@ -83,10 +56,10 @@ def get_key(mode: Mode, account: Account):
return None
dict = globals()
try:
API_KEY = dict[str.upper(str(account.name)) + "_" + str.upper(str(mode.name)) + "_API_KEY" ]
SECRET_KEY = dict[str.upper(str(account.name)) + "_" + str.upper(str(mode.name)) + "_SECRET_KEY" ]
PAPER = dict[str.upper(str(account.name)) + "_" + str.upper(str(mode.name)) + "_PAPER" ]
FEED = dict[str.upper(str(account.name)) + "_" + str.upper(str(mode.name)) + "_FEED" ]
API_KEY = dict[str.upper(str(account.value)) + "_" + str.upper(str(mode.value)) + "_API_KEY" ]
SECRET_KEY = dict[str.upper(str(account.value)) + "_" + str.upper(str(mode.value)) + "_SECRET_KEY" ]
PAPER = dict[str.upper(str(account.value)) + "_" + str.upper(str(mode.value)) + "_PAPER" ]
FEED = dict[str.upper(str(account.value)) + "_" + str.upper(str(mode.value)) + "_FEED" ]
return Keys(API_KEY, SECRET_KEY, PAPER, FEED)
except KeyError:
print("Not valid combination to get keys for", mode, account)
@ -95,7 +68,7 @@ def get_key(mode: Mode, account: Account):
#strategy instance main loop heartbeat
HEARTBEAT_TIMEOUT=5
WEB_API_KEY=os.environ.get('WEB_API_KEY')
WEB_API_KEY="david"
#PRIMARY PAPER
ACCOUNT1_PAPER_API_KEY = os.environ.get('ACCOUNT1_PAPER_API_KEY')
@ -110,7 +83,7 @@ data_feed_type_str = os.environ.get('ACCOUNT1_PAPER_FEED', 'iex') # Default to
# Convert the string to DataFeed enum
try:
ACCOUNT1_PAPER_FEED = DataFeed(data_feed_type_str)
except nameError:
except ValueError:
# Handle the case where the environment variable does not match any enum member
print(f"Invalid data feed type: {data_feed_type_str} in ACCOUNT1_PAPER_FEED defaulting to 'iex'")
ACCOUNT1_PAPER_FEED = DataFeed.SIP
@ -128,7 +101,7 @@ data_feed_type_str = os.environ.get('ACCOUNT1_LIVE_FEED', 'iex') # Default to '
# Convert the string to DataFeed enum
try:
ACCOUNT1_LIVE_FEED = DataFeed(data_feed_type_str)
except nameError:
except ValueError:
# Handle the case where the environment variable does not match any enum member
print(f"Invalid data feed type: {data_feed_type_str} in ACCOUNT1_LIVE_FEED defaulting to 'iex'")
ACCOUNT1_LIVE_FEED = DataFeed.IEX
@ -146,7 +119,7 @@ data_feed_type_str = os.environ.get('ACCOUNT2_PAPER_FEED', 'iex') # Default to
# Convert the string to DataFeed enum
try:
ACCOUNT2_PAPER_FEED = DataFeed(data_feed_type_str)
except nameError:
except ValueError:
# Handle the case where the environment variable does not match any enum member
print(f"Invalid data feed type: {data_feed_type_str} in ACCOUNT2_PAPER_FEED defaulting to 'iex'")
ACCOUNT2_PAPER_FEED = DataFeed.IEX
@ -165,7 +138,7 @@ except nameError:
# # Convert the string to DataFeed enum
# try:
# ACCOUNT2_LIVE_FEED = DataFeed(data_feed_type_str)
# except nameError:
# except ValueError:
# # Handle the case where the environment variable does not match any enum member
# print(f"Invalid data feed type: {data_feed_type_str} in ACCOUNT2_LIVE_FEED defaulting to 'iex'")
# ACCOUNT2_LIVE_FEED = DataFeed.IEX

View File

@ -3,7 +3,7 @@ from uuid import UUID, uuid4
from v2realbot.common.model import RunManagerRecord, StrategyInstance, RunDay, StrategyInstance, Runner, RunRequest, RunArchive, RunArchiveView, RunArchiveViewPagination, RunArchiveDetail, RunArchiveChange, Bar, TradeEvent, TestList, Intervals, ConfigItem, InstantIndicator, DataTablesRequest
from v2realbot.utils.utils import validate_and_format_time, AttributeDict, zoneNY, zonePRG, safe_get, dict_replace_value, Store, parse_toml_string, json_serial, is_open_hours, send_to_telegram, concatenate_weekdays, transform_data
from v2realbot.utils.ilog import delete_logs
from v2realbot.common.model import Trade, TradeDirection, TradeStatus, TradeStoplossType
from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeStatus, TradeStoplossType
from datetime import datetime
from v2realbot.loader.trade_offline_streamer import Trade_Offline_Streamer
from threading import Thread, current_thread, Event, enumerate
@ -172,14 +172,14 @@ def add_run_manager_record(new_record: RunManagerRecord):
# Construct a suitable INSERT query based on your RunManagerRecord fields
insert_query = """
INSERT INTO run_manager (moddus, id, strat_id, symbol,account, mode, note,ilog_save,
market, bt_from, bt_to, weekdays_filter, batch_id,
bt_from, bt_to, weekdays_filter, batch_id,
start_time, stop_time, status, last_processed,
history, valid_from, valid_to, testlist_id)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?,?)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
"""
values = [
new_record.moddus, str(new_record.id), str(new_record.strat_id), new_record.symbol, new_record.account, new_record.mode, new_record.note,
int(new_record.ilog_save), new_record.market,
int(new_record.ilog_save),
new_record.bt_from.isoformat() if new_record.bt_from is not None else None,
new_record.bt_to.isoformat() if new_record.bt_to is not None else None,
",".join(str(x) for x in new_record.weekdays_filter) if new_record.weekdays_filter else None,

View File

@ -3,14 +3,14 @@ from uuid import UUID, uuid4
import pickle
from alpaca.data.historical import StockHistoricalDataClient
from alpaca.data.requests import StockTradesRequest, StockBarsRequest
from alpaca.data.enums import DataFeed
from alpaca.data.enums import DataFeed
from alpaca.data.timeframe import TimeFrame
from v2realbot.strategy.base import StrategyState
from v2realbot.enums.enums import RecordType, StartBarAlign, Mode, Account, OrderSide
from v2realbot.common.model import RunDay, StrategyInstance, Runner, RunRequest, RunArchive, RunArchiveView, RunArchiveViewPagination, RunArchiveDetail, RunArchiveChange, Bar, TradeEvent, TestList, Intervals, ConfigItem, InstantIndicator, DataTablesRequest
from v2realbot.utils.utils import AttributeDict, zoneNY, zonePRG, safe_get, dict_replace_value, Store, parse_toml_string, json_serial, is_open_hours, send_to_telegram, concatenate_weekdays, transform_data, gaka
from v2realbot.utils.utils import AttributeDict, zoneNY, zonePRG, safe_get, dict_replace_value, Store, parse_toml_string, json_serial, is_open_hours, send_to_telegram, concatenate_weekdays, transform_data
from v2realbot.utils.ilog import delete_logs
from v2realbot.common.model import Trade, TradeDirection, TradeStatus, TradeStoplossType
from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeStatus, TradeStoplossType
from datetime import datetime
from v2realbot.loader.trade_offline_streamer import Trade_Offline_Streamer
from threading import Thread, current_thread, Event, enumerate
@ -71,8 +71,8 @@ def get_all_runners():
if i.run_instance:
i.run_profit = round(float(i.run_instance.state.profit),2)
i.run_trade_count = len(i.run_instance.state.tradeList)
i.run_positions = gaka(i.run_instance.state.account_variables, "positions")
i.run_avgp = gaka(i.run_instance.state.account_variables, "avgp", lambda x: round(float(x),3))
i.run_positions = i.run_instance.state.positions
i.run_avgp = round(float(i.run_instance.state.avgp),3)
return (0, db.runners)
else:
return (0, [])
@ -94,8 +94,8 @@ def get_runner(id: UUID):
if str(i.id) == str(id):
i.run_profit = round(float(i.run_instance.state.profit),2)
i.run_trade_count = len(i.run_instance.state.tradeList)
i.run_positions =gaka(i.run_instance.state.account_variables, "positions")
i.run_avgp = gaka(i.run_instance.state.account_variables, "avgp", lambda x: round(float(x),3))
i.run_positions = i.run_instance.state.positions
i.run_avgp = round(float(i.run_instance.state.avgp),3)
return (0, i)
return (-2, "not found")
@ -738,14 +738,13 @@ def populate_metrics_output_directory(strat: StrategyInstance, inter_batch_param
tradeList = strat.state.tradeList
trade_dict = AttributeDict(account=[],orderid=[],timestamp=[],symbol=[],side=[],order_type=[],qty=[],price=[],position_qty=[])
trade_dict = AttributeDict(orderid=[],timestamp=[],symbol=[],side=[],order_type=[],qty=[],price=[],position_qty=[])
if strat.mode == Mode.BT:
trade_dict["value"] = []
trade_dict["cash"] = []
trade_dict["pos_avg_price"] = []
for t in tradeList:
if t.event == TradeEvent.FILL:
trade_dict.account.append(t.account)
trade_dict.orderid.append(str(t.order.id))
trade_dict.timestamp.append(t.timestamp)
trade_dict.symbol.append(t.order.symbol)
@ -769,12 +768,10 @@ def populate_metrics_output_directory(strat: StrategyInstance, inter_batch_param
max_positions = max_positions[max_positions['side'] == OrderSide.SELL]
max_positions = max_positions.drop(columns=['side'], axis=1)
res = dict(account_variables={}, profit={})
res = dict(profit={})
#filt = max_positions['side'] == 'OrderSide.BUY'
res["account_variables"] = transform_data(strat.state.account_variables, json_serial)
res["pos_cnt"] = dict(zip(str(max_positions['qty']), max_positions['count']))
#naplneni batch sum profitu
if inter_batch_params is not None:
res["profit"]["batch_sum_profit"] = int(inter_batch_params["batch_profit"])
@ -926,8 +923,8 @@ def archive_runner(runner: Runner, strat: StrategyInstance, inter_batch_params:
settings = settings,
profit=round(float(strat.state.profit),2),
trade_count=len(strat.state.tradeList),
end_positions=gaka(strat.state.account_variables, "positions"),
end_positions_avgp=gaka(strat.state.account_variables, "avgp", lambda x: round(float(x),3)),
end_positions=strat.state.positions,
end_positions_avgp=round(float(strat.state.avgp),3),
metrics=results_metrics,
stratvars_toml=runner.run_stratvars_toml,
transferables=strat.state.vars["transferables"]
@ -1117,7 +1114,7 @@ def get_all_archived_runners_p(request: DataTablesRequest) -> Tuple[int, RunArch
# Total count query
total_count_query = """
SELECT COUNT(*) FROM runner_header
WHERE (:search_value = '' OR strat_id LIKE :search_value OR batch_id LIKE :search_value OR symbol like :search_value OR name like :search_value)
WHERE (:search_value = '' OR strat_id LIKE :search_value OR batch_id LIKE :search_value)
"""
c.execute(total_count_query, {'search_value': f'%{search_value}%'})
total_count = c.fetchone()[0]
@ -1132,7 +1129,7 @@ def get_all_archived_runners_p(request: DataTablesRequest) -> Tuple[int, RunArch
SUM(profit) OVER (PARTITION BY batch_id) AS batch_profit,
COUNT(*) OVER (PARTITION BY batch_id) AS batch_count
FROM runner_header
WHERE (:search_value = '' OR strat_id LIKE :search_value OR batch_id LIKE :search_value OR symbol like :search_value OR name like :search_value)
WHERE (:search_value = '' OR strat_id LIKE :search_value OR batch_id LIKE :search_value)
),
InterleavedGroups AS (
SELECT *,
@ -1159,7 +1156,7 @@ def get_all_archived_runners_p(request: DataTablesRequest) -> Tuple[int, RunArch
# Filtered count query
filtered_count_query = """
SELECT COUNT(*) FROM runner_header
WHERE (:search_value = '' OR strat_id LIKE :search_value OR batch_id LIKE :search_value OR symbol like :search_value OR name like :search_value)
WHERE (:search_value = '' OR strat_id LIKE :search_value OR batch_id LIKE :search_value)
"""
c.execute(filtered_count_query, {'search_value': f'%{search_value}%'})
filtered_count = c.fetchone()[0]
@ -1267,7 +1264,6 @@ def insert_archive_header(archeader: RunArchive):
try:
c = conn.cursor()
#json_string = orjson.dumps(archeader, default=json_serial, option=orjson.OPT_PASSTHROUGH_DATETIME)
#print(archeader)
res = c.execute("""
INSERT INTO runner_header
@ -1275,7 +1271,7 @@ def insert_archive_header(archeader: RunArchive):
VALUES
(?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
""",
(str(archeader.id), str(archeader.strat_id), archeader.batch_id, archeader.symbol, archeader.name, archeader.note, archeader.started, archeader.stopped, archeader.mode, archeader.account, archeader.bt_from, archeader.bt_to, orjson.dumps(archeader.strat_json).decode('utf-8'), orjson.dumps(archeader.settings).decode('utf-8'), archeader.ilog_save, archeader.profit, archeader.trade_count, orjson.dumps(archeader.end_positions).decode('utf-8'), orjson.dumps(archeader.end_positions_avgp).decode('utf-8'), orjson.dumps(archeader.metrics, default=json_serial, option=orjson.OPT_PASSTHROUGH_DATETIME).decode('utf-8'), archeader.stratvars_toml, orjson.dumps(archeader.transferables).decode('utf-8')))
(str(archeader.id), str(archeader.strat_id), archeader.batch_id, archeader.symbol, archeader.name, archeader.note, archeader.started, archeader.stopped, archeader.mode, archeader.account, archeader.bt_from, archeader.bt_to, orjson.dumps(archeader.strat_json).decode('utf-8'), orjson.dumps(archeader.settings).decode('utf-8'), archeader.ilog_save, archeader.profit, archeader.trade_count, archeader.end_positions, archeader.end_positions_avgp, orjson.dumps(archeader.metrics, default=json_serial, option=orjson.OPT_PASSTHROUGH_DATETIME).decode('utf-8'), archeader.stratvars_toml, orjson.dumps(archeader.transferables).decode('utf-8')))
#retry not yet supported for statement format above
#res = execute_with_retry(c,statement)
@ -1650,14 +1646,13 @@ def preview_indicator_byTOML(id: UUID, indicator: InstantIndicator, save: bool =
new_inds = AttributeDict(**new_inds)
new_tick_inds = {key: [] for key in detail.indicators[1].keys()}
new_tick_inds = AttributeDict(**new_tick_inds)
def_account = Account("ACCOUNT1")
interface = BacktestInterface(symbol="X", bt=None, account=def_account)
interface = BacktestInterface(symbol="X", bt=None)
##dame nastaveni indikatoru do tvaru, ktery stratvars ocekava (pro dynmaicke inicializace)
stratvars = AttributeDict(indicators=AttributeDict(**{jmeno:toml_parsed}))
#print("stratvars", stratvars)
state = StrategyState(name="XX", symbol = "X", stratvars = AttributeDict(**stratvars), interface=interface, accounts=[def_account], account=def_account)
state = StrategyState(name="XX", symbol = "X", stratvars = AttributeDict(**stratvars), interface=interface)
#inicializujeme stavove promenne a novy indikator v cilovem dict
if output == "bar":

View File

@ -1,11 +1,6 @@
from enum import Enum
from alpaca.trading.enums import OrderSide, OrderStatus, OrderType
class BarType(str, Enum):
TIME = "time"
VOLUME = "volume"
DOLLAR = "dollar"
class Env(str, Enum):
PROD = "prod"
TEST = "test"
@ -108,10 +103,4 @@ class StartBarAlign(str, Enum):
RANDOM = first bar starts when first trade occurs
"""
ROUND = "round"
RANDOM = "random"
class Market(str, Enum):
US = "US"
CRYPTO = "CRYPTO"
RANDOM = "random"

View File

@ -5,7 +5,6 @@ from v2realbot.backtesting.backtester import Backtester
from datetime import datetime
from v2realbot.utils.utils import zoneNY
import v2realbot.utils.config_handler as cfh
from v2realbot.common.model import Account
""""
backtester methods can be called
@ -17,9 +16,8 @@ both should be backtestable
if method are called for the past self.time must be set accordingly
"""
class BacktestInterface(GeneralInterface):
def __init__(self, symbol, bt: Backtester, account: Account) -> None:
def __init__(self, symbol, bt: Backtester) -> None:
self.symbol = symbol
self.account = account
self.bt = bt
self.count_api_requests = cfh.config_handler.get_val('COUNT_API_REQUESTS')
self.mincnt = list([dict(minute=0,count=0)])
@ -45,48 +43,48 @@ class BacktestInterface(GeneralInterface):
def buy(self, size = 1, repeat: bool = False):
self.count()
#add REST API latency
return self.bt.submit_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'),symbol=self.symbol,side=OrderSide.BUY,size=size,order_type = OrderType.MARKET, account=self.account)
return self.bt.submit_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'),symbol=self.symbol,side=OrderSide.BUY,size=size,order_type = OrderType.MARKET)
"""buy limit"""
def buy_l(self, price: float, size: int = 1, repeat: bool = False, force: int = 0):
self.count()
return self.bt.submit_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'),symbol=self.symbol,side=OrderSide.BUY,size=size,price=price,order_type = OrderType.LIMIT, account=self.account)
return self.bt.submit_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'),symbol=self.symbol,side=OrderSide.BUY,size=size,price=price,order_type = OrderType.LIMIT)
"""sell market"""
def sell(self, size = 1, repeat: bool = False):
self.count()
return self.bt.submit_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'),symbol=self.symbol,side=OrderSide.SELL,size=size,order_type = OrderType.MARKET, account=self.account)
return self.bt.submit_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'),symbol=self.symbol,side=OrderSide.SELL,size=size,order_type = OrderType.MARKET)
"""sell limit"""
async def sell_l(self, price: float, size = 1, repeat: bool = False):
self.count()
return self.bt.submit_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'),symbol=self.symbol,side=OrderSide.SELL,size=size,price=price,order_type = OrderType.LIMIT, account=self.account)
return self.bt.submit_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'),symbol=self.symbol,side=OrderSide.SELL,size=size,price=price,order_type = OrderType.LIMIT)
"""replace order"""
async def repl(self, orderid: str, price: float = None, size: int = None, repeat: bool = False):
self.count()
return self.bt.replace_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'),id=orderid,size=size,price=price, account=self.account)
return self.bt.replace_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'),id=orderid,size=size,price=price)
"""cancel order"""
#TBD exec predtim?
def cancel(self, orderid: str):
self.count()
return self.bt.cancel_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'), id=orderid, account=self.account)
return self.bt.cancel_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'), id=orderid)
"""get positions ->(size,avgp)"""
#TBD exec predtim?
def pos(self):
self.count()
return self.bt.get_open_position(symbol=self.symbol, account=self.account)
return self.bt.get_open_position(symbol=self.symbol)
"""get open orders ->list(Order)"""
def get_open_orders(self, side: OrderSide, symbol: str):
self.count()
return self.bt.get_open_orders(side=side, symbol=symbol, account=self.account)
return self.bt.get_open_orders(side=side, symbol=symbol)
def get_last_price(self, symbol: str):
self.count()
return self.bt.get_last_price(time=self.bt.time, account=self.account)
return self.bt.get_last_price(time=self.bt.time)

View File

@ -97,7 +97,7 @@ class LiveInterface(GeneralInterface):
return -1
"""sell limit"""
def sell_l(self, price: float, size = 1, repeat: bool = False):
async def sell_l(self, price: float, size = 1, repeat: bool = False):
self.size = size
self.repeat = repeat
@ -124,7 +124,7 @@ class LiveInterface(GeneralInterface):
return -1
"""order replace"""
def repl(self, orderid: str, price: float = None, size: int = None, repeatl: bool = False):
async def repl(self, orderid: str, price: float = None, size: int = None, repeatl: bool = False):
if not price and not size:
print("price or size has to be filled")

File diff suppressed because it is too large Load Diff

View File

@ -1,570 +0,0 @@
import pandas as pd
import numpy as np
from numba import jit
from alpaca.data.historical import StockHistoricalDataClient
from sqlalchemy import column
from v2realbot.config import ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, DATA_DIR
from alpaca.data.requests import StockTradesRequest
import time as time_module
from v2realbot.utils.utils import parse_alpaca_timestamp, ltp, zoneNY, send_to_telegram, fetch_calendar_data
import pyarrow
from traceback import format_exc
from datetime import timedelta, datetime, time
from concurrent.futures import ThreadPoolExecutor
import os
import gzip
import pickle
import random
from alpaca.data.models import BarSet, QuoteSet, TradeSet
import v2realbot.utils.config_handler as cfh
from v2realbot.enums.enums import BarType
from tqdm import tqdm
""""
Module used for vectorized aggregation of trades.
Includes fetch (remote/cached) methods and numba aggregator function for TIME BASED, VOLUME BASED and DOLLAR BARS
"""""
def aggregate_trades(symbol: str, trades_df: pd.DataFrame, resolution: int, type: BarType = BarType.TIME):
""""
Accepts dataframe with trades keyed by symbol. Preparess dataframe to
numpy and calls Numba optimized aggregator for given bar type. (time/volume/dollar)
"""""
trades_df = trades_df.loc[symbol]
trades_df= trades_df.reset_index()
ticks = trades_df[['timestamp', 'price', 'size']].to_numpy()
# Extract the timestamps column (assuming it's the first column)
timestamps = ticks[:, 0]
# Convert the timestamps to Unix timestamps in seconds with microsecond precision
unix_timestamps_s = np.array([ts.timestamp() for ts in timestamps], dtype='float64')
# Replace the original timestamps in the NumPy array with the converted Unix timestamps
ticks[:, 0] = unix_timestamps_s
ticks = ticks.astype(np.float64)
#based on type, specific aggregator function is called
match type:
case BarType.TIME:
ohlcv_bars = generate_time_bars_nb(ticks, resolution)
case BarType.VOLUME:
ohlcv_bars = generate_volume_bars_nb(ticks, resolution)
case BarType.DOLLAR:
ohlcv_bars = generate_dollar_bars_nb(ticks, resolution)
case _:
raise ValueError("Invalid bar type. Supported types are 'time', 'volume' and 'dollar'.")
# Convert the resulting array back to a DataFrame
columns = ['time', 'open', 'high', 'low', 'close', 'volume', 'trades']
if type == BarType.DOLLAR:
columns.append('amount')
columns.append('updated')
if type == BarType.TIME:
columns.append('vwap')
columns.append('buyvolume')
columns.append('sellvolume')
if type == BarType.VOLUME:
columns.append('buyvolume')
columns.append('sellvolume')
ohlcv_df = pd.DataFrame(ohlcv_bars, columns=columns)
ohlcv_df['time'] = pd.to_datetime(ohlcv_df['time'], unit='s').dt.tz_localize('UTC').dt.tz_convert(zoneNY)
#print(ohlcv_df['updated'])
ohlcv_df['updated'] = pd.to_datetime(ohlcv_df['updated'], unit="s").dt.tz_localize('UTC').dt.tz_convert(zoneNY)
# Round to microseconds to maintain six decimal places
ohlcv_df['updated'] = ohlcv_df['updated'].dt.round('us')
ohlcv_df.set_index('time', inplace=True)
#ohlcv_df.index = ohlcv_df.index.tz_localize('UTC').tz_convert(zoneNY)
return ohlcv_df
# Function to ensure fractional seconds are present
def ensure_fractional_seconds(timestamp):
if '.' not in timestamp:
# Inserting .000000 before the timezone indicator 'Z'
return timestamp.replace('Z', '.000000Z')
else:
return timestamp
def convert_dict_to_multiindex_df(tradesResponse):
""""
Converts dictionary from cache or from remote (raw input) to multiindex dataframe.
with microsecond precision (from nanoseconds in the raw data)
"""""
# Create a DataFrame for each key and add the key as part of the MultiIndex
dfs = []
for key, values in tradesResponse.items():
df = pd.DataFrame(values)
# Rename columns
# Select and order columns explicitly
#print(df)
df = df[['t', 'x', 'p', 's', 'i', 'c','z']]
df.rename(columns={'t': 'timestamp', 'c': 'conditions', 'p': 'price', 's': 'size', 'x': 'exchange', 'z':'tape', 'i':'id'}, inplace=True)
df['symbol'] = key # Add ticker as a column
# Apply the function to ensure all timestamps have fractional seconds
#zvazit zda toto ponechat a nebo dat jen pri urcitem erroru pri to_datetime
#pripadne pak pridelat efektivnejsi pristup, aneb nahrazeni NaT - https://chatgpt.com/c/d2be6f87-b38f-4050-a1c6-541d100b1474
df['timestamp'] = df['timestamp'].apply(ensure_fractional_seconds)
df['timestamp'] = pd.to_datetime(df['timestamp'], errors='coerce') # Convert 't' from string to datetime before setting it as an index
#Adjust to microsecond precision
df.loc[df['timestamp'].notna(), 'timestamp'] = df['timestamp'].dt.floor('us')
df.set_index(['symbol', 'timestamp'], inplace=True) # Set the multi-level index using both 'ticker' and 't'
df = df.tz_convert(zoneNY, level='timestamp')
dfs.append(df)
# Concatenate all DataFrames into a single DataFrame with MultiIndex
final_df = pd.concat(dfs)
return final_df
def dict_to_df(tradesResponse, start, end, exclude_conditions = None, minsize = None):
""""
Transforms dict to Tradeset, then df and to zone aware
Also filters to start and end if necessary (ex. 9:30 to 15:40 is required only)
NOTE: prepodkladame, ze tradesResponse je dict from Raw data (cached/remote)
"""""
df = convert_dict_to_multiindex_df(tradesResponse)
#REQUIRED FILTERING
#pokud je zacatek pozdeji nebo konec driv tak orizneme
if (start.time() > time(9, 30) or end.time() < time(16, 0)):
print(f"filtrujeme {start.time()} {end.time()}")
# Define the time range
# start_time = pd.Timestamp(start.time(), tz=zoneNY).time()
# end_time = pd.Timestamp(end.time(), tz=zoneNY).time()
# Create a mask to filter rows within the specified time range
mask = (df.index.get_level_values('timestamp') >= start) & \
(df.index.get_level_values('timestamp') <= end)
# Apply the mask to the DataFrame
df = df[mask]
if exclude_conditions is not None:
print(f"excluding conditions {exclude_conditions}")
# Create a mask to exclude rows with any of the specified conditions
mask = df['conditions'].apply(lambda x: any(cond in exclude_conditions for cond in x))
# Filter out the rows with specified conditions
df = df[~mask]
if minsize is not None:
print(f"minsize {minsize}")
#exclude conditions
df = df[df['size'] >= minsize]
return df
def fetch_daily_stock_trades(symbol, start, end, exclude_conditions=None, minsize=None, force_remote=False, max_retries=5, backoff_factor=1):
#doc for this function
"""
Attempts to fetch stock trades either from cache or remote. When remote, it uses retry mechanism with exponential backoff.
Also it stores the data to cache if it is not already there.
by using force_remote - forcess using remote data always and thus refreshing cache for these dates
Attributes:
:param symbol: The stock symbol to fetch trades for.
:param start: The start time for the trade data.
:param end: The end time for the trade data.
:exclude_conditions: list of string conditions to exclude from the data
:minsize minimum size of trade to be included in the data
:force_remote will always use remote data and refresh cache
:param max_retries: Maximum number of retries.
:param backoff_factor: Factor to determine the next sleep time.
:return: TradesResponse object.
:raises: ConnectionError if all retries fail.
We use tradecache only for main sessison requests = 9:30 to 16:00
Do budoucna ukládat celý den BAC-20240203.cache.gz a z toho si pak filtrovat bud main sesssionu a extended
Ale zatim je uloženo jen main session v BAC-timestampopenu-timestampclose.cache.gz
"""
is_same_day = start.date() == end.date()
# Determine if the requested times fall within the main session
in_main_session = (time(9, 30) <= start.time() < time(16, 0)) and (time(9, 30) <= end.time() <= time(16, 0))
file_path = ''
if in_main_session:
filename_start = zoneNY.localize(datetime.combine(start.date(), time(9, 30)))
filename_end = zoneNY.localize(datetime.combine(end.date(), time(16, 0)))
daily_file = f"{symbol}-{int(filename_start.timestamp())}-{int(filename_end.timestamp())}.cache.gz"
file_path = f"{DATA_DIR}/tradecache/{daily_file}"
if not force_remote and os.path.exists(file_path):
print(f"Searching {str(start.date())} cache: " + daily_file)
with gzip.open(file_path, 'rb') as fp:
tradesResponse = pickle.load(fp)
print("FOUND in CACHE", daily_file)
return dict_to_df(tradesResponse, start, end, exclude_conditions, minsize)
print("NOT FOUND. Fetching from remote")
client = StockHistoricalDataClient(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, raw_data=True)
stockTradeRequest = StockTradesRequest(symbol_or_symbols=symbol, start=start, end=end)
last_exception = None
for attempt in range(max_retries):
try:
tradesResponse = client.get_stock_trades(stockTradeRequest)
is_empty = not tradesResponse[symbol]
print(f"Remote fetched: {is_empty=}", start, end)
if in_main_session and not is_empty:
current_time = datetime.now().astimezone(zoneNY)
if not (start < current_time < end):
with gzip.open(file_path, 'wb') as fp:
pickle.dump(tradesResponse, fp)
print("Saving to Trade CACHE", file_path)
else: # Don't save the cache if the market is still open
print("Not saving trade cache, market still open today")
return pd.DataFrame() if is_empty else dict_to_df(tradesResponse, start, end, exclude_conditions, minsize)
except Exception as e:
print(f"Attempt {attempt + 1} failed: {e}")
last_exception = e
time_module.sleep(backoff_factor * (2 ** attempt) + random.uniform(0, 1)) # Adding random jitter
print("All attempts to fetch data failed.")
raise ConnectionError(f"Failed to fetch stock trades after {max_retries} retries. Last exception: {str(last_exception)} and {format_exc()}")
def fetch_trades_parallel(symbol, start_date, end_date, exclude_conditions = cfh.config_handler.get_val('AGG_EXCLUDED_TRADES'), minsize = 100, force_remote = False, max_workers=None):
"""
Fetches trades for each day between start_date and end_date during market hours (9:30-16:00) in parallel and concatenates them into a single DataFrame.
:param symbol: Stock symbol.
:param start_date: Start date as datetime.
:param end_date: End date as datetime.
:return: DataFrame containing all trades from start_date to end_date.
"""
futures = []
results = []
market_open_days = fetch_calendar_data(start_date, end_date)
day_count = len(market_open_days)
print("Contains", day_count, " market days")
max_workers = min(10, max(2, day_count // 2)) if max_workers is None else max_workers # Heuristic: half the days to process, but at least 1 and no more than 10
with ThreadPoolExecutor(max_workers=max_workers) as executor:
#for single_date in (start_date + timedelta(days=i) for i in range((end_date - start_date).days + 1)):
for market_day in tqdm(market_open_days, desc="Processing market days"):
#start = datetime.combine(single_date, time(9, 30)) # Market opens at 9:30 AM
#end = datetime.combine(single_date, time(16, 0)) # Market closes at 4:00 PM
interval_from = zoneNY.localize(market_day.open)
interval_to = zoneNY.localize(market_day.close)
#pripadne orizneme pokud je pozadovane pozdejsi zacatek a drivejsi konek
start = start_date if interval_from < start_date else interval_from
#start = max(start_date, interval_from)
end = end_date if interval_to > end_date else interval_to
#end = min(end_date, interval_to)
future = executor.submit(fetch_daily_stock_trades, symbol, start, end, exclude_conditions, minsize, force_remote)
futures.append(future)
for future in tqdm(futures, desc="Fetching data"):
try:
result = future.result()
results.append(result)
except Exception as e:
print(f"Error fetching data for a day: {e}")
# Batch concatenation to improve speed
batch_size = 10
batches = [results[i:i + batch_size] for i in range(0, len(results), batch_size)]
final_df = pd.concat([pd.concat(batch, ignore_index=False) for batch in batches], ignore_index=False)
return final_df
#original version
#return pd.concat(results, ignore_index=False)
@jit(nopython=True)
def generate_dollar_bars_nb(ticks, amount_per_bar):
""""
Generates Dollar based bars from ticks.
There is also simple prevention of aggregation from different days
as described here https://chatgpt.com/c/17804fc1-a7bc-495d-8686-b8392f3640a2
Downside: split days by UTC (which is ok for main session, but when extended hours it should be reworked by preprocessing new column identifying session)
When trade is split into multiple bars it is counted as trade in each of the bars.
Other option: trade count can be proportionally distributed by weight (0.2 to 1st bar, 0.8 to 2nd bar) - but this is not implemented yet
https://chatgpt.com/c/ff4802d9-22a2-4b72-8ab7-97a91e7a515f
"""""
ohlcv_bars = []
remaining_amount = amount_per_bar
# Initialize bar values based on the first tick to avoid uninitialized values
open_price = ticks[0, 1]
high_price = ticks[0, 1]
low_price = ticks[0, 1]
close_price = ticks[0, 1]
volume = 0
trades_count = 0
current_day = np.floor(ticks[0, 0] / 86400) # Calculate the initial day from the first tick timestamp
bar_time = ticks[0, 0] # Initialize bar time with the time of the first tick
for tick in ticks:
tick_time = tick[0]
price = tick[1]
tick_volume = tick[2]
tick_amount = price * tick_volume
tick_day = np.floor(tick_time / 86400) # Calculate the day of the current tick
# Check if the new tick is from a different day, then close the current bar
if tick_day != current_day:
if trades_count > 0:
ohlcv_bars.append([bar_time, open_price, high_price, low_price, close_price, volume, trades_count, amount_per_bar, tick_time])
# Reset for the new day using the current tick data
open_price = price
high_price = price
low_price = price
close_price = price
volume = 0
trades_count = 0
remaining_amount = amount_per_bar
current_day = tick_day
bar_time = tick_time
# Start new bar if needed because of the dollar value
while tick_amount > 0:
if tick_amount < remaining_amount:
# Add the entire tick to the current bar
high_price = max(high_price, price)
low_price = min(low_price, price)
close_price = price
volume += tick_volume
remaining_amount -= tick_amount
trades_count += 1
tick_amount = 0
else:
# Calculate the amount of volume that fits within the remaining dollar amount
volume_to_add = remaining_amount / price
volume += volume_to_add # Update the volume here before appending and resetting
# Append the partially filled bar to the list
ohlcv_bars.append([bar_time, open_price, high_price, low_price, close_price, volume, trades_count + 1, amount_per_bar, tick_time])
# Fill the current bar and continue with a new bar
tick_volume -= volume_to_add
tick_amount -= remaining_amount
# Reset bar values for the new bar using the current tick data
open_price = price
high_price = price
low_price = price
close_price = price
volume = 0 # Reset volume for the new bar
trades_count = 0
remaining_amount = amount_per_bar
# Increment bar time if splitting a trade
if tick_volume > 0: #pokud v tradu je jeste zbytek nastavujeme cas o nanosekundu vetsi
bar_time = tick_time + 1e-6
else:
bar_time = tick_time #jinak nastavujeme cas ticku
#bar_time = tick_time
# Add the last bar if it contains any trades
if trades_count > 0:
ohlcv_bars.append([bar_time, open_price, high_price, low_price, close_price, volume, trades_count, amount_per_bar, tick_time])
return np.array(ohlcv_bars)
@jit(nopython=True)
def generate_volume_bars_nb(ticks, volume_per_bar):
""""
Generates Volume based bars from ticks.
NOTE: UTC day split here (doesnt aggregate trades from different days)
but realized from UTC (ok for main session) - but needs rework for extension by preprocessing ticks_df and introduction sesssion column
When trade is split into multiple bars it is counted as trade in each of the bars.
Other option: trade count can be proportionally distributed by weight (0.2 to 1st bar, 0.8 to 2nd bar) - but this is not implemented yet
https://chatgpt.com/c/ff4802d9-22a2-4b72-8ab7-97a91e7a515f
"""""
ohlcv_bars = []
remaining_volume = volume_per_bar
# Initialize bar values based on the first tick to avoid uninitialized values
open_price = ticks[0, 1]
high_price = ticks[0, 1]
low_price = ticks[0, 1]
close_price = ticks[0, 1]
volume = 0
trades_count = 0
current_day = np.floor(ticks[0, 0] / 86400) # Calculate the initial day from the first tick timestamp
bar_time = ticks[0, 0] # Initialize bar time with the time of the first tick
buy_volume = 0 # Volume of buy trades
sell_volume = 0 # Volume of sell trades
prev_price = ticks[0, 1] # Initialize previous price for the first tick
for tick in ticks:
tick_time = tick[0]
price = tick[1]
tick_volume = tick[2]
tick_day = np.floor(tick_time / 86400) # Calculate the day of the current tick
# Check if the new tick is from a different day, then close the current bar
if tick_day != current_day:
if trades_count > 0:
ohlcv_bars.append([bar_time, open_price, high_price, low_price, close_price, volume, trades_count, tick_time, buy_volume, sell_volume])
# Reset for the new day using the current tick data
open_price = price
high_price = price
low_price = price
close_price = price
volume = 0
trades_count = 0
remaining_volume = volume_per_bar
current_day = tick_day
bar_time = tick_time # Update bar time to the current tick time
buy_volume = 0
sell_volume = 0
# Reset previous tick price (calulating imbalance for each day from the start)
prev_price = price
# Start new bar if needed because of the volume
while tick_volume > 0:
if tick_volume < remaining_volume:
# Add the entire tick to the current bar
high_price = max(high_price, price)
low_price = min(low_price, price)
close_price = price
volume += tick_volume
remaining_volume -= tick_volume
trades_count += 1
# Update buy and sell volumes
if price > prev_price:
buy_volume += tick_volume
elif price < prev_price:
sell_volume += tick_volume
tick_volume = 0
else:
# Fill the current bar and continue with a new bar
volume_to_add = remaining_volume
volume += volume_to_add
tick_volume -= volume_to_add
trades_count += 1
# Update buy and sell volumes
if price > prev_price:
buy_volume += volume_to_add
elif price < prev_price:
sell_volume += volume_to_add
# Append the completed bar to the list
ohlcv_bars.append([bar_time, open_price, high_price, low_price, close_price, volume, trades_count, tick_time, buy_volume, sell_volume])
# Reset bar values for the new bar using the current tick data
open_price = price
high_price = price
low_price = price
close_price = price
volume = 0
trades_count = 0
remaining_volume = volume_per_bar
buy_volume = 0
sell_volume = 0
# Increment bar time if splitting a trade
if tick_volume > 0: # If there's remaining volume in the trade, set bar time slightly later
bar_time = tick_time + 1e-6
else:
bar_time = tick_time # Otherwise, set bar time to the tick time
prev_price = price
# Add the last bar if it contains any trades
if trades_count > 0:
ohlcv_bars.append([bar_time, open_price, high_price, low_price, close_price, volume, trades_count, tick_time, buy_volume, sell_volume])
return np.array(ohlcv_bars)
@jit(nopython=True)
def generate_time_bars_nb(ticks, resolution):
# Initialize the start and end time
start_time = np.floor(ticks[0, 0] / resolution) * resolution
end_time = np.floor(ticks[-1, 0] / resolution) * resolution
# # Calculate number of bars
# num_bars = int((end_time - start_time) // resolution + 1)
# Using a list to append data only when trades exist
ohlcv_bars = []
# Variables to track the current bar
current_bar_index = -1
open_price = 0
high_price = -np.inf
low_price = np.inf
close_price = 0
volume = 0
trades_count = 0
vwap_cum_volume_price = 0 # Cumulative volume * price
cum_volume = 0 # Cumulative volume for VWAP
buy_volume = 0 # Volume of buy trades
sell_volume = 0 # Volume of sell trades
prev_price = ticks[0, 1] # Initialize previous price for the first tick
prev_day = np.floor(ticks[0, 0] / 86400) # Calculate the initial day from the first tick timestamp
for tick in ticks:
curr_time = tick[0] #updated time
tick_time = np.floor(tick[0] / resolution) * resolution
price = tick[1]
tick_volume = tick[2]
tick_day = np.floor(tick_time / 86400) # Calculate the day of the current tick
#if the new tick is from a new day, reset previous tick price (calculating imbalance starts over)
if tick_day != prev_day:
prev_price = price
prev_day = tick_day
# Check if the tick belongs to a new bar
if tick_time != start_time + current_bar_index * resolution:
if current_bar_index >= 0 and trades_count > 0: # Save the previous bar if trades happened
vwap = vwap_cum_volume_price / cum_volume if cum_volume > 0 else 0
ohlcv_bars.append([start_time + current_bar_index * resolution, open_price, high_price, low_price, close_price, volume, trades_count, curr_time, vwap, buy_volume, sell_volume])
# Reset bar values
current_bar_index = int((tick_time - start_time) / resolution)
open_price = price
high_price = price
low_price = price
volume = 0
trades_count = 0
vwap_cum_volume_price = 0
cum_volume = 0
buy_volume = 0
sell_volume = 0
# Update the OHLCV values for the current bar
high_price = max(high_price, price)
low_price = min(low_price, price)
close_price = price
volume += tick_volume
trades_count += 1
vwap_cum_volume_price += price * tick_volume
cum_volume += tick_volume
# Update buy and sell volumes
if price > prev_price:
buy_volume += tick_volume
elif price < prev_price:
sell_volume += tick_volume
prev_price = price
# Save the last processed bar
if trades_count > 0:
vwap = vwap_cum_volume_price / cum_volume if cum_volume > 0 else 0
ohlcv_bars.append([start_time + current_bar_index * resolution, open_price, high_price, low_price, close_price, volume, trades_count, curr_time, vwap, buy_volume, sell_volume])
return np.array(ohlcv_bars)
# Example usage
if __name__ == '__main__':
pass
#example in agg_vect.ipynb

View File

@ -1,7 +1,6 @@
from threading import Thread, current_thread
from threading import Thread
from alpaca.trading.stream import TradingStream
from v2realbot.config import Keys
from v2realbot.common.model import Account
#jelikoz Alpaca podporuje pripojeni libovolneho poctu websocket instanci na order updates
#vytvorime pro kazdou bezici instanci vlastni webservisu (jinak bychom museli delat instanci pro kombinaci ACCOUNT1 - LIVE, ACCOUNT1 - PAPER, ACCOUNT2 - PAPER ..)
@ -15,16 +14,15 @@ As Alpaca supports connecting of any number of trade updates clients
new instance of this websocket thread is created for each strategy instance.
"""""
class LiveOrderUpdatesStreamer(Thread):
def __init__(self, key: Keys, name: str, account: Account) -> None:
def __init__(self, key: Keys, name: str) -> None:
self.key = key
self.account = account
self.strategy = None
self.client = TradingStream(api_key=key.API_KEY, secret_key=key.SECRET_KEY, paper=key.PAPER)
Thread.__init__(self, name=name)
#notif dispatcher - pouze 1 strategie
async def distributor(self,data):
if self.strategy.symbol == data.order.symbol: await self.strategy.order_updates(data, self.account)
if self.strategy.symbol == data.order.symbol: await self.strategy.order_updates(data)
# connects callback to interface object - responses for given symbol are routed to interface callback
def connect_callback(self, st):
@ -41,6 +39,6 @@ class LiveOrderUpdatesStreamer(Thread):
print("connect strategy first")
return
self.client.subscribe_trade_updates(self.distributor)
print("*"*10, "WS Order Update Streamer started for", current_thread().name,"*"*10)
print("*"*10, "WS Order Update Streamer started for", self.strategy.name, "*"*10)
self.client.run()

View File

@ -10,15 +10,14 @@ from fastapi.security import APIKeyHeader
import uvicorn
from uuid import UUID
from v2realbot.utils.ilog import get_log_window
from v2realbot.common.model import RunManagerRecord, StrategyInstance, RunnerView, RunRequest, TradeView, RunArchive, RunArchiveView, RunArchiveViewPagination, RunArchiveDetail, Bar, RunArchiveChange, TestList, ConfigItem, InstantIndicator, DataTablesRequest, AnalyzerInputs
from fastapi import FastAPI, WebSocket, WebSocketDisconnect, Depends, HTTPException, status, WebSocketException, Cookie, Query, Request
from v2realbot.common.model import RunManagerRecord, StrategyInstance, RunnerView, RunRequest, Trade, RunArchive, RunArchiveView, RunArchiveViewPagination, RunArchiveDetail, Bar, RunArchiveChange, TestList, ConfigItem, InstantIndicator, DataTablesRequest, AnalyzerInputs
from fastapi import FastAPI, WebSocket, WebSocketDisconnect, Depends, HTTPException, status, WebSocketException, Cookie, Query
from fastapi.responses import FileResponse, StreamingResponse, JSONResponse
from fastapi.staticfiles import StaticFiles
from fastapi.security import HTTPBasic, HTTPBasicCredentials
from v2realbot.enums.enums import Env, Mode
from typing import Annotated
import os
import psutil
import uvicorn
import orjson
from queue import Queue, Empty
@ -36,7 +35,7 @@ from traceback import format_exc
#from v2realbot.reporting.optimizecutoffs import find_optimal_cutoff
import v2realbot.reporting.analyzer as ci
import shutil
from starlette.responses import JSONResponse, HTMLResponse, FileResponse, RedirectResponse
from starlette.responses import JSONResponse
import mlroom
import mlroom.utils.mlutils as ml
from typing import List
@ -75,52 +74,14 @@ def api_key_auth(api_key: str = Depends(X_API_KEY)):
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Forbidden"
)
def authenticate_user(credentials: HTTPBasicCredentials = Depends(HTTPBasic())):
correct_username = "david"
correct_password = "david"
if credentials.username == correct_username and credentials.password == correct_password:
return True
else:
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Incorrect username or password",
headers={"WWW-Authenticate": "Basic"},
)
app = FastAPI()
root = os.path.dirname(os.path.abspath(__file__))
#app.mount("/static", StaticFiles(html=True, directory=os.path.join(root, 'static')), name="static")
app.mount("/static", StaticFiles(html=True, directory=os.path.join(root, 'static')), name="static")
app.mount("/media", StaticFiles(directory=str(MEDIA_DIRECTORY)), name="media")
#app.mount("/", StaticFiles(html=True, directory=os.path.join(root, 'static')), name="www")
security = HTTPBasic()
@app.get("/static/{path:path}")
async def static_files(request: Request, path: str, authenticated: bool = Depends(authenticate_user)):
root = os.path.dirname(os.path.abspath(__file__))
static_dir = os.path.join(root, 'static')
if not path or path == "/":
file_path = os.path.join(static_dir, 'index.html')
else:
file_path = os.path.join(static_dir, path)
# Check if path is a directory
if os.path.isdir(file_path):
# If it's a directory, try to serve index.html within that directory
index_path = os.path.join(file_path, 'index.html')
if os.path.exists(index_path):
return FileResponse(index_path)
else:
# Optionally, you can return a directory listing or a custom 404 page here
return HTMLResponse("Directory listing not enabled.", status_code=403)
if not os.path.exists(file_path):
raise HTTPException(status_code=404, detail="File not found")
return FileResponse(file_path)
def get_current_username(
credentials: Annotated[HTTPBasicCredentials, Depends(security)]
@ -142,9 +103,9 @@ async def get_api_key(
return session or api_key
#TODO predelat z Async?
# @app.get("/static")
# async def get(username: Annotated[str, Depends(get_current_username)]):
# return FileResponse("index.html")
@app.get("/static")
async def get(username: Annotated[str, Depends(get_current_username)]):
return FileResponse("index.html")
@app.websocket("/runners/{runner_id}/ws")
async def websocket_endpoint(
@ -334,7 +295,7 @@ def stop_all_runners():
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=f"Error: {res}:{id}")
@app.get("/tradehistory/{symbol}", dependencies=[Depends(api_key_auth)])
def get_trade_history(symbol: str, timestamp_from: float, timestamp_to:float) -> list[TradeView]:
def get_trade_history(symbol: str, timestamp_from: float, timestamp_to:float) -> list[Trade]:
res, set = cs.get_trade_history(symbol, timestamp_from, timestamp_to)
if res == 0:
return set
@ -1026,25 +987,7 @@ def get_metadata(model_name: str):
# "last_modified": os.path.getmtime(model_path),
# # ... other metadata fields ...
# }
@app.get("/system-info")
def get_system_info():
"""Get system info, e.g. disk free space, used percentage ... """
disk_total = round(psutil.disk_usage('/').total / 1024**3, 1)
disk_used = round(psutil.disk_usage('/').used / 1024**3, 1)
disk_free = round(psutil.disk_usage('/').free / 1024**3, 1)
disk_used_percentage = round(psutil.disk_usage('/').percent, 1)
# memory_total = round(psutil.virtual_memory().total / 1024**3, 1)
# memory_perc = round(psutil.virtual_memory().percent, 1)
# cpu_time_user = round(psutil.cpu_times().user,1)
# cpu_time_system = round(psutil.cpu_times().system,1)
# cpu_time_idle = round(psutil.cpu_times().idle,1)
# network_sent = round(psutil.net_io_counters().bytes_sent / 1024**3, 6)
# network_recv = round(psutil.net_io_counters().bytes_recv / 1024**3, 6)
return {"disk_space": {"total": disk_total, "used": disk_used, "free" : disk_free, "used_percentage" : disk_used_percentage},
# "memory": {"total": memory_total, "used_percentage": memory_perc},
# "cpu_time" : {"user": cpu_time_user, "system": cpu_time_system, "idle": cpu_time_idle},
# "network": {"sent": network_sent, "received": network_recv}
}
# Thread function to insert data from the queue into the database
def insert_queue2db():

View File

@ -12,7 +12,7 @@ import numpy as np
import v2realbot.controller.services as cs
from rich import print
from v2realbot.common.model import AnalyzerInputs
from v2realbot.common.model import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.common.PrescribedTradeModel import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, safe_get#, print
from pathlib import Path
from v2realbot.config import WEB_API_KEY, DATA_DIR, MEDIA_DIRECTORY

View File

@ -12,7 +12,7 @@ import numpy as np
import v2realbot.controller.services as cs
from rich import print
from v2realbot.common.model import AnalyzerInputs
from v2realbot.common.model import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.common.PrescribedTradeModel import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, safe_get#, print
from pathlib import Path
from v2realbot.config import WEB_API_KEY, DATA_DIR, MEDIA_DIRECTORY

View File

@ -11,7 +11,7 @@ import numpy as np
import v2realbot.controller.services as cs
from rich import print
from v2realbot.common.model import AnalyzerInputs
from v2realbot.common.model import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.common.PrescribedTradeModel import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, safe_get#, print
from pathlib import Path
from v2realbot.config import WEB_API_KEY, DATA_DIR, MEDIA_DIRECTORY

View File

@ -11,7 +11,7 @@ import numpy as np
import v2realbot.controller.services as cs
from rich import print
from v2realbot.common.model import AnalyzerInputs
from v2realbot.common.model import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.common.PrescribedTradeModel import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, safe_get#, print
from pathlib import Path
from v2realbot.config import WEB_API_KEY, DATA_DIR, MEDIA_DIRECTORY

View File

@ -11,7 +11,7 @@ import numpy as np
import v2realbot.controller.services as cs
from rich import print
from v2realbot.common.model import AnalyzerInputs
from v2realbot.common.model import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.common.PrescribedTradeModel import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, safe_get#, print
from pathlib import Path
from v2realbot.config import WEB_API_KEY, DATA_DIR, MEDIA_DIRECTORY

View File

@ -11,7 +11,7 @@ import numpy as np
import v2realbot.controller.services as cs
from rich import print
from v2realbot.common.model import AnalyzerInputs
from v2realbot.common.model import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.common.PrescribedTradeModel import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, safe_get#, print
from pathlib import Path
from v2realbot.config import WEB_API_KEY, DATA_DIR, MEDIA_DIRECTORY

View File

@ -11,7 +11,7 @@ import numpy as np
import v2realbot.controller.services as cs
from rich import print
from v2realbot.common.model import AnalyzerInputs
from v2realbot.common.model import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.common.PrescribedTradeModel import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, safe_get#, print
from pathlib import Path
from v2realbot.config import WEB_API_KEY, DATA_DIR, MEDIA_DIRECTORY

View File

@ -11,7 +11,7 @@ import numpy as np
import v2realbot.controller.services as cs
from rich import print
from v2realbot.common.model import AnalyzerInputs
from v2realbot.common.model import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.common.PrescribedTradeModel import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, safe_get#, print
from pathlib import Path
from v2realbot.config import WEB_API_KEY, DATA_DIR, MEDIA_DIRECTORY

View File

@ -12,7 +12,7 @@ import numpy as np
import v2realbot.controller.services as cs
from rich import print
from v2realbot.common.model import AnalyzerInputs
from v2realbot.common.model import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.common.PrescribedTradeModel import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, safe_get#, print
from pathlib import Path
from v2realbot.config import WEB_API_KEY, DATA_DIR, MEDIA_DIRECTORY

View File

@ -10,7 +10,7 @@ from enum import Enum
import numpy as np
import v2realbot.controller.services as cs
from rich import print
from v2realbot.common.model import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.common.PrescribedTradeModel import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, safe_get#, print
from pathlib import Path
from v2realbot.config import WEB_API_KEY, DATA_DIR, MEDIA_DIRECTORY

View File

@ -10,7 +10,7 @@ from enum import Enum
import numpy as np
import v2realbot.controller.services as cs
from rich import print
from v2realbot.common.model import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.common.PrescribedTradeModel import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, safe_get#, print
from pathlib import Path
from v2realbot.config import WEB_API_KEY, DATA_DIR, MEDIA_DIRECTORY

View File

@ -10,7 +10,7 @@ from enum import Enum
import numpy as np
import v2realbot.controller.services as cs
from rich import print
from v2realbot.common.model import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.common.PrescribedTradeModel import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, safe_get#, print
from pathlib import Path
from v2realbot.config import WEB_API_KEY, DATA_DIR, MEDIA_DIRECTORY

View File

@ -11,7 +11,7 @@ import numpy as np
import v2realbot.controller.services as cs
from rich import print
from v2realbot.common.model import AnalyzerInputs
from v2realbot.common.model import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.common.PrescribedTradeModel import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, safe_get#, print
from pathlib import Path
from v2realbot.config import WEB_API_KEY, DATA_DIR, MEDIA_DIRECTORY
@ -23,7 +23,7 @@ from collections import defaultdict
from scipy.stats import zscore
from io import BytesIO
from typing import Tuple, Optional, List
from v2realbot.common.model import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.common.PrescribedTradeModel import TradeDirection, TradeStatus, Trade, TradeStoplossType
def load_trades(runner_ids: List = None, batch_id: str = None) -> Tuple[int, List[Trade], int]:
if runner_ids is None and batch_id is None:

View File

@ -10,7 +10,7 @@ from enum import Enum
import numpy as np
import v2realbot.controller.services as cs
from rich import print
from v2realbot.common.model import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.common.PrescribedTradeModel import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, safe_get, print
from pathlib import Path
from v2realbot.config import WEB_API_KEY, DATA_DIR, MEDIA_DIRECTORY

View File

@ -2,9 +2,9 @@ from uuid import UUID
from typing import Any, List, Tuple
from uuid import UUID, uuid4
from v2realbot.enums.enums import Moddus, SchedulerStatus, RecordType, StartBarAlign, Mode, Account, OrderSide
from v2realbot.common.model import RunManagerRecord, StrategyInstance, RunDay, StrategyInstance, Runner, RunRequest, RunArchive, RunArchiveView, RunArchiveViewPagination, RunArchiveDetail, RunArchiveChange, Bar, TradeEvent, TestList, Intervals, ConfigItem, InstantIndicator, DataTablesRequest, Market
from v2realbot.common.model import RunManagerRecord, StrategyInstance, RunDay, StrategyInstance, Runner, RunRequest, RunArchive, RunArchiveView, RunArchiveViewPagination, RunArchiveDetail, RunArchiveChange, Bar, TradeEvent, TestList, Intervals, ConfigItem, InstantIndicator, DataTablesRequest
from v2realbot.utils.utils import validate_and_format_time, AttributeDict, zoneNY, zonePRG, safe_get, dict_replace_value, Store, parse_toml_string, json_serial, is_open_hours, send_to_telegram, concatenate_weekdays, transform_data
from v2realbot.common.model import Trade, TradeDirection, TradeStatus, TradeStoplossType
from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeStatus, TradeStoplossType
from datetime import datetime
from v2realbot.config import JOB_LOG_FILE, STRATVARS_UNCHANGEABLES, ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, ACCOUNT1_LIVE_API_KEY, ACCOUNT1_LIVE_SECRET_KEY, DATA_DIR, MEDIA_DIRECTORY, RUNNER_DETAIL_DIRECTORY
import numpy as np
@ -116,8 +116,7 @@ def initialize_jobs(run_manager_records: RunManagerRecord = None):
scheduler.add_job(start_runman_record, start_trigger, id=f"scheduler_start_{record.id}", args=[record.id])
scheduler.add_job(stop_runman_record, stop_trigger, id=f"scheduler_stop_{record.id}", args=[record.id])
#scheduler.add_job(print_hello, 'interval', seconds=10, id=
# f"scheduler_testinterval")
#scheduler.add_job(print_hello, 'interval', seconds=10, id=f"scheduler_testinterval")
scheduled_jobs = scheduler.get_jobs()
print(f"APS jobs refreshed ({len(scheduled_jobs)})")
current_jobs_dict = format_apscheduler_jobs(scheduled_jobs)
@ -125,9 +124,9 @@ def initialize_jobs(run_manager_records: RunManagerRecord = None):
return 0, current_jobs_dict
#zastresovaci funkce resici error handling a printing
def start_runman_record(id: UUID, debug_date = None):
def start_runman_record(id: UUID, market = "US", debug_date = None):
record = None
res, record, msg = _start_runman_record(id=id, debug_date=debug_date)
res, record, msg = _start_runman_record(id=id, market=market, debug_date=debug_date)
if record is not None:
market_time_now = datetime.now().astimezone(zoneNY) if debug_date is None else debug_date
@ -166,8 +165,8 @@ def update_runman_record(record: RunManagerRecord):
err_msg= f"STOP: Error updating {record.id} errir {set} with values {record}"
return -2, err_msg#toto stopne zpracovani dalsich zaznamu pri chybe, zvazit continue
def stop_runman_record(id: UUID, debug_date = None):
res, record, msg = _stop_runman_record(id=id, debug_date=debug_date)
def stop_runman_record(id: UUID, market = "US", debug_date = None):
res, record, msg = _stop_runman_record(id=id, market=market, debug_date=debug_date)
#results : 0 - ok, -1 not running/already running/not specific, -2 error
#report vzdy zapiseme do history, pokud je record not None, pripadna chyba se stala po dotazeni recordu
@ -197,7 +196,7 @@ def stop_runman_record(id: UUID, debug_date = None):
print(f"STOP JOB: {id} FINISHED")
#start function that is called from the job
def _start_runman_record(id: UUID, debug_date = None):
def _start_runman_record(id: UUID, market = "US", debug_date = None):
print(f"Start scheduled record {id}")
record : RunManagerRecord = None
@ -208,16 +207,15 @@ def _start_runman_record(id: UUID, debug_date = None):
record = result
if record.market == Market.US or record.market == Market.CRYPTO:
res, sada = sch.get_todays_market_times(market=record.market, debug_date=debug_date)
if market is not None and market == "US":
res, sada = sch.get_todays_market_times(market=market, debug_date=debug_date)
if res == 0:
market_time_now, market_open_datetime, market_close_datetime = sada
print(f"OPEN:{market_open_datetime} CLOSE:{market_close_datetime}")
else:
sada = f"Market {record.market} Error getting market times (CLOSED): " + str(sada)
sada = f"Market {market} Error getting market times (CLOSED): " + str(sada)
return res, record, sada
else:
print("Market type is unknown.")
if cs.is_stratin_running(record.strat_id):
return -1, record, f"Stratin {record.strat_id} is already running"
@ -231,7 +229,7 @@ def _start_runman_record(id: UUID, debug_date = None):
return 0, record, record.runner_id
#stop function that is called from the job
def _stop_runman_record(id: UUID, debug_date = None):
def _stop_runman_record(id: UUID, market = "US", debug_date = None):
record = None
#get all records
print(f"Stopping record {id}")
@ -306,5 +304,5 @@ if __name__ == "__main__":
# print(f"CALL FINISHED, with {debug_date} RESULT: {res}, {result}")
res, result = stop_runman_record(id=id, debug_date = debug_date)
res, result = stop_runman_record(id=id, market = "US", debug_date = debug_date)
print(f"CALL FINISHED, with {debug_date} RESULT: {res}, {result}")

View File

@ -2,10 +2,10 @@ import json
import datetime
import v2realbot.controller.services as cs
import v2realbot.controller.run_manager as rm
from v2realbot.common.model import RunnerView, RunManagerRecord, StrategyInstance, Runner, RunRequest, Trade, RunArchive, RunArchiveView, RunArchiveViewPagination, RunArchiveDetail, Bar, RunArchiveChange, TestList, ConfigItem, InstantIndicator, DataTablesRequest, AnalyzerInputs, Market
from v2realbot.common.model import RunnerView, RunManagerRecord, StrategyInstance, Runner, RunRequest, Trade, RunArchive, RunArchiveView, RunArchiveViewPagination, RunArchiveDetail, Bar, RunArchiveChange, TestList, ConfigItem, InstantIndicator, DataTablesRequest, AnalyzerInputs
from uuid import uuid4, UUID
from v2realbot.utils.utils import json_serial, send_to_telegram, zoneNY, zonePRG, zoneUTC, fetch_calendar_data
from datetime import datetime, timedelta, time
from v2realbot.utils.utils import json_serial, send_to_telegram, zoneNY, zonePRG, fetch_calendar_data
from datetime import datetime, timedelta
from traceback import format_exc
from rich import print
import requests
@ -18,18 +18,9 @@ from v2realbot.config import WEB_API_KEY
#naplanovany jako samostatni job a triggerován pouze jednou v daný čas pro start a stop
#novy kod v aps_scheduler.py
def is_US_market_day(date):
cal_dates = fetch_calendar_data(date, date)
if len(cal_dates) == 0:
print("Today is not a market day.")
return False, cal_dates
else:
print("Market is open")
return True, cal_dates
def get_todays_market_times(market, debug_date = None):
def get_todays_market_times(market = "US", debug_date = None):
try:
if market == Market.US:
if market == "US":
#zjistit vsechny podminky - mozna loopovat - podminky jsou vlevo
if debug_date is not None:
nowNY = debug_date
@ -37,20 +28,17 @@ def get_todays_market_times(market, debug_date = None):
nowNY = datetime.now().astimezone(zoneNY)
nowNY_date = nowNY.date()
#is market open - nyni pouze US
stat, calendar_dates = is_US_market_day(nowNY_date)
if stat:
cal_dates = fetch_calendar_data(nowNY_date, nowNY_date)
if len(cal_dates) == 0:
print("No Market Day today")
return -1, "Market Closed"
#zatim podpora pouze main session
#pouze main session
market_open_datetime = zoneNY.localize(calendar_dates[0].open)
market_close_datetime = zoneNY.localize(calendar_dates[0].close)
return 0, (nowNY, market_open_datetime, market_close_datetime)
else:
return -1, "Market is closed."
elif market == Market.CRYPTO:
now_market_datetime = datetime.now().astimezone(zoneUTC)
market_open_datetime = datetime.combine(datetime.now(), time.min)
matket_close_datetime = datetime.combine(datetime.now(), time.max)
return 0, (now_market_datetime, market_open_datetime, matket_close_datetime)
market_open_datetime = zoneNY.localize(cal_dates[0].open)
market_close_datetime = zoneNY.localize(cal_dates[0].close)
return 0, (nowNY, market_open_datetime, market_close_datetime)
else:
return -1, "Market not supported"
except Exception as e:

View File

@ -131,29 +131,9 @@
<!-- <script type="text/javascript" src="https://cdnjs.cloudflare.com/ajax/libs/monaco-editor/0.41.0/min/vs/editor/editor.main.js"></script> -->
<!-- <script type="text/javascript" src="https://cdnjs.cloudflare.com/ajax/libs/monaco-editor/0.41.0/min/vs/loader.min.js"></script> -->
<script src="/static/js/systeminfo.js"> </script>
</head>
<body>
<div id="main" class="mainConteiner flex-container content">
<div id="system-info" class="flex-items">
<label data-bs-toggle="collapse" data-bs-target="#system-info-inner" aria-expanded="true">
<h4>System Info </h4>
</label>
<div id="system-info-inner" class="collapse">
<div id="system-info-output"></div>
<div id="graphical-output">
<div id="disk-gauge-container">
<span id="title"> Disk Space: </span>
<span id="free-space">Free: -- GB</span> |
<span id="total-space">Total: -- GB</span> |
<span id="used-percent">Used: -- %</span>
<div id="disk-gauge">
<div id="disk-gauge-bar"></div>
</div>
</div>
</div>
</div>
</div>
<div id="chartContainer" class="flex-items">
<label data-bs-toggle="collapse" data-bs-target="#chartContainerInner" aria-expanded="true">
<h4>Chart</h4>
@ -235,7 +215,7 @@
</div>
</div>
<div id="hist-trades" class="flex-items">
<div id="form-trades">
<div id="form-trades">
<label data-bs-toggle="collapse" data-bs-target="#trades-data">
<h4>Trade History</h4>
</label>
@ -250,7 +230,6 @@
<!-- <table id="trades-data-table" class="dataTable no-footer" style="width: 300px;display: contents;"></table> -->
</div>
</div>
<div id="runner-table" class="flex-items">
<label data-bs-toggle="collapse" data-bs-target="#runner-table-inner">
<h4>Running Strategies</h4>
@ -368,7 +347,6 @@
<th>testlist_id</th>
<th>Running</th>
<th>RunnerId</th>
<th>Market</th>
</tr>
</thead>
<tbody></tbody>
@ -689,14 +667,14 @@
</div>
<div class="form-group mt-3">
<label for="logHere" class="form-label">Log</label>
<div id="log-container"style="height:700px;border:1px solid black;">
<!-- <pre id="log-content"></pre> -->
<div id="log-container">
<pre id="log-content"></pre>
</div>
</div>
</div>
<div class="modal-footer">
<button type="button" class="btn btn-primary" id="logRefreshButton" value="Refresh">Refresh</button>
<button type="button" class="btn btn-secondary" id="closeLogModal" data-bs-dismiss="modal">Close</button>
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Close</button>
</div>
</div>
</div>
@ -1171,7 +1149,7 @@
<script src="/static/js/config.js?v=1.04"></script>
<!-- tady zacina polska docasna lokalizace -->
<!-- <script type="text/javascript" src="https://unpkg.com/lightweight-charts/dist/lightweight-charts.standalone.production.js"></script> -->
<script type="text/javascript" src="/static/js/libs/lightweightcharts/lightweight-charts.standalone.production413.js"></script>
<script type="text/javascript" src="/static/js/libs/lightweightcharts/lightweight-charts.standalone.production410.js"></script>
<script src="/static/js/dynamicbuttons.js?v=1.05"></script>
@ -1188,9 +1166,9 @@
<!-- <script src="/static/js/archivetables.js?v=1.05"></script> -->
<!-- archiveTables split into separate files -->
<script src="/static/js/tables/archivetable/init.js?v=1.12"></script>
<script src="/static/js/tables/archivetable/functions.js?v=1.11"></script>
<script src="/static/js/tables/archivetable/functions.js?v=1.10"></script>
<script src="/static/js/tables/archivetable/modals.js?v=1.07"></script>
<script src="/static/js/tables/archivetable/handlers.js?v=1.11"></script>
<script src="/static/js/tables/archivetable/handlers.js?v=1.09"></script>
<!-- Runmanager functionality -->
<script src="/static/js/tables/runmanager/init.js?v=1.1"></script>
@ -1200,7 +1178,7 @@
<script src="/static/js/livewebsocket.js?v=1.02"></script>
<script src="/static/js/realtimechart.js?v=1.02"></script>
<script src="/static/js/mytables.js?v=1.03"></script>
<script src="/static/js/mytables.js?v=1.02"></script>
<script src="/static/js/testlist.js?v=1.01"></script>
<script src="/static/js/ml.js?v=1.02"></script>
<script src="/static/js/common.js?v=1.01"></script>

View File

@ -11,44 +11,6 @@ var markersLine = null
var avgBuyLine = null
var profitLine = null
var slLine = []
//create function which for each ACCOUNT1, ACCOUNT2 or ACCOUNT3 returns color for buy and color for sell - which can be strings representing color
//HELPERS FUNCTION - will go to utils
/**
* Returns an object containing the colors for buy and sell for the specified account.
*
* Parameters:
* account (string): The account for which to retrieve the colors (ACCOUNT1, ACCOUNT2, or ACCOUNT3).
*
* Returns:
* object: An object with 'buy' and 'sell' properties containing the corresponding color strings.
*
* Account 1:
#FF6B6B, #FF9999
Account 2:
#4ECDC4, #83E8E1
Account 3:
#FFD93D, #FFE787
Account 4:
#6C5CE7, #A29BFE
Another option for colors:
#1F77B4 (Entry) and #AEC7E8 (Exit)
#FF7F0E (Entry) and #FFBB78 (Exit)
#2CA02C (Entry) and #98DF8A (Exit)
#D62728 (Entry) and #FF9896 (Exit)
*/
function getAccountColors(account) {
const accountColors = {
ACCOUNT1: { accid: 'A1', buy: '#FF7F0E', sell: '#FFBB78' },
ACCOUNT2: { accid: 'A2',buy: '#1F77B4', sell: '#AEC7E8' },
ACCOUNT3: { accid: 'A3',buy: '#2CA02C', sell: '#98DF8A' },
ACCOUNT4: { accid: 'A4',buy: '#D62728', sell: '#FF9896' },
ACCOUNT5: { accid: 'A5',buy: 'purple', sell: 'orange' }
};
return accountColors[account] || { buy: '#37cade', sell: 'red' };
}
//TRANSFORM object returned from REST API get_arch_run_detail
//to series and markers required by lightweigth chart
//input array object bars = { high: [1,2,3], time: [1,2,3], close: [2,2,2]...}
@ -72,11 +34,6 @@ function transform_data(data) {
//cas of first record, nekdy jsou stejny - musim pridat setinku
prev_cas = 0
if ((data.ext_data !== null) && (data.ext_data.sl_history)) {
///sort sl_history according to order id string - i need all same order id together
data.ext_data.sl_history.sort(function (a, b) {
return a.id.localeCompare(b.id);
});
data.ext_data.sl_history.forEach((histRecord, index, array) => {
//console.log("plnime")
@ -91,7 +48,6 @@ function transform_data(data) {
//init nova sada
sl_line_sada = []
sl_line_markers_sada = []
sline_color = "#f5aa42"
}
prev_id = histRecord.id
@ -109,21 +65,12 @@ function transform_data(data) {
sline = {}
sline["time"] = cas
sline["value"] = histRecord.sl_val
if (histRecord.account) {
const accColors = getAccountColors(histRecord.account)
sline_color = histRecord.direction == "long" ? accColors.buy : accColors.sell //idealne
sline["color"] = sline_color
}
sl_line_sada.push(sline)
//ZDE JSEM SKONCIL
//COLOR SE NASTAVUJE V SERIES OPTIONS POZDEJI - nejak vymyslet
sline_markers = {}
sline_markers["time"] = cas
sline_markers["position"] = "inBar"
sline_markers["color"] = sline_color
sline_markers["color"] = "#f5aa42"
//sline_markers["shape"] = "circle"
//console.log("SHOW_SL_DIGITS",SHOW_SL_DIGITS)
sline_markers["text"] = SHOW_SL_DIGITS ? histRecord.sl_val.toFixed(3) : ""
@ -292,33 +239,31 @@ function transform_data(data) {
// //a_markers["text"] = CHART_SHOW_TEXT ? trade.position_qty + "/" + parseFloat(trade.pos_avg_price).toFixed(3) :trade.position_qty
// avgp_markers.push(a_markers)
}
}
const { accid: accountId,buy: buyColor, sell: sellColor } = getAccountColors(trade.account);
}
//buy sell markery
marker = {}
marker["time"] = timestamp;
// marker["position"] = (trade.order.side == "buy") ? "belowBar" : "aboveBar"
marker["position"] = (trade.order.side == "buy") ? "aboveBar" : "aboveBar"
marker["color"] = (trade.order.side == "buy") ? buyColor : sellColor
marker["color"] = (trade.order.side == "buy") ? "#37cade" : "red"
//marker["shape"] = (trade.order.side == "buy") ? "arrowUp" : "arrowDown"
marker["shape"] = (trade.order.side == "buy") ? "arrowUp" : "arrowDown"
//marker["text"] = trade.qty + "/" + trade.price
qt_optimized = (trade.order.qty % 1000 === 0) ? (trade.order.qty / 1000).toFixed(1) + 'K' : trade.order.qty
marker["text"] = accountId + " " //account shortcut
if (CHART_SHOW_TEXT) {
//včetně qty
//marker["text"] = qt_optimized + "@" + trade.price
//bez qty
marker["text"] += trade.price
marker["text"] = trade.price
closed_trade_marker_and_profit = (trade.profit) ? "c" + trade.profit.toFixed(1) + "/" + trade.profit_sum.toFixed(1) : "c"
marker["text"] += (trade.position_qty == 0) ? closed_trade_marker_and_profit : ""
} else {
closed_trade_marker_and_profit = (trade.profit) ? "c" + trade.profit.toFixed(1) + "/" + trade.profit_sum.toFixed(1) : "c"
marker["text"] += (trade.position_qty == 0) ? closed_trade_marker_and_profit : trade.price.toFixed(3)
marker["text"] = (trade.position_qty == 0) ? closed_trade_marker_and_profit : trade.price.toFixed(3)
}
markers.push(marker)
@ -899,7 +844,7 @@ function display_buy_markers(data) {
//console.log("uvnitr")
slLine_temp = chart.addLineSeries({
// title: "avgpbuyline",
color: slRecord[0]["color"] ? slRecord[0]["color"] : '#e4c76d',
color: '#e4c76d',
// color: 'transparent',
lineWidth: 1,
lastValueVisible: false

View File

@ -90,55 +90,9 @@ $(document).ready(function () {
monaco.languages.register({ id: 'python' });
monaco.languages.register({ id: 'json' });
//Register mylogs language
monaco.languages.register({ id: 'mylogs' });
// Register the TOML language
monaco.languages.setLanguageConfiguration('mylogs', {
comments: {
lineComment: '//', // Adjust if your logs use a different comment symbol
},
brackets: [['[', ']'], ['{', '}']], // Array and object brackets
autoClosingPairs: [
{ open: '{', close: '}', notIn: ['string'] },
{ open: '"', close: '"', notIn: ['string', 'comment'] },
{ open: "'", close: "'", notIn: ['string', 'comment'] },
],
});
monaco.languages.setMonarchTokensProvider('mylogs', {
tokenizer: {
root: [
[/#.*/, 'comment'], // Comments (if applicable)
// Timestamps
[/\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}\.\d+/, 'timestamp'],
// Log Levels
[/\b(INFO|DEBUG|WARNING|ERROR|CRITICAL)\b/, 'log-level'],
// Strings
[/".*"/, 'string'],
[/'.*'/, 'string'],
// Key-Value Pairs
[/[A-Za-z_]+\s*:/, 'key'],
[/-?\d+\.\d+/, 'number.float'], // Floating-point
[/-?\d+/, 'number.integer'], // Integers
[/\btrue\b/, 'boolean.true'],
[/\bfalse\b/, 'boolean.false'],
// Other Words and Symbols
[/[A-Za-z_]+/, 'identifier'],
[/[ \t\r\n]+/, 'white'],
[/[\[\]{}(),]/, 'delimiter'], // Expand if more delimiters exist
]
}
});
monaco.languages.register({ id: 'toml' });
// Define the TOML language configuration
monaco.languages.setLanguageConfiguration('toml', {
comments: {

View File

@ -1,31 +0,0 @@
function get_system_info() {
console.log('Button get system status clicked')
$.ajax({
url: '/system-info',
type: 'GET',
beforeSend: function (xhr) {
xhr.setRequestHeader('X-API-Key',
API_KEY); },
success: function(response) {
$.each(response, function(index, item) {
if (index=="disk_space") {
$('#disk-gauge-bar').css('width', response.disk_space.used_percentage + '%');
$('#free-space').text('Free: ' + response.disk_space.free + ' GB');
$('#total-space').text('Total: ' + response.disk_space.total + ' GB');
$('#used-percent').text('Used: ' + response.disk_space.used_percentage + '%');
} else {
var formatted_item = JSON.stringify(item, null, 4)
$('#system-info-output').append('<p>' + index + ': ' + formatted_item + '</p>');
}
});
},
error: function(xhr, status, error) {
$('#disk-gauge-bar').html('An error occurred: ' + error + xhr.responseText + status);
}
});
}
$(document).ready(function(){
get_system_info()
});

View File

@ -6,7 +6,6 @@ let editor_diff_arch1
let editor_diff_arch2
var archData = null
var batchHeaders = []
var editorLog = null
function refresh_arch_and_callback(row, callback) {
//console.log("entering refresh")
@ -473,34 +472,13 @@ function refresh_logfile() {
contentType: "application/json",
dataType: "json",
success:function(response){
if (editorLog) {
editorLog.dispose();
}
if (response.lines.length == 0) {
value = "no records";
// $('#log-content').html("no records");
$('#log-content').html("no records");
}
else {
//console.log(response.lines)
//var escapedLines = response.lines.map(line => escapeHtml(line));
value = response.lines.join('\n')
// $('#log-content').html(escapedLines.join('\n'));
}
require(["vs/editor/editor.main"], () => {
editorLog = monaco.editor.create(document.getElementById('log-container'), {
value: value,
language: 'mylogs',
theme: 'tomlTheme-dark',
automaticLayout: true,
readOnly: true
});
});
// Focus at the end of the file:
const model = editorLog.getModel();
const lastLineNumber = model.getLineCount();
const lastLineColumn = model.getLineMaxColumn(lastLineNumber);
editorLog.setPosition({ lineNumber: lastLineNumber, column: lastLineColumn });
editorLog.revealPosition({ lineNumber: lastLineNumber, column: lastLineColumn });
var escapedLines = response.lines.map(line => escapeHtml(line));
$('#log-content').html(escapedLines.join('\n'));
}
},
error: function(xhr, status, error) {
var err = eval("(" + xhr.responseText + ")");

View File

@ -265,8 +265,8 @@ $(document).ready(function () {
$('#diff_first').text(record1.name);
$('#diff_second').text(record2.name);
$('#diff_first_id').text(data1.id + ' Batch: ' + data1.batch_id);
$('#diff_second_id').text(data2.id + ' Batch: ' + data2.batch_id);
$('#diff_first_id').text(data1.id);
$('#diff_second_id').text(data2.id);
//monaco
require(["vs/editor/editor.main"], () => {
@ -358,13 +358,8 @@ $(document).ready(function () {
})
});
$('#closeLogModal').click(function () {
editorLog.dispose()
});
//button to query log
$('#logRefreshButton').click(function () {
editorLog.dispose()
refresh_logfile()
});

View File

@ -172,7 +172,7 @@ function initialize_archiveRecords() {
{
targets: [13,14,15],
render: function ( data, type, row ) {
return '<div class="tdsmall">'+JSON.stringify(data, null, 2)+'</div>'
return '<div class="tdsmall">'+data+'</div>'
},
},
{

View File

@ -45,8 +45,7 @@ function initialize_runmanagerRecords() {
{data: 'valid_to', visible: true},
{data: 'testlist_id', visible: true},
{data: 'strat_running', visible: true},
{data: 'runner_id', visible: true},
{data: 'market', visible: true},
{data: 'runner_id', visible: true},
],
paging: true,
processing: true,

View File

@ -371,10 +371,9 @@ function initialize_chart() {
}
chart = LightweightCharts.createChart(document.getElementById('chart'), chartOptions);
chart.applyOptions({ timeScale: { visible: true, timeVisible: true, secondsVisible: true, minBarSpacing: 0.003}, crosshair: {
chart.applyOptions({ timeScale: { visible: true, timeVisible: true, secondsVisible: true }, crosshair: {
mode: LightweightCharts.CrosshairMode.Normal, labelVisible: true
}})
console.log("chart intiialized")
}
//mozna atributy last value visible

View File

@ -994,24 +994,3 @@ pre {
#datepicker:disabled {
background-color: #f2f2f2;
}
#disk-gauge-container {
text-align: center;
width: 400px;
}
#disk-gauge {
width: 100%;
height: 20px;
background-color: #ddd;
border-radius: 10px;
overflow: hidden;
}
#disk-gauge-bar {
height: 100%;
background-color: #4285F4;
width: 0%; /* Initial state */
border-radius: 10px;
}

View File

@ -4,7 +4,7 @@ from v2realbot.utils.tlog import tlog, tlog_exception
from v2realbot.enums.enums import Mode, Order, Account, RecordType, Followup
#from alpaca.trading.models import TradeUpdate
from v2realbot.common.model import TradeUpdate
from v2realbot.common.model import Trade, TradeDirection, TradeStatus
from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeStatus
from alpaca.trading.enums import TradeEvent, OrderStatus
from v2realbot.indicators.indicators import ema
import orjson
@ -35,74 +35,47 @@ class StrategyClassicSL(Strategy):
max_sum_profit_to_quit_rel = safe_get(self.state.vars, "max_sum_profit_to_quit_rel", None)
max_sum_loss_to_quit_rel = safe_get(self.state.vars, "max_sum_loss_to_quit_rel", None)
#load typ direktivy hard/soft cutoff
hard_cutoff = safe_get(self.state.vars, "hard_cutoff", False)
rel_profit = round(float(np.sum(self.state.rel_profit_cum)),5)
if max_sum_profit_to_quit_rel is not None:
if rel_profit >= float(max_sum_profit_to_quit_rel):
msg = f"QUITTING {hard_cutoff=} MAX SUM REL PROFIT REACHED {max_sum_profit_to_quit_rel=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}"
printanyway(msg)
self.state.ilog(e=msg)
for account in self.accounts:
self.state.account_variables[account.name].pending = "max_sum_profit_to_quit_rel"
self.state.ilog(e=f"QUITTING MAX SUM REL PROFIT REACHED {max_sum_profit_to_quit_rel=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}")
self.state.vars.pending = "max_sum_profit_to_quit_rel"
if self.mode not in [Mode.BT, Mode.PREP]:
send_to_telegram(msg)
if hard_cutoff:
self.hard_stop = True
else:
self.soft_stop = True
send_to_telegram(f"QUITTING MAX SUM REL PROFIT REACHED {max_sum_profit_to_quit_rel=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}")
self.signal_stop = True
return True
if max_sum_loss_to_quit_rel is not None:
if rel_profit < 0 and rel_profit <= float(max_sum_loss_to_quit_rel):
msg=f"QUITTING {hard_cutoff=} MAX SUM REL LOSS REACHED {max_sum_loss_to_quit_rel=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}"
printanyway(msg)
self.state.ilog(e=msg)
for account in self.accounts:
self.state.account_variables[account.name].pending = "max_sum_loss_to_quit_rel"
self.state.ilog(e=f"QUITTING MAX SUM REL LOSS REACHED {max_sum_loss_to_quit_rel=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}")
self.state.vars.pending = "max_sum_loss_to_quit_rel"
if self.mode not in [Mode.BT, Mode.PREP]:
send_to_telegram(msg)
if hard_cutoff:
self.hard_stop = True
else:
self.soft_stop = True
send_to_telegram(f"QUITTING MAX SUM REL LOSS REACHED {max_sum_loss_to_quit_rel=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}")
self.signal_stop = True
return True
if max_sum_profit_to_quit is not None:
if float(self.state.profit) >= float(max_sum_profit_to_quit):
msg = f"QUITTING {hard_cutoff=} MAX SUM ABS PROFIT REACHED {max_sum_profit_to_quit=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}"
printanyway(msg)
self.state.ilog(e=msg)
for account in self.accounts:
self.state.account_variables[account.name].pending = "max_sum_profit_to_quit"
self.state.ilog(e=f"QUITTING MAX SUM ABS PROFIT REACHED {max_sum_profit_to_quit=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}")
self.state.vars.pending = "max_sum_profit_to_quit"
if self.mode not in [Mode.BT, Mode.PREP]:
send_to_telegram(msg)
if hard_cutoff:
self.hard_stop = True
else:
self.soft_stop = True
send_to_telegram(f"QUITTING MAX SUM ABS PROFIT REACHED {max_sum_profit_to_quit=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}")
self.signal_stop = True
return True
if max_sum_loss_to_quit is not None:
if float(self.state.profit) < 0 and float(self.state.profit) <= float(max_sum_loss_to_quit):
msg = f"QUITTING {hard_cutoff=} MAX SUM ABS LOSS REACHED {max_sum_loss_to_quit=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}"
printanyway(msg)
self.state.ilog(e=msg)
for account in self.accounts:
self.state.account_variables[account.name].pending = "max_sum_loss_to_quit"
self.state.ilog(e=f"QUITTING MAX SUM ABS LOSS REACHED {max_sum_loss_to_quit=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}")
self.state.vars.pending = "max_sum_loss_to_quit"
if self.mode not in [Mode.BT, Mode.PREP]:
send_to_telegram(msg)
if hard_cutoff:
self.hard_stop = True
else:
self.soft_stop = True
send_to_telegram(f"QUITTING MAX SUM ABS LOSS REACHED {max_sum_loss_to_quit=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}")
self.signal_stop = True
return True
return False
async def add_followup(self, direction: TradeDirection, size: int, signal_name: str, account: Account):
async def add_followup(self, direction: TradeDirection, size: int, signal_name: str):
trade_to_add = Trade(
id=uuid4(),
account=account,
last_update=datetime.fromtimestamp(self.state.time).astimezone(zoneNY),
status=TradeStatus.READY,
size=size,
@ -113,48 +86,45 @@ class StrategyClassicSL(Strategy):
self.state.vars.prescribedTrades.append(trade_to_add)
self.state.account_variables[account.name].requested_followup = None
self.state.vars.requested_followup = None
self.state.ilog(e=f"FOLLOWUP {direction} - {account} added to prescr.trades {signal_name=} {size=}", trade=trade_to_add)
self.state.ilog(e=f"FOLLOWUP {direction} added to prescr.trades {signal_name=} {size=}", trade=trade_to_add)
async def orderUpdateBuy(self, data: TradeUpdate):
o: Order = data.order
signal_name = None
account = data.account
##nejak to vymyslet, aby se dal poslat cely Trade a serializoval se
self.state.ilog(e="Příchozí BUY notif"+account, msg=o.status, trade=transform_data(data, json_serial))
self.state.ilog(e="Příchozí BUY notif", msg=o.status, trade=transform_data(data, json_serial))
if data.event == TradeEvent.FILL or data.event == TradeEvent.PARTIAL_FILL:
#pokud jde o fill pred kterym je partail, muze se stat, ze uz budou vynulovany pozice, toto je pojistka
#jde o uzavření short pozice - počítáme PROFIT
if int(self.state.account_variables[account.name].positions) < 0 or (int(self.state.account_variables[account.name].positions) == 0 and self.state.account_variables[account.name].wait_for_fill is not None):
if int(self.state.positions) < 0 or (int(self.state.positions) == 0 and self.state.wait_for_fill is not None):
if data.event == TradeEvent.PARTIAL_FILL and self.state.account_variables[account.name].wait_for_fill is None:
if data.event == TradeEvent.PARTIAL_FILL and self.state.wait_for_fill is None:
#timto si oznacime, ze po partialu s vlivem na PROFIT musime cekat na FILL a zaroven ukladame prum cenu, kterou potrebujeme na vypocet profitu u fillu
self.state.account_variables[account.name].wait_for_fill = float(self.state.account_variables[account.name].avgp)
self.state.wait_for_fill = float(self.state.avgp)
#PROFIT pocitame z TradeUpdate.price a TradeUpdate.qty - aktualne provedene mnozstvi a cena
#naklady vypocteme z prumerne ceny, kterou mame v pozicich
bought_amount = data.qty * data.price
#podle prumerne vstupni ceny, kolik stalo toto mnozstvi
if float(self.state.account_variables[account.name].avgp) > 0:
vstup_cena = float(self.state.account_variables[account.name].avgp)
elif float(self.state.account_variables[account.name].avgp) == 0 and self.state.account_variables[account.name].wait_for_fill is not None:
vstup_cena = float(self.state.account_variables[account.name].wait_for_fill)
if float(self.state.avgp) > 0:
vstup_cena = float(self.state.avgp)
elif float(self.state.avgp) == 0 and self.state.wait_for_fill is not None:
vstup_cena = float(self.state.wait_for_fill)
else:
vstup_cena = 0
avg_costs = vstup_cena * float(data.qty)
if avg_costs == 0:
self.state.ilog(e="ERR: Nemame naklady na PROFIT, AVGP je nula. Zaznamenano jako 0"+account, msg="naklady=utrzena cena. TBD opravit.")
self.state.ilog(e="ERR: Nemame naklady na PROFIT, AVGP je nula. Zaznamenano jako 0", msg="naklady=utrzena cena. TBD opravit.")
avg_costs = bought_amount
trade_profit = round((avg_costs-bought_amount),2)
#celkovy profit
self.state.profit += trade_profit #overall abs profit
self.state.account_variables[account.name].profit += trade_profit #account profit
self.state.profit += trade_profit
rel_profit = 0
#spoctene celkovy relativni profit za trade v procentech ((trade_profit/vstup_naklady)*100)
@ -168,32 +138,30 @@ class StrategyClassicSL(Strategy):
if data.event == TradeEvent.FILL:
#jde o partial EXIT dvááme si rel.profit do docasne promenne, po poslednim exitu z nich vypocteme skutecny rel.profit
if data.position_qty != 0:
self.state.account_variables[account.name].docasny_rel_profit.append(rel_profit)
self.state.docasny_rel_profit.append(rel_profit)
partial_exit = True
else:
#jde o posledni z PARTIAL EXITU tzn.data.position_qty == 0
if len(self.state.account_variables[account.name].docasny_rel_profit) > 0:
if len(self.state.docasny_rel_profit) > 0:
#pricteme aktualni rel profit
self.state.account_variables[account.name].docasny_rel_profit.append(rel_profit)
self.state.docasny_rel_profit.append(rel_profit)
#a z rel profitu tohoto tradu vypocteme prumer, ktery teprve ulozime
rel_profit = round(np.mean(self.state.account_variables[account.name].docasny_rel_profit),5)
self.state.account_variables[account.name].docasny_rel_profit = []
rel_profit = round(np.mean(self.state.docasny_rel_profit),5)
self.state.docasny_rel_profit = []
partial_last = True
self.state.rel_profit_cum.append(rel_profit) #overall cum rel profit
self.state.account_variables[account.name].rel_profit_cum.append(rel_profit) #account cum rel profit
self.state.rel_profit_cum.append(rel_profit)
rel_profit_cum_calculated = round(np.sum(self.state.rel_profit_cum),5)
#pro martingale updatujeme loss_series_cnt -
#pro martingale updatujeme loss_series_cnt
self.state.vars["transferables"]["martingale"]["cont_loss_series_cnt"] = 0 if rel_profit > 0 else self.state.vars["transferables"]["martingale"]["cont_loss_series_cnt"]+1
self.state.ilog(lvl=1, e=f"update cont_loss_series_cnt na {self.state.vars['transferables']['martingale']['cont_loss_series_cnt']}")
self.state.ilog(e=f"BUY notif {account} - SHORT PROFIT: {partial_exit=} {partial_last=} {round(float(trade_profit),3)} celkem abs:{round(float(self.state.profit),3)} rel:{float(rel_profit)} rel_cum:{round(rel_profit_cum_calculated,7)}", msg=str(data.event), rel_profit_cum=str(self.state.rel_profit_cum), bought_amount=bought_amount, avg_costs=avg_costs, trade_qty=data.qty, trade_price=data.price, orderid=str(data.order.id))
self.state.ilog(e=f"BUY notif - SHORT PROFIT: {partial_exit=} {partial_last=} {round(float(trade_profit),3)} celkem:{round(float(self.state.profit),3)} rel:{float(rel_profit)} rel_cum:{round(rel_profit_cum_calculated,7)}", msg=str(data.event), rel_profit_cum=str(self.state.rel_profit_cum), bought_amount=bought_amount, avg_costs=avg_costs, trade_qty=data.qty, trade_price=data.price, orderid=str(data.order.id))
#zapsat profit do prescr.trades
for trade in self.state.vars.prescribedTrades:
if trade.id == self.state.account_variables[account.name].pending:
if trade.id == self.state.vars.pending:
trade.last_update = datetime.fromtimestamp(self.state.time).astimezone(zoneNY)
trade.profit += trade_profit
#pro ulozeni do tradeData scitame vsechen zisk z tohoto tradu (kvuli partialum)
@ -211,7 +179,7 @@ class StrategyClassicSL(Strategy):
if data.event == TradeEvent.FILL:
#mazeme self.state.
self.state.account_variables[account.name].wait_for_fill = None
self.state.wait_for_fill = None
#zapsat update profitu do tradeList
for tradeData in self.state.tradeList:
if tradeData.execution_id == data.execution_id:
@ -219,7 +187,7 @@ class StrategyClassicSL(Strategy):
setattr(tradeData, "profit", trade_profit)
setattr(tradeData, "profit_sum", self.state.profit)
setattr(tradeData, "signal_name", signal_name)
setattr(tradeData, "prescribed_trade_id", self.state.account_variables[account.name].pending)
setattr(tradeData, "prescribed_trade_id", self.state.vars.pending)
#self.state.ilog(f"updatnut tradeList o profit", tradeData=orjson.loads(orjson.dumps(tradeData, default=json_serial, option=orjson.OPT_PASSTHROUGH_DATETIME)))
setattr(tradeData, "rel_profit", rel_profit)
setattr(tradeData, "rel_profit_cum", rel_profit_cum_calculated)
@ -230,16 +198,16 @@ class StrategyClassicSL(Strategy):
#pIF REVERSAL REQUIRED - reverse position is added to prescr.Trades with same signal name
#jen při celém FILLU
if data.event == TradeEvent.FILL and self.state.account_variables[account.name].requested_followup is not None:
if self.state.account_variables[account.name].requested_followup == Followup.REVERSE:
await self.add_followup(direction=TradeDirection.LONG, size=o.qty, signal_name=signal_name, account=account)
elif self.state.account_variables[account.name].requested_followup == Followup.ADD:
if data.event == TradeEvent.FILL and self.state.vars.requested_followup is not None:
if self.state.vars.requested_followup == Followup.REVERSE:
await self.add_followup(direction=TradeDirection.LONG, size=o.qty, signal_name=signal_name)
elif self.state.vars.requested_followup == Followup.ADD:
#zatim stejna SIZE
await self.add_followup(direction=TradeDirection.SHORT, size=o.qty, signal_name=signal_name, account=account)
await self.add_followup(direction=TradeDirection.SHORT, size=o.qty, signal_name=signal_name)
else:
#zjistime nazev signalu a updatneme do tradeListu - abychom meli svazano
for trade in self.state.vars.prescribedTrades:
if trade.id == self.state.account_variables[account.name].pending:
if trade.id == self.state.vars.pending:
signal_name = trade.generated_by
#zapiseme entry_time (jen pokud to neni partial add) - tzn. jen poprvé
if data.event == TradeEvent.FILL and trade.entry_time is None:
@ -249,7 +217,7 @@ class StrategyClassicSL(Strategy):
for tradeData in self.state.tradeList:
if tradeData.execution_id == data.execution_id:
setattr(tradeData, "signal_name", signal_name)
setattr(tradeData, "prescribed_trade_id", self.state.account_variables[account.name].pending)
setattr(tradeData, "prescribed_trade_id", self.state.vars.pending)
self.state.ilog(e="BUY: Jde o LONG nakuú nepocitame profit zatim")
@ -258,47 +226,43 @@ class StrategyClassicSL(Strategy):
self.state.last_entry_price["long"] = data.price
#pokud neni nastaveno goal_price tak vyplnujeme defaultem
if self.state.account_variables[account.name].activeTrade.goal_price is None:
if self.state.vars.activeTrade.goal_price is None:
dat = dict(close=data.price)
self.state.account_variables[account.name].activeTrade.goal_price = get_profit_target_price(self.state, dat, self.state.account_variables[account.name].activeTrade, TradeDirection.LONG)
self.state.vars.activeTrade.goal_price = get_profit_target_price(self.state, dat, TradeDirection.LONG)
#ic("vstupujeme do orderupdatebuy")
print(data)
#dostavame zde i celkové akutální množství - ukládáme
self.state.account_variables[account.name].positions = data.position_qty
self.state.account_variables[account.name].avgp, self.state.account_variables[account.name].positions = self.state.interface[account.name].pos()
self.state.positions = data.position_qty
self.state.avgp, self.state.positions = self.state.interface.pos()
if o.status == OrderStatus.FILLED or o.status == OrderStatus.CANCELED:
#davame pryc pending
self.state.account_variables[account.name].pending = None
self.state.vars.pending = None
async def orderUpdateSell(self, data: TradeUpdate):
account = data.account
#TODO tady jsem skoncil, pak projit vsechen kod na state.avgp a prehodit
self.state.ilog(e=f"Příchozí SELL notif - {account}", msg=data.order.status, trade=transform_data(data, json_serial))
async def orderUpdateSell(self, data: TradeUpdate):
self.state.ilog(e="Příchozí SELL notif", msg=data.order.status, trade=transform_data(data, json_serial))
#naklady vypocteme z prumerne ceny, kterou mame v pozicich
if data.event == TradeEvent.FILL or data.event == TradeEvent.PARTIAL_FILL:
#pokud jde o fill pred kterym je partail, muze se stat, ze uz budou vynulovany pozice, toto je pojistka
#jde o uzavření long pozice - počítáme PROFIT
if int(self.state.account_variables[account.name].positions) > 0 or (int(self.state.account_variables[account.name].positions) == 0 and self.state.account_variables[account.name].wait_for_fill is not None):
if int(self.state.positions) > 0 or (int(self.state.positions) == 0 and self.state.wait_for_fill is not None):
if data.event == TradeEvent.PARTIAL_FILL and self.state.account_variables[account.name].wait_for_fill is None:
if data.event == TradeEvent.PARTIAL_FILL and self.state.wait_for_fill is None:
#timto si oznacime, ze po partialu s vlivem na PROFIT musime cekat na FILL a zaroven ukladame prum cenu, kterou potrebujeme na vypocet profitu u fillu
self.state.account_variables[account.name].wait_for_fill = float(self.state.account_variables[account.name].avgp)
self.state.wait_for_fill = float(self.state.avgp)
#PROFIT pocitame z TradeUpdate.price a TradeUpdate.qty - aktualne provedene mnozstvi a cena
#naklady vypocteme z prumerne ceny, kterou mame v pozicich
sold_amount = data.qty * data.price
if float(self.state.account_variables[account.name].avgp) > 0:
vstup_cena = float(self.state.account_variables[account.name].avgp)
elif float(self.state.account_variables[account.name].avgp) == 0 and self.state.account_variables[account.name].wait_for_fill is not None:
vstup_cena = float(self.state.account_variables[account.name].wait_for_fill)
if float(self.state.avgp) > 0:
vstup_cena = float(self.state.avgp)
elif float(self.state.avgp) == 0 and self.state.wait_for_fill is not None:
vstup_cena = float(self.state.wait_for_fill)
else:
vstup_cena = 0
@ -306,13 +270,12 @@ class StrategyClassicSL(Strategy):
avg_costs = vstup_cena * float(data.qty)
if avg_costs == 0:
self.state.ilog(e=f"ERR: {account} Nemame naklady na PROFIT, AVGP je nula. Zaznamenano jako 0", msg="naklady=utrzena cena. TBD opravit.")
self.state.ilog(e="ERR: Nemame naklady na PROFIT, AVGP je nula. Zaznamenano jako 0", msg="naklady=utrzena cena. TBD opravit.")
avg_costs = sold_amount
trade_profit = round((sold_amount - avg_costs),2)
self.state.profit += trade_profit #celkový profit
self.state.account_variables[account.name].profit += trade_profit #account specific profit
self.state.profit += trade_profit
rel_profit = 0
#spoctene celkovy relativni profit za trade v procentech ((trade_profit/vstup_naklady)*100)
if vstup_cena != 0 and data.order.qty != 0:
@ -324,31 +287,30 @@ class StrategyClassicSL(Strategy):
if data.event == TradeEvent.FILL:
#jde o partial EXIT dvááme si rel.profit do docasne promenne, po poslednim exitu z nich vypocteme skutecny rel.profit
if data.position_qty != 0:
self.state.account_variables[account.name].docasny_rel_profit.append(rel_profit)
self.state.docasny_rel_profit.append(rel_profit)
partial_exit = True
else:
#jde o posledni z PARTIAL EXITU tzn.data.position_qty == 0
if len(self.state.account_variables[account.name].docasny_rel_profit) > 0:
if len(self.state.docasny_rel_profit) > 0:
#pricteme aktualni rel profit
self.state.account_variables[account.name].docasny_rel_profit.append(rel_profit)
self.state.docasny_rel_profit.append(rel_profit)
#a z rel profitu tohoto tradu vypocteme prumer, ktery teprve ulozime
rel_profit = round(np.mean(self.state.account_variables[account.name].docasny_rel_profit),5)
self.state.account_variables[account.name].docasny_rel_profit = []
rel_profit = round(np.mean(self.state.docasny_rel_profit),5)
self.state.docasny_rel_profit = []
partial_last = True
self.state.rel_profit_cum.append(rel_profit) #overall rel profit
self.state.account_variables[account.name].rel_profit_cum.append(rel_profit) #account cum rel profit
self.state.rel_profit_cum.append(rel_profit)
rel_profit_cum_calculated = round(np.sum(self.state.rel_profit_cum),5)
#pro martingale updatujeme loss_series_cnt
self.state.vars["transferables"]["martingale"]["cont_loss_series_cnt"] = 0 if rel_profit > 0 else self.state.vars["transferables"]["martingale"]["cont_loss_series_cnt"]+1
self.state.ilog(lvl=1, e=f"update cont_loss_series_cnt na {self.state.vars['transferables']['martingale']['cont_loss_series_cnt']}")
self.state.ilog(e=f"SELL notif {account.name}- LONG PROFIT {partial_exit=} {partial_last=}:{round(float(trade_profit),3)} celkem:{round(float(self.state.profit),3)} rel:{float(rel_profit)} rel_cum:{round(rel_profit_cum_calculated,7)}", msg=str(data.event), rel_profit_cum = str(self.state.rel_profit_cum), sold_amount=sold_amount, avg_costs=avg_costs, trade_qty=data.qty, trade_price=data.price, orderid=str(data.order.id))
self.state.ilog(e=f"SELL notif - LONG PROFIT {partial_exit=} {partial_last=}:{round(float(trade_profit),3)} celkem:{round(float(self.state.profit),3)} rel:{float(rel_profit)} rel_cum:{round(rel_profit_cum_calculated,7)}", msg=str(data.event), rel_profit_cum = str(self.state.rel_profit_cum), sold_amount=sold_amount, avg_costs=avg_costs, trade_qty=data.qty, trade_price=data.price, orderid=str(data.order.id))
#zapsat profit do prescr.trades
for trade in self.state.vars.prescribedTrades:
if trade.id == self.state.account_variables[account.name].pending:
if trade.id == self.state.vars.pending:
trade.last_update = datetime.fromtimestamp(self.state.time).astimezone(zoneNY)
trade.profit += trade_profit
#pro ulozeni do tradeData scitame vsechen zisk z tohoto tradu (kvuli partialum)
@ -365,7 +327,7 @@ class StrategyClassicSL(Strategy):
if data.event == TradeEvent.FILL:
#mazeme self.state.
self.state.account_variables[account.name].wait_for_fill = None
self.state.wait_for_fill = None
#zapsat update profitu do tradeList
for tradeData in self.state.tradeList:
if tradeData.execution_id == data.execution_id:
@ -373,7 +335,7 @@ class StrategyClassicSL(Strategy):
setattr(tradeData, "profit", trade_profit)
setattr(tradeData, "profit_sum", self.state.profit)
setattr(tradeData, "signal_name", signal_name)
setattr(tradeData, "prescribed_trade_id", self.state.account_variables[account.name].pending)
setattr(tradeData, "prescribed_trade_id", self.state.vars.pending)
#self.state.ilog(f"updatnut tradeList o profi {str(tradeData)}")
setattr(tradeData, "rel_profit", rel_profit)
setattr(tradeData, "rel_profit_cum", rel_profit_cum_calculated)
@ -383,17 +345,17 @@ class StrategyClassicSL(Strategy):
if data.event == TradeEvent.FILL and await self.stop_when_max_profit_loss() is False:
#IF REVERSAL REQUIRED - reverse position is added to prescr.Trades with same signal name
if data.event == TradeEvent.FILL and self.state.account_variables[account.name].requested_followup is not None:
if self.state.account_variables[account.name].requested_followup == Followup.REVERSE:
if data.event == TradeEvent.FILL and self.state.vars.requested_followup is not None:
if self.state.vars.requested_followup == Followup.REVERSE:
await self.add_followup(direction=TradeDirection.SHORT, size=data.order.qty, signal_name=signal_name)
elif self.state.account_variables[account.name].requested_followup == Followup.ADD:
elif self.state.vars.requested_followup == Followup.ADD:
#zatim stejna SIZE
await self.add_followup(direction=TradeDirection.LONG, size=data.order.qty, signal_name=signal_name)
else:
#zjistime nazev signalu a updatneme do tradeListu - abychom meli svazano
for trade in self.state.vars.prescribedTrades:
if trade.id == self.state.account_variables[account.name].pending:
if trade.id == self.state.vars.pending:
signal_name = trade.generated_by
#zapiseme entry_time (jen pokud to neni partial add) - tzn. jen poprvé
if data.event == TradeEvent.FILL and trade.entry_time is None:
@ -403,7 +365,7 @@ class StrategyClassicSL(Strategy):
for tradeData in self.state.tradeList:
if tradeData.execution_id == data.execution_id:
setattr(tradeData, "signal_name", signal_name)
setattr(tradeData, "prescribed_trade_id", self.state.account_variables[account.name].pending)
setattr(tradeData, "prescribed_trade_id", self.state.vars.pending)
self.state.ilog(e="SELL: Jde o SHORT nepocitame profit zatim")
@ -411,32 +373,32 @@ class StrategyClassicSL(Strategy):
#zapisujeme last entry price
self.state.last_entry_price["short"] = data.price
#pokud neni nastaveno goal_price tak vyplnujeme defaultem
if self.state.account_variables[account.name].activeTrade.goal_price is None:
if self.state.vars.activeTrade.goal_price is None:
dat = dict(close=data.price)
self.state.account_variables[account.name].activeTrade.goal_price = get_profit_target_price(self.state, dat, self.state.account_variables[account.name].activeTrade, TradeDirection.SHORT)
self.state.vars.activeTrade.goal_price = get_profit_target_price(self.state, dat, TradeDirection.SHORT)
#sem v budoucnu dat i update SL
#if self.state.vars.activeTrade.stoploss_value is None:
#update pozic, v trade update je i pocet zbylych pozic
old_avgp = self.state.account_variables[account.name].avgp
old_pos = self.state.account_variables[account.name].positions
self.state.account_variables[account.name].positions = int(data.position_qty)
old_avgp = self.state.avgp
old_pos = self.state.positions
self.state.positions = int(data.position_qty)
if int(data.position_qty) == 0:
self.state.account_variables[account.name].avgp = 0
self.state.avgp = 0
self.state.ilog(e="SELL notifikace "+str(data.order.status), msg="update pozic", old_avgp=old_avgp, old_pos=old_pos, avgp=self.state.account_variables[account.name].avgp, pos=self.state.account_variables[account.name].positions, orderid=str(data.order.id))
#self.state.account_variables[account.name].avgp, self.state.account_variables[account.name].positions = self.interface.pos()
self.state.ilog(e="SELL notifikace "+str(data.order.status), msg="update pozic", old_avgp=old_avgp, old_pos=old_pos, avgp=self.state.avgp, pos=self.state.positions, orderid=str(data.order.id))
#self.state.avgp, self.state.positions = self.interface.pos()
if data.event == TradeEvent.FILL or data.event == TradeEvent.CANCELED:
print("Příchozí SELL notifikace - complete FILL nebo CANCEL", data.event)
self.state.account_variables[account.name].pending = None
a,p = self.interface[account.name].pos() #TBD maybe optimize for speed
self.state.vars.pending = None
a,p = self.interface.pos()
#pri chybe api nechavame puvodni hodnoty
if a != -1:
self.state.account_variables[account.name].avgp, self.state.account_variables[account.name].positions = a,p
self.state.avgp, self.state.positions = a,p
else: self.state.ilog(e=f"Chyba pri dotažení self.interface.pos() {a}")
#ic(self.state.account_variables[account.name].avgp, self.state.account_variables[account.name].positions)
#ic(self.state.avgp, self.state.positions)
#this parent method is called by strategy just once before waiting for first data
def strat_init(self):
@ -452,52 +414,53 @@ class StrategyClassicSL(Strategy):
populate_all_indicators(item, self.state)
#pro přípravu dat next nevoláme
if self.mode == Mode.PREP or self.soft_stop:
if self.mode == Mode.PREP:
return
else:
self.next(item, self.state)
#overidden methods
# pouziva se pri vstupu long nebo exitu short
# osetrit uzavreni s vice nez mam
def buy(self, account: Account, size = None, repeat: bool = False):
def buy(self, size = None, repeat: bool = False):
print("overriden buy method")
if size is None:
sizer = self.state.vars.chunk
else:
sizer = size
#jde o uzavreni short pozice
if int(self.state.account_variables[account.name].positions) < 0 and (int(self.state.account_variables[account.name].positions) + int(sizer)) > 0:
self.state.ilog(e="buy nelze nakoupit vic nez shortuji", positions=self.state.account_variables[account.name].positions, size=size)
if int(self.state.positions) < 0 and (int(self.state.positions) + int(sizer)) > 0:
self.state.ilog(e="buy nelze nakoupit vic nez shortuji", positions=self.state.positions, size=size)
printanyway("buy nelze nakoupit vic nez shortuji")
return -2
if int(self.state.account_variables[account.name].positions) >= self.state.vars.maxpozic:
self.state.ilog(e="buy Maxim mnozstvi naplneno", positions=self.state.account_variables[account.name].positions)
if int(self.state.positions) >= self.state.vars.maxpozic:
self.state.ilog(e="buy Maxim mnozstvi naplneno", positions=self.state.positions)
printanyway("max mnostvi naplneno")
return 0
self.state.blockbuy = 1
self.state.vars.lastbuyindex = self.state.bars['index'][-1]
#self.state.ilog(e="send MARKET buy to if", msg="S:"+str(size), ltp=self.state.interface.get_last_price(self.state.symbol))
self.state.ilog(e="send MARKET buy to if", msg="S:"+str(size), ltp=self.state.bars['close'][-1])
return self.state.interface[account.name].buy(size=sizer)
return self.state.interface.buy(size=sizer)
#overidden methods
# pouziva se pri vstupu short nebo exitu long
def sell(self, account: Account, size = None, repeat: bool = False):
def sell(self, size = None, repeat: bool = False):
print("overriden sell method")
if size is None:
size = abs(int(self.state.account_variables[account.name].positions))
size = abs(int(self.state.positions))
#jde o uzavreni long pozice
if int(self.state.account_variables[account.name].positions) > 0 and (int(self.state.account_variables[account.name].positions) - int(size)) < 0:
self.state.ilog(e="nelze prodat vic nez longuji", positions=self.state.account_variables[account.name].positions, size=size)
if int(self.state.positions) > 0 and (int(self.state.positions) - int(size)) < 0:
self.state.ilog(e="nelze prodat vic nez longuji", positions=self.state.positions, size=size)
printanyway("nelze prodat vic nez longuji")
return -2
#pokud shortuji a mam max pozic
if int(self.state.account_variables[account.name].positions) < 0 and abs(int(self.state.account_variables[account.name].positions)) >= self.state.vars.maxpozic:
self.state.ilog(e="short - Maxim mnozstvi naplneno", positions=self.state.account_variables[account.name].positions, size=size)
if int(self.state.positions) < 0 and abs(int(self.state.positions)) >= self.state.vars.maxpozic:
self.state.ilog(e="short - Maxim mnozstvi naplneno", positions=self.state.positions, size=size)
printanyway("short - Maxim mnozstvi naplneno")
return 0
@ -505,4 +468,4 @@ class StrategyClassicSL(Strategy):
#self.state.vars.lastbuyindex = self.state.bars['index'][-1]
#self.state.ilog(e="send MARKET SELL to if", msg="S:"+str(size), ltp=self.state.interface.get_last_price(self.state.symbol))
self.state.ilog(e="send MARKET SELL to if", msg="S:"+str(size), ltp=self.state.bars['close'][-1])
return self.state.interface[account.name].sell(size=size)
return self.state.interface.sell(size=size)

View File

@ -2,7 +2,7 @@
Strategy base class
"""
from datetime import datetime
from v2realbot.utils.utils import AttributeDict, zoneNY, is_open_rush, is_close_rush, json_serial, print, gaka
from v2realbot.utils.utils import AttributeDict, zoneNY, is_open_rush, is_close_rush, json_serial, print
from v2realbot.utils.tlog import tlog
from v2realbot.utils.ilog import insert_log, insert_log_multiple_queue
from v2realbot.enums.enums import RecordType, StartBarAlign, Mode, Order, Account
@ -16,11 +16,11 @@ from v2realbot.loader.trade_ws_streamer import Trade_WS_Streamer
from v2realbot.interfaces.general_interface import GeneralInterface
from v2realbot.interfaces.backtest_interface import BacktestInterface
from v2realbot.interfaces.live_interface import LiveInterface
import v2realbot.common.model as ptm
import v2realbot.common.PrescribedTradeModel as ptm
from alpaca.trading.enums import OrderSide
from v2realbot.backtesting.backtester import Backtester
#from alpaca.trading.models import TradeUpdate
from v2realbot.common.model import TradeUpdate, AccountVariables
from v2realbot.common.model import TradeUpdate
from alpaca.trading.enums import TradeEvent, OrderStatus
from threading import Event, current_thread
import orjson
@ -30,7 +30,6 @@ from collections import defaultdict
import v2realbot.strategyblocks.activetrade.sl.optimsl as optimsl
from tqdm import tqdm
import v2realbot.utils.config_handler as cfh
from typing import Dict, Set
if PROFILING_NEXT_ENABLED:
from pyinstrument import Profiler
@ -51,8 +50,7 @@ class Strategy:
self.rectype: RecordType = None
self.nextnew = 1
self.btdata: list = []
self.interface: Dict[str, GeneralInterface] = {}
self.order_notifs: Dict[str, LiveOrderUpdatesStreamer] = {}
self.interface: GeneralInterface = None
self.state: StrategyState = None
self.bt: Backtester = None
self.debug = False
@ -62,8 +60,8 @@ class Strategy:
self.open_rush = open_rush
self.close_rush = close_rush
self._streams = []
#primary account from runReqs
self.account = account
self.key = get_key(mode=self.mode, account=self.account)
self.rtqueue = None
self.runner_id = runner_id
self.ilog_save = ilog_save
@ -71,9 +69,6 @@ class Strategy:
self.secondary_res_start_index = dict()
self.last_index = -1
#set of all accounts (Account) including those from stratvars
self.accounts = self.get_accounts_in_stratvars_and_reqs()
#TODO predelat na dynamické queues
self.q1 = queue.Queue()
self.q2 = queue.Queue()
@ -85,27 +80,7 @@ class Strategy:
self.pe = pe
self.se = se
#signal stop - internal
self.hard_stop = False #indikuje hard stop, tedy vypnuti strategie
self.soft_stop = False #indikuje soft stop (napr. při dosažení max zisku/ztráty), tedy pokracovani strategie, vytvareni dat, jen bez obchodu
def get_accounts_in_stratvars_and_reqs(self) -> Set:
"""
Helper that retrieves distinct account values used in stratvars and in runRequest.
Returns:
set: A set of unique account values.
"""
account_keywords = ['account', 'account_long', 'account_short']
account_values = set()
for signal_value in self.stratvars.get('signals', {}).values():
for key in account_keywords:
if key in signal_value:
account_values.add(Account(signal_value[key]))
account_values.add(Account(self.account))
printnow("Distinct account values:", account_values)
return account_values
self.signal_stop = False
#prdelat queue na dynamic - podle toho jak bud uchtit pracovat s multiresolutions
#zatim jen jedna q1
@ -140,33 +115,25 @@ class Strategy:
return -1
self.debug = debug
self.key = get_key(mode=mode, account=self.account)
if mode == Mode.LIVE or mode == Mode.PAPER:
#data loader thread
self.dataloader = Trade_WS_Streamer(name="WS-LDR-"+self.name)
#populate interfaces for each account
for account in self.accounts:
#get key for account
key = get_key(mode=mode, account=Account(account))
self.interface[account.name] = LiveInterface(symbol=self.symbol, key=key)
# order notif thread
self.order_notifs[account.name] = LiveOrderUpdatesStreamer(key=key, name="WS-STRMR-" + account.name + "-" + self.name, account=account)
#propojujeme notifice s interfacem (pro callback)
self.order_notifs[account.name].connect_callback(self)
self.state = StrategyState(name=self.name, accounts=self.accounts, account=self.account, symbol = self.symbol, stratvars = self.stratvars, interface=self.interface, rectype=self.rectype, runner_id=self.runner_id, ilog_save=self.ilog_save)
self.interface = LiveInterface(symbol=self.symbol, key=self.key)
# order notif thread
self.order_notifs = LiveOrderUpdatesStreamer(key=self.key, name="WS-STRMR-" + self.name)
#propojujeme notifice s interfacem (pro callback)
self.order_notifs.connect_callback(self)
self.state = StrategyState(name=self.name, symbol = self.symbol, stratvars = self.stratvars, interface=self.interface, rectype=self.rectype, runner_id=self.runner_id, ilog_save=self.ilog_save)
elif mode == Mode.BT:
self.dataloader = Trade_Offline_Streamer(start, end, btdata=self.btdata)
self.bt = Backtester(symbol = self.symbol, accounts=self.accounts, order_fill_callback= self.order_updates, btdata=self.btdata, cash=cash, bp_from=start, bp_to=end)
self.bt = Backtester(symbol = self.symbol, order_fill_callback= self.order_updates, btdata=self.btdata, cash=cash, bp_from=start, bp_to=end)
#populate interfaces for each account
for account in self.accounts:
#pro backtest volame stejne oklicujeme interface
self.interface[account.name] = BacktestInterface(symbol=self.symbol, bt=self.bt, account=account)
self.state = StrategyState(name=self.name, accounts=self.accounts, account=self.account, symbol = self.symbol, stratvars = self.stratvars, interface=self.interface, rectype=self.rectype, runner_id=self.runner_id, bt=self.bt, ilog_save=self.ilog_save)
#no callback from bt, it is called directly
self.interface = BacktestInterface(symbol=self.symbol, bt=self.bt)
self.state = StrategyState(name=self.name, symbol = self.symbol, stratvars = self.stratvars, interface=self.interface, rectype=self.rectype, runner_id=self.runner_id, bt=self.bt, ilog_save=self.ilog_save)
self.order_notifs = None
##streamer bude plnit trady do listu trades - nad kterym bude pracovat paper trade
@ -174,10 +141,10 @@ class Strategy:
self.dataloader.add_stream(TradeAggregator2List(symbol=self.symbol,btdata=self.btdata,rectype=RecordType.TRADE))
elif mode == Mode.PREP:
#bt je zde jen pro udrzeni BT casu v logu atp. JInak jej nepouzivame.
self.bt = Backtester(symbol = self.symbol, accounts=self.accounts, order_fill_callback= self.order_updates, btdata=self.btdata, cash=cash, bp_from=start, bp_to=end)
self.bt = Backtester(symbol = self.symbol, order_fill_callback= self.order_updates, btdata=self.btdata, cash=cash, bp_from=start, bp_to=end)
self.interface = None
#self.interface = BacktestInterface(symbol=self.symbol, bt=self.bt)
self.state = StrategyState(name=self.name, accounts=self.accounts, account=self.account, symbol = self.symbol, stratvars = self.stratvars, interface=self.interface, rectype=self.rectype, runner_id=self.runner_id, bt=self.bt, ilog_save=self.ilog_save)
self.state = StrategyState(name=self.name, symbol = self.symbol, stratvars = self.stratvars, interface=self.interface, rectype=self.rectype, runner_id=self.runner_id, bt=self.bt, ilog_save=self.ilog_save)
self.order_notifs = None
else:
@ -346,15 +313,13 @@ class Strategy:
""""refresh positions and avgp - for CBAR once per confirmed, for BARS each time"""
def refresh_positions(self, item):
if self.rectype == RecordType.BAR:
for account in self.accounts:
a,p = self.interface[account.name].pos()
if a != -1:
self.state.account_variables[account.name].avgp, self.state.account_variables[account.name].positions = a, p
a,p = self.interface.pos()
if a != -1:
self.state.avgp, self.state.positions = a,p
elif self.rectype in (RecordType.CBAR, RecordType.CBARVOLUME, RecordType.CBARDOLLAR, RecordType.CBARRENKO) and item['confirmed'] == 1:
for account in self.accounts:
a,p = self.interface[account.name].pos()
if a != -1:
self.state.account_variables[account.name].avgp, self.state.account_variables[account.name].positions = a, p
a,p = self.interface.pos()
if a != -1:
self.state.avgp, self.state.positions = a,p
"""update state.last_trade_time a time of iteration"""
def update_times(self, item):
@ -450,11 +415,7 @@ class Strategy:
if self.mode == Mode.LIVE or self.mode == Mode.PAPER:
#live notification thread
#for all keys in self.order_notifs call start()
for key in self.order_notifs:
self.order_notifs[key].start()
#self.order_notifs.start()
self.order_notifs.start()
elif self.mode == Mode.BT or self.mode == Mode.PREP:
self.bt.backtest_start = datetime.now()
@ -472,7 +433,7 @@ class Strategy:
#printnow(current_thread().name, "Items waiting in queue:", self.q1.qsize())
except queue.Empty:
#check internal signals - for profit/loss optim etc - valid for runner
if self.hard_stop:
if self.signal_stop:
print(current_thread().name, "Stopping signal - internal")
break
@ -493,7 +454,7 @@ class Strategy:
if item == "last" or self.se.is_set():
print(current_thread().name, "stopping")
break
elif self.hard_stop:
elif self.signal_stop:
print(current_thread().name, "Stopping signal - internal")
break
elif self.pe.is_set():
@ -524,7 +485,7 @@ class Strategy:
self.stop()
if self.mode == Mode.BT:
print("REQUEST COUNT:", {account_str:self.interface[account_str].mincnt for account_str in self.interface})
print("REQUEST COUNT:", self.interface.mincnt)
self.bt.backtest_end = datetime.now()
#print(40*"*",self.mode, "BACKTEST RESULTS",40*"*")
@ -538,9 +499,7 @@ class Strategy:
#disconnect strategy from websocket trader updates
if self.mode == Mode.LIVE or self.mode == Mode.PAPER:
for key in self.order_notifs:
self.order_notifs[key].disconnect_callback(self)
#self.order_notifs.disconnect_callback(self)
self.order_notifs.disconnect_callback(self)
#necessary only for shared loaders (to keep it running for other stratefies)
for i in self._streams:
@ -581,11 +540,11 @@ class Strategy:
#for order updates from LIVE or BACKTEST
#updates are sent only for SYMBOL of strategy
async def order_updates(self, data: TradeUpdate, account: Account):
async def order_updates(self, data: TradeUpdate):
if self.mode == Mode.LIVE or self.mode == Mode.PAPER:
now = datetime.now().timestamp()
#z alpakýho TradeEvent si udelame svuj rozsireny TradeEvent (obsahujici navic profit atp.)
data = TradeUpdate(**data.dict(), account=account)
data = TradeUpdate(**data.dict())
else:
now = self.bt.time
@ -672,14 +631,17 @@ class Strategy:
rt_out["statinds"] = dict()
for key, value in self.state.statinds.items():
rt_out["statinds"][key] = value
#vkladame average price and positions, pokud existuji
#self.state.avgp , self.state.positions
#pro typ strategie Classic, posilame i vysi stoploss
try:
sl_value = gaka(self.state.account_variables, "activeTrade", lambda x: x.stoploss_value)
sl_value = self.state.vars["activeTrade"].stoploss_value
except (KeyError, AttributeError):
sl_value = None
rt_out["positions"] = dict(time=self.state.time, positions=gaka(self.state.account_variables, "positions"), avgp=gaka(self.state.account_variables,), sl_value=sl_value)
rt_out["positions"] = dict(time=self.state.time, positions=self.state.positions, avgp=self.state.avgp, sl_value=sl_value)
#vkladame limitku a pendingbuys
try:
@ -755,21 +717,17 @@ class StrategyState:
"""Strategy Stat object that is passed to callbacks
note:
state.time
state.interface[account.name].time
accounts = set of all accounts (strings)
account = enum of primary account (Account)
state.interface.time
většinou mají stejnou hodnotu, ale lišit se mužou např. v případě BT callbacku - kdy se v rámci okna končící state.time realizují objednávky, které
triggerují callback, který následně vyvolá např. buy (ten se musí ale udít v čase fillu, tzn. callback si nastaví čas interfacu na filltime)
po dokončení bt kroků před zahájením iterace "NEXT" se časy znovu updatnout na původni state.time
"""
def __init__(self, name: str, symbol: str, accounts: set, account: Account, stratvars: AttributeDict, bars: AttributeDict = {}, trades: AttributeDict = {}, interface: GeneralInterface = None, rectype: RecordType = RecordType.BAR, runner_id: UUID = None, bt: Backtester = None, ilog_save: bool = False):
def __init__(self, name: str, symbol: str, stratvars: AttributeDict, bars: AttributeDict = {}, trades: AttributeDict = {}, interface: GeneralInterface = None, rectype: RecordType = RecordType.BAR, runner_id: UUID = None, bt: Backtester = None, ilog_save: bool = False):
self.vars = stratvars
self.interface = interface
self.account = account #primary account
self.accounts = accounts
#populate account variables dictionary
self.account_variables: Dict[str, AccountVariables] = {account.name: AccountVariables() for account in self.accounts}
self.positions = 0
self.avgp = 0
self.blockbuy = 0
self.name = name
self.symbol = symbol
self.rectype = rectype
@ -778,10 +736,11 @@ class StrategyState:
self.time = None
#time of last trade processed
self.last_trade_time = 0
self.last_entry_price={key:dict(long=0,short=999) for key in self.accounts}
self.last_entry_price=dict(long=0,short=999)
self.resolution = None
self.runner_id = runner_id
self.bt = bt
self.dont_exit_already_activated = False
self.docasny_rel_profit = []
self.ilog_save = ilog_save
self.sl_optimizer_short = optimsl.SLOptimizer(ptm.TradeDirection.SHORT)
@ -819,14 +778,18 @@ class StrategyState:
#secondary resolution indicators
#self.secondary_indicators = AttributeDict(time=[], sec_price=[])
self.statinds = AttributeDict()
#these methods can be overrided by StrategyType (to add or alter its functionality)
self.buy = self.interface.buy
self.buy_l = self.interface.buy_l
self.sell = self.interface.sell
self.sell_l = self.interface.sell_l
self.cancel_pending_buys = None
self.iter_log_list = []
self.dailyBars = defaultdict(dict)
#celkovy profit (prejmennovat na profit_cum)
self.profit = 0 #TODO key by account?
self.profit = 0
#celkovy relativni profit (obsahuje pole relativnich zisku, z jeho meanu se spocita celkovy rel_profit_cu,)
self.rel_profit_cum = []#TODO key by account?
self.rel_profit_cum = []
self.tradeList = []
#nova promenna pro externi data do ArchiveDetaili, napr. pro zobrazeni v grafu, je zde např. SL history
self.extData = defaultdict(dict)
@ -835,25 +798,6 @@ class StrategyState:
self.today_market_close = None
self.classed_indicators = {}
#quick interface actions to access from state without having to write interface[account.name].buy_l
def buy_l(self, account: Account, price: float, size: int = 1, repeat: bool = False, force: int = 0):
self.interface[account.name].buy_l(price, size, repeat, force)
def buy(self, account: Account, size = 1, repeat: bool = False):
self.interface[account.name].buy(size, repeat)
def sell_l(self, account: Account, price: float, size: int = 1, repeat: bool = False):
self.interface[account.name].sell_l(price, size, repeat)
def sell(self, account: Account, size = 1, repeat: bool = False):
self.interface[account.name].sell(size, repeat)
def repl(self, account: Account, orderid: str, price: float = None, size: int = 1, repeat: bool = False):
self.interface[account.name].repl(orderid, price, size, repeat)
def cancel(self, account: Account, orderid: str):
self.interface[account.name].cancel(orderid)
def release(self):
#release large variables
self.bars = None

View File

@ -1,13 +1,9 @@
from v2realbot.strategyblocks.activetrade.sl.trailsl import trail_SL_management
from v2realbot.strategyblocks.activetrade.close.evaluate_close import eval_close_position
from v2realbot.utils.utils import gaka
def manage_active_trade(state, data):
accountsWithActiveTrade = gaka(state.account_variables, "activeTrade", None, lambda x: x is not None)
# {"account1": activeTrade,
# "account2": activeTrade}
if len(accountsWithActiveTrade.values()) == 0:
return
trail_SL_management(state, accountsWithActiveTrade, data)
eval_close_position(state, accountsWithActiveTrade, data)
def manage_active_trade(state, data):
trade = state.vars.activeTrade
if trade is None:
return -1
trail_SL_management(state, data)
eval_close_position(state, data)

View File

@ -1,7 +1,7 @@
from v2realbot.strategy.base import StrategyState
from v2realbot.strategy.StrategyOrderLimitVykladaciNormalizedMYSELL import StrategyOrderLimitVykladaciNormalizedMYSELL
from v2realbot.enums.enums import RecordType, StartBarAlign, Mode, Account, Followup
from v2realbot.common.model import Trade, TradeDirection, TradeStatus
from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeStatus
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, print, safe_get, is_still, is_window_open, eval_cond_dict, crossed_down, crossed_up, crossed, is_pivot, json_serial, pct_diff, create_new_bars, slice_dict_lists
from v2realbot.utils.directive_utils import get_conditions_from_configuration
from v2realbot.common.model import SLHistory
@ -18,61 +18,52 @@ import os
from traceback import format_exc
from v2realbot.strategyblocks.activetrade.helpers import insert_SL_history
#TODO tady jsem taky skoncil a pak zpetna evaluate_close (mozna zde staci jen account?)
# - close means change status in prescribed Trends,update profit, delete from activeTrade
def close_position(state: StrategyState, activeTrade: Trade, data, direction: TradeDirection, reason: str, followup: Followup = None):
def close_position(state, data, direction: TradeDirection, reason: str, followup: Followup = None):
followup_text = str(followup) if followup is not None else ""
positions = state.account_variables[activeTrade.account.name].positions
state.ilog(lvl=1,e=f"CLOSING TRADE {followup_text} {reason} {str(direction)}", curr_price=data["close"], trade=activeTrade)
state.ilog(lvl=1,e=f"CLOSING TRADE {followup_text} {reason} {str(direction)}", curr_price=data["close"], trade=state.vars.activeTrade)
if direction == TradeDirection.SHORT:
res = state.buy(account=activeTrade.account, size=abs(int(positions)))
res = state.buy(size=abs(int(state.positions)))
if isinstance(res, int) and res < 0:
raise Exception(f"error in required operation {reason} {res}")
elif direction == TradeDirection.LONG:
res = state.sell(account=activeTrade.account, size=positions)
res = state.sell(size=state.positions)
if isinstance(res, int) and res < 0:
raise Exception(f"error in required operation STOPLOSS SELL {res}") #TBD error handling
raise Exception(f"error in required operation STOPLOSS SELL {res}")
else:
raise Exception(f"unknow TradeDirection in close_position")
#pri uzavreni tradu zapisujeme SL history - lepsi zorbazeni v grafu
insert_SL_history(state, activeTrade)
state.account_variables[activeTrade.account.name].pending = activeTrade.id
state.account_variables[activeTrade.account.name].activeTrade = None
#state.account_variables[activeTrade.account.name].last_exit_index = data["index"]
state.vars.last_exit_index = data["index"]
state.account_variables[activeTrade.account.name].dont_exit_already_activated = False
insert_SL_history(state)
state.dont_exit_already_activated = False
state.vars.pending = state.vars.activeTrade.id
state.vars.activeTrade = None
state.vars.last_exit_index = data["index"]
if followup is not None:
state.account_variables[activeTrade.account.name].requested_followup = followup
state.vars.requested_followup = followup
#close only partial position - no followup here, size multiplier must be between 0 and 1
def close_position_partial(state, activeTrade: Trade,data, direction: TradeDirection, reason: str, size: float):
positions = state.account_variables[activeTrade.account.name].positions
def close_position_partial(state, data, direction: TradeDirection, reason: str, size: float):
if size <= 0 or size >=1:
raise Exception(f"size must be betweem 0 and 1")
size_abs = abs(int(int(positions)*size))
state.ilog(lvl=1,e=f"CLOSING TRADE PART: {size_abs} {size} {reason} {str(direction)}", curr_price=data["close"], trade=activeTrade)
size_abs = abs(int(int(state.positions)*size))
state.ilog(lvl=1,e=f"CLOSING TRADE PART: {size_abs} {size} {reason} {str(direction)}", curr_price=data["close"], trade=state.vars.activeTrade)
if direction == TradeDirection.SHORT:
res = state.buy(account=activeTrade.account, size=size_abs)
res = state.buy(size=size_abs)
if isinstance(res, int) and res < 0:
raise Exception(f"error in required operation STOPLOSS PARTIAL BUY {reason} {res}")
elif direction == TradeDirection.LONG:
res = state.sell(account=activeTrade.account, size=size_abs)
res = state.sell(size=size_abs)
if isinstance(res, int) and res < 0:
raise Exception(f"error in required operation STOPLOSS PARTIAL SELL {res}")
else:
raise Exception(f"unknow TradeDirection in close_position")
#pri uzavreni tradu zapisujeme SL history - lepsi zorbazeni v grafu
insert_SL_history(state, activeTrade)
state.account_variables[activeTrade.account.name].pending = activeTrade.id
state.account_variables[activeTrade.account.name].activeTrade = None
state.account_variables[activeTrade.account.name].dont_exit_already_activated = False
#state.account_variables[activeTrade.account.name].last_exit_index = data["index"]
insert_SL_history(state)
state.vars.pending = state.vars.activeTrade.id
#state.vars.activeTrade = None
state.vars.last_exit_index = data["index"] #ponechano mimo account
#state.vars.last_exit_index = data["index"]

View File

@ -1,7 +1,7 @@
from v2realbot.strategy.base import StrategyState
from v2realbot.strategy.StrategyOrderLimitVykladaciNormalizedMYSELL import StrategyOrderLimitVykladaciNormalizedMYSELL
from v2realbot.enums.enums import RecordType, StartBarAlign, Mode, Account, Followup
from v2realbot.common.model import Trade, TradeDirection, TradeStatus
from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeStatus
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, print, safe_get, is_still, is_window_open, eval_cond_dict, crossed_down, crossed_up, crossed, is_pivot, json_serial, pct_diff, create_new_bars, slice_dict_lists
from v2realbot.utils.directive_utils import get_conditions_from_configuration
from v2realbot.common.model import SLHistory
@ -12,9 +12,9 @@ from threading import Event
import os
from traceback import format_exc
from v2realbot.strategyblocks.indicators.helpers import evaluate_directive_conditions
from v2realbot.strategyblocks.activetrade.helpers import get_signal_section_directive, normalize_tick
from v2realbot.strategyblocks.activetrade.helpers import get_override_for_active_trade, normalize_tick
def dontexit_protection_met(state, activeTrade: Trade, data, direction: TradeDirection):
def dontexit_protection_met(state, data, direction: TradeDirection):
if direction == TradeDirection.LONG:
smer = "long"
else:
@ -24,64 +24,58 @@ def dontexit_protection_met(state, activeTrade: Trade, data, direction: TradeDir
#vyreseno pri kazde aktivaci se vyplni flag already_activated
#pri naslednem false podminky se v pripade, ze je aktivovany flag posle True -
#take se vyrusi v closu
def process_result(result, account):
def process_result(result):
if result:
state.account_variables[account.name].dont_exit_already_activated = True
state.dont_exit_already_activated = True
return True
else:
return False
def evaluate_result():
mother_signal = activeTrade.generated_by
dont_exit_already_activated = state.account_variables[activeTrade.account.name].dont_exit_already_activated
mother_signal = state.vars.activeTrade.generated_by
if mother_signal is not None:
#TESTUJEME DONT_EXIT_
cond_dict = state.vars.conditions[KW.dont_exit][mother_signal][smer]
#OR
result, conditions_met = evaluate_directive_conditions(state, cond_dict, "OR")
state.ilog(lvl=1,e=f"DONT_EXIT {mother_signal} {smer} =OR= {result}", **conditions_met, cond_dict=cond_dict, already_activated=str(dont_exit_already_activated))
state.ilog(lvl=1,e=f"DONT_EXIT {mother_signal} {smer} =OR= {result}", **conditions_met, cond_dict=cond_dict, already_activated=str(state.dont_exit_already_activated))
if result:
return True
#OR neprosly testujeme AND
result, conditions_met = evaluate_directive_conditions(state, cond_dict, "AND")
state.ilog(lvl=1,e=f"DONT_EXIT {mother_signal} {smer} =AND= {result}", **conditions_met, cond_dict=cond_dict, already_activated=str(dont_exit_already_activated))
state.ilog(lvl=1,e=f"DONT_EXIT {mother_signal} {smer} =AND= {result}", **conditions_met, cond_dict=cond_dict, already_activated=str(state.dont_exit_already_activated))
if result:
return True
cond_dict = state.vars.conditions[KW.dont_exit]["common"][smer]
#OR
result, conditions_met = evaluate_directive_conditions(state, cond_dict, "OR")
state.ilog(lvl=1,e=f"DONT_EXIT common {smer} =OR= {result}", **conditions_met, cond_dict=cond_dict, already_activated=str(dont_exit_already_activated))
state.ilog(lvl=1,e=f"DONT_EXIT common {smer} =OR= {result}", **conditions_met, cond_dict=cond_dict, already_activated=str(state.dont_exit_already_activated))
if result:
return True
#OR neprosly testujeme AND
result, conditions_met = evaluate_directive_conditions(state, cond_dict, "AND")
state.ilog(lvl=1,e=f"DONT_EXIT common {smer} =AND= {result}", **conditions_met, cond_dict=cond_dict, already_activated=str(dont_exit_already_activated))
state.ilog(lvl=1,e=f"DONT_EXIT common {smer} =AND= {result}", **conditions_met, cond_dict=cond_dict, already_activated=str(state.dont_exit_already_activated))
return result
#nejprve evaluujeme vsechny podminky
result = evaluate_result()
#pak evaluujeme vysledek a vracíme
return process_result(result, activeTrade.account)
return process_result(result)
def exit_conditions_met(state: StrategyState, activeTrade: Trade, data, direction: TradeDirection):
def exit_conditions_met(state, data, direction: TradeDirection):
if direction == TradeDirection.LONG:
smer = "long"
else:
smer = "short"
signal_name = activeTrade.generated_by
last_entry_index = state.account_variables[activeTrade.account.name].last_entry_index
avgp = state.account_variables[activeTrade.account.name].avgp
positions = state.account_variables[activeTrade.account.name].positions
directive_name = "exit_cond_only_on_confirmed"
exit_cond_only_on_confirmed = get_signal_section_directive(state, signal_name=signal_name, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, False))
exit_cond_only_on_confirmed = get_override_for_active_trade(state, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, False))
if exit_cond_only_on_confirmed and data['confirmed'] == 0:
state.ilog(lvl=0,e="EXIT COND ONLY ON CONFIRMED BAR")
@ -89,20 +83,20 @@ def exit_conditions_met(state: StrategyState, activeTrade: Trade, data, directio
## minimální počet barů od vstupu
directive_name = "exit_cond_req_bars"
exit_cond_req_bars = get_signal_section_directive(state, signal_name=signal_name, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, 1))
exit_cond_req_bars = get_override_for_active_trade(state, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, 1))
if last_entry_index is not None:
index_to_compare = int(last_entry_index)+int(exit_cond_req_bars)
if state.vars.last_in_index is not None:
index_to_compare = int(state.vars.last_in_index)+int(exit_cond_req_bars)
if int(data["index"]) < index_to_compare:
state.ilog(lvl=1,e=f"EXIT COND WAITING on required bars from IN {exit_cond_req_bars} TOO SOON", currindex=data["index"], index_to_compare=index_to_compare, last_entry_index=last_entry_index)
state.ilog(lvl=1,e=f"EXIT COND WAITING on required bars from IN {exit_cond_req_bars} TOO SOON", currindex=data["index"], index_to_compare=index_to_compare, last_in_index=state.vars.last_in_index)
return False
#POKUD je nastaven MIN PROFIT, zkontrolujeme ho a az pripadne pustime CONDITIONY
directive_name = "exit_cond_min_profit"
exit_cond_min_profit_nodir = get_signal_section_directive(state, signal_name=signal_name, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, None))
exit_cond_min_profit_nodir = get_override_for_active_trade(state, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, None))
directive_name = "exit_cond_min_profit_" + str(smer)
exit_cond_min_profit = get_signal_section_directive(state, signal_name=signal_name,directive_name=directive_name, default_value=exit_cond_min_profit_nodir)
exit_cond_min_profit = get_override_for_active_trade(state, directive_name=directive_name, default_value=exit_cond_min_profit_nodir)
#máme nastavený exit_cond_min_profit
@ -111,10 +105,10 @@ def exit_conditions_met(state: StrategyState, activeTrade: Trade, data, directio
if exit_cond_min_profit is not None:
exit_cond_min_profit_normalized = normalize_tick(state, data, float(exit_cond_min_profit))
exit_cond_goal_price = price2dec(float(avgp)+exit_cond_min_profit_normalized,3) if int(positions) > 0 else price2dec(float(avgp)-exit_cond_min_profit_normalized,3)
exit_cond_goal_price = price2dec(float(state.avgp)+exit_cond_min_profit_normalized,3) if int(state.positions) > 0 else price2dec(float(state.avgp)-exit_cond_min_profit_normalized,3)
curr_price = float(data["close"])
state.ilog(lvl=1,e=f"EXIT COND min profit {exit_cond_goal_price=} {exit_cond_min_profit=} {exit_cond_min_profit_normalized=} {curr_price=}")
if (int(positions) < 0 and curr_price<=exit_cond_goal_price) or (int(positions) > 0 and curr_price>=exit_cond_goal_price):
if (int(state.positions) < 0 and curr_price<=exit_cond_goal_price) or (int(state.positions) > 0 and curr_price>=exit_cond_goal_price):
state.ilog(lvl=1,e=f"EXIT COND min profit PASS - POKRACUJEME")
else:
state.ilog(lvl=1,e=f"EXIT COND min profit NOT PASS")
@ -143,10 +137,10 @@ def exit_conditions_met(state: StrategyState, activeTrade: Trade, data, directio
#bereme bud exit condition signalu, ktery activeTrade vygeneroval+ fallback na general
state.ilog(lvl=0,e=f"EXIT CONDITIONS ENTRY {smer}", conditions=state.vars.conditions[KW.exit])
mother_signal = signal_name
mother_signal = state.vars.activeTrade.generated_by
if mother_signal is not None:
cond_dict = state.vars.conditions[KW.exit][signal_name][smer]
cond_dict = state.vars.conditions[KW.exit][state.vars.activeTrade.generated_by][smer]
result, conditions_met = evaluate_directive_conditions(state, cond_dict, "OR")
state.ilog(lvl=1,e=f"EXIT CONDITIONS of {mother_signal} =OR= {result}", **conditions_met, cond_dict=cond_dict)
if result:

View File

@ -1,7 +1,7 @@
from v2realbot.strategy.base import StrategyState
from v2realbot.strategy.StrategyOrderLimitVykladaciNormalizedMYSELL import StrategyOrderLimitVykladaciNormalizedMYSELL
from v2realbot.enums.enums import RecordType, StartBarAlign, Mode, Account, Followup
from v2realbot.common.model import Trade, TradeDirection, TradeStatus
from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeStatus
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, print, safe_get, is_still, is_window_open, eval_cond_dict, crossed_down, crossed_up, crossed, is_pivot, json_serial, pct_diff, create_new_bars, slice_dict_lists
from v2realbot.utils.directive_utils import get_conditions_from_configuration
from v2realbot.common.model import SLHistory
@ -18,10 +18,10 @@ import os
from traceback import format_exc
from v2realbot.strategyblocks.activetrade.helpers import insert_SL_history
from v2realbot.strategyblocks.activetrade.close.conditions import dontexit_protection_met, exit_conditions_met
from v2realbot.strategyblocks.activetrade.helpers import get_max_profit_price, get_profit_target_price, get_signal_section_directive
from v2realbot.strategyblocks.activetrade.helpers import get_max_profit_price, get_profit_target_price, get_override_for_active_trade, keyword_conditions_met
def eod_exit_activated(state: StrategyState, activeTrade: Trade, data, direction: TradeDirection):
def eod_exit_activated(state: StrategyState, data, direction: TradeDirection):
"""
Function responsible for end of day management
@ -38,10 +38,8 @@ def eod_exit_activated(state: StrategyState, activeTrade: Trade, data, direction
- 1 min forced immediate
"""
avgp = state.account_variables[activeTrade.account.name].avgp
directive_name = "forced_exit_window_start"
forced_exit_window_start = get_signal_section_directive(state, signal_name=activeTrade.generated_by, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, None))
forced_exit_window_start = get_override_for_active_trade(state, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, None))
if forced_exit_window_start is None:
state.ilog(lvl=0,e="Forced exit not required.")
@ -49,7 +47,7 @@ def eod_exit_activated(state: StrategyState, activeTrade: Trade, data, direction
directive_name = "forced_exit_window_end"
forced_exit_window_end = get_signal_section_directive(state, signal_name=activeTrade.generated_by,directive_name=directive_name, default_value=safe_get(state.vars, directive_name, 389))
forced_exit_window_end = get_override_for_active_trade(state, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, 389))
if forced_exit_window_start>389:
state.ilog(lvl=0,e="Forced exit window end max is 389")
@ -62,7 +60,7 @@ def eod_exit_activated(state: StrategyState, activeTrade: Trade, data, direction
# #dokdy konci okno snizujiciho se profitu (zbytek je breakeven a posledni minuta forced) - default pulka okna
# directive_name = "forced_exit_decreasing_profit_window_end"
# forced_exit_decreasing_profit_window_end = get_signal_section_directive(state, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, (forced_exit_window_end-forced_exit_window_end)/2))
# forced_exit_decreasing_profit_window_end = get_override_for_active_trade(state, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, (forced_exit_window_end-forced_exit_window_end)/2))
# if forced_exit_decreasing_profit_window_end > forced_exit_window_end-1:
# state.ilog(lvl=0,e="Decreasing profit window must be less than window end -1.")
@ -74,7 +72,7 @@ def eod_exit_activated(state: StrategyState, activeTrade: Trade, data, direction
state.ilog(lvl=1,e=f"Forced Exit Window OPEN - breakeven check", msg=f"{forced_exit_window_start=} {forced_exit_window_end=} ", time=str(datetime.fromtimestamp(data['updated']).astimezone(zoneNY)))
directive_name = "forced_exit_breakeven_period"
forced_exit_breakeven_period = get_signal_section_directive(state, signal_name=activeTrade.generated_by, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, True))
forced_exit_breakeven_period = get_override_for_active_trade(state, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, True))
if forced_exit_breakeven_period is False:
return False
@ -82,11 +80,11 @@ def eod_exit_activated(state: StrategyState, activeTrade: Trade, data, direction
#zatim krom posledni minuty cekame alespon na breakeven
curr_price = float(data['close'])
#short smer
if direction == TradeDirection.SHORT and curr_price<=float(avgp):
if direction == TradeDirection.SHORT and curr_price<=float(state.avgp):
state.ilog(lvl=1,e=f"Forced Exit - price better than avgp, dir SHORT")
return True
if direction == TradeDirection.LONG and curr_price>=float(avgp):
if direction == TradeDirection.LONG and curr_price>=float(state.avgp):
state.ilog(lvl=1,e=f"Forced Exit - price better than avgp, dir LONG")
return True

View File

@ -1,207 +1,198 @@
from v2realbot.strategyblocks.activetrade.close.close_position import close_position, close_position_partial
from v2realbot.strategy.base import StrategyState
from v2realbot.enums.enums import Followup
from v2realbot.common.model import Trade, TradeDirection, TradeStatus
from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeStatus
from v2realbot.utils.utils import safe_get
from v2realbot.config import KW
#from icecream import install, ic
from rich import print as printanyway
from threading import Event
#import gaka
from v2realbot.utils.utils import gaka
import os
from traceback import format_exc
from v2realbot.strategyblocks.activetrade.close.eod_exit import eod_exit_activated
from v2realbot.strategyblocks.activetrade.close.conditions import dontexit_protection_met, exit_conditions_met
from v2realbot.strategyblocks.activetrade.helpers import get_max_profit_price, get_profit_target_price, get_signal_section_directive, keyword_conditions_met
from v2realbot.strategyblocks.activetrade.helpers import get_max_profit_price, get_profit_target_price, get_override_for_active_trade, keyword_conditions_met
from v2realbot.strategyblocks.activetrade.sl.optimsl import SLOptimizer
#TODO tady odsud
def eval_close_position(state: StrategyState, accountsWithActiveTrade, data):
def eval_close_position(state: StrategyState, data):
curr_price = float(data['close'])
state.ilog(lvl=0,e="Eval CLOSE", price=curr_price, pos=gaka(state.account_variables, "positions"), avgp=gaka(state.account_variables, "avgp"), pending=gaka(state.account_variables, "pending"), activeTrade=str(gaka(state.account_variables, "activeTrade")))
state.ilog(lvl=0,e="Eval CLOSE", price=curr_price, pos=state.positions, avgp=state.avgp, pending=state.vars.pending, activeTrade=str(state.vars.activeTrade))
#iterate over accountsWithActiveTrade
for account_str, activeTrade in accountsWithActiveTrade.items():
positions = state.account_variables[account_str].positions
avgp = state.account_variables[account_str].avgp
pending = state.account_variables[account_str].pending
if int(positions) != 0 and float(avgp)>0 and pending is None:
if int(state.positions) != 0 and float(state.avgp)>0 and state.vars.pending is None:
#close position handling
#TBD pridat OPTIMALIZACI POZICE - EXIT 1/2
#mame short pozice - (IDEA: rozlisovat na zaklade aktivniho tradu - umozni mi spoustet i pri soucasne long pozicemi)
if int(state.positions) < 0:
#get TARGET PRICE pro dany smer a signal
#pokud existujeme bereme z nastaveni tradu a nebo z defaultu
if state.vars.activeTrade.goal_price is not None:
goal_price = state.vars.activeTrade.goal_price
else:
goal_price = get_profit_target_price(state, data, TradeDirection.SHORT)
max_price = get_max_profit_price(state, data, TradeDirection.SHORT)
state.ilog(lvl=1,e=f"Def Goal price {str(TradeDirection.SHORT)} {goal_price} max price {max_price}")
#close position handling
#TBD pridat OPTIMALIZACI POZICE - EXIT 1/2
#SL OPTIMALIZATION - PARTIAL EXIT
level_met, exit_adjustment = state.sl_optimizer_short.eval_position(state, data)
if level_met is not None and exit_adjustment is not None:
position = state.positions * exit_adjustment
state.ilog(lvl=1,e=f"SL OPTIMIZATION ENGAGED {str(TradeDirection.SHORT)} {position=} {level_met=} {exit_adjustment}", initial_levels=str(state.sl_optimizer_short.get_initial_abs_levels(state)), rem_levels=str(state.sl_optimizer_short.get_remaining_abs_levels(state)), exit_levels=str(state.sl_optimizer_short.exit_levels), exit_sizes=str(state.sl_optimizer_short.exit_sizes))
printanyway(f"SL OPTIMIZATION ENGAGED {str(TradeDirection.SHORT)} {position=} {level_met=} {exit_adjustment}")
close_position_partial(state=state, data=data, direction=TradeDirection.SHORT, reason=F"SL OPT LEVEL {level_met} REACHED", size=exit_adjustment)
return
#FULL SL reached - execution
if curr_price > state.vars.activeTrade.stoploss_value:
#mame short pozice - (IDEA: rozlisovat na zaklade aktivniho tradu - umozni mi spoustet i pri soucasne long pozicemi)
if int(positions) < 0:
#get TARGET PRICE pro dany smer a signal
directive_name = 'reverse_for_SL_exit_short'
reverse_for_SL_exit = get_override_for_active_trade(state=state, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, "no"))
#pokud existujeme bereme z nastaveni tradu a nebo z defaultu
if activeTrade.goal_price is not None:
goal_price = activeTrade.goal_price
if reverse_for_SL_exit == "always":
followup_action = Followup.REVERSE
elif reverse_for_SL_exit == "cond":
followup_action = Followup.REVERSE if keyword_conditions_met(state, data, direction=TradeDirection.SHORT, keyword=KW.slreverseonly, skip_conf_validation=True) else None
else:
goal_price = get_profit_target_price(state, data, activeTrade, TradeDirection.SHORT)
max_price = get_max_profit_price(state, activeTrade, data, TradeDirection.SHORT)
state.ilog(lvl=1,e=f"Def Goal price {str(TradeDirection.SHORT)} {goal_price} max price {max_price}")
followup_action = None
close_position(state=state, data=data, direction=TradeDirection.SHORT, reason="SL REACHED", followup=followup_action)
return
#SL OPTIMALIZATION - PARTIAL EXIT
level_met, exit_adjustment = state.sl_optimizer_short.eval_position(state, data, activeTrade)
if level_met is not None and exit_adjustment is not None:
position = positions * exit_adjustment
state.ilog(lvl=1,e=f"SL OPTIMIZATION ENGAGED {str(TradeDirection.SHORT)} {position=} {level_met=} {exit_adjustment}", initial_levels=str(state.sl_optimizer_short.get_initial_abs_levels(state, activeTrade)), rem_levels=str(state.sl_optimizer_short.get_remaining_abs_levels(state, activeTrade)), exit_levels=str(state.sl_optimizer_short.exit_levels), exit_sizes=str(state.sl_optimizer_short.exit_sizes))
printanyway(f"SL OPTIMIZATION ENGAGED {str(TradeDirection.SHORT)} {position=} {level_met=} {exit_adjustment}")
close_position_partial(state=state, activeTrade=activeTrade, data=data, direction=TradeDirection.SHORT, reason=F"SL OPT LEVEL {level_met} REACHED", size=exit_adjustment)
return
#FULL SL reached - execution
if curr_price > activeTrade.stoploss_value:
#REVERSE BASED ON REVERSE CONDITIONS
if keyword_conditions_met(state, data, direction=TradeDirection.SHORT, keyword=KW.reverse):
close_position(state=state, data=data, direction=TradeDirection.SHORT, reason="REVERSE COND MET", followup=Followup.REVERSE)
return
directive_name = 'reverse_for_SL_exit_short'
reverse_for_SL_exit = get_signal_section_directive(state=state, signal_name=activeTrade.generated_by, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, "no"))
#EXIT ADD CONDITIONS MET (exit and add)
if keyword_conditions_met(state, data, direction=TradeDirection.SHORT, keyword=KW.exitadd):
close_position(state=state, data=data, direction=TradeDirection.SHORT, reason="EXITADD COND MET", followup=Followup.ADD)
return
if reverse_for_SL_exit == "always":
#CLOSING BASED ON EXIT CONDITIONS
if exit_conditions_met(state, data, TradeDirection.SHORT):
directive_name = 'reverse_for_cond_exit_short'
reverse_for_cond_exit_short = get_override_for_active_trade(state=state, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, False))
directive_name = 'add_for_cond_exit_short'
add_for_cond_exit_short = get_override_for_active_trade(state=state, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, False))
if reverse_for_cond_exit_short:
followup_action = Followup.REVERSE
elif reverse_for_SL_exit == "cond":
followup_action = Followup.REVERSE if keyword_conditions_met(state, data=data, activeTrade=activeTrade, direction=TradeDirection.SHORT, keyword=KW.slreverseonly, skip_conf_validation=True) else None
elif add_for_cond_exit_short:
followup_action = Followup.ADD
else:
followup_action = None
close_position(state=state, activeTrade=activeTrade, data=data, direction=TradeDirection.SHORT, reason="SL REACHED", followup=followup_action)
close_position(state=state, data=data, direction=TradeDirection.SHORT, reason="EXIT COND MET", followup=followup_action)
return
#PROFIT
if curr_price<=goal_price:
#TODO cekat az slope prestane intenzivn erust, necekat az na klesani
#TODO mozna cekat na nejaky signal RSI
#TODO pripadne pokud dosahne TGTBB prodat ihned
max_price_signal = curr_price<=max_price
#OPTIMALIZACE pri stoupajícím angle
if max_price_signal or dontexit_protection_met(state=state, data=data,direction=TradeDirection.SHORT) is False:
close_position(state=state, data=data, direction=TradeDirection.SHORT, reason=f"PROFIT or MAXPROFIT REACHED {max_price_signal=}")
return
#REVERSE BASED ON REVERSE CONDITIONS
if keyword_conditions_met(state, data, activeTrade=activeTrade, direction=TradeDirection.SHORT, keyword=KW.reverse):
close_position(state=state, activeTrade=activeTrade,data=data, direction=TradeDirection.SHORT, reason="REVERSE COND MET", followup=Followup.REVERSE)
return
#pokud je cena horsi, ale byla uz dont exit aktivovany - pak prodavame také
elif state.dont_exit_already_activated == True:
#TODO toto mozna take na direktivu, timto neprodavame pokud porkacuje trend - EXIT_PROT_BOUNCE_IMMEDIATE
#if dontexit_protection_met(state=state, data=data,direction=TradeDirection.SHORT) is False:
close_position(state=state, data=data, direction=TradeDirection.SHORT, reason=f"EXIT PROTECTION BOUNCE {state.dont_exit_already_activated=}")
state.dont_exit_already_activated = False
return
#EXIT ADD CONDITIONS MET (exit and add)
if keyword_conditions_met(state, data, activeTrade=activeTrade, direction=TradeDirection.SHORT, keyword=KW.exitadd):
close_position(state=state, activeTrade=activeTrade, data=data, direction=TradeDirection.SHORT, reason="EXITADD COND MET", followup=Followup.ADD)
return
#FORCED EXIT PRI KONCI DNE
if eod_exit_activated(state, data, TradeDirection.SHORT):
close_position(state=state, data=data, direction=TradeDirection.SHORT, reason="EOD EXIT ACTIVATED")
return
#mame long
elif int(state.positions) > 0:
#CLOSING BASED ON EXIT CONDITIONS
if exit_conditions_met(state, activeTrade, data, TradeDirection.SHORT):
directive_name = 'reverse_for_cond_exit_short'
reverse_for_cond_exit_short = get_signal_section_directive(state=state, signal_name=activeTrade.generated_by, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, False))
directive_name = 'add_for_cond_exit_short'
add_for_cond_exit_short = get_signal_section_directive(state=state, signal_name=activeTrade.generated_by, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, False))
if reverse_for_cond_exit_short:
followup_action = Followup.REVERSE
elif add_for_cond_exit_short:
followup_action = Followup.ADD
else:
followup_action = None
close_position(state=state, activeTrae=activeTrade, data=data, direction=TradeDirection.SHORT, reason="EXIT COND MET", followup=followup_action)
return
#get TARGET PRICE pro dany smer a signal
#pokud existujeme bereme z nastaveni tradu a nebo z defaultu
if state.vars.activeTrade.goal_price is not None:
goal_price = state.vars.activeTrade.goal_price
else:
goal_price = get_profit_target_price(state, data, TradeDirection.LONG)
max_price = get_max_profit_price(state, data, TradeDirection.LONG)
state.ilog(lvl=1,e=f"Goal price {str(TradeDirection.LONG)} {goal_price} max price {max_price}")
#PROFIT
if curr_price<=goal_price:
#TODO cekat az slope prestane intenzivn erust, necekat az na klesani
#TODO mozna cekat na nejaky signal RSI
#TODO pripadne pokud dosahne TGTBB prodat ihned
max_price_signal = curr_price<=max_price
#OPTIMALIZACE pri stoupajícím angle
if max_price_signal or dontexit_protection_met(state=state, activeTrade=activeTrade, data=data,direction=TradeDirection.SHORT) is False:
close_position(state=state, activeTrade=activeTrade, data=data, direction=TradeDirection.SHORT, reason=f"PROFIT or MAXPROFIT REACHED {max_price_signal=}")
return
#pokud je cena horsi, ale byla uz dont exit aktivovany - pak prodavame také
elif state.account_variables[activeTrade.account.name].dont_exit_already_activated == True:
#TODO toto mozna take na direktivu, timto neprodavame pokud porkacuje trend - EXIT_PROT_BOUNCE_IMMEDIATE
#if dontexit_protection_met(state=state, data=data,direction=TradeDirection.SHORT) is False:
close_position(state=state, activeTrade=activeTrade, data=data, direction=TradeDirection.SHORT, reason=f"EXIT PROTECTION BOUNCE {state.account_variables[activeTrade.account.name].dont_exit_already_activated=}")
state.account_variables[activeTrade.account.name].dont_exit_already_activated = False
return
#SL OPTIMALIZATION - PARTIAL EXIT
level_met, exit_adjustment = state.sl_optimizer_long.eval_position(state, data)
if level_met is not None and exit_adjustment is not None:
position = state.positions * exit_adjustment
state.ilog(lvl=1,e=f"SL OPTIMIZATION ENGAGED {str(TradeDirection.LONG)} {position=} {level_met=} {exit_adjustment}", initial_levels=str(state.sl_optimizer_long.get_initial_abs_levels(state)), rem_levels=str(state.sl_optimizer_long.get_remaining_abs_levels(state)), exit_levels=str(state.sl_optimizer_long.exit_levels), exit_sizes=str(state.sl_optimizer_long.exit_sizes))
printanyway(f"SL OPTIMIZATION ENGAGED {str(TradeDirection.LONG)} {position=} {level_met=} {exit_adjustment}")
close_position_partial(state=state, data=data, direction=TradeDirection.LONG, reason=f"SL OPT LEVEL {level_met} REACHED", size=exit_adjustment)
return
#FORCED EXIT PRI KONCI DNE
if eod_exit_activated(state, activeTrade=activeTrade, data=data, direction=TradeDirection.SHORT):
close_position(state=state, activeTrade=activeTrade,data=data, direction=TradeDirection.SHORT, reason="EOD EXIT ACTIVATED")
return
#mame long
elif int(positions) > 0:
#SL FULL execution
if curr_price < state.vars.activeTrade.stoploss_value:
directive_name = 'reverse_for_SL_exit_long'
reverse_for_SL_exit = get_override_for_active_trade(state, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, "no"))
#get TARGET PRICE pro dany smer a signal
#pokud existujeme bereme z nastaveni tradu a nebo z defaultu
if activeTrade.goal_price is not None:
goal_price = activeTrade.goal_price
state.ilog(lvl=1, e=f"reverse_for_SL_exit {reverse_for_SL_exit}")
if reverse_for_SL_exit == "always":
followup_action = Followup.REVERSE
elif reverse_for_SL_exit == "cond":
followup_action = Followup.REVERSE if keyword_conditions_met(state, data, direction=TradeDirection.LONG, keyword=KW.slreverseonly, skip_conf_validation=True) else None
else:
goal_price = get_profit_target_price(state, data, activeTrade, TradeDirection.LONG)
followup_action = None
close_position(state=state, data=data, direction=TradeDirection.LONG, reason="SL REACHED", followup=followup_action)
return
max_price = get_max_profit_price(state, activeTrade, data, TradeDirection.LONG)
state.ilog(lvl=1,e=f"Goal price {str(TradeDirection.LONG)} {goal_price} max price {max_price}")
#SL OPTIMALIZATION - PARTIAL EXIT
level_met, exit_adjustment = state.sl_optimizer_long.eval_position(state, data, activeTrade)
if level_met is not None and exit_adjustment is not None:
position = positions * exit_adjustment
state.ilog(lvl=1,e=f"SL OPTIMIZATION ENGAGED {str(TradeDirection.LONG)} {position=} {level_met=} {exit_adjustment}", initial_levels=str(state.sl_optimizer_long.get_initial_abs_levels(state, activeTrade)), rem_levels=str(state.sl_optimizer_long.get_remaining_abs_levels(state, activeTrade)), exit_levels=str(state.sl_optimizer_long.exit_levels), exit_sizes=str(state.sl_optimizer_long.exit_sizes))
printanyway(f"SL OPTIMIZATION ENGAGED {str(TradeDirection.LONG)} {position=} {level_met=} {exit_adjustment}")
close_position_partial(state=state, activeTrade=activeTrade, data=data, direction=TradeDirection.LONG, reason=f"SL OPT LEVEL {level_met} REACHED", size=exit_adjustment)
return
#REVERSE BASED ON REVERSE CONDITIONS
if keyword_conditions_met(state, data,TradeDirection.LONG, KW.reverse):
close_position(state=state, data=data, direction=TradeDirection.LONG, reason="REVERSE COND MET", followup=Followup.REVERSE)
return
#SL FULL execution
if curr_price < activeTrade.stoploss_value:
directive_name = 'reverse_for_SL_exit_long'
reverse_for_SL_exit = get_signal_section_directive(state, signal_name=activeTrade.generated_by, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, "no"))
#EXIT ADD CONDITIONS MET (exit and add)
if keyword_conditions_met(state, data, TradeDirection.LONG, KW.exitadd):
close_position(state=state, data=data, direction=TradeDirection.LONG, reason="EXITADD COND MET", followup=Followup.ADD)
return
state.ilog(lvl=1, e=f"reverse_for_SL_exit {reverse_for_SL_exit}")
if reverse_for_SL_exit == "always":
#EXIT CONDITIONS
if exit_conditions_met(state, data, TradeDirection.LONG):
directive_name = 'reverse_for_cond_exit_long'
reverse_for_cond_exit_long = get_override_for_active_trade(state, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, False))
directive_name = 'add_for_cond_exit_long'
add_for_cond_exit_long = get_override_for_active_trade(state, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, False))
if reverse_for_cond_exit_long:
followup_action = Followup.REVERSE
elif reverse_for_SL_exit == "cond":
followup_action = Followup.REVERSE if keyword_conditions_met(state, data, activeTrade, direction=TradeDirection.LONG, keyword=KW.slreverseonly, skip_conf_validation=True) else None
elif add_for_cond_exit_long:
followup_action = Followup.ADD
else:
followup_action = None
close_position(state=state, data=data, direction=TradeDirection.LONG, reason="EXIT CONDS MET", followup=followup_action)
return
close_position(state=state, activeTrade=activeTrade, data=data, direction=TradeDirection.LONG, reason="SL REACHED", followup=followup_action)
#PROFIT
if curr_price>=goal_price:
#TODO cekat az slope prestane intenzivn erust, necekat az na klesani
#TODO mozna cekat na nejaky signal RSI
#TODO pripadne pokud dosahne TGTBB prodat ihned
max_price_signal = curr_price>=max_price
#OPTIMALIZACE pri stoupajícím angle
if max_price_signal or dontexit_protection_met(state, data, direction=TradeDirection.LONG) is False:
close_position(state=state, data=data, direction=TradeDirection.LONG, reason=f"PROFIT or MAXPROFIT REACHED {max_price_signal=}")
return
#REVERSE BASED ON REVERSE CONDITIONS
if keyword_conditions_met(state, data, activeTrade, TradeDirection.LONG, KW.reverse):
close_position(state=state, activeTrade=activeTrade, data=data, direction=TradeDirection.LONG, reason="REVERSE COND MET", followup=Followup.REVERSE)
return
#EXIT ADD CONDITIONS MET (exit and add)
if keyword_conditions_met(state, data, activeTrade, TradeDirection.LONG, KW.exitadd):
close_position(state=state, activeTrade=activeTrade, data=data, direction=TradeDirection.LONG, reason="EXITADD COND MET", followup=Followup.ADD)
return
#EXIT CONDITIONS
if exit_conditions_met(state, activeTrade, data, TradeDirection.LONG):
directive_name = 'reverse_for_cond_exit_long'
reverse_for_cond_exit_long = get_signal_section_directive(state, signal_name=activeTrade.generated_by, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, False))
directive_name = 'add_for_cond_exit_long'
add_for_cond_exit_long = get_signal_section_directive(state, signal_name=activeTrade.generated_by, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, False))
if reverse_for_cond_exit_long:
followup_action = Followup.REVERSE
elif add_for_cond_exit_long:
followup_action = Followup.ADD
else:
followup_action = None
close_position(state=state, activeTrade=activeTrade, data=data, direction=TradeDirection.LONG, reason="EXIT CONDS MET", followup=followup_action)
return
#PROFIT
if curr_price>=goal_price:
#TODO cekat az slope prestane intenzivn erust, necekat az na klesani
#TODO mozna cekat na nejaky signal RSI
#TODO pripadne pokud dosahne TGTBB prodat ihned
max_price_signal = curr_price>=max_price
#OPTIMALIZACE pri stoupajícím angle
if max_price_signal or dontexit_protection_met(state, activeTrade=activeTrade, data=data, direction=TradeDirection.LONG) is False:
close_position(state=state, activeTrade=activeTrade, data=data, direction=TradeDirection.LONG, reason=f"PROFIT or MAXPROFIT REACHED {max_price_signal=}")
return
#pokud je cena horsi, ale byl uz dont exit aktivovany - pak prodavame také
elif state.account_variables[activeTrade.account.name].dont_exit_already_activated == True:
#TODO toto mozna take na direktivu, timto neprodavame pokud porkacuje trend - EXIT_PROT_BOUNCE_IMMEDIATE
# if dontexit_protection_met(state=state, data=data,direction=TradeDirection.LONG) is False:
close_position(state=state, activeTrade=activeTrade, data=data, direction=TradeDirection.LONG, reason=f"EXIT PROTECTION BOUNCE {state.account_variables[activeTrade.account.name].dont_exit_already_activated=}")
state.account_variables[activeTrade.account.name].dont_exit_already_activated = False
return
#FORCED EXIT PRI KONCI DNE
if eod_exit_activated(state=state, activeTrade=activeTrade, data=data, direction=TradeDirection.LONG):
close_position(state=state, activeTrade=activeTrade, data=data, direction=TradeDirection.LONG, reason="EOD EXIT ACTIVATED")
return
#pokud je cena horsi, ale byl uz dont exit aktivovany - pak prodavame také
elif state.dont_exit_already_activated == True:
#TODO toto mozna take na direktivu, timto neprodavame pokud porkacuje trend - EXIT_PROT_BOUNCE_IMMEDIATE
# if dontexit_protection_met(state=state, data=data,direction=TradeDirection.LONG) is False:
close_position(state=state, data=data, direction=TradeDirection.LONG, reason=f"EXIT PROTECTION BOUNCE {state.dont_exit_already_activated=}")
state.dont_exit_already_activated = False
return
#FORCED EXIT PRI KONCI DNE
if eod_exit_activated(state, data, TradeDirection.LONG):
close_position(state=state, data=data, direction=TradeDirection.LONG, reason="EOD EXIT ACTIVATED")
return

View File

@ -1,5 +1,5 @@
from v2realbot.enums.enums import RecordType, StartBarAlign, Mode, Account, Followup
from v2realbot.common.model import Trade, TradeDirection, TradeStatus
from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeStatus
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, print, safe_get, is_still, is_window_open, eval_cond_dict, crossed_down, crossed_up, crossed, is_pivot, json_serial, pct_diff, create_new_bars, slice_dict_lists
from v2realbot.utils.directive_utils import get_conditions_from_configuration
from v2realbot.common.model import SLHistory
@ -18,20 +18,17 @@ from traceback import format_exc
from v2realbot.strategyblocks.helpers import normalize_tick
from v2realbot.strategyblocks.indicators.helpers import evaluate_directive_conditions
#TODO zde dodelat viz nize get get_signal_section_directive a pak pokracovat v close positions
#otestuje keyword podminky (napr. reverse_if, nebo exitadd_if)
def keyword_conditions_met(state, data, activeTrade: Trade, direction: TradeDirection, keyword: KW, skip_conf_validation: bool = False):
def keyword_conditions_met(state, data, direction: TradeDirection, keyword: KW, skip_conf_validation: bool = False):
action = str(keyword).upper()
if direction == TradeDirection.LONG:
smer = "long"
else:
smer = "short"
mother_signal = activeTrade.generated_by
if skip_conf_validation is False:
directive_name = "exit_cond_only_on_confirmed"
exit_cond_only_on_confirmed = get_signal_section_directive(state, signal_name=activeTrade.generated_by, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, False))
exit_cond_only_on_confirmed = get_override_for_active_trade(state, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, False))
if exit_cond_only_on_confirmed and data['confirmed'] == 0:
state.ilog(lvl=0,e=f"{action} CHECK COND ONLY ON CONFIRMED BAR")
@ -40,7 +37,7 @@ def keyword_conditions_met(state, data, activeTrade: Trade, direction: TradeDire
#TOTO zatim u REVERSU neresime
# #POKUD je nastaven MIN PROFIT, zkontrolujeme ho a az pripadne pustime CONDITIONY
# directive_name = "exit_cond_min_profit"
# exit_cond_min_profit = get_signal_section_directive(directive_name=directive_name, default_value=safe_get(state.vars, directive_name, None))
# exit_cond_min_profit = get_override_for_active_trade(directive_name=directive_name, default_value=safe_get(state.vars, directive_name, None))
# #máme nastavený exit_cond_min_profit
# # zjistíme, zda jsme v daném profit a případně nepustíme dál
@ -80,6 +77,8 @@ def keyword_conditions_met(state, data, activeTrade: Trade, direction: TradeDire
#bereme bud exit condition signalu, ktery activeTrade vygeneroval+ fallback na general
state.ilog(lvl=0,e=f"{action} CONDITIONS ENTRY {smer}", conditions=state.vars.conditions[KW.reverse])
mother_signal = state.vars.activeTrade.generated_by
if mother_signal is not None:
cond_dict = state.vars.conditions[keyword][mother_signal][smer]
result, conditions_met = evaluate_directive_conditions(state, cond_dict, "OR")
@ -109,12 +108,12 @@ def keyword_conditions_met(state, data, activeTrade: Trade, direction: TradeDire
#mozna do SL helpers tuto
def insert_SL_history(state, activeTrade: Trade):
def insert_SL_history(state):
#insert stoploss history as key sl_history into runner archive extended data
state.extData["sl_history"].append(SLHistory(id=activeTrade.id, time=state.time, sl_val=activeTrade.stoploss_value, direction=activeTrade.direction, account=activeTrade.account))
state.extData["sl_history"].append(SLHistory(id=state.vars.activeTrade.id, time=state.time, sl_val=state.vars.activeTrade.stoploss_value))
def get_default_sl_value(state, signal_name, direction: TradeDirection):
def get_default_sl_value(state, direction: TradeDirection):
if direction == TradeDirection.LONG:
smer = "long"
@ -129,16 +128,15 @@ def get_default_sl_value(state, signal_name, direction: TradeDirection):
state.ilog(lvl=1,e="No options for exit in stratvars. Fallback.")
return 0.01
directive_name = 'SL_defval_'+str(smer)
val = get_signal_section_directive(state, signal_name, directive_name=directive_name, default_value=safe_get(options, directive_name, 0.01))
val = get_override_for_active_trade(state, directive_name=directive_name, default_value=safe_get(options, directive_name, 0.01))
return val
#funkce pro direktivy, ktere muzou byt overridnute v signal sekci
#tato funkce vyhleda signal sekci aktivniho tradu a pokusi se danou direktivu vyhledat tam,
#pokud nenajde tak vrati default, ktery byl poskytnut
#TODO toto predelat na jiny nazev get_overide_for_directive_section (vstup muze byt opuze signal_name)
def get_signal_section_directive(state, signal_name: str, directive_name: str, default_value: str):
def get_override_for_active_trade(state, directive_name: str, default_value: str):
val = default_value
override = "NO"
mother_signal = signal_name
mother_signal = state.vars.activeTrade.generated_by
if mother_signal is not None:
override = "YES "+mother_signal
@ -147,30 +145,30 @@ def get_signal_section_directive(state, signal_name: str, directive_name: str, d
state.ilog(lvl=0,e=f"{directive_name} OVERRIDE {override} NEWVAL:{val} ORIGINAL:{default_value} {mother_signal}", mother_signal=mother_signal,default_value=default_value)
return val
def get_profit_target_price(state, data, activeTrade, direction: TradeDirection):
def get_profit_target_price(state, data, direction: TradeDirection):
if direction == TradeDirection.LONG:
smer = "long"
else:
smer = "short"
directive_name = "profit"
def_profit_both_directions = get_signal_section_directive(state, signal_name=activeTrade.generated_by, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, 0.50))
def_profit_both_directions = get_override_for_active_trade(state, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, 0.50))
#profit pro dany smer
directive_name = 'profit_'+str(smer)
def_profit = get_signal_section_directive(state, signal_name=activeTrade.generated_by, directive_name=directive_name, default_value=def_profit_both_directions)
def_profit = get_override_for_active_trade(state, directive_name=directive_name, default_value=def_profit_both_directions)
#mame v direktivve ticky
if isinstance(def_profit, (float, int)):
to_return = get_normalized_profitprice_from_tick(state, data, def_profit, activeTrade.account, direction)
to_return = get_normalized_profitprice_from_tick(state, data, def_profit, direction)
#mame v direktive indikator
elif isinstance(def_profit, str):
to_return = float(value_or_indicator(state, def_profit))
#min profit (ochrana extremnich hodnot indikatoru)
directive_name = 'profit_min_ind_tick_value'
profit_min_ind_tick_value = get_signal_section_directive(state, signal_name=activeTrade.generated_by, directive_name=directive_name, default_value=def_profit_both_directions)
profit_min_ind_price_value = get_normalized_profitprice_from_tick(state, data, profit_min_ind_tick_value, activeTrade.account, direction)
profit_min_ind_tick_value = get_override_for_active_trade(state, directive_name=directive_name, default_value=def_profit_both_directions)
profit_min_ind_price_value = get_normalized_profitprice_from_tick(state, data, profit_min_ind_tick_value, direction)
#ochrana pri nastaveni profitu prilis nizko
if direction == TradeDirection.LONG and to_return < profit_min_ind_price_value or direction == TradeDirection.SHORT and to_return > profit_min_ind_price_value:
@ -181,32 +179,28 @@ def get_profit_target_price(state, data, activeTrade, direction: TradeDirection)
return to_return
##based on tick a direction, returns normalized prfoit price (LONG = avgp(nebo currprice)+norm.tick, SHORT=avgp(or currprice)-norm.tick)
def get_normalized_profitprice_from_tick(state, data, tick, account: Account, direction: TradeDirection):
avgp = state.account_variables[account.name].avgp
def get_normalized_profitprice_from_tick(state, data, tick, direction: TradeDirection):
normalized_tick = normalize_tick(state, data, float(tick))
base_price = avgp if avgp != 0 else data["close"]
base_price = state.avgp if state.avgp != 0 else data["close"]
returned_price = price2dec(float(base_price)+normalized_tick,3) if direction == TradeDirection.LONG else price2dec(float(base_price)-normalized_tick,3)
state.ilog(lvl=0,e=f"NORMALIZED TICK {tick=} {normalized_tick=} NORM.PRICE {returned_price}")
return returned_price
def get_max_profit_price(state, activeTrade: Trade, data, direction: TradeDirection):
def get_max_profit_price(state, data, direction: TradeDirection):
if direction == TradeDirection.LONG:
smer = "long"
else:
smer = "short"
directive_name = "max_profit"
max_profit_both_directions = get_signal_section_directive(state, signal_name=activeTrade.generated_by, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, 0.35))
avgp = state.account_variables[activeTrade.account.name].avgp
positions = state.account_variables[activeTrade.account.name].positions
max_profit_both_directions = get_override_for_active_trade(state, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, 0.35))
#max profit pro dany smer, s fallbackem na bez smeru
directive_name = 'max_profit_'+str(smer)
max_profit = get_signal_section_directive(state, signal_name=activeTrade.generated_by, directive_name=directive_name, default_value=max_profit_both_directions)
max_profit = get_override_for_active_trade(state, directive_name=directive_name, default_value=max_profit_both_directions)
normalized_max_profit = normalize_tick(state,data,float(max_profit))
state.ilog(lvl=0,e=f"MAX PROFIT {max_profit=} {normalized_max_profit=}")
return price2dec(float(avgp)+normalized_max_profit,3) if int(positions) > 0 else price2dec(float(avgp)-normalized_max_profit,3)
return price2dec(float(state.avgp)+normalized_max_profit,3) if int(state.positions) > 0 else price2dec(float(state.avgp)-normalized_max_profit,3)

View File

@ -1,8 +1,8 @@
import numpy as np
from v2realbot.common.model import Trade, TradeDirection, TradeStatus
from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeStatus
from typing import Tuple
from copy import deepcopy
from v2realbot.strategyblocks.activetrade.helpers import get_signal_section_directive
from v2realbot.strategyblocks.activetrade.helpers import get_override_for_active_trade
from v2realbot.utils.utils import safe_get
# FIBONACCI PRO PROFIT A SL
@ -49,23 +49,23 @@ class SLOptimizer:
# self.exit_levels = self.init_exit_levels
# self.exit_sizes = self.init_exit_sizes
def get_trade_details(self, state, activeTrade):
trade: Trade = activeTrade
def get_trade_details(self, state):
trade: Trade = state.vars.activeTrade
#jde o novy trade - resetujeme levely
if trade.id != self.last_trade:
#inicializujeme a vymazeme pripadne puvodni
if self.initialize_levels(state, activeTrade) is False:
if self.initialize_levels(state) is False:
return None, None
self.last_trade = trade.id
#return cost_price, sl_price
return state.account_variables[trade.account.name].avgp, trade.stoploss_value
return state.avgp, trade.stoploss_value
def initialize_levels(self, state, activeTrade):
def initialize_levels(self, state):
directive_name = 'SL_opt_exit_levels_'+str(self.direction.value)
SL_opt_exit_levels = get_signal_section_directive(state=state, signal_name=activeTrade.generated_by, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, None))
SL_opt_exit_levels = get_override_for_active_trade(state=state, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, None))
directive_name = 'SL_opt_exit_sizes_'+str(self.direction.value)
SL_opt_exit_sizes = get_signal_section_directive(state=state, signal_name=activeTrade.generated_by, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, None))
SL_opt_exit_sizes = get_override_for_active_trade(state=state, directive_name=directive_name, default_value=safe_get(state.vars, directive_name, None))
if SL_opt_exit_levels is None or SL_opt_exit_sizes is None:
#print("no directives found: SL_opt_exit_levels/SL_opt_exit_sizes")
@ -83,11 +83,11 @@ class SLOptimizer:
print(f"new levels initialized {self.exit_levels=} {self.exit_sizes=}")
return True
def get_initial_abs_levels(self, state, activeTrade):
def get_initial_abs_levels(self, state):
"""
Returns price levels corresponding to initial setting of exit_levels
"""
cost_price, sl_price = self.get_trade_details(state, activeTrade)
cost_price, sl_price = self.get_trade_details(state)
if cost_price is None or sl_price is None:
return []
curr_sl_distance = np.abs(cost_price - sl_price)
@ -96,11 +96,11 @@ class SLOptimizer:
else:
return [cost_price - exit_level * curr_sl_distance for exit_level in self.init_exit_levels]
def get_remaining_abs_levels(self, state, activeTrade):
def get_remaining_abs_levels(self, state):
"""
Returns price levels corresponding to remaing exit_levels for current trade
"""
cost_price, sl_price = self.get_trade_details(state, activeTrade)
cost_price, sl_price = self.get_trade_details(state)
if cost_price is None or sl_price is None:
return []
curr_sl_distance = np.abs(cost_price - sl_price)
@ -109,11 +109,11 @@ class SLOptimizer:
else:
return [cost_price - exit_level * curr_sl_distance for exit_level in self.exit_levels]
def eval_position(self, state, data, activeTrade) -> Tuple[float, float]:
def eval_position(self, state, data) -> Tuple[float, float]:
"""Evaluates optimalization for current position and returns if the given level was
met and how to adjust exit position.
"""
cost_price, sl_price = self.get_trade_details(state, activeTrade)
cost_price, sl_price = self.get_trade_details(state)
if cost_price is None or sl_price is None:
#print("no settings found")
return (None, None)

View File

@ -1,12 +1,13 @@
from v2realbot.strategy.base import StrategyState
from v2realbot.common.model import Trade, TradeDirection, TradeStatus
from v2realbot.utils.utils import gaka, isrising, isfalling,zoneNY, price2dec, print, safe_get
from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeStatus
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, print, safe_get
#from icecream import install, ic
from rich import print as printanyway
from threading import Event
import os
from traceback import format_exc
from v2realbot.strategyblocks.activetrade.helpers import get_signal_section_directive, normalize_tick, insert_SL_history
from v2realbot.strategyblocks.activetrade.helpers import get_override_for_active_trade, normalize_tick, insert_SL_history
#pokud se cena posouva nasim smerem olespon o (0.05) nad (SL + 0.09val), posuneme SL o offset
#+ varianta - skoncit breakeven
@ -24,75 +25,68 @@ from v2realbot.strategyblocks.activetrade.helpers import get_signal_section_dire
# SL_trailing_stop_at_breakeven_short = true
# SL_trailing_stop_at_breakeven_long = true
def trail_SL_management(state: StrategyState, accountsWithActiveTrade, data):
#iterate over accountsWithActiveTrade
for account_str, activeTrade in accountsWithActiveTrade.items():
positions = state.account_variables[account_str].positions
avgp = state.account_variables[account_str].avgp
pending = state.account_variables[account_str].pending
signal_name = activeTrade.generated_by
last_entry_index = state.account_variables[account_str].last_entry_index
if int(positions) != 0 and float(avgp)>0 and pending is None:
def trail_SL_management(state: StrategyState, data):
if int(state.positions) != 0 and float(state.avgp)>0 and state.vars.pending is None:
if int(positions) < 0:
direction = TradeDirection.SHORT
smer = "short"
else:
direction = TradeDirection.LONG
smer = "long"
# zatim nastaveni SL plati pro vsechny - do budoucna per signal - pridat sekci
options = safe_get(state.vars, 'exit', None)
if options is None:
state.ilog(lvl=1,e="Trail SL. No options for exit conditions in stratvars.")
return
directive_name = 'SL_trailing_enabled_'+str(smer)
sl_trailing_enabled = get_signal_section_directive(state=state, signal_name=signal_name, directive_name=directive_name, default_value=safe_get(options, directive_name, False))
if int(state.positions) < 0:
direction = TradeDirection.SHORT
smer = "short"
else:
direction = TradeDirection.LONG
smer = "long"
# zatim nastaveni SL plati pro vsechny - do budoucna per signal - pridat sekci
#SL_trailing_protection_window_short
directive_name = 'SL_trailing_protection_window_'+str(smer)
SL_trailing_protection_window = get_signal_section_directive(state=state, signal_name=signal_name, directive_name=directive_name, default_value=safe_get(options, directive_name, 0))
index_to_compare = int(last_entry_index)+int(SL_trailing_protection_window)
if index_to_compare > int(data["index"]):
state.ilog(lvl=1,e=f"SL trail PROTECTION WINDOW {SL_trailing_protection_window} - TOO SOON", currindex=data["index"], index_to_compare=index_to_compare, last_entry_index=last_entry_index)
options = safe_get(state.vars, 'exit', None)
if options is None:
state.ilog(lvl=1,e="Trail SL. No options for exit conditions in stratvars.")
return
directive_name = 'SL_trailing_enabled_'+str(smer)
sl_trailing_enabled = get_override_for_active_trade(state=state, directive_name=directive_name, default_value=safe_get(options, directive_name, False))
#SL_trailing_protection_window_short
directive_name = 'SL_trailing_protection_window_'+str(smer)
SL_trailing_protection_window = get_override_for_active_trade(state=state, directive_name=directive_name, default_value=safe_get(options, directive_name, 0))
index_to_compare = int(state.vars.last_in_index)+int(SL_trailing_protection_window)
if index_to_compare > int(data["index"]):
state.ilog(lvl=1,e=f"SL trail PROTECTION WINDOW {SL_trailing_protection_window} - TOO SOON", currindex=data["index"], index_to_compare=index_to_compare, last_in_index=state.vars.last_in_index)
return
if sl_trailing_enabled is True:
directive_name = 'SL_trailing_stop_at_breakeven_'+str(smer)
stop_breakeven = get_override_for_active_trade(state=state, directive_name=directive_name, default_value=safe_get(options, directive_name, False))
directive_name = 'SL_defval_'+str(smer)
def_SL = get_override_for_active_trade(state=state, directive_name=directive_name, default_value=safe_get(options, directive_name, 0.01))
directive_name = "SL_trailing_offset_"+str(smer)
offset = get_override_for_active_trade(state=state, directive_name=directive_name, default_value=safe_get(options, directive_name, 0.01))
directive_name = "SL_trailing_step_"+str(smer)
step = get_override_for_active_trade(state=state, directive_name=directive_name, default_value=safe_get(options, directive_name, offset))
#pokud je pozadovan trail jen do breakeven a uz prekroceno
if (direction == TradeDirection.LONG and stop_breakeven and state.vars.activeTrade.stoploss_value >= float(state.avgp)) or (direction == TradeDirection.SHORT and stop_breakeven and state.vars.activeTrade.stoploss_value <= float(state.avgp)):
state.ilog(lvl=1,e=f"SL trail STOP at breakeven {str(smer)} SL:{state.vars.activeTrade.stoploss_value} UNCHANGED", stop_breakeven=stop_breakeven)
return
if sl_trailing_enabled is True:
directive_name = 'SL_trailing_stop_at_breakeven_'+str(smer)
stop_breakeven = get_signal_section_directive(state=state, signal_name=signal_name, directive_name=directive_name, default_value=safe_get(options, directive_name, False))
directive_name = 'SL_defval_'+str(smer)
def_SL = get_signal_section_directive(state=state, signal_name=signal_name, directive_name=directive_name, default_value=safe_get(options, directive_name, 0.01))
directive_name = "SL_trailing_offset_"+str(smer)
offset = get_signal_section_directive(state=state, signal_name=signal_name, directive_name=directive_name, default_value=safe_get(options, directive_name, 0.01))
directive_name = "SL_trailing_step_"+str(smer)
step = get_signal_section_directive(state=state, signal_name=signal_name, directive_name=directive_name, default_value=safe_get(options, directive_name, offset))
#Aktivace SL pokud vystoupa na "offset", a nasledne posunuti o "step"
#pokud je pozadovan trail jen do breakeven a uz prekroceno
if (direction == TradeDirection.LONG and stop_breakeven and activeTrade.stoploss_value >= float(avgp)) or (direction == TradeDirection.SHORT and stop_breakeven and activeTrade.stoploss_value <= float(avgp)):
state.ilog(lvl=1,e=f"SL trail STOP at breakeven {str(smer)} SL:{activeTrade.stoploss_value} UNCHANGED", stop_breakeven=stop_breakeven)
return
#Aktivace SL pokud vystoupa na "offset", a nasledne posunuti o "step"
offset_normalized = normalize_tick(state, data, offset) #to ticks and from options
step_normalized = normalize_tick(state, data, step)
def_SL_normalized = normalize_tick(state, data, def_SL)
if direction == TradeDirection.LONG:
move_SL_threshold = activeTrade.stoploss_value + offset_normalized + def_SL_normalized
state.ilog(lvl=1,e=f"SL TRAIL EVAL {smer} SL:{round(activeTrade.stoploss_value,3)} TRAILGOAL:{move_SL_threshold}", def_SL=def_SL, offset=offset, offset_normalized=offset_normalized, step_normalized=step_normalized, def_SL_normalized=def_SL_normalized)
if (move_SL_threshold) < data['close']:
activeTrade.stoploss_value += step_normalized
insert_SL_history(state, activeTrade)
state.ilog(lvl=1,e=f"SL TRAIL TH {smer} reached {move_SL_threshold} SL moved to {activeTrade.stoploss_value}", offset_normalized=offset_normalized, step_normalized=step_normalized, def_SL_normalized=def_SL_normalized)
elif direction == TradeDirection.SHORT:
move_SL_threshold = activeTrade.stoploss_value - offset_normalized - def_SL_normalized
state.ilog(lvl=0,e=f"SL TRAIL EVAL {smer} SL:{round(activeTrade.stoploss_value,3)} TRAILGOAL:{move_SL_threshold}", def_SL=def_SL, offset=offset, offset_normalized=offset_normalized, step_normalized=step_normalized, def_SL_normalized=def_SL_normalized)
if (move_SL_threshold) > data['close']:
activeTrade.stoploss_value -= step_normalized
insert_SL_history(state, activeTrade)
state.ilog(lvl=1,e=f"SL TRAIL GOAL {smer} reached {move_SL_threshold} SL moved to {activeTrade.stoploss_value}", offset_normalized=offset_normalized, step_normalized=step_normalized, def_SL_normalized=def_SL_normalized)
offset_normalized = normalize_tick(state, data, offset) #to ticks and from options
step_normalized = normalize_tick(state, data, step)
def_SL_normalized = normalize_tick(state, data, def_SL)
if direction == TradeDirection.LONG:
move_SL_threshold = state.vars.activeTrade.stoploss_value + offset_normalized + def_SL_normalized
state.ilog(lvl=1,e=f"SL TRAIL EVAL {smer} SL:{round(state.vars.activeTrade.stoploss_value,3)} TRAILGOAL:{move_SL_threshold}", def_SL=def_SL, offset=offset, offset_normalized=offset_normalized, step_normalized=step_normalized, def_SL_normalized=def_SL_normalized)
if (move_SL_threshold) < data['close']:
state.vars.activeTrade.stoploss_value += step_normalized
insert_SL_history(state)
state.ilog(lvl=1,e=f"SL TRAIL TH {smer} reached {move_SL_threshold} SL moved to {state.vars.activeTrade.stoploss_value}", offset_normalized=offset_normalized, step_normalized=step_normalized, def_SL_normalized=def_SL_normalized)
elif direction == TradeDirection.SHORT:
move_SL_threshold = state.vars.activeTrade.stoploss_value - offset_normalized - def_SL_normalized
state.ilog(lvl=0,e=f"SL TRAIL EVAL {smer} SL:{round(state.vars.activeTrade.stoploss_value,3)} TRAILGOAL:{move_SL_threshold}", def_SL=def_SL, offset=offset, offset_normalized=offset_normalized, step_normalized=step_normalized, def_SL_normalized=def_SL_normalized)
if (move_SL_threshold) > data['close']:
state.vars.activeTrade.stoploss_value -= step_normalized
insert_SL_history(state)
state.ilog(lvl=1,e=f"SL TRAIL GOAL {smer} reached {move_SL_threshold} SL moved to {state.vars.activeTrade.stoploss_value}", offset_normalized=offset_normalized, step_normalized=step_normalized, def_SL_normalized=def_SL_normalized)

View File

@ -55,11 +55,7 @@ def populate_all_indicators(data, state: StrategyState):
#TODO tento lof patri spis do nextu classic SL - je poplatny typu stratefie
#TODO na toto se podivam, nejak moc zajasonovani a zpatky -
#PERF PROBLEM
positions = state.account_variables[state.account].positions
avgp = state.account_variables[state.account].avgp
#state.ilog(lvl=1,e="ENTRY", msg=f"LP:{lp} P:{positions}/{round(float(avgp),3)} SL:{state.vars.activeTrade.stoploss_value if state.vars.activeTrade is not None else None} GP:{state.vars.activeTrade.goal_price if state.vars.activeTrade is not None else None} profit:{round(float(state.profit),2)} profit_rel:{round(np.sum(state.rel_profit_cum),6) if len(state.rel_profit_cum)>0 else 0} Trades:{len(state.tradeList)} pend:{state.vars.pending}", rel_profit_cum=str(state.rel_profit_cum), activeTrade=transform_data(state.vars.activeTrade, json_serial), prescribedTrades=transform_data(state.vars.prescribedTrades, json_serial), pending=str(state.vars.pending))
state.ilog(lvl=1,e="ENTRY", msg=f"LP:{lp} ", accountVars=transform_data(state.account_variables, json_serial), prescribedTrades=transform_data(state.vars.prescribedTrades, json_serial))
state.ilog(lvl=1,e="ENTRY", msg=f"LP:{lp} P:{state.positions}/{round(float(state.avgp),3)} SL:{state.vars.activeTrade.stoploss_value if state.vars.activeTrade is not None else None} GP:{state.vars.activeTrade.goal_price if state.vars.activeTrade is not None else None} profit:{round(float(state.profit),2)} profit_rel:{round(np.sum(state.rel_profit_cum),6) if len(state.rel_profit_cum)>0 else 0} Trades:{len(state.tradeList)} pend:{state.vars.pending}", rel_profit_cum=str(state.rel_profit_cum), activeTrade=transform_data(state.vars.activeTrade, json_serial), prescribedTrades=transform_data(state.vars.prescribedTrades, json_serial), pending=str(state.vars.pending))
#kroky pro CONFIRMED BAR only
if conf_bar == 1:

View File

@ -46,10 +46,6 @@ def populate_dynamic_slopeLP_indicator(data, state: StrategyState, name):
#typ leveho bodu [lastbuy - cena posledniho nakupu, baropen - cena otevreni baru]
leftpoint = safe_get(options, 'leftpoint', "lastbuy")
#REFACTOR multiaccount
#avgp bereme z primarni accountu (state.account)
avgp = state.account_variables[state.account].avgp
#lookback has to be even
if lookback_offset % 2 != 0:
lookback_offset += 1
@ -69,8 +65,8 @@ def populate_dynamic_slopeLP_indicator(data, state: StrategyState, name):
#pokud mame aktivni pozice, nastavime lookbackprice a time podle posledniho tradu
#pokud se ale dlouho nenakupuje (uplynulo od posledniho nakupu vic nez back_to_standard_after baru), tak se vracime k prumeru
if avgp > 0 and state.bars.index[-1] < int(state.vars.last_entry_index)+back_to_standard_after:
lb_index = -1 - (state.bars.index[-1] - int(state.vars.last_entry_index))
if state.avgp > 0 and state.bars.index[-1] < int(state.vars.last_buy_index)+back_to_standard_after:
lb_index = -1 - (state.bars.index[-1] - int(state.vars.last_buy_index))
lookbackprice = state.bars.vwap[lb_index]
state.ilog(lvl=0,e=f"IND {name} slope {leftpoint}- LEFT POINT OVERRIDE bereme ajko cenu lastbuy {lookbackprice=} {lookbacktime=} {lb_index=}")
else:

View File

@ -5,108 +5,99 @@ from v2realbot.utils.utils import slice_dict_lists,zoneUTC,safe_get, AttributeDi
#id = "b11c66d9-a9b6-475a-9ac1-28b11e1b4edf"
#state = AttributeDict(vars={})
from rich import print
from traceback import format_exc
def attach_previous_data(state):
"""""
Attaches data from previous runner of the same batch.
"""""
print("ATTACHING PREVIOUS DATA")
try:
runner : Runner
#get batch_id of current runer
res, runner = cs.get_runner(state.runner_id)
if res < 0:
if runner.batch_id is None:
print(f"No batch_id found for runner {runner.id}")
else:
print(f"Couldnt get previous runner {state.runner_id} error: {runner}")
return None
batch_id = runner.batch_id
#batch_id = "6a6b0bcf"
res, runner_ids =cs.get_archived_runnerslist_byBatchID(batch_id, "desc")
if res < 0:
msg = f"error whne fetching runners of batch {batch_id} {runner_ids}"
print(msg)
return None
if runner_ids is None or len(runner_ids) == 0:
print(f"NO runners found for batch {batch_id} {runner_ids}")
return None
last_runner = runner_ids[0]
print("Previous runner identified:", last_runner)
#get archived header - to get transferables
runner_header : RunArchive = None
res, runner_header = cs.get_archived_runner_header_byID(last_runner)
if res < 0:
print(f"Error when fetching runner header {last_runner}")
return None
state.vars["transferables"] = runner_header.transferables
print("INITIALIZED transferables", state.vars["transferables"])
#get details from the runner
print(f"Fetching runner details of {last_runner}")
res, val = cs.get_archived_runner_details_byID(last_runner)
if res < 0:
print(f"no archived runner {last_runner}")
return None
detail = RunArchiveDetail(**val)
#print("toto jsme si dotahnuli", detail.bars)
if len(detail.bars["time"]) == 0:
print(f"no bars for runner {last_runner}")
return None
# from stratvars directives
attach_previous_bar_data = safe_get(state.vars, "attach_previous_bar_data", 50)
attach_previous_tick_data = safe_get(state.vars, "attach_previous_tick_data", None)
#indicators datetime utc
indicators = slice_dict_lists(d=detail.indicators[0],last_item=attach_previous_bar_data, time_to_datetime=True)
#time -datetime utc, updated - timestamp float
bars = slice_dict_lists(d=detail.bars, last_item=attach_previous_bar_data, time_to_datetime=True)
cbar_ids = {}
#zarovname tick spolu s bar daty
if attach_previous_tick_data is None:
oldest_timestamp = bars["updated"][0]
#returns only values older that oldest_timestamp
cbar_inds = filter_timeseries_by_timestamp(detail.indicators[1], oldest_timestamp)
runner : Runner
#get batch_id of current runer
res, runner = cs.get_runner(state.runner_id)
if res < 0:
if runner.batch_id is None:
print(f"No batch_id found for runner {runner.id}")
else:
cbar_inds = slice_dict_lists(d=detail.indicators[1],last_item=attach_previous_tick_data)
#USE these as INITs - TADY SI TO JESTE ZASTAVIT a POROVNAT
#print("state.indicatorsL", state.indicators, "NEW:", indicators)
state.indicators = AttributeDict(**indicators)
print("transfered indicators:", len(state.indicators["time"]))
#print("state.bars", state.bars, "NEW:", bars)
state.bars = AttributeDict(bars)
print("transfered bars:", len(state.bars["time"]))
#print("state.cbar_indicators", state.cbar_indicators, "NEW:", cbar_inds)
state.cbar_indicators = AttributeDict(cbar_inds)
print("transfered ticks:", len(state.cbar_indicators["time"]))
print("TRANSFERABLEs INITIALIZED")
#bars
#transferable_state_vars = ["martingale", "batch_profit"]
#1. pri initu se tyto klice v state vars se namapuji do ext_data ext_data["transferrables"]["martingale"] = state.vars["martingale"]
#2. pri transferu se vse z ext_data["trasferrables"] dá do stejnénné state.vars["martingale"]
#3. na konci dne se uloží do sloupce transferables v RunArchive
#pridavame dailyBars z extData
# if hasattr(detail, "ext_data") and "dailyBars" in detail.ext_data:
# state.dailyBars = detail.ext_data["dailyBars"]
return
except Exception as e:
print(str(e)+format_exc())
print(f"Couldnt get previous runner {state.runner_id} error: {runner}")
return None
batch_id = runner.batch_id
#batch_id = "6a6b0bcf"
res, runner_ids =cs.get_archived_runnerslist_byBatchID(batch_id, "desc")
if res < 0:
msg = f"error whne fetching runners of batch {batch_id} {runner_ids}"
print(msg)
return None
if runner_ids is None or len(runner_ids) == 0:
print(f"NO runners found for batch {batch_id} {runner_ids}")
return None
last_runner = runner_ids[0]
print("Previous runner identified:", last_runner)
#get archived header - to get transferables
runner_header : RunArchive = None
res, runner_header = cs.get_archived_runner_header_byID(last_runner)
if res < 0:
print(f"Error when fetching runner header {last_runner}")
return None
state.vars["transferables"] = runner_header.transferables
print("INITIALIZED transferables", state.vars["transferables"])
#get details from the runner
print(f"Fetching runner details of {last_runner}")
res, val = cs.get_archived_runner_details_byID(last_runner)
if res < 0:
print(f"no archived runner {last_runner}")
return None
detail = RunArchiveDetail(**val)
#print("toto jsme si dotahnuli", detail.bars)
# from stratvars directives
attach_previous_bar_data = safe_get(state.vars, "attach_previous_bar_data", 50)
attach_previous_tick_data = safe_get(state.vars, "attach_previous_tick_data", None)
#indicators datetime utc
indicators = slice_dict_lists(d=detail.indicators[0],last_item=attach_previous_bar_data, time_to_datetime=True)
#time -datetime utc, updated - timestamp float
bars = slice_dict_lists(d=detail.bars, last_item=attach_previous_bar_data, time_to_datetime=True)
#zarovname tick spolu s bar daty
if attach_previous_tick_data is None:
oldest_timestamp = bars["updated"][0]
#returns only values older that oldest_timestamp
cbar_inds = filter_timeseries_by_timestamp(detail.indicators[1], oldest_timestamp)
else:
cbar_inds = slice_dict_lists(d=detail.indicators[1],last_item=attach_previous_tick_data)
#USE these as INITs - TADY SI TO JESTE ZASTAVIT a POROVNAT
#print("state.indicatorsL", state.indicators, "NEW:", indicators)
state.indicators = AttributeDict(**indicators)
print("transfered indicators:", len(state.indicators["time"]))
#print("state.bars", state.bars, "NEW:", bars)
state.bars = AttributeDict(bars)
print("transfered bars:", len(state.bars["time"]))
#print("state.cbar_indicators", state.cbar_indicators, "NEW:", cbar_inds)
state.cbar_indicators = AttributeDict(cbar_inds)
print("transfered ticks:", len(state.cbar_indicators["time"]))
print("TRANSFERABLEs INITIALIZED")
#bars
#transferable_state_vars = ["martingale", "batch_profit"]
#1. pri initu se tyto klice v state vars se namapuji do ext_data ext_data["transferrables"]["martingale"] = state.vars["martingale"]
#2. pri transferu se vse z ext_data["trasferrables"] dá do stejnénné state.vars["martingale"]
#3. na konci dne se uloží do sloupce transferables v RunArchive
#pridavame dailyBars z extData
# if hasattr(detail, "ext_data") and "dailyBars" in detail.ext_data:
# state.dailyBars = detail.ext_data["dailyBars"]
return
# if __name__ == "__main__":
# attach_previous_data(state)

View File

@ -1,7 +1,7 @@
from v2realbot.strategy.base import StrategyState
from v2realbot.strategy.StrategyOrderLimitVykladaciNormalizedMYSELL import StrategyOrderLimitVykladaciNormalizedMYSELL
from v2realbot.enums.enums import RecordType, StartBarAlign, Mode, Account, Followup
from v2realbot.common.model import Trade, TradeDirection, TradeStatus
from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeStatus
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, print, safe_get, is_still, is_window_open, eval_cond_dict, crossed_down, crossed_up, crossed, is_pivot, json_serial, pct_diff, create_new_bars, slice_dict_lists
from v2realbot.utils.directive_utils import get_conditions_from_configuration
from v2realbot.common.model import SLHistory

View File

@ -1,7 +1,7 @@
from v2realbot.strategy.base import StrategyState
from v2realbot.strategy.StrategyOrderLimitVykladaciNormalizedMYSELL import StrategyOrderLimitVykladaciNormalizedMYSELL
from v2realbot.enums.enums import RecordType, StartBarAlign, Mode, Account, Followup
from v2realbot.common.model import Trade, TradeDirection, TradeStatus
from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeStatus
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, print, safe_get, is_still, is_window_open, eval_cond_dict, crossed_down, crossed_up, crossed, is_pivot, json_serial, pct_diff, create_new_bars, slice_dict_lists
from v2realbot.utils.directive_utils import get_conditions_from_configuration
#import mlroom.utils.mlutils as ml

View File

@ -1,7 +1,7 @@
from v2realbot.strategy.base import StrategyState
from v2realbot.strategy.StrategyOrderLimitVykladaciNormalizedMYSELL import StrategyOrderLimitVykladaciNormalizedMYSELL
from v2realbot.enums.enums import RecordType, StartBarAlign, Mode, Account, Followup
from v2realbot.common.model import Trade, TradeDirection, TradeStatus
from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeStatus
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, print, safe_get, is_still, is_window_open, eval_cond_dict, crossed_down, crossed_up, crossed, is_pivot, json_serial, pct_diff, create_new_bars, slice_dict_lists
from v2realbot.utils.directive_utils import get_conditions_from_configuration
from v2realbot.common.model import SLHistory

View File

@ -1,6 +1,6 @@
from v2realbot.strategy.base import StrategyState
from v2realbot.common.model import TradeDirection, TradeStatus
from v2realbot.utils.utils import zoneNY, json_serial,transform_data, gaka
from v2realbot.common.PrescribedTradeModel import TradeDirection, TradeStatus
from v2realbot.utils.utils import zoneNY, json_serial,transform_data
from datetime import datetime
#import random
import orjson
@ -10,109 +10,97 @@ from v2realbot.strategyblocks.indicators.helpers import value_or_indicator
#TODO nad prescribed trades postavit vstupni funkce
def execute_prescribed_trades(state: StrategyState, data):
##evaluate prescribed trade, prvni eligible presuneme do activeTrade, zmenime stav and vytvorime objednavky
#for multiaccount setup we check if there is active trade for each account
if len(state.vars.prescribedTrades) == 0 :
if state.vars.activeTrade is not None or len(state.vars.prescribedTrades) == 0:
return
accountsWithNoActiveTrade = gaka(state.account_variables, "activeTrade", None, lambda x: x is None)
if len(accountsWithNoActiveTrade.values()) == 0:
#print("active trades on all accounts")
return
#returns true if all values are not None
#all(v is not None for v in d.keys())
#evaluate long (price/market)
#support multiaccount trades
state.ilog(lvl=1,e="evaluating prescr trades", trades=transform_data(state.vars.prescribedTrades, json_serial))
for trade in state.vars.prescribedTrades:
if trade.account.name not in accountsWithNoActiveTrade.keys() or state.account_variables[trade.account.name].pending is not None: #availability or pending
continue
if trade.status == TradeStatus.READY and trade.direction == TradeDirection.LONG and (trade.entry_price is None or trade.entry_price >= data['close']):
trade.status = TradeStatus.ACTIVATED
trade.last_update = datetime.fromtimestamp(state.time).astimezone(zoneNY)
state.ilog(lvl=1,e=f"evaluated LONG", trade=transform_data(trade, json_serial), prescrTrades=transform_data(state.vars.prescribedTrades, json_serial))
execute_trade(state, data, trade) #TBD ERROR HANDLING
del accountsWithNoActiveTrade[trade.account.name] #to avoid other entries on the same account
continue
state.vars.activeTrade = trade
state.vars.last_buy_index = data["index"]
state.vars.last_in_index = data["index"]
break
#evaluate shorts
if trade.status == TradeStatus.READY and trade.direction == TradeDirection.SHORT and (trade.entry_price is None or trade.entry_price <= data['close']):
state.ilog(lvl=1,e=f"evaluaed SHORT", trade=transform_data(trade, json_serial), prescrTrades=transform_data(state.vars.prescribedTrades, json_serial))
trade.status = TradeStatus.ACTIVATED
trade.last_update = datetime.fromtimestamp(state.time).astimezone(zoneNY)
execute_trade(state, data, trade) #TBD ERROR HANDLING
del accountsWithNoActiveTrade[trade.account.name] #to avoid other entries on the same account
continue
if not state.vars.activeTrade:
for trade in state.vars.prescribedTrades:
if trade.status == TradeStatus.READY and trade.direction == TradeDirection.SHORT and (trade.entry_price is None or trade.entry_price <= data['close']):
state.ilog(lvl=1,e=f"evaluaed SHORT", trade=transform_data(trade, json_serial), prescrTrades=transform_data(state.vars.prescribedTrades, json_serial))
trade.status = TradeStatus.ACTIVATED
trade.last_update = datetime.fromtimestamp(state.time).astimezone(zoneNY)
state.vars.activeTrade = trade
state.vars.last_buy_index = data["index"]
state.vars.last_in_index = data["index"]
break
#odeslani ORDER + NASTAVENI STOPLOSS (zatim hardcoded)
if state.vars.activeTrade:
if state.vars.activeTrade.direction == TradeDirection.LONG:
state.ilog(lvl=1,e="odesilame LONG ORDER", trade=transform_data(state.vars.activeTrade, json_serial))
if state.vars.activeTrade.size is not None:
size = state.vars.activeTrade.size
else:
size = state.vars.chunk
res = state.buy(size=size)
if isinstance(res, int) and res < 0:
raise Exception(f"error in required operation LONG {res}")
#TODO konzolidovat nize na spolecny kod pro short a long
#odeslani ORDER + NASTAVENI STOPLOSS (zatim hardcoded)
#TODO doplnit error management
def execute_trade(state, data, trade):
if trade.direction == TradeDirection.LONG:
state.ilog(lvl=1,e="odesilame LONG ORDER", trade=transform_data(trade, json_serial))
size = trade.size if trade.size is not None else state.vars.chunk
res = state.buy(size=size, account=trade.account)
#TODO ukládáme někam ID objednávky? už zde je vráceno v res
#TODO error handling
if isinstance(res, int) and res < 0:
raise Exception(f"error in required operation LONG {res}")
#TODO error handling
#defaultni goal price pripadne nastavujeme az v notifikaci
state.account_variables[trade.account.name].activeTrade = trade
#defaultni goal price pripadne nastavujeme az v notifikaci
#TODO nastaveni SL az do notifikace, kdy je známá
#pokud neni nastaveno SL v prescribe, tak nastavuji default dle stratvars
if trade.stoploss_value is None:
sl_defvalue = get_default_sl_value(state=state, signal_name=trade.generated_by, direction=trade.direction)
#TODO nastaveni SL az do notifikace, kdy je známá
#pokud neni nastaveno SL v prescribe, tak nastavuji default dle stratvars
if state.vars.activeTrade.stoploss_value is None:
sl_defvalue = get_default_sl_value(state, direction=state.vars.activeTrade.direction)
if isinstance(sl_defvalue, (float, int)):
#normalizuji dle aktualni ceny
sl_defvalue_normalized = normalize_tick(state, data,sl_defvalue)
state.account_variables[trade.account.name].activeTrade.stoploss_value = float(data['close']) - sl_defvalue_normalized
state.ilog(lvl=1,e=f"Nastaveno SL na {sl_defvalue}, priced normalized: {sl_defvalue_normalized} price: {state.account_variables[trade.account.name].activeTrade.stoploss_value }")
elif isinstance(sl_defvalue, str):
#from indicator
ind = sl_defvalue
sl_defvalue_abs = float(value_or_indicator(state, sl_defvalue))
if sl_defvalue_abs >= float(data['close']):
raise Exception(f"error in stoploss {ind} {sl_defvalue_abs} >= curr price")
state.account_variables[trade.account.name].activeTrade.stoploss_value = sl_defvalue_abs
state.ilog(lvl=1,e=f"Nastaveno SL na {sl_defvalue_abs} dle indikatoru {ind}")
insert_SL_history(state, state.account_variables[trade.account.name].activeTrade)
elif trade.direction == TradeDirection.SHORT:
state.ilog(lvl=1,e="odesilame SHORT ORDER", trade=transform_data(trade, json_serial))
size = trade.size if trade.size is not None else state.vars.chunk
res = state.sell(size=size, account=trade.account)
if isinstance(res, int) and res < 0:
print(f"error in required operation SHORT {res}")
raise Exception(f"error in required operation SHORT {res}")
#defaultní goalprice nastavujeme az v notifikaci
if isinstance(sl_defvalue, (float, int)):
#normalizuji dle aktualni ceny
sl_defvalue_normalized = normalize_tick(state, data,sl_defvalue)
state.vars.activeTrade.stoploss_value = float(data['close']) - sl_defvalue_normalized
state.ilog(lvl=1,e=f"Nastaveno SL na {sl_defvalue}, priced normalized: {sl_defvalue_normalized} price: {state.vars.activeTrade.stoploss_value }")
elif isinstance(sl_defvalue, str):
#from indicator
ind = sl_defvalue
sl_defvalue_abs = float(value_or_indicator(state, sl_defvalue))
if sl_defvalue_abs >= float(data['close']):
raise Exception(f"error in stoploss {ind} {sl_defvalue_abs} >= curr price")
state.vars.activeTrade.stoploss_value = sl_defvalue_abs
state.ilog(lvl=1,e=f"Nastaveno SL na {sl_defvalue_abs} dle indikatoru {ind}")
insert_SL_history(state)
state.vars.pending = state.vars.activeTrade.id
elif state.vars.activeTrade.direction == TradeDirection.SHORT:
state.ilog(lvl=1,e="odesilame SHORT ORDER", trade=transform_data(state.vars.activeTrade, json_serial))
if state.vars.activeTrade.size is not None:
size = state.vars.activeTrade.size
else:
size = state.vars.chunk
res = state.sell(size=size)
if isinstance(res, int) and res < 0:
print(f"error in required operation SHORT {res}")
raise Exception(f"error in required operation SHORT {res}")
#defaultní goalprice nastavujeme az v notifikaci
state.account_variables[trade.account.name].activeTrade = trade
#pokud neni nastaveno SL v prescribe, tak nastavuji default dle stratvars
if trade.stoploss_value is None:
sl_defvalue = get_default_sl_value(state, signal_name=trade.generated_by,direction=trade.direction)
#pokud neni nastaveno SL v prescribe, tak nastavuji default dle stratvars
if state.vars.activeTrade.stoploss_value is None:
sl_defvalue = get_default_sl_value(state, direction=state.vars.activeTrade.direction)
if isinstance(sl_defvalue, (float, int)):
#normalizuji dle aktualni ceny
sl_defvalue_normalized = normalize_tick(state, data,sl_defvalue)
state.account_variables[trade.account.name].activeTrade.stoploss_value = float(data['close']) + sl_defvalue_normalized
state.ilog(lvl=1,e=f"Nastaveno SL na {sl_defvalue}, priced normalized: {sl_defvalue_normalized} price: {state.account_variables[trade.account.name].activeTrade.stoploss_value }")
elif isinstance(sl_defvalue, str):
#from indicator
ind = sl_defvalue
sl_defvalue_abs = float(value_or_indicator(state, sl_defvalue))
if sl_defvalue_abs <= float(data['close']):
raise Exception(f"error in stoploss {ind} {sl_defvalue_abs} <= curr price")
state.account_variables[trade.account.name].activeTrade.stoploss_value = sl_defvalue_abs
state.ilog(lvl=1,e=f"Nastaveno SL na {sl_defvalue_abs} dle indikatoru {ind}")
insert_SL_history(state, state.account_variables[trade.account.name].activeTrade)
state.account_variables[trade.account.name].pending = trade.id
state.account_variables[trade.account.name].activeTrade = trade
state.account_variables[trade.account.name].last_entry_index =data["index"] #last_entry_index per account
state.vars.last_entry_index = data["index"] #spolecne pro vsechny accounty
if isinstance(sl_defvalue, (float, int)):
#normalizuji dle aktualni ceny
sl_defvalue_normalized = normalize_tick(state, data,sl_defvalue)
state.vars.activeTrade.stoploss_value = float(data['close']) + sl_defvalue_normalized
state.ilog(lvl=1,e=f"Nastaveno SL na {sl_defvalue}, priced normalized: {sl_defvalue_normalized} price: {state.vars.activeTrade.stoploss_value }")
elif isinstance(sl_defvalue, str):
#from indicator
ind = sl_defvalue
sl_defvalue_abs = float(value_or_indicator(state, sl_defvalue))
if sl_defvalue_abs <= float(data['close']):
raise Exception(f"error in stoploss {ind} {sl_defvalue_abs} <= curr price")
state.vars.activeTrade.stoploss_value = sl_defvalue_abs
state.ilog(lvl=1,e=f"Nastaveno SL na {sl_defvalue_abs} dle indikatoru {ind}")
insert_SL_history(state)
state.vars.pending = state.vars.activeTrade.id
else:
state.ilog(lvl=1,e="unknow direction")
state.vars.activeTrade = None

View File

@ -1,6 +1,6 @@
from v2realbot.strategy.base import StrategyState
from v2realbot.common.model import Trade, TradeDirection, TradeStatus
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, print, safe_get, gaka
from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeStatus
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, print, safe_get
from v2realbot.config import KW
from uuid import uuid4
from datetime import datetime
@ -31,12 +31,6 @@ def signal_search(state: StrategyState, data):
# slope10.out_short_if_above = 0
# ema.AND.short_if_below = 28
accountsWithNoActiveTrade = gaka(state.account_variables, "activeTrade", None, lambda x: x is None)
if len(accountsWithNoActiveTrade.values()) == 0:
#print("active trades on all accounts")
return
for signalname, signalsettings in state.vars.signals.items():
execute_signal_generator(state, data, signalname)
@ -44,21 +38,14 @@ def signal_search(state: StrategyState, data):
# pokud je s cenou ceka se na cenu, pokud immmediate tak se hned provede
# to vse za predpokladu, ze neni aktivni trade
def execute_signal_generator(state: StrategyState, data, name):
def execute_signal_generator(state, data, name):
state.ilog(lvl=1,e=f"SIGNAL SEARCH for {name}", cond_go=state.vars.conditions[KW.go][name], cond_dontgo=state.vars.conditions[KW.dont_go][name], cond_activate=state.vars.conditions[KW.activate][name] )
options = safe_get(state.vars.signals, name, None)
#add account from stratvars (if there) or default to self.state.account
if options is None:
state.ilog(lvl=1,e=f"No options for {name} in stratvars")
return
#get account of the signal, fallback to default
account = safe_get(options, "account", state.account)
account_long = safe_get(options, "account_long", account)
account_short = safe_get(options, "account_short", account)
if common_go_preconditions_check(state, data, signalname=name, options=options) is False:
return
@ -84,35 +71,29 @@ def execute_signal_generator(state: StrategyState, data, name):
state.ilog(lvl=1,e=f"{name} SHORT DISABLED")
if long_enabled is False:
state.ilog(lvl=1,e=f"{name} LONG DISABLED")
trade_made = None
#predkontroloa zda neni pending na accountu nebo aktivni trade
if state.account_variables[account_long].pending is None and state.account_variables[account_long].activeTrade is None and long_enabled and go_conditions_met(state, data,signalname=name, direction=TradeDirection.LONG):
if long_enabled and go_conditions_met(state, data,signalname=name, direction=TradeDirection.LONG):
multiplier = get_multiplier(state, data, options, TradeDirection.LONG)
state.vars.prescribedTrades.append(Trade(
account = account_long,
id=uuid4(),
last_update=datetime.fromtimestamp(state.time).astimezone(zoneNY),
status=TradeStatus.READY,
generated_by=name,
size=int(multiplier*state.vars.chunk),
size=multiplier*state.vars.chunk,
size_multiplier = multiplier,
direction=TradeDirection.LONG,
entry_price=None,
stoploss_value = None))
trade_made = account_long
#pri multiaccountu muzeme udelat v jedne iteraci vice tradu avsak vzdy na ruznych accountech
if (trade_made is None or trade_made != account_short) and state.account_variables[account_short].pending is None and state.account_variables[account_short].activeTrade is None and short_enabled and go_conditions_met(state, data, signalname=name, direction=TradeDirection.SHORT):
elif short_enabled and go_conditions_met(state, data, signalname=name, direction=TradeDirection.SHORT):
multiplier = get_multiplier(state, data, options, TradeDirection.SHORT)
state.vars.prescribedTrades.append(Trade(
account=account_short,
id=uuid4(),
last_update=datetime.fromtimestamp(state.time).astimezone(zoneNY),
status=TradeStatus.READY,
generated_by=name,
size=int(multiplier*state.vars.chunk),
size=multiplier*state.vars.chunk,
size_multiplier = multiplier,
direction=TradeDirection.SHORT,
entry_price=None,
stoploss_value = None))
return
state.ilog(lvl=0,e=f"{name} NO SIGNAL")
else:
state.ilog(lvl=0,e=f"{name} NO SIGNAL")

View File

@ -1,5 +1,5 @@
from v2realbot.strategy.base import StrategyState
from v2realbot.common.model import Trade, TradeDirection, TradeStatus
from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeStatus
import v2realbot.utils.utils as utls
from v2realbot.config import KW
from uuid import uuid4
@ -99,8 +99,7 @@ def get_multiplier(state: StrategyState, data, signaloptions: dict, direction: T
if probe_enabled:
#zatim pouze probe number 1 natvrdo, tzn. nesmi byt trade pro aktivace
#zatim funguje pouze pro primarni
if state.account_variables[state.account].last_entry_index is None:
if state.vars.last_in_index is None:
#probe_number = utls.safe_get(options, "probe_number",1)
probe_size = float(utls.safe_get(options, "probe_size", 0.1))
state.ilog(lvl=1,e=f"SIZER - PROBE - setting multiplier to {probe_size}", options=options)
@ -152,17 +151,13 @@ def get_multiplier(state: StrategyState, data, signaloptions: dict, direction: T
#pocet ztrátových obchodů v řadě mi udává multiplikátor (0 - 1, 1 ztráta 2x, 3 v řadě - 4x atp.)
if martingale_enabled:
#martingale base - základ umocňování - klasicky 2
base = float(utls.safe_get(options, "martingale_base", 2))
#pocet aktuálních konsekutivních ztrát
cont_loss_series_cnt = state.vars["transferables"]["martingale"]["cont_loss_series_cnt"]
if cont_loss_series_cnt == 0:
multiplier = 1
else:
multiplier = base ** cont_loss_series_cnt
multiplier = 2 ** cont_loss_series_cnt
state.ilog(lvl=1,e=f"SIZER - MARTINGALE {multiplier}", options=options, time=state.time, cont_loss_series_cnt=cont_loss_series_cnt)
if (martingale_enabled is False and multiplier > 1) or multiplier <= 0:
state.ilog(lvl=1,e=f"SIZER - Mame nekde problem MULTIPLIER mimo RANGE ERROR {multiplier}", options=options, time=state.time)
multiplier = 1

Some files were not shown because too many files have changed in this diff Show More