79 Commits

Author SHA1 Message Date
f04c3e9f12 Merge branch 'master' of https://github.com/drew2323/v2trading 2024-10-08 15:16:02 +02:00
faaaa65081 removed doc 2024-10-08 14:20:45 +02:00
773f5a3ef8 Update README.md (#246) 2024-08-31 06:09:17 +02:00
9d1ef83733 Update README.md (#245) 2024-08-27 14:12:05 +02:00
ceb69d696b Update README.md (#242) 2024-08-26 19:59:00 +02:00
badd5b87dd remove keys (#225) 2024-07-25 11:33:11 +02:00
8e5a56a28c Update requirements_newest.txt (#222) 2024-07-18 21:14:09 +02:00
591a9643eb Update requirements_newest.txt (#221) 2024-07-18 21:12:44 +02:00
41b3c0d839 Update requirements_newest.txt (#220) 2024-07-18 21:07:52 +02:00
dff6680098 Update requirements_newest.txt (#219) 2024-07-18 21:05:50 +02:00
88dd8e84dd Create requirements_newest.txt (#218) 2024-07-18 20:45:43 +02:00
45f022c16b Update README.md (#217) 2024-07-18 14:42:50 +02:00
9fb6794997 Update README.md (#216) 2024-07-16 07:04:49 +02:00
20e38fe223 Update README.md (#215) 2024-07-16 07:03:02 +02:00
f2ab00559a Update README.md (#214) 2024-07-15 11:20:56 +02:00
fc10cf3907 Feature/cal days from pandas (#213)
* new fetch_calendar_data function

* new class Calendar

* Revert common/model.py to the state before the last commit

* dataframe transformation is making Timestamp objects in open and close columns naive before converting to dictionary.

* typing for function arguments added. Function returns list of Calendar objects that have properties defined like strings.

* else condition fixed

* else condition returns directly an  empty without declaring a list name
2024-07-15 09:46:17 +02:00
2f15b0b2a7 system info fixes (#212) 2024-06-20 22:27:28 +02:00
0bda14409d new version vbt (#211) 2024-06-20 22:15:05 +02:00
d7bde54533 Feature/disk space (#199)
* new backend API to get disk info from psutil

* Disk info div + disk space gauge div

* styling for git disk space gauge

* inital commit - jquery request to system-info endpoint

* div disk_info created

* get_system_info function is initiated once DOM is fully loaded.

* styling for disk-gauge-bar added

* get_system_info endpoint returns additionally an information about network, cpu_time and memory

* new <div> for graphical output of system info

* increased widht for disk-gauge-container

* if condition testing an index of response and rendering an output within div for graphical output

* div deleted
2024-06-14 12:45:34 +02:00
ad45b424f7 init fixes2 (#209) 2024-06-13 11:52:14 +02:00
ee5c1ebae1 fix module inits (#208)
* research added

* fix module inits
2024-06-13 11:47:05 +02:00
702328a242 research added (#207) 2024-06-13 11:02:41 +02:00
132391c915 Create testdoc.md (#204) 2024-06-04 14:07:44 +02:00
15948ea863 static site pwd protected, load dotenv moved to config, aggregator vecotrized chng (#203) 2024-06-04 12:49:32 +02:00
63c2f7e748 vectorized aggregator, minor changes (#198) 2024-05-17 14:09:42 +02:00
031b2427b9 Feature/dotenv (#195)
* load_dotenv from python-dotenv library imported

* WEB_API_KEY is read as virtual environment variable specified in .env file

* env file referenced by variable imported from config.py

* env file directory and env file variables defined

* bash script to create env file

* Delete env_migration.sh

---------

Co-authored-by: David Brazda <davidbrazda61@gmail.com>
2024-05-09 12:47:32 +02:00
6b2a4bb066 update of vbt doc 2024-04-25 06:24:51 +02:00
c3d22e439f fix 2024-04-17 13:04:57 +02:00
8f87764fc9 Feature/market attribute (#185)
* RunManagerRecord class has a new attribute market. Market enum is imported.

* row_to_runmanager function considers market column

* add_run_manager_record and update_run_manager_record functions are changed. fetch_all_markets_in_run_manager is new.

* new Market enumeration class is defined

* market_value used for job scheduling. start and stop functions have modifications of market parameter input

* new is_market_day function + modifications of get_todays_market_times function

* market attribute set default to US

* row_to_runmanager function has no string formatter for market attribute

* add_run_manager_record function adn update_run_manager_record function update the DB column market based on record.market data

* start_runman_record and stop_runman_record have got no market parameter

* get_todays_market_times function is changed

* default value for market atribute is Market.US

* update_run_manager_record function has no if condition for market key

* market_value deleted, used enumaration value Market.US instead of string US

* get_todays_market_times has a new if condition for Market.CRYPTO

* update includes market column in the run_manager table

* market attribute in Run Manager record has value given by enumeration as Market.US

* documentation of changes made in the branch

* remove README_feature_market.md

* back to original state

* Delete README_feature_market.md

* _start_runman_record has an additional else condition

* is_market_day renamed to is_US_market_day

* transferables column added into runner_header table
2024-04-17 12:14:01 +02:00
074b6feaf8 vectorbtdoc 2024-04-16 15:53:51 +02:00
919ddf2238 bugfix (#181) 2024-03-18 18:42:09 +01:00
dfbda326ea hard stop / soft stop for cutoff (#177) martingale base (#178) 2024-03-15 13:36:28 +01:00
eff4770692 highlight logs on gui (#176) 2024-03-15 11:06:18 +01:00
e54683c69f archrunner db query searches for symbol, name (#175) 2024-03-15 10:04:46 +01:00
db22d47f72 toml validation to frontend (#174) 2024-03-14 17:39:52 +01:00
fb75ed2c35 #163 transferables (#172) 2024-03-14 14:16:01 +01:00
878092fe93 #168 #166 and additional fixes (#169) 2024-03-13 12:31:06 +01:00
801ce61c9d run updte 2024-03-07 14:07:46 +01:00
0d49327cca bugfix - kontrolu na maxloss provadime az u eventy FILL, kdy je znama celkova castka 2024-03-06 15:50:16 +01:00
f92d8c2f5e #148 #158 config refactoring to support profiles/reloading (#165) 2024-03-06 14:30:24 +01:00
ce6dc58764 #155 + presun row_to from db.py to transform.py 2024-03-06 13:31:09 +01:00
b4ac17585b Merge pull request #161 from drew2323/local
Minor changes for installation on windows
2024-03-04 17:03:50 +01:00
0f65ce3dc3 Delete run.sh 2024-03-04 17:01:47 +01:00
d3236d27a6 primary live account api and secret changed 2024-03-04 16:57:10 +01:00
5136279eb5 line 29 has deleted integrity and crossorigin value 2024-02-28 08:08:21 +01:00
d63a6b7897 user_data_dir function has a second parameter author, ACCOUNT1_LIVE has still PAPER_API_KEY and SECRET_KEY 2024-02-28 08:04:02 +01:00
a9db7e087f changed VIRTUAL_ENV_DIR and PYTHON_TO_USE 2024-02-27 18:15:35 +01:00
a96cf19fd7 #135 -> BT same period button 2024-02-27 12:03:57 +07:00
17cb63f792 all dates in gui are in market time zone (even start/stop) 2024-02-27 10:53:30 +07:00
ca1172c61c batchprofit/batchcount columns hidden from archiverunners gui 2024-02-27 08:15:07 +07:00
f884c16f07 #149 2024-02-26 22:42:03 +07:00
d0920daa16 moved config related services into separated package 2024-02-26 19:35:19 +07:00
884f377ebc #147 2024-02-26 11:30:13 +07:00
a16b3c1571 zpet debug podminka 2024-02-24 21:23:17 +07:00
d15581e35c docasny disable pro testing 2024-02-24 21:17:10 +07:00
ca3565132d #143 2024-02-24 20:32:01 +07:00
73fef65309 live_data_feed stored in runner_archive 2024-02-23 21:20:07 +07:00
3494177ac5 bugfix 2024-02-23 21:04:23 +07:00
855e4379a3 #139 konfigurace LIVE_DATA_FEED 2024-02-23 12:35:02 +07:00
0d65ae6ea1 #136 bugfix properly closing ws 2024-02-23 10:30:12 +07:00
67aab2a1be fix 2024-02-22 23:23:20 +07:00
2ba42430a3 fix 2024-02-22 23:20:54 +07:00
d3cb2fa760 Scheduler support #24sched 2024-02-22 23:05:49 +07:00
ed6285dcf5 unknown symbol msg 2024-02-12 10:45:23 +07:00
7eadf6c165 bugfix create batch image (check for None from Alpaca) 2024-02-11 15:26:15 +07:00
04cf2e2ba2 createbatch image tool + send to telefram enrichment 2024-02-11 12:37:19 +07:00
2ba492ead2 updatnute requirements.txt 2024-02-10 21:35:53 +07:00
a3b182fd45 keys to env variables, optimalizations 2024-02-10 21:02:00 +07:00
6e30ee92a0 Merge branch 'master' of https://github.com/drew2323/v2trading 2024-02-06 11:16:58 +07:00
576b2445f8 ok 2024-02-06 11:16:09 +07:00
da34775708 calendar wrapper with retry, histo bars with retry 2024-02-06 11:14:38 +07:00
90afa29f34 Update README.md 2024-02-06 09:52:53 +07:00
8991733278 Update README.md 2024-02-06 09:34:33 +07:00
c213342353 Update README.md 2024-02-06 09:30:56 +07:00
a3cab14bdd bugfix None in trade response 2024-02-05 10:22:20 +07:00
f3d2b403bd fixes 2024-02-04 17:55:43 +07:00
32e77a4cb9 Merge branch 'master' of https://github.com/drew2323/v2trading 2024-02-04 17:54:09 +07:00
14e6501ac8 Update README.md 2024-01-31 13:39:33 +07:00
c03cf054e8 Create README.md 2024-01-31 13:37:45 +07:00
99 changed files with 1838757 additions and 662 deletions

1
CODEOWNERS Normal file
View File

@ -0,0 +1 @@
* @drew2323

127
README.md Normal file
View File

@ -0,0 +1,127 @@
# V2TRADING - Algorithmic Trading Platform with Frontend
## Overview
Custom-built algorithmic trading platform for research, backtesting and live trading. Trading engine capable of processing tick data, providing custom aggregation, managing trades, and supporting backtesting in a highly accurate and efficient manner.
## Key Features
- **Trading Engine**: Processes tick data in real time, aggregating data and managing trade execution.
- **Backtesting**: tick-by tick backtesting, down to millisecond accuracy, mirrors live trading environments and is vital for developing and testing high(er)-frequency trading strategies.
- **Configuration**: robust configuration via TOML
- **Frontend**: Frontend to support research to backtesting to paper trading workflow, including lightweight charts.
- **Custom Data Aggregation:** Custom time based, volume based, dollar based and renko bars aggregators based on tick-by-tick data.
- **Indicators** Contains inbuild [tulipy](https://tulipindicators.org/list) [ta-lib](https://ta-lib.github.io/ta-lib-python/) and templates for custom build multioutputs stateful indicators.
- **Machine Learning Integration:** Includes modules for both training and inference, supporting the complete ML lifecycle.
**Gui examples**
<p align="center">
Main screen with entry/exit points and stoploss lines<br>
<img width="700" alt="Main screen with entry/exit points and stoploss lines" src="https://github.com/drew2323/v2trading/assets/28433232/751d5b0e-ef64-453f-8e76-89a39db679c5">
</p>
<p align="center">
Main screen with tick based indicators<br>
<img width="700" alt="Main screen with tick based indicators" src="https://github.com/drew2323/v2trading/assets/28433232/4bf6128c-9b36-4e88-9da1-5a33319976a1">
</p>
<p align="center">
Indicator editor<br>
<img width="700" alt="Indicator editor" src="https://github.com/drew2323/v2trading/assets/28433232/cc417393-7b88-4eea-afcb-3a00402d0a8d">
</p>
<p align="center">
Strategy editor<br>
<img width="700" alt="Strategy editor" src="https://github.com/drew2323/v2trading/assets/28433232/74f67e7a-1efc-4f63-b763-7827b2337b6a">
</p>
<p align="center">
Strategy analytical tools<br>
<img width="700" alt="Strategy analytical tools" src="https://github.com/drew2323/v2trading/assets/28433232/4bf8b3c3-e430-4250-831a-e5876bb6b743">
</p>
**Backend and API:** The backbone of the platform is built with Python, utilizing libraries such as FastAPI, NumPy, Keras, and JAX, ensuring high performance and scalability.
**Frontend:** The client-side is developed with Vanilla JavaScript and jQuery, employing LightweightCharts for charting purposes. Additional modules enhance the platform's functionality. The frontend is slated for a future refactoring to modern frameworks like Vue.js and Vuetify for a more robust user interface.
**Documentation** Public docs in in progress. Some can be found on [knowledge base](trading.mujdenik.eu) but first please request access. Some analysis documents can be found on [shared google doc folder](https://drive.google.com/drive/folders/1WmYG8oDGXO-lVTLVs9knAmMTmQL4dZt6?usp=drive_link).
# Installation Instructions
This document outlines the steps for installing and setting up the necessary environment for the application. These instructions are applicable for both Windows and Linux operating systems. Please follow the steps carefully to ensure a smooth setup.
## Prerequisites
Before beginning the installation process, ensure the following prerequisites are met:
- TA-Lib Library:
- Windows: Download and build the TA-Lib library. Install Visual Studio Community with the Visual C++ feature. Navigate to `C:\ta-lib\c\make\cdr\win32\msvc` in the command prompt and build the library using the available makefile.
- Linux: Install TA-Lib using your distribution's package manager or compile from source following the instructions available on the TA-Lib GitHub repository.
- Alpaca Paper Trading Account: Create an account at [Alpaca Markets](https://alpaca.markets/) and generate `API_KEY` and `SECRET_KEY` for your paper trading account.
## Installation Steps
**Clone the Repository:** Clone the remote repository to your local machine.
`git clone git@github.com:drew2323/v2trading.git <name_of_local_folder>`
**Install Python:** Ensure Python 3.10.11 is installed on your system.
**Create a Virtual Environment:** Set up a Python virtual environment.
`python -m venv <path_to_venv_folder>`
**Activate Virtual Environment:**
- Windows: `source ./<venv_folder>/Scripts/activate`
- Linux: `source ./<venv_folder>/bin/activate`
**Install Dependencies:** Install the program requirements.
pip install -r requirements.txt
Note: It's permissible to comment out references to `keras` and `tensorflow` modules, as well as the `ml-room` repository in `requirements.txt`.
**Environment Variables:** In `run.sh`, modify the `VIRTUAL_ENV_DIR` and `PYTHON_TO_USE` variables as necessary.
**Data Directory:** Navigate to `DATA_DIR` and create folders: `aggcache`, `tradecache`, and `models`.
**Media and Static Folders:** Create `media` and `static` folders one level above the repository directory. Also create `.env` file there.
**Database Setup:** Create the `v2trading.db` file using SQL commands from `v2trading_create_db.sql`.
```
import sqlite3
with open("v2trading_create_db.sql", "r") as f:
sql_statements = f.read()
conn = sqlite3.connect('v2trading.db')
cursor = conn.cursor()
cursor.executescript(sql_statements)
conn.commit()
conn.close()
```
Ensure the `config_table` is not empty by making an initial entry.
**Start the Application:** Run `main.py` in VSCode to start the application.
**Accessing the Application:** If the uvicorn server runs successfully at `http://0.0.0.0:8000`, access the application at `http://localhost:8000/static/`.
**Database Configuration:** Add dynamic button and JS configurations to the `config_table` in `v2trading.db` via the "Config" section on the main page.
Please replace placeholders (e.g., `<name_of_local_folder>`, `<path_to_venv_folder>`) with your actual paths and details. Follow these instructions to ensure the application is set up correctly and ready for use.
## Environmental variables
Trading platform can support N different accounts. Their API keys are stored as environmental variables in .env file located in the root directory.
Account for trading api is selected when each strategy is run. However for realtime websocket data), always ACCOUNT1 is used for all strategies. The data point selection (iex vs sip) is set by LIVE_DATA_FEED environment variable.
.env file should contain:
```
ACCOUNT1_LIVE_API_KEY=<ACCOUNT1_LIVE_API_KEY>
ACCOUNT1_LIVE_SECRET_KEY=<ACCOUNT1_LIVE_SECRET_KEY>
ACCOUNT1_LIVE_FEED=sip
ACCOUNT1_PAPER_API_KEY=<ACCOUNT1_PAPER_API_KEY>
ACCOUNT1_PAPER_SECRET_KEY=<ACCOUNT1_PAPER_SECRET_KEY>
ACCOUNT1_PAPER_FEED=sip
ACCOUNT2_PAPER_API_KEY=<ACCOUNT2_PAPER_API_KEY>
ACCOUNT2_PAPER_SECRET_KEY=ACCOUNT2_PAPER_SECRET_KEY<>
ACCOUNT2_PAPER_FEED=iex
WEB_API_KEY=<pass-for-webapi>
```

51
_run_scheduler.sh Executable file
View File

@ -0,0 +1,51 @@
#!/bin/bash
# Approach: (https://chat.openai.com/c/43be8685-b27b-4e3b-bd18-0856f8d23d7e)
# cron runs this script every minute New York in range of 9:20 - 16:20 US time
# Also this scripts writes the "heartbeat" message to log file, so the user knows
#that cron is running
# Installation steps required:
#chmod +x run_scheduler.sh
#install tzdata package: sudo apt-get install tzdata
#crontab -e
#CRON_TZ=America/New_York
# * 9-16 * * 1-5 /home/david/v2trading/run_scheduler.sh
#
# (Runs every minute of every hour on every day-of-week from Monday to Friday) US East time
# Path to the Python script
PYTHON_SCRIPT="v2realbot/scheduler/scheduler.py"
# Log file path
LOG_FILE="job.log"
# Timezone for New York
TZ='America/New_York'
NY_DATE_TIME=$(TZ=$TZ date +'%Y-%m-%d %H:%M:%S')
echo "NY_DATE_TIME: $NY_DATE_TIME"
# Check if log file exists, create it if it doesn't
if [ ! -f "$LOG_FILE" ]; then
touch "$LOG_FILE"
fi
# Check the last line of the log file
LAST_LINE=$(tail -n 1 "$LOG_FILE")
# Cron trigger message
CRON_TRIGGER="Cron trigger: $NY_DATE_TIME"
# Update the log
if [[ "$LAST_LINE" =~ "Cron trigger:".* ]]; then
# Replace the last line with the new trigger message
sed -i '' '$ d' "$LOG_FILE"
echo "$CRON_TRIGGER" >> "$LOG_FILE"
else
# Append a new cron trigger message
echo "$CRON_TRIGGER" >> "$LOG_FILE"
fi
# FOR DEBUG - Run the Python script and append output to log file
python3 "$PYTHON_SCRIPT" >> "$LOG_FILE" 2>&1

7
deployall.sh Executable file
View File

@ -0,0 +1,7 @@
#!/bin/bash
# Navigate to your git repository directory
# Execute git commands
git push deploytest master
git push deploy master

1251
job.log Normal file

File diff suppressed because it is too large Load Diff

1
jobs.log Normal file
View File

@ -0,0 +1 @@
Current 0 scheduled jobs: []

View File

@ -30,10 +30,12 @@ dateparser==1.1.8
decorator==5.1.1
defusedxml==0.7.1
dill==0.3.7
dm-tree==0.1.8
entrypoints==0.4
exceptiongroup==1.1.3
executing==1.2.0
fastapi==0.95.0
filelock==3.13.1
Flask==2.2.3
flatbuffers==23.5.26
fonttools==4.39.0
@ -46,7 +48,7 @@ google-auth-oauthlib==1.0.0
google-pasta==0.2.0
grpcio==1.58.0
h11==0.14.0
h5py==3.9.0
h5py==3.10.0
icecream==2.1.3
idna==3.4
imageio==2.31.6
@ -54,12 +56,18 @@ importlib-metadata==6.1.0
ipython==8.17.2
ipywidgets==8.1.1
itsdangerous==2.1.2
jax==0.4.23
jaxlib==0.4.23
jedi==0.19.1
Jinja2==3.1.2
joblib==1.3.2
jsonschema==4.17.3
jupyterlab-widgets==3.0.9
keras
keras==3.0.2
keras-core==0.1.7
keras-nightly==3.0.3.dev2024010203
keras-nlp-nightly==0.7.0.dev2024010203
keras-tcn @ git+https://github.com/drew2323/keras-tcn.git@4bddb17a02cb2f31c9fe2e8f616b357b1ddb0e11
kiwisolver==1.4.4
libclang==16.0.6
llvmlite==0.39.1
@ -69,12 +77,12 @@ MarkupSafe==2.1.2
matplotlib==3.8.2
matplotlib-inline==0.1.6
mdurl==0.1.2
ml-dtypes==0.2.0
mlroom @ git+https://github.com/drew2323/mlroom.git
keras-tcn @ git+https://github.com/drew2323/keras-tcn.git
ml-dtypes==0.3.1
mlroom @ git+https://github.com/drew2323/mlroom.git@692900e274c4e0542d945d231645c270fc508437
mplfinance==0.12.10b0
msgpack==1.0.4
mypy-extensions==1.0.0
namex==0.0.7
newtulipy==0.4.6
numba==0.56.4
numpy==1.23.5
@ -85,6 +93,7 @@ packaging==23.0
pandas==1.5.3
param==1.13.0
parso==0.8.3
patsy==0.5.6
pexpect==4.8.0
Pillow==9.4.0
plotly==5.13.1
@ -111,6 +120,7 @@ python-multipart==0.0.6
pytz==2022.7.1
pytz-deprecation-shim==0.1.0.post0
pyviz-comms==2.2.1
PyWavelets==1.5.0
PyYAML==6.0
regex==2023.10.3
requests==2.31.0
@ -128,11 +138,21 @@ sniffio==1.3.0
sseclient-py==1.7.2
stack-data==0.6.3
starlette==0.26.1
statsmodels==0.14.1
streamlit==1.20.0
structlog==23.1.0
TA-Lib==0.4.28
tb-nightly==2.16.0a20240102
tenacity==8.2.2
tensorboard==2.15.1
tensorboard-data-server==0.7.1
tensorflow-addons==0.23.0
tensorflow-estimator==2.15.0
tensorflow-io-gcs-filesystem==0.34.0
termcolor==2.3.0
tf-estimator-nightly==2.14.0.dev2023080308
tf-nightly==2.16.0.dev20240101
tf_keras-nightly==2.16.0.dev2023123010
threadpoolctl==3.2.0
tinydb==4.7.1
tinydb-serialization==2.1.0
@ -143,12 +163,13 @@ toolz==0.12.0
tornado==6.2
tqdm==4.65.0
traitlets==5.13.0
typeguard==2.13.3
typing_extensions==4.5.0
tzdata==2023.2
tzlocal==4.3
urllib3==1.26.14
uvicorn==0.21.1
-e git+https://github.com/drew2323/v2trading.git@523905ece6d99bf48a8952d39ced6a13f3b9a84e#egg=v2realbot
-e git+https://github.com/drew2323/v2trading.git@b58639454be921f9f0c9dd1880491cfcfdfdf3b7#egg=v2realbot
validators==0.20.0
wcwidth==0.2.9
webencodings==0.5.1

View File

@ -1,21 +1,34 @@
absl-py==2.0.0
alpaca==1.0.0
alpaca-py==0.7.1
alpaca-py==0.18.1
altair==4.2.2
annotated-types==0.6.0
anyio==3.6.2
appdirs==1.4.4
appnope==0.1.3
APScheduler==3.10.4
argon2-cffi==23.1.0
argon2-cffi-bindings==21.2.0
arrow==1.3.0
asttokens==2.2.1
astunparse==1.6.3
async-lru==2.0.4
attrs==22.2.0
Babel==2.15.0
beautifulsoup4==4.12.3
better-exceptions==0.3.3
bleach==6.0.0
blinker==1.5
bottle==0.12.25
cachetools==5.3.0
CD==1.1.0
certifi==2022.12.7
cffi==1.16.0
chardet==5.1.0
charset-normalizer==3.0.1
click==8.1.3
colorama==0.4.6
comm==0.1.4
contourpy==1.0.7
cycler==0.11.0
dash==2.9.1
@ -23,90 +36,189 @@ dash-bootstrap-components==1.4.1
dash-core-components==2.0.0
dash-html-components==2.0.0
dash-table==5.0.0
dateparser==1.1.8
debugpy==1.8.1
decorator==5.1.1
defusedxml==0.7.1
dill==0.3.7
dm-tree==0.1.8
entrypoints==0.4
exceptiongroup==1.1.3
executing==1.2.0
fastapi==0.95.0
fastapi==0.109.2
fastjsonschema==2.19.1
filelock==3.13.1
Flask==2.2.3
flatbuffers==23.5.26
fonttools==4.39.0
fpdf2==2.7.6
fqdn==1.5.1
gast==0.4.0
gitdb==4.0.10
GitPython==3.1.31
google-auth==2.23.0
google-auth-oauthlib==1.0.0
google-pasta==0.2.0
greenlet==3.0.3
grpcio==1.58.0
h11==0.14.0
h5py==3.9.0
h5py==3.10.0
html2text==2024.2.26
httpcore==1.0.5
httpx==0.27.0
humanize==4.9.0
icecream==2.1.3
idna==3.4
imageio==2.31.6
importlib-metadata==6.1.0
ipykernel==6.29.4
ipython==8.17.2
ipywidgets==8.1.1
isoduration==20.11.0
itables==2.0.1
itsdangerous==2.1.2
jax==0.4.23
jaxlib==0.4.23
jedi==0.19.1
Jinja2==3.1.2
joblib==1.3.2
jsonschema==4.17.3
keras==2.13.1
json5==0.9.25
jsonpointer==2.4
jsonschema==4.22.0
jsonschema-specifications==2023.12.1
jupyter-events==0.10.0
jupyter-lsp==2.2.5
jupyter_client==8.6.1
jupyter_core==5.7.2
jupyter_server==2.14.0
jupyter_server_terminals==0.5.3
jupyterlab==4.1.8
jupyterlab-widgets==3.0.9
jupyterlab_pygments==0.3.0
jupyterlab_server==2.27.1
kaleido==0.2.1
keras==3.0.2
keras-core==0.1.7
keras-nightly==3.0.3.dev2024010203
keras-nlp-nightly==0.7.0.dev2024010203
keras-tcn @ git+https://github.com/drew2323/keras-tcn.git@4bddb17a02cb2f31c9fe2e8f616b357b1ddb0e11
kiwisolver==1.4.4
libclang==16.0.6
lightweight-charts @ git+https://github.com/drew2323/lightweight-charts-python@10fd42f785182edfbf6b46a19a4ef66e85985a23
llvmlite==0.39.1
Markdown==3.4.3
markdown-it-py==2.2.0
MarkupSafe==2.1.2
matplotlib==3.7.1
matplotlib==3.8.2
matplotlib-inline==0.1.6
mdurl==0.1.2
mistune==3.0.2
ml-dtypes==0.3.1
mlroom @ git+https://github.com/drew2323/mlroom.git@692900e274c4e0542d945d231645c270fc508437
mplfinance==0.12.10b0
msgpack==1.0.4
mypy-extensions==1.0.0
namex==0.0.7
nbclient==0.10.0
nbconvert==7.16.4
nbformat==5.10.4
nest-asyncio==1.6.0
newtulipy==0.4.6
numpy==1.24.2
notebook_shim==0.2.4
numba==0.56.4
numpy==1.23.5
oauthlib==3.2.2
opt-einsum==3.3.0
orjson==3.9.10
overrides==7.7.0
packaging==23.0
pandas==1.5.3
pandas==2.2.1
pandocfilters==1.5.1
param==1.13.0
parso==0.8.3
patsy==0.5.6
pexpect==4.8.0
Pillow==9.4.0
plotly==5.13.1
platformdirs==4.2.0
plotly==5.22.0
prometheus_client==0.20.0
prompt-toolkit==3.0.39
proto-plus==1.22.2
protobuf==3.20.3
proxy-tools==0.1.0
psutil==5.9.8
ptyprocess==0.7.0
pure-eval==0.2.2
pyarrow==11.0.0
pyasn1==0.4.8
pyasn1-modules==0.2.8
pycparser==2.22
pyct==0.5.0
pydantic==1.10.5
pydantic==2.6.4
pydantic_core==2.16.3
pydeck==0.8.0
Pygments==2.14.0
pyinstrument==4.5.3
Pympler==1.0.1
pyobjc-core==10.3
pyobjc-framework-Cocoa==10.3
pyobjc-framework-Security==10.3
pyobjc-framework-WebKit==10.3
pyparsing==3.0.9
pyrsistent==0.19.3
pysos==1.3.0
python-dateutil==2.8.2
python-dotenv==1.0.0
python-json-logger==2.0.7
python-multipart==0.0.6
pytz==2022.7.1
pytz-deprecation-shim==0.1.0.post0
pyviz-comms==2.2.1
PyWavelets==1.5.0
pywebview==5.1
PyYAML==6.0
pyzmq==25.1.2
referencing==0.35.1
regex==2023.10.3
requests==2.31.0
requests-oauthlib==1.3.1
rfc3339-validator==0.1.4
rfc3986-validator==0.1.1
rich==13.3.1
rpds-py==0.18.0
rsa==4.9
scikit-learn==1.3.1
schedule==1.2.1
scikit-learn==1.3.2
scipy==1.11.2
seaborn==0.12.2
semver==2.13.0
Send2Trash==1.8.3
six==1.16.0
smmap==5.0.0
sniffio==1.3.0
soupsieve==2.5
SQLAlchemy==2.0.27
sseclient-py==1.7.2
starlette==0.26.1
stack-data==0.6.3
starlette==0.36.3
statsmodels==0.14.1
streamlit==1.20.0
structlog==23.1.0
TA-Lib==0.4.28
tb-nightly==2.16.0a20240102
tenacity==8.2.2
tensorboard==2.13.0
tensorboard==2.15.1
tensorboard-data-server==0.7.1
tensorflow==2.13.0
tensorflow-estimator==2.13.0
tensorflow-addons==0.23.0
tensorflow-estimator==2.15.0
tensorflow-io-gcs-filesystem==0.34.0
termcolor==2.3.0
terminado==0.18.1
tf-estimator-nightly==2.14.0.dev2023080308
tf-nightly==2.16.0.dev20240101
tf_keras-nightly==2.16.0.dev2023123010
threadpoolctl==3.2.0
tinycss2==1.3.0
tinydb==4.7.1
tinydb-serialization==2.1.0
tinyflux==0.4.0
@ -115,15 +227,24 @@ tomli==2.0.1
toolz==0.12.0
tornado==6.2
tqdm==4.65.0
typing_extensions==4.5.0
traitlets==5.13.0
typeguard==2.13.3
types-python-dateutil==2.9.0.20240316
typing_extensions==4.9.0
tzdata==2023.2
tzlocal==4.3
uri-template==1.3.0
urllib3==1.26.14
uvicorn==0.21.1
#-e git+https://github.com/drew2323/v2trading.git@940348412f67ecd551ef8d0aaedf84452abf1320#egg=v2realbot
-e git+https://github.com/drew2323/v2trading.git@1f85b271dba2b9baf2c61b591a08849e9d684374#egg=v2realbot
validators==0.20.0
vectorbtpro @ file:///Users/davidbrazda/Downloads/vectorbt.pro-2024.2.22
wcwidth==0.2.9
webcolors==1.13
webencodings==0.5.1
websockets==10.4
websocket-client==1.7.0
websockets==11.0.3
Werkzeug==2.2.3
wrapt==1.15.0
widgetsnbextension==4.0.9
wrapt==1.14.1
zipp==3.15.0

243
requirements_newest.txt Normal file
View File

@ -0,0 +1,243 @@
absl-py
alpaca
alpaca-py
altair
annotated-types
anyio
appdirs
appnope
APScheduler
argon2-cffi
argon2-cffi-bindings
arrow
asttokens
astunparse
async-lru
attrs
Babel
beautifulsoup4
better-exceptions
bleach
blinker
bottle
cachetools
CD
certifi
cffi
chardet
charset-normalizer
click
colorama
comm
contourpy
cycler
dash
dash-bootstrap-components
dash-core-components
dash-html-components
dash-table
dateparser
debugpy
decorator
defusedxml
dill
dm-tree
entrypoints
exceptiongroup
executing
fastapi
fastjsonschema
filelock
Flask
flatbuffers
fonttools
fpdf2
fqdn
gast
gitdb
GitPython
google-auth
google-auth-oauthlib
google-pasta
greenlet
grpcio
h11
h5py
html2text
httpcore
httpx
humanize
icecream
idna
imageio
importlib-metadata
ipykernel
ipython
ipywidgets
isoduration
itables
itsdangerous
jax
jaxlib
jedi
Jinja2
joblib
json5
jsonpointer
jsonschema
jsonschema-specifications
jupyter-events
jupyter-lsp
jupyter_client
jupyter_core
jupyter_server
jupyter_server_terminals
jupyterlab
jupyterlab-widgets
jupyterlab_pygments
jupyterlab_server
kaleido
keras
keras-core
keras-nightly
keras-nlp-nightly
keras-tcn @ git+https://github.com/drew2323/keras-tcn.git
kiwisolver
libclang
lightweight-charts @ git+https://github.com/drew2323/lightweight-charts-python.git
llvmlite
Markdown
markdown-it-py
MarkupSafe
matplotlib
matplotlib-inline
mdurl
mistune
ml-dtypes
mlroom @ git+https://github.com/drew2323/mlroom.git
mplfinance
msgpack
mypy-extensions
namex
nbclient
nbconvert
nbformat
nest-asyncio
newtulipy
notebook_shim
numba
numpy
oauthlib
opt-einsum
orjson
overrides
packaging
pandas
pandocfilters
param
parso
patsy
pexpect
Pillow
platformdirs
plotly
prometheus_client
prompt-toolkit
proto-plus
protobuf
proxy-tools
psutil
ptyprocess
pure-eval
pyarrow
pyasn1
pyasn1-modules
pycparser
pyct
pydantic
pydantic_core
pydeck
Pygments
pyinstrument
pyparsing
pyrsistent
pysos
python-dateutil
python-dotenv
python-json-logger
python-multipart
pytz
pytz-deprecation-shim
pyviz-comms
PyWavelets
pywebview
PyYAML
pyzmq
referencing
regex
requests
requests-oauthlib
rfc3339-validator
rfc3986-validator
rich
rpds-py
rsa
schedule
scikit-learn
scipy
seaborn
semver
Send2Trash
six
smmap
sniffio
soupsieve
SQLAlchemy
sseclient-py
stack-data
starlette
statsmodels
streamlit
structlog
TA-Lib
tb-nightly
tenacity
tensorboard
tensorboard-data-server
tensorflow-addons
tensorflow-estimator
tensorflow-io-gcs-filesystem
termcolor
terminado
tf-estimator-nightly
tf-nightly
tf_keras-nightly
threadpoolctl
tinycss2
tinydb
tinydb-serialization
tinyflux
toml
tomli
toolz
tornado
tqdm
traitlets
typeguard
types-python-dateutil
typing_extensions
tzdata
tzlocal
uri-template
urllib3
uvicorn
validators
wcwidth
webcolors
webencodings
websocket-client
websockets
Werkzeug
widgetsnbextension
wrapt
zipp

104044
research/basic.ipynb Normal file

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,410 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Loading trades and vectorized aggregation\n",
"Describes how to fetch trades (remote/cached) and use new vectorized aggregation to aggregate bars of given type (time, volume, dollar) and resolution\n",
"\n",
"`fetch_trades_parallel` enables to fetch trades of given symbol and interval, also can filter conditions and minimum size. return `trades_df`\n",
"`aggregate_trades` acceptss `trades_df` and ressolution and type of bars (VOLUME, TIME, DOLLAR) and return aggregated ohlcv dataframe `ohlcv_df`"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">Activating profile profile1\n",
"</pre>\n"
],
"text/plain": [
"Activating profile profile1\n"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"trades_df-BAC-2024-01-11T09:30:00-2024-01-12T16:00:00.parquet\n",
"trades_df-SPY-2024-01-01T09:30:00-2024-05-14T16:00:00.parquet\n",
"ohlcv_df-BAC-2024-01-11T09:30:00-2024-01-12T16:00:00.parquet\n",
"ohlcv_df-SPY-2024-01-01T09:30:00-2024-05-14T16:00:00.parquet\n"
]
}
],
"source": [
"import pandas as pd\n",
"import numpy as np\n",
"from numba import jit\n",
"from alpaca.data.historical import StockHistoricalDataClient\n",
"from v2realbot.config import ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, DATA_DIR\n",
"from alpaca.data.requests import StockTradesRequest\n",
"from v2realbot.enums.enums import BarType\n",
"import time\n",
"from datetime import datetime\n",
"from v2realbot.utils.utils import parse_alpaca_timestamp, ltp, zoneNY, send_to_telegram, fetch_calendar_data\n",
"import pyarrow\n",
"from v2realbot.loader.aggregator_vectorized import fetch_daily_stock_trades, fetch_trades_parallel, generate_time_bars_nb, aggregate_trades\n",
"import vectorbtpro as vbt\n",
"import v2realbot.utils.config_handler as cfh\n",
"\n",
"vbt.settings.set_theme(\"dark\")\n",
"vbt.settings['plotting']['layout']['width'] = 1280\n",
"vbt.settings.plotting.auto_rangebreaks = True\n",
"# Set the option to display with pagination\n",
"pd.set_option('display.notebook_repr_html', True)\n",
"pd.set_option('display.max_rows', 20) # Number of rows per page\n",
"# pd.set_option('display.float_format', '{:.9f}'.format)\n",
"\n",
"\n",
"#trade filtering\n",
"exclude_conditions = cfh.config_handler.get_val('AGG_EXCLUDED_TRADES') #standard ['C','O','4','B','7','V','P','W','U','Z','F']\n",
"minsize = 100\n",
"\n",
"symbol = \"SPY\"\n",
"#datetime in zoneNY \n",
"day_start = datetime(2024, 1, 1, 9, 30, 0)\n",
"day_stop = datetime(2024, 1, 14, 16, 00, 0)\n",
"day_start = zoneNY.localize(day_start)\n",
"day_stop = zoneNY.localize(day_stop)\n",
"#filename of trades_df parquet, date are in isoformat but without time zone part\n",
"dir = DATA_DIR + \"/notebooks/\"\n",
"#parquet interval cache contains exclude conditions and minsize filtering\n",
"file_trades = dir + f\"trades_df-{symbol}-{day_start.strftime('%Y-%m-%dT%H:%M:%S')}-{day_stop.strftime('%Y-%m-%dT%H:%M:%S')}-{exclude_conditions}-{minsize}.parquet\"\n",
"#file_trades = dir + f\"trades_df-{symbol}-{day_start.strftime('%Y-%m-%dT%H:%M:%S')}-{day_stop.strftime('%Y-%m-%dT%H:%M:%S')}.parquet\"\n",
"file_ohlcv = dir + f\"ohlcv_df-{symbol}-{day_start.strftime('%Y-%m-%dT%H:%M:%S')}-{day_stop.strftime('%Y-%m-%dT%H:%M:%S')}-{exclude_conditions}-{minsize}.parquet\"\n",
"\n",
"#PRINT all parquet in directory\n",
"import os\n",
"files = [f for f in os.listdir(dir) if f.endswith(\".parquet\")]\n",
"for f in files:\n",
" print(f)"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"NOT FOUND. Fetching from remote\n"
]
},
{
"ename": "KeyboardInterrupt",
"evalue": "",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)",
"Cell \u001b[0;32mIn[2], line 1\u001b[0m\n\u001b[0;32m----> 1\u001b[0m trades_df \u001b[38;5;241m=\u001b[39m \u001b[43mfetch_daily_stock_trades\u001b[49m\u001b[43m(\u001b[49m\u001b[43msymbol\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mday_start\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mday_stop\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mexclude_conditions\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mexclude_conditions\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mminsize\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mminsize\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mforce_remote\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mFalse\u001b[39;49;00m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mmax_retries\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;241;43m5\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mbackoff_factor\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;241;43m1\u001b[39;49m\u001b[43m)\u001b[49m\n\u001b[1;32m 2\u001b[0m trades_df\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/v2realbot/loader/aggregator_vectorized.py:200\u001b[0m, in \u001b[0;36mfetch_daily_stock_trades\u001b[0;34m(symbol, start, end, exclude_conditions, minsize, force_remote, max_retries, backoff_factor)\u001b[0m\n\u001b[1;32m 198\u001b[0m \u001b[38;5;28;01mfor\u001b[39;00m attempt \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28mrange\u001b[39m(max_retries):\n\u001b[1;32m 199\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 200\u001b[0m tradesResponse \u001b[38;5;241m=\u001b[39m \u001b[43mclient\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mget_stock_trades\u001b[49m\u001b[43m(\u001b[49m\u001b[43mstockTradeRequest\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 201\u001b[0m is_empty \u001b[38;5;241m=\u001b[39m \u001b[38;5;129;01mnot\u001b[39;00m tradesResponse[symbol]\n\u001b[1;32m 202\u001b[0m \u001b[38;5;28mprint\u001b[39m(\u001b[38;5;124mf\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mRemote fetched: \u001b[39m\u001b[38;5;132;01m{\u001b[39;00mis_empty\u001b[38;5;132;01m=}\u001b[39;00m\u001b[38;5;124m\"\u001b[39m, start, end)\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/alpaca/data/historical/stock.py:144\u001b[0m, in \u001b[0;36mStockHistoricalDataClient.get_stock_trades\u001b[0;34m(self, request_params)\u001b[0m\n\u001b[1;32m 141\u001b[0m params \u001b[38;5;241m=\u001b[39m request_params\u001b[38;5;241m.\u001b[39mto_request_fields()\n\u001b[1;32m 143\u001b[0m \u001b[38;5;66;03m# paginated get request for market data api\u001b[39;00m\n\u001b[0;32m--> 144\u001b[0m raw_trades \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_data_get\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 145\u001b[0m \u001b[43m \u001b[49m\u001b[43mendpoint_data_type\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mtrades\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m,\u001b[49m\n\u001b[1;32m 146\u001b[0m \u001b[43m \u001b[49m\u001b[43mendpoint_asset_class\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mstocks\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m,\u001b[49m\n\u001b[1;32m 147\u001b[0m \u001b[43m \u001b[49m\u001b[43mapi_version\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mv2\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m,\u001b[49m\n\u001b[1;32m 148\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mparams\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 149\u001b[0m \u001b[43m\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 151\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_use_raw_data:\n\u001b[1;32m 152\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m raw_trades\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/alpaca/data/historical/stock.py:338\u001b[0m, in \u001b[0;36mStockHistoricalDataClient._data_get\u001b[0;34m(self, endpoint_asset_class, endpoint_data_type, api_version, symbol_or_symbols, limit, page_limit, extension, **kwargs)\u001b[0m\n\u001b[1;32m 335\u001b[0m params[\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mlimit\u001b[39m\u001b[38;5;124m\"\u001b[39m] \u001b[38;5;241m=\u001b[39m actual_limit\n\u001b[1;32m 336\u001b[0m params[\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mpage_token\u001b[39m\u001b[38;5;124m\"\u001b[39m] \u001b[38;5;241m=\u001b[39m page_token\n\u001b[0;32m--> 338\u001b[0m response \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mget\u001b[49m\u001b[43m(\u001b[49m\u001b[43mpath\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mpath\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mdata\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mparams\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mapi_version\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mapi_version\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 340\u001b[0m \u001b[38;5;66;03m# TODO: Merge parsing if possible\u001b[39;00m\n\u001b[1;32m 341\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m extension \u001b[38;5;241m==\u001b[39m DataExtensionType\u001b[38;5;241m.\u001b[39mSNAPSHOT:\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/alpaca/common/rest.py:221\u001b[0m, in \u001b[0;36mRESTClient.get\u001b[0;34m(self, path, data, **kwargs)\u001b[0m\n\u001b[1;32m 210\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mget\u001b[39m(\u001b[38;5;28mself\u001b[39m, path: \u001b[38;5;28mstr\u001b[39m, data: Union[\u001b[38;5;28mdict\u001b[39m, \u001b[38;5;28mstr\u001b[39m] \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mNone\u001b[39;00m, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m HTTPResult:\n\u001b[1;32m 211\u001b[0m \u001b[38;5;250m \u001b[39m\u001b[38;5;124;03m\"\"\"Performs a single GET request\u001b[39;00m\n\u001b[1;32m 212\u001b[0m \n\u001b[1;32m 213\u001b[0m \u001b[38;5;124;03m Args:\u001b[39;00m\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 219\u001b[0m \u001b[38;5;124;03m dict: The response\u001b[39;00m\n\u001b[1;32m 220\u001b[0m \u001b[38;5;124;03m \"\"\"\u001b[39;00m\n\u001b[0;32m--> 221\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_request\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mGET\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mpath\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mdata\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/alpaca/common/rest.py:129\u001b[0m, in \u001b[0;36mRESTClient._request\u001b[0;34m(self, method, path, data, base_url, api_version)\u001b[0m\n\u001b[1;32m 127\u001b[0m \u001b[38;5;28;01mwhile\u001b[39;00m retry \u001b[38;5;241m>\u001b[39m\u001b[38;5;241m=\u001b[39m \u001b[38;5;241m0\u001b[39m:\n\u001b[1;32m 128\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 129\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_one_request\u001b[49m\u001b[43m(\u001b[49m\u001b[43mmethod\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43murl\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mopts\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mretry\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 130\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m RetryException:\n\u001b[1;32m 131\u001b[0m time\u001b[38;5;241m.\u001b[39msleep(\u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_retry_wait)\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/alpaca/common/rest.py:193\u001b[0m, in \u001b[0;36mRESTClient._one_request\u001b[0;34m(self, method, url, opts, retry)\u001b[0m\n\u001b[1;32m 174\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21m_one_request\u001b[39m(\u001b[38;5;28mself\u001b[39m, method: \u001b[38;5;28mstr\u001b[39m, url: \u001b[38;5;28mstr\u001b[39m, opts: \u001b[38;5;28mdict\u001b[39m, retry: \u001b[38;5;28mint\u001b[39m) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m \u001b[38;5;28mdict\u001b[39m:\n\u001b[1;32m 175\u001b[0m \u001b[38;5;250m \u001b[39m\u001b[38;5;124;03m\"\"\"Perform one request, possibly raising RetryException in the case\u001b[39;00m\n\u001b[1;32m 176\u001b[0m \u001b[38;5;124;03m the response is 429. Otherwise, if error text contain \"code\" string,\u001b[39;00m\n\u001b[1;32m 177\u001b[0m \u001b[38;5;124;03m then it decodes to json object and returns APIError.\u001b[39;00m\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 191\u001b[0m \u001b[38;5;124;03m dict: The response data\u001b[39;00m\n\u001b[1;32m 192\u001b[0m \u001b[38;5;124;03m \"\"\"\u001b[39;00m\n\u001b[0;32m--> 193\u001b[0m response \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_session\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mrequest\u001b[49m\u001b[43m(\u001b[49m\u001b[43mmethod\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43murl\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mopts\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 195\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m 196\u001b[0m response\u001b[38;5;241m.\u001b[39mraise_for_status()\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/requests/sessions.py:589\u001b[0m, in \u001b[0;36mSession.request\u001b[0;34m(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json)\u001b[0m\n\u001b[1;32m 584\u001b[0m send_kwargs \u001b[38;5;241m=\u001b[39m {\n\u001b[1;32m 585\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mtimeout\u001b[39m\u001b[38;5;124m\"\u001b[39m: timeout,\n\u001b[1;32m 586\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mallow_redirects\u001b[39m\u001b[38;5;124m\"\u001b[39m: allow_redirects,\n\u001b[1;32m 587\u001b[0m }\n\u001b[1;32m 588\u001b[0m send_kwargs\u001b[38;5;241m.\u001b[39mupdate(settings)\n\u001b[0;32m--> 589\u001b[0m resp \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43msend\u001b[49m\u001b[43m(\u001b[49m\u001b[43mprep\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43msend_kwargs\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 591\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m resp\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/requests/sessions.py:703\u001b[0m, in \u001b[0;36mSession.send\u001b[0;34m(self, request, **kwargs)\u001b[0m\n\u001b[1;32m 700\u001b[0m start \u001b[38;5;241m=\u001b[39m preferred_clock()\n\u001b[1;32m 702\u001b[0m \u001b[38;5;66;03m# Send the request\u001b[39;00m\n\u001b[0;32m--> 703\u001b[0m r \u001b[38;5;241m=\u001b[39m \u001b[43madapter\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43msend\u001b[49m\u001b[43m(\u001b[49m\u001b[43mrequest\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 705\u001b[0m \u001b[38;5;66;03m# Total elapsed time of the request (approximately)\u001b[39;00m\n\u001b[1;32m 706\u001b[0m elapsed \u001b[38;5;241m=\u001b[39m preferred_clock() \u001b[38;5;241m-\u001b[39m start\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/requests/adapters.py:486\u001b[0m, in \u001b[0;36mHTTPAdapter.send\u001b[0;34m(self, request, stream, timeout, verify, cert, proxies)\u001b[0m\n\u001b[1;32m 483\u001b[0m timeout \u001b[38;5;241m=\u001b[39m TimeoutSauce(connect\u001b[38;5;241m=\u001b[39mtimeout, read\u001b[38;5;241m=\u001b[39mtimeout)\n\u001b[1;32m 485\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 486\u001b[0m resp \u001b[38;5;241m=\u001b[39m \u001b[43mconn\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43murlopen\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 487\u001b[0m \u001b[43m \u001b[49m\u001b[43mmethod\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mrequest\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mmethod\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 488\u001b[0m \u001b[43m \u001b[49m\u001b[43murl\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43murl\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 489\u001b[0m \u001b[43m \u001b[49m\u001b[43mbody\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mrequest\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mbody\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 490\u001b[0m \u001b[43m \u001b[49m\u001b[43mheaders\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mrequest\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mheaders\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 491\u001b[0m \u001b[43m \u001b[49m\u001b[43mredirect\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mFalse\u001b[39;49;00m\u001b[43m,\u001b[49m\n\u001b[1;32m 492\u001b[0m \u001b[43m \u001b[49m\u001b[43massert_same_host\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mFalse\u001b[39;49;00m\u001b[43m,\u001b[49m\n\u001b[1;32m 493\u001b[0m \u001b[43m \u001b[49m\u001b[43mpreload_content\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mFalse\u001b[39;49;00m\u001b[43m,\u001b[49m\n\u001b[1;32m 494\u001b[0m \u001b[43m \u001b[49m\u001b[43mdecode_content\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mFalse\u001b[39;49;00m\u001b[43m,\u001b[49m\n\u001b[1;32m 495\u001b[0m \u001b[43m \u001b[49m\u001b[43mretries\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mmax_retries\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 496\u001b[0m \u001b[43m \u001b[49m\u001b[43mtimeout\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mtimeout\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 497\u001b[0m \u001b[43m \u001b[49m\u001b[43mchunked\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mchunked\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 498\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 500\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m (ProtocolError, \u001b[38;5;167;01mOSError\u001b[39;00m) \u001b[38;5;28;01mas\u001b[39;00m err:\n\u001b[1;32m 501\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mConnectionError\u001b[39;00m(err, request\u001b[38;5;241m=\u001b[39mrequest)\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/urllib3/connectionpool.py:703\u001b[0m, in \u001b[0;36mHTTPConnectionPool.urlopen\u001b[0;34m(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, **response_kw)\u001b[0m\n\u001b[1;32m 700\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_prepare_proxy(conn)\n\u001b[1;32m 702\u001b[0m \u001b[38;5;66;03m# Make the request on the httplib connection object.\u001b[39;00m\n\u001b[0;32m--> 703\u001b[0m httplib_response \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_make_request\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 704\u001b[0m \u001b[43m \u001b[49m\u001b[43mconn\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 705\u001b[0m \u001b[43m \u001b[49m\u001b[43mmethod\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 706\u001b[0m \u001b[43m \u001b[49m\u001b[43murl\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 707\u001b[0m \u001b[43m \u001b[49m\u001b[43mtimeout\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mtimeout_obj\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 708\u001b[0m \u001b[43m \u001b[49m\u001b[43mbody\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mbody\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 709\u001b[0m \u001b[43m \u001b[49m\u001b[43mheaders\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mheaders\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 710\u001b[0m \u001b[43m \u001b[49m\u001b[43mchunked\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mchunked\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 711\u001b[0m \u001b[43m\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 713\u001b[0m \u001b[38;5;66;03m# If we're going to release the connection in ``finally:``, then\u001b[39;00m\n\u001b[1;32m 714\u001b[0m \u001b[38;5;66;03m# the response doesn't need to know about the connection. Otherwise\u001b[39;00m\n\u001b[1;32m 715\u001b[0m \u001b[38;5;66;03m# it will also try to release it and we'll have a double-release\u001b[39;00m\n\u001b[1;32m 716\u001b[0m \u001b[38;5;66;03m# mess.\u001b[39;00m\n\u001b[1;32m 717\u001b[0m response_conn \u001b[38;5;241m=\u001b[39m conn \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m release_conn \u001b[38;5;28;01melse\u001b[39;00m \u001b[38;5;28;01mNone\u001b[39;00m\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/urllib3/connectionpool.py:449\u001b[0m, in \u001b[0;36mHTTPConnectionPool._make_request\u001b[0;34m(self, conn, method, url, timeout, chunked, **httplib_request_kw)\u001b[0m\n\u001b[1;32m 444\u001b[0m httplib_response \u001b[38;5;241m=\u001b[39m conn\u001b[38;5;241m.\u001b[39mgetresponse()\n\u001b[1;32m 445\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mBaseException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[1;32m 446\u001b[0m \u001b[38;5;66;03m# Remove the TypeError from the exception chain in\u001b[39;00m\n\u001b[1;32m 447\u001b[0m \u001b[38;5;66;03m# Python 3 (including for exceptions like SystemExit).\u001b[39;00m\n\u001b[1;32m 448\u001b[0m \u001b[38;5;66;03m# Otherwise it looks like a bug in the code.\u001b[39;00m\n\u001b[0;32m--> 449\u001b[0m \u001b[43msix\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mraise_from\u001b[49m\u001b[43m(\u001b[49m\u001b[43me\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43;01mNone\u001b[39;49;00m\u001b[43m)\u001b[49m\n\u001b[1;32m 450\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m (SocketTimeout, BaseSSLError, SocketError) \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[1;32m 451\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_raise_timeout(err\u001b[38;5;241m=\u001b[39me, url\u001b[38;5;241m=\u001b[39murl, timeout_value\u001b[38;5;241m=\u001b[39mread_timeout)\n",
"File \u001b[0;32m<string>:3\u001b[0m, in \u001b[0;36mraise_from\u001b[0;34m(value, from_value)\u001b[0m\n",
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/urllib3/connectionpool.py:444\u001b[0m, in \u001b[0;36mHTTPConnectionPool._make_request\u001b[0;34m(self, conn, method, url, timeout, chunked, **httplib_request_kw)\u001b[0m\n\u001b[1;32m 441\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mTypeError\u001b[39;00m:\n\u001b[1;32m 442\u001b[0m \u001b[38;5;66;03m# Python 3\u001b[39;00m\n\u001b[1;32m 443\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 444\u001b[0m httplib_response \u001b[38;5;241m=\u001b[39m \u001b[43mconn\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mgetresponse\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 445\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mBaseException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[1;32m 446\u001b[0m \u001b[38;5;66;03m# Remove the TypeError from the exception chain in\u001b[39;00m\n\u001b[1;32m 447\u001b[0m \u001b[38;5;66;03m# Python 3 (including for exceptions like SystemExit).\u001b[39;00m\n\u001b[1;32m 448\u001b[0m \u001b[38;5;66;03m# Otherwise it looks like a bug in the code.\u001b[39;00m\n\u001b[1;32m 449\u001b[0m six\u001b[38;5;241m.\u001b[39mraise_from(e, \u001b[38;5;28;01mNone\u001b[39;00m)\n",
"File \u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py:1375\u001b[0m, in \u001b[0;36mHTTPConnection.getresponse\u001b[0;34m(self)\u001b[0m\n\u001b[1;32m 1373\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m 1374\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m-> 1375\u001b[0m \u001b[43mresponse\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mbegin\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 1376\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mConnectionError\u001b[39;00m:\n\u001b[1;32m 1377\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mclose()\n",
"File \u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py:318\u001b[0m, in \u001b[0;36mHTTPResponse.begin\u001b[0;34m(self)\u001b[0m\n\u001b[1;32m 316\u001b[0m \u001b[38;5;66;03m# read until we get a non-100 response\u001b[39;00m\n\u001b[1;32m 317\u001b[0m \u001b[38;5;28;01mwhile\u001b[39;00m \u001b[38;5;28;01mTrue\u001b[39;00m:\n\u001b[0;32m--> 318\u001b[0m version, status, reason \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_read_status\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 319\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m status \u001b[38;5;241m!=\u001b[39m CONTINUE:\n\u001b[1;32m 320\u001b[0m \u001b[38;5;28;01mbreak\u001b[39;00m\n",
"File \u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py:279\u001b[0m, in \u001b[0;36mHTTPResponse._read_status\u001b[0;34m(self)\u001b[0m\n\u001b[1;32m 278\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21m_read_status\u001b[39m(\u001b[38;5;28mself\u001b[39m):\n\u001b[0;32m--> 279\u001b[0m line \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mstr\u001b[39m(\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mfp\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mreadline\u001b[49m\u001b[43m(\u001b[49m\u001b[43m_MAXLINE\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m+\u001b[39;49m\u001b[43m \u001b[49m\u001b[38;5;241;43m1\u001b[39;49m\u001b[43m)\u001b[49m, \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124miso-8859-1\u001b[39m\u001b[38;5;124m\"\u001b[39m)\n\u001b[1;32m 280\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28mlen\u001b[39m(line) \u001b[38;5;241m>\u001b[39m _MAXLINE:\n\u001b[1;32m 281\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m LineTooLong(\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mstatus line\u001b[39m\u001b[38;5;124m\"\u001b[39m)\n",
"File \u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/socket.py:705\u001b[0m, in \u001b[0;36mSocketIO.readinto\u001b[0;34m(self, b)\u001b[0m\n\u001b[1;32m 703\u001b[0m \u001b[38;5;28;01mwhile\u001b[39;00m \u001b[38;5;28;01mTrue\u001b[39;00m:\n\u001b[1;32m 704\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 705\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_sock\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mrecv_into\u001b[49m\u001b[43m(\u001b[49m\u001b[43mb\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 706\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m timeout:\n\u001b[1;32m 707\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_timeout_occurred \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mTrue\u001b[39;00m\n",
"File \u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/ssl.py:1274\u001b[0m, in \u001b[0;36mSSLSocket.recv_into\u001b[0;34m(self, buffer, nbytes, flags)\u001b[0m\n\u001b[1;32m 1270\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m flags \u001b[38;5;241m!=\u001b[39m \u001b[38;5;241m0\u001b[39m:\n\u001b[1;32m 1271\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mValueError\u001b[39;00m(\n\u001b[1;32m 1272\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mnon-zero flags not allowed in calls to recv_into() on \u001b[39m\u001b[38;5;132;01m%s\u001b[39;00m\u001b[38;5;124m\"\u001b[39m \u001b[38;5;241m%\u001b[39m\n\u001b[1;32m 1273\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m\u001b[38;5;18m__class__\u001b[39m)\n\u001b[0;32m-> 1274\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mread\u001b[49m\u001b[43m(\u001b[49m\u001b[43mnbytes\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mbuffer\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 1275\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[1;32m 1276\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28msuper\u001b[39m()\u001b[38;5;241m.\u001b[39mrecv_into(buffer, nbytes, flags)\n",
"File \u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/ssl.py:1130\u001b[0m, in \u001b[0;36mSSLSocket.read\u001b[0;34m(self, len, buffer)\u001b[0m\n\u001b[1;32m 1128\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m 1129\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m buffer \u001b[38;5;129;01mis\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m \u001b[38;5;28;01mNone\u001b[39;00m:\n\u001b[0;32m-> 1130\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_sslobj\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mread\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43mlen\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mbuffer\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 1131\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[1;32m 1132\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_sslobj\u001b[38;5;241m.\u001b[39mread(\u001b[38;5;28mlen\u001b[39m)\n",
"\u001b[0;31mKeyboardInterrupt\u001b[0m: "
]
}
],
"source": [
"trades_df = fetch_daily_stock_trades(symbol, day_start, day_stop, exclude_conditions=exclude_conditions, minsize=minsize, force_remote=False, max_retries=5, backoff_factor=1)\n",
"trades_df"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"#Either load trades or ohlcv from parquet if exists\n",
"\n",
"#trades_df = fetch_trades_parallel(symbol, day_start, day_stop, exclude_conditions=exclude_conditions, minsize=50, max_workers=20) #exclude_conditions=['C','O','4','B','7','V','P','W','U','Z','F'])\n",
"# trades_df.to_parquet(file_trades, engine='pyarrow', compression='gzip')\n",
"\n",
"trades_df = pd.read_parquet(file_trades,engine='pyarrow')\n",
"ohlcv_df = aggregate_trades(symbol=symbol, trades_df=trades_df, resolution=1, type=BarType.TIME)\n",
"ohlcv_df.to_parquet(file_ohlcv, engine='pyarrow', compression='gzip')\n",
"\n",
"# ohlcv_df = pd.read_parquet(file_ohlcv,engine='pyarrow')\n",
"# trades_df = pd.read_parquet(file_trades,engine='pyarrow')\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#list all files is dir directory with parquet extension\n",
"dir = DATA_DIR + \"/notebooks/\"\n",
"import os\n",
"files = [f for f in os.listdir(dir) if f.endswith(\".parquet\")]\n",
"file_name = \"\"\n",
"ohlcv_df = pd.read_parquet(file_ohlcv,engine='pyarrow')"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"ohlcv_df"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import matplotlib.pyplot as plt\n",
"import seaborn as sns\n",
"# Calculate daily returns\n",
"ohlcv_df['returns'] = ohlcv_df['close'].pct_change().dropna()\n",
"#same as above but pct_change is from 3 datapoints back, but only if it is the same date, else na\n",
"\n",
"\n",
"# Plot the probability distribution curve\n",
"plt.figure(figsize=(10, 6))\n",
"sns.histplot(df['returns'].dropna(), kde=True, stat='probability', bins=30)\n",
"plt.title('Probability Distribution of Daily Returns')\n",
"plt.xlabel('Daily Returns')\n",
"plt.ylabel('Probability')\n",
"plt.show()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import pandas as pd\n",
"import numpy as np\n",
"from sklearn.model_selection import train_test_split\n",
"from sklearn.preprocessing import StandardScaler\n",
"from sklearn.linear_model import LogisticRegression\n",
"from sklearn.metrics import accuracy_score\n",
"\n",
"# Define the intervals from 5 to 20 s, returns for each interval\n",
"#maybe use rolling window?\n",
"intervals = range(5, 21, 5)\n",
"\n",
"# Create columns for percentage returns\n",
"rolling_window = 50\n",
"\n",
"# Normalize the returns using rolling mean and std\n",
"for N in intervals:\n",
" column_name = f'returns_{N}'\n",
" rolling_mean = ohlcv_df[column_name].rolling(window=rolling_window).mean()\n",
" rolling_std = ohlcv_df[column_name].rolling(window=rolling_window).std()\n",
" ohlcv_df[f'norm_{column_name}'] = (ohlcv_df[column_name] - rolling_mean) / rolling_std\n",
"\n",
"# Display the dataframe with normalized return columns\n",
"ohlcv_df\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Calculate the sum of the normalized return columns for each row\n",
"ohlcv_df['sum_norm_returns'] = ohlcv_df[[f'norm_returns_{N}' for N in intervals]].sum(axis=1)\n",
"\n",
"# Sort the DataFrame based on the sum of normalized returns in descending order\n",
"df_sorted = ohlcv_df.sort_values(by='sum_norm_returns', ascending=False)\n",
"\n",
"# Display the top rows with the highest sum of normalized returns\n",
"df_sorted\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Drop initial rows with NaN values due to pct_change\n",
"ohlcv_df.dropna(inplace=True)\n",
"\n",
"# Plotting the probability distribution curves\n",
"plt.figure(figsize=(14, 8))\n",
"for N in intervals:\n",
" sns.kdeplot(ohlcv_df[f'returns_{N}'].dropna(), label=f'Returns {N}', fill=True)\n",
"\n",
"plt.title('Probability Distribution of Percentage Returns')\n",
"plt.xlabel('Percentage Return')\n",
"plt.ylabel('Density')\n",
"plt.legend()\n",
"plt.show()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import matplotlib.pyplot as plt\n",
"import seaborn as sns\n",
"# Plot the probability distribution curve\n",
"plt.figure(figsize=(10, 6))\n",
"sns.histplot(ohlcv_df['returns'].dropna(), kde=True, stat='probability', bins=30)\n",
"plt.title('Probability Distribution of Daily Returns')\n",
"plt.xlabel('Daily Returns')\n",
"plt.ylabel('Probability')\n",
"plt.show()\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#show only rows from ohlcv_df where returns > 0.005\n",
"ohlcv_df[ohlcv_df['returns'] > 0.0005]\n",
"\n",
"#ohlcv_df[ohlcv_df['returns'] < -0.005]"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#ohlcv where index = date 2024-03-13 and between hour 12\n",
"\n",
"a = ohlcv_df.loc['2024-03-13 12:00:00':'2024-03-13 13:00:00']\n",
"a"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"ohlcv_df"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"trades_df"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"ohlcv_df.info()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"trades_df.to_parquet(\"trades_df-spy-0111-0111.parquett\", engine='pyarrow', compression='gzip')\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"trades_df.to_parquet(\"trades_df-spy-111-0516.parquett\", engine='pyarrow', compression='gzip', allow_truncated_timestamps=True)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"ohlcv_df.to_parquet(\"ohlcv_df-spy-111-0516.parquett\", engine='pyarrow', compression='gzip')"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"basic_data = vbt.Data.from_data(vbt.symbol_dict({symbol: ohlcv_df}), tz_convert=zoneNY)\n",
"vbt.settings['plotting']['auto_rangebreaks'] = True\n",
"basic_data.ohlcv.plot()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#access just BCA\n",
"#df_filtered = df.loc[\"BAC\"]"
]
}
],
"metadata": {
"kernelspec": {
"display_name": ".venv",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.10"
}
},
"nbformat": 4,
"nbformat_minor": 2
}

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because it is too large Load Diff

26673
research/rsi_alpaca.ipynb Normal file

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

23637
research/test.ipynb Normal file

File diff suppressed because it is too large Load Diff

105
research/test1.ipynb Normal file

File diff suppressed because one or more lines are too long

421
research/test1sbars.ipynb Normal file
View File

@ -0,0 +1,421 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from v2realbot.tools.loadbatch import load_batch\n",
"from v2realbot.utils.utils import zoneNY\n",
"import pandas as pd\n",
"import numpy as np\n",
"import vectorbtpro as vbt\n",
"from itables import init_notebook_mode, show\n",
"\n",
"init_notebook_mode(all_interactive=True)\n",
"\n",
"vbt.settings.set_theme(\"dark\")\n",
"vbt.settings['plotting']['layout']['width'] = 1280\n",
"vbt.settings.plotting.auto_rangebreaks = True\n",
"# Set the option to display with pagination\n",
"pd.set_option('display.notebook_repr_html', True)\n",
"pd.set_option('display.max_rows', 10) # Number of rows per page\n",
"\n",
"res, df = load_batch(batch_id=\"0fb5043a\", #46 days 1.3 - 6.5.\n",
" space_resolution_evenly=False,\n",
" indicators_columns=[\"Rsi14\"],\n",
" main_session_only=True,\n",
" verbose = False)\n",
"if res < 0:\n",
" print(\"Error\" + str(res) + str(df))\n",
"df = df[\"bars\"]\n",
"\n",
"df"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# filter dates"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#filter na dny\n",
"# dates_of_interest = pd.to_datetime(['2024-04-22', '2024-04-23']).tz_localize('US/Eastern')\n",
"# filtered_df = df.loc[df.index.normalize().isin(dates_of_interest)]\n",
"\n",
"# df = filtered_df\n",
"# df.info()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import plotly.io as pio\n",
"pio.renderers.default = 'notebook'\n",
"\n",
"#naloadujeme do vbt symbol as column\n",
"basic_data = vbt.Data.from_data({\"BAC\": df}, tz_convert=zoneNY)\n",
"start_date = pd.Timestamp('2024-03-12 09:30', tz=zoneNY)\n",
"end_date = pd.Timestamp('2024-03-13 16:00', tz=zoneNY)\n",
"\n",
"#basic_data = basic_data.transform(lambda df: df[df.index.date == start_date.date()])\n",
"#basic_data = basic_data.transform(lambda df: df[(df.index >= start_date) & (df.index <= end_date)])\n",
"#basic_data.data[\"BAC\"].info()\n",
"\n",
"# fig = basic_data.plot(plot_volume=False)\n",
"# pivot_info = basic_data.run(\"pivotinfo\", up_th=0.003, down_th=0.002)\n",
"# #pivot_info.plot()\n",
"# pivot_info.plot(fig=fig, conf_value_trace_kwargs=dict(visible=True))\n",
"# fig.show()\n",
"\n",
"\n",
"# rsi14 = basic_data.data[\"BAC\"][\"Rsi14\"].rename(\"Rsi14\")\n",
"\n",
"# rsi14.vbt.plot().show()\n",
"#basic_data.xloc[\"09:30\":\"10:00\"].data[\"BAC\"].vbt.ohlcv.plot().show()\n",
"\n",
"vbt.settings.plotting.auto_rangebreaks = True\n",
"#basic_data.data[\"BAC\"].vbt.ohlcv.plot()\n",
"\n",
"#basic_data.data[\"BAC\"]"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"m1_data = basic_data[['Open', 'High', 'Low', 'Close', 'Volume']]\n",
"\n",
"m1_data.data[\"BAC\"]\n",
"#m5_data = m1_data.resample(\"5T\")\n",
"\n",
"#m5_data.data[\"BAC\"].head(10)\n",
"\n",
"# m15_data = m1_data.resample(\"15T\")\n",
"\n",
"# m15 = m15_data.data[\"BAC\"]\n",
"\n",
"# m15.vbt.ohlcv.plot()\n",
"\n",
"# m1_data.wrapper.index\n",
"\n",
"# m1_resampler = m1_data.wrapper.get_resampler(\"1T\")\n",
"# m1_resampler.index_difference(reverse=True)\n",
"\n",
"\n",
"# m5_resampler.prettify()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# defining ENTRY WINDOW and forced EXIT window"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#m1_data.data[\"BAC\"].info()\n",
"import datetime\n",
"# Define the market open and close times\n",
"market_open = datetime.time(9, 30)\n",
"market_close = datetime.time(16, 0)\n",
"entry_window_opens = 1\n",
"entry_window_closes = 350\n",
"\n",
"forced_exit_start = 380\n",
"forced_exit_end = 390\n",
"\n",
"forced_exit = m1_data.symbol_wrapper.fill(False)\n",
"entry_window_open= m1_data.symbol_wrapper.fill(False)\n",
"\n",
"# Calculate the time difference in minutes from market open for each timestamp\n",
"elapsed_min_from_open = (forced_exit.index.hour - market_open.hour) * 60 + (forced_exit.index.minute - market_open.minute)\n",
"\n",
"entry_window_open[(elapsed_min_from_open >= entry_window_opens) & (elapsed_min_from_open < entry_window_closes)] = True\n",
"forced_exit[(elapsed_min_from_open >= forced_exit_start) & (elapsed_min_from_open < forced_exit_end)] = True\n",
"\n",
"#entry_window_open.info()\n",
"# forced_exit.tail(100)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"close = m1_data.close\n",
"\n",
"rsi = vbt.RSI.run(close, window=14)\n",
"\n",
"long_entries = (rsi.rsi.vbt.crossed_below(20) & entry_window_open)\n",
"long_exits = (rsi.rsi.vbt.crossed_above(70) | forced_exit)\n",
"#long_entries.info()\n",
"#number of trues and falses in long_entries\n",
"long_entries.value_counts()\n",
"#long_exits.value_counts()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"def plot_rsi(rsi, close, entries, exits):\n",
" fig = vbt.make_subplots(rows=1, cols=1, shared_xaxes=True, specs=[[{\"secondary_y\": True}]], vertical_spacing=0.02, subplot_titles=(\"RSI\", \"Price\" ))\n",
" close.vbt.plot(fig=fig, add_trace_kwargs=dict(secondary_y=True))\n",
" rsi.plot(fig=fig, add_trace_kwargs=dict(secondary_y=False))\n",
" entries.vbt.signals.plot_as_entries(rsi.rsi, fig=fig, add_trace_kwargs=dict(secondary_y=False)) \n",
" exits.vbt.signals.plot_as_exits(rsi.rsi, fig=fig, add_trace_kwargs=dict(secondary_y=False)) \n",
" return fig\n",
"\n",
"plot_rsi(rsi, close, long_entries, long_exits)\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"vbt.phelp(vbt.Portfolio.from_signals)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"sl_stop = np.arange(0.03/100, 0.2/100, 0.02/100).tolist()\n",
"# Using the round function\n",
"sl_stop = [round(val, 4) for val in sl_stop]\n",
"print(sl_stop)\n",
"sl_stop = vbt.Param(sl_stop) #np.nan mean s no stoploss\n",
"\n",
"pf = vbt.Portfolio.from_signals(close=close, entries=long_entries, sl_stop=sl_stop, tp_stop = sl_stop, exits=long_exits,fees=0.0167/100, freq=\"1s\") #sl_stop=sl_stop, tp_stop = sl_stop, \n",
"\n",
"#pf.stats()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pf.plot()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pf[(0.0015,0.0013)].plot()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pf[0.03].plot_trade_signals()\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# pristup k pf jako multi index"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#pf[0.03].plot()\n",
"#pf.order_records\n",
"pf[(0.03)].stats()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#zgrupovane statistiky\n",
"stats_df = pf.stats([\n",
" 'total_return',\n",
" 'total_trades',\n",
" 'win_rate',\n",
" 'expectancy'\n",
"], agg_func=None)\n",
"stats_df\n",
"\n",
"\n",
"stats_df.nlargest(50, 'Total Return [%]')\n",
"#stats_df.info()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pf[(0.0011,0.0013)].plot()\n",
"\n",
"#pf[(0.0011,0.0013000000000000002)].plot()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from pandas.tseries.offsets import DateOffset\n",
"\n",
"temp_data = basic_data['2024-4-22']\n",
"temp_data\n",
"res1m = temp_data[[\"Open\", \"High\", \"Low\", \"Close\", \"Volume\"]]\n",
"\n",
"# Define a custom date offset that starts at 9:30 AM and spans 4 hours\n",
"custom_offset = DateOffset(hours=4, minutes=30)\n",
"\n",
"# res1m = res1m.get().resample(\"4H\").agg({ \n",
"# \"Open\": \"first\",\n",
"# \"High\": \"max\",\n",
"# \"Low\": \"min\",\n",
"# \"Close\": \"last\",\n",
"# \"Volume\": \"sum\"\n",
"# })\n",
"\n",
"res4h = res1m.resample(\"1h\", resample_kwargs=dict(origin=\"start\"))\n",
"\n",
"res4h.data\n",
"\n",
"res15m = res1m.resample(\"15T\", resample_kwargs=dict(origin=\"start\"))\n",
"\n",
"res15m.data[\"BAC\"]"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"@vbt.njit\n",
"def long_entry_place_func_nb(c, low, close, time_in_ns, rsi14, window_open, window_close):\n",
" market_open_minutes = 570 # 9 hours * 60 minutes + 30 minutes\n",
"\n",
" for out_i in range(len(c.out)):\n",
" i = c.from_i + out_i\n",
"\n",
" current_minutes = vbt.dt_nb.hour_nb(time_in_ns[i]) * 60 + vbt.dt_nb.minute_nb(time_in_ns[i])\n",
" #print(\"current_minutes\", current_minutes)\n",
" # Calculate elapsed minutes since market open at 9:30 AM\n",
" elapsed_from_open = current_minutes - market_open_minutes\n",
" elapsed_from_open = elapsed_from_open if elapsed_from_open >= 0 else 0\n",
" #print( \"elapsed_from_open\", elapsed_from_open)\n",
"\n",
" #elapsed_from_open = elapsed_minutes_from_open_nb(time_in_ns) \n",
" in_window = elapsed_from_open > window_open and elapsed_from_open < window_close\n",
" #print(\"in_window\", in_window)\n",
" # if in_window:\n",
" # print(\"in window\")\n",
"\n",
" if in_window and rsi14[i] > 60: # and low[i, c.col] <= hit_price: # and hour == 9: # (4)!\n",
" return out_i\n",
" return -1\n",
"\n",
"@vbt.njit\n",
"def long_exit_place_func_nb(c, high, close, time_index, tp, sl): # (5)!\n",
" entry_i = c.from_i - c.wait\n",
" entry_price = close[entry_i, c.col]\n",
" hit_price = entry_price * (1 + tp)\n",
" stop_price = entry_price * (1 - sl)\n",
" for out_i in range(len(c.out)):\n",
" i = c.from_i + out_i\n",
" last_bar_of_day = vbt.dt_nb.day_changed_nb(time_index[i], time_index[i + 1])\n",
"\n",
" #print(next_day)\n",
" if last_bar_of_day: #pokud je dalsi next day, tak zavirame posledni\n",
" print(\"ted\",out_i)\n",
" return out_i\n",
" if close[i, c.col] >= hit_price or close[i, c.col] <= stop_price :\n",
" return out_i\n",
" return -1\n",
"\n",
"\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"df = pd.DataFrame(np.random.random(size=(5, 10)), columns=list('abcdefghij'))\n",
"\n",
"df"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"df.sum()"
]
}
],
"metadata": {
"kernelspec": {
"display_name": ".venv",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.11"
}
},
"nbformat": 4,
"nbformat_minor": 2
}

File diff suppressed because one or more lines are too long

45
restart.sh Executable file
View File

@ -0,0 +1,45 @@
#!/bin/bash
# file: restart.sh
# Usage: ./restart.sh [test|prod|all]
# Define server addresses
TEST_SERVER="david@142.132.188.109"
PROD_SERVER="david@5.161.179.223"
# Define the remote directory where the script is located
REMOTE_DIR="v2trading"
# Check for argument
if [ "$#" -ne 1 ]; then
echo "Usage: $0 [test|prod|all]"
exit 1
fi
# Function to restart a server
restart_server() {
local server=$1
echo "Connecting to $server to restart the Python app..."
ssh -t $server "cd $REMOTE_DIR && . ~/.bashrc && ./run.sh restart" # Sourcing .bashrc here
echo "Operation completed on $server."
}
# Select the server based on the input argument
case $1 in
test)
restart_server $TEST_SERVER
;;
prod)
restart_server $PROD_SERVER
;;
all)
restart_server $TEST_SERVER
restart_server $PROD_SERVER
;;
*)
echo "Invalid argument: $1. Use 'test', 'prod', or 'all'."
exit 1
;;
esac

107
testdoc.md Normal file
View File

@ -0,0 +1,107 @@
# Plotly
* MAKE_SUBPLOT Defines layout (if more then 1x1 or secondary y axis are required)
```python
fig = vbt.make_subplots(rows=2, cols=1, shared_xaxes=True,
specs=[[{"secondary_y": True}], [{"secondary_y": False}]],
vertical_spacing=0.02, subplot_titles=("Row 1 title", "Row 2 title"))
```
Then the different [sr/df generic accessor](http://5.161.179.223:8000/static/js/vbt/api/generic/accessors/index.html#vectorbtpro.generic.accessors.GenericAccessor.areaplot) are added with ADD_TRACE_KWARGS and TRACE_KWARGS. Other types of plot available in [plotting module](http://5.161.179.223:8000/static/js/vbt/api/generic/plotting/index.html)
```python
#using accessor
close.vbt.plot(fig=fig, add_trace_kwargs=dict(secondary_y=False,row=1, col=1), trace_kwargs=dict(line=dict(color="blue")))
indvolume.vbt.barplot(fig=fig, add_trace_kwargs=dict(secondary_y=False, row=2, col=1))
#using plotting module
vbt.Bar(indvolume, fig=fig, add_trace_kwargs=dict(secondary_y=False, row=2, col=1))
```
* ADD_TRACE_KWARGS - determines positioning withing subplot
```python
add_trace_kwargs=dict(secondary_y=False,row=1, col=1)
```
* TRACE_KWARGS - other styling of trace
```python
trace_kwargs=dict(name="LONGS",
line=dict(color="#ffe476"),
marker=dict(color="limegreen"),
fill=None,
connectgaps=True)
```
## Example
```python
fig = vbt.make_subplots(rows=2, cols=1, shared_xaxes=True,
specs=[[{"secondary_y": True}], [{"secondary_y": False}]],
vertical_spacing=0.02, subplot_titles=("Price and Indicators", "Volume"))
# Plotting the close price
close.vbt.plot(fig=fig, add_trace_kwargs=dict(secondary_y=False,row=1, col=1), trace_kwargs=dict(line=dict(color="blue")))
```
# Data
## Resampling
```python
t1data = basic_data[['open', 'high', 'low', 'close', 'volume','vwap','buyvolume','sellvolume']].resample("1T")
t1data = t1data.transform(lambda df: df.between_time('09:30', '16:00').dropna()) #main session data only, no nans
t5data = basic_data[['open', 'high', 'low', 'close', 'volume','vwap','buyvolume','sellvolume']].resample("5T")
t5data = t5data.transform(lambda df: df.between_time('09:30', '16:00').dropna())
dailydata = basic_data[['open', 'high', 'low', 'close', 'volume', 'vwap']].resample("D").dropna()
#realign 5min close to 1min so it can be compared with 1min
t5data_close_realigned = t5data.close.vbt.realign_closing("1T").between_time('09:30', '16:00').dropna()
#same with open
t5data.open.vbt.realign_opening("1h")
```
### Define resample function for custom column
Example of custom feature config [Binance Data](http://5.161.179.223:8000/static/js/vbt/api/data/custom/binance/index.html#vectorbtpro.data.custom.binance.BinanceData.feature_config).
Other [reduced functions available](http://5.161.179.223:8000/static/js/vbt/api/generic/nb/apply_reduce/index.html). (mean, min, max, median, nth ...)
```python
from vectorbtpro.utils.config import merge_dicts, Config, HybridConfig
from vectorbtpro import _typing as tp
from vectorbtpro.generic import nb as generic_nb
_feature_config: tp.ClassVar[Config] = HybridConfig(
{
"buyvolume": dict(
resample_func=lambda self, obj, resampler: obj.vbt.resample_apply(
resampler,
generic_nb.sum_reduce_nb,
)
),
"sellvolume": dict(
resample_func=lambda self, obj, resampler: obj.vbt.resample_apply(
resampler,
generic_nb.sum_reduce_nb,
)
)
}
)
basic_data._feature_config = _feature_config
```
### Validate resample
```python
t2dataclose = t2data.close.rename("15MIN - realigned").vbt.realign_closing("1T")
fig = t1data.close.rename("1MIN").vbt.plot()
t2data.close.rename("15MIN").vbt.plot(fig=fig)
t2dataclose.vbt.plot(fig=fig)
```
## Persisting
```python
basic_data.to_parquet(partition_by="day", compression="gzip")
day_data = vbt.ParquetData.pull("BAC", filters=[("group", "==", "2024-05-03")])
vbt.print_dir_tree("BTC-USD")#overeni directory structure
```
# Discover
```python
vbt.phelp(vbt.talib(atr).run) #parameters it accepts
vbt.pdir(pf) - get available properties and methods
vbt.pprint(basic_data) #to get correct shape, info about instance
```

View File

@ -23,12 +23,12 @@ clientTrading = TradingClient(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY,
#get previous days bar
datetime_object_from = datetime.datetime(2023, 10, 11, 4, 0, 00, tzinfo=datetime.timezone.utc)
datetime_object_to = datetime.datetime(2023, 10, 16, 16, 1, 00, tzinfo=datetime.timezone.utc)
calendar_request = GetCalendarRequest(start=datetime_object_from,end=datetime_object_to)
cal_dates = clientTrading.get_calendar(calendar_request)
print(cal_dates)
bar_request = StockBarsRequest(symbol_or_symbols="BAC",timeframe=TimeFrame.Day, start=datetime_object_from, end=datetime_object_to, feed=DataFeed.SIP)
datetime_object_from = datetime.datetime(2024, 3, 9, 13, 29, 00, tzinfo=datetime.timezone.utc)
datetime_object_to = datetime.datetime(2024, 3, 11, 20, 1, 00, tzinfo=datetime.timezone.utc)
# calendar_request = GetCalendarRequest(start=datetime_object_from,end=datetime_object_to)
# cal_dates = clientTrading.get_calendar(calendar_request)
# print(cal_dates)
bar_request = StockBarsRequest(symbol_or_symbols="BAC",timeframe=TimeFrame.Minute, start=datetime_object_from, end=datetime_object_to, feed=DataFeed.SIP)
# bars = client.get_stock_bars(bar_request).df

View File

@ -1,3 +1,3 @@
API_KEY = 'PKGGEWIEYZOVQFDRY70L'
SECRET_KEY = 'O5Kt8X4RLceIOvM98i5LdbalItsX7hVZlbPYHy8Y'
API_KEY = ''
SECRET_KEY = ''
MAX_BATCH_SIZE = 1

18
testy/createbatchimage.py Normal file
View File

@ -0,0 +1,18 @@
import argparse
import v2realbot.reporting.metricstoolsimage as mt
# Parse the command-line arguments
# parser = argparse.ArgumentParser(description="Generate trading report image with batch ID")
# parser.add_argument("batch_id", type=str, help="The batch ID for the report")
# args = parser.parse_args()
# batch_id = args.batch_id
# Generate the report image
res, val = mt.generate_trading_report_image(batch_id="4d7dc163")
# Print the result
if res == 0:
print("BATCH REPORT CREATED")
else:
print(f"BATCH REPORT ERROR - {val}")

View File

@ -1,7 +1,9 @@
import os,sys
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
print(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from alpaca.data.historical import CryptoHistoricalDataClient, StockHistoricalDataClient
import pandas as pd
import numpy as np
from alpaca.data.historical import StockHistoricalDataClient
from alpaca.data.requests import CryptoLatestTradeRequest, StockLatestTradeRequest, StockLatestBarRequest, StockTradesRequest
from alpaca.data.enums import DataFeed
from v2realbot.config import ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY

89
testy/getrunnerdetail.py Normal file
View File

@ -0,0 +1,89 @@
from v2realbot.common.model import RunDay, StrategyInstance, Runner, RunRequest, RunArchive, RunArchiveView, RunArchiveViewPagination, RunArchiveDetail, RunArchiveChange, Bar, TradeEvent, TestList, Intervals, ConfigItem, InstantIndicator, DataTablesRequest
import v2realbot.controller.services as cs
from v2realbot.utils.utils import slice_dict_lists,zoneUTC,safe_get, AttributeDict
id = "b11c66d9-a9b6-475a-9ac1-28b11e1b4edf"
state = AttributeDict(vars={})
##základ pro init_attached_data in strategy.init
# def get_previous_runner(state):
# runner : Runner
# res, runner = cs.get_runner(state.runner_id)
# if res < 0:
# print(f"Not running {id}")
# return 0, None
# return 0, runner.batch_id
def attach_previous_data(state):
runner : Runner
#get batch_id of current runer
res, runner = cs.get_runner(state.runner_id)
if res < 0 or runner.batch_id is None:
print(f"Couldnt get previous runner {val}")
return None
batch_id = runner.batch_id
#batch_id = "6a6b0bcf"
res, runner_ids =cs.get_archived_runnerslist_byBatchID(batch_id, "desc")
if res < 0:
msg = f"error whne fetching runners of batch {batch_id} {runner_ids}"
print(msg)
return None
if runner_ids is None or len(runner_ids) == 0:
print(f"no runners found for batch {batch_id} {runner_ids}")
return None
last_runner = runner_ids[0]
print("Previous runner identified:", last_runner)
#get details from the runner
res, val = cs.get_archived_runner_details_byID(last_runner)
if res < 0:
print(f"no archived runner {last_runner}")
detail = RunArchiveDetail(**val)
#print("toto jsme si dotahnuli", detail.bars)
# from stratvars directives
attach_previous_bars_indicators = safe_get(state.vars, "attach_previous_bars_indicators", 50)
attach_previous_cbar_indicators = safe_get(state.vars, "attach_previous_cbar_indicators", 50)
# [stratvars]
# attach_previous_bars_indicators = 50
# attach_previous_cbar_indicators = 50
#indicators datetime utc
indicators = slice_dict_lists(d=detail.indicators[0],last_item=attach_previous_bars_indicators, time_to_datetime=True)
#time -datetime utc, updated - timestamp float
bars = slice_dict_lists(d=detail.bars, last_item=attach_previous_bars_indicators, time_to_datetime=True)
#cbar_indicatzors #float
cbar_inds = slice_dict_lists(d=detail.indicators[1],last_item=attach_previous_cbar_indicators)
#USE these as INITs - TADY SI TO JESTE ZASTAVIT a POROVNAT
print(f"{state.indicators=} NEW:{indicators=}")
state.indicators = indicators
print(f"{state.bars=} NEW:{bars=}")
state.bars = bars
print(f"{state.cbar_indicators=} NEW:{cbar_inds=}")
state.cbar_indicators = cbar_inds
print("BARS and INDS INITIALIZED")
#bars
#tady budou pripadne dalsi inicializace, z ext_data
print("EXT_DATA", detail.ext_data)
#podle urciteho nastaveni napr.v konfiguraci se pouziji urcite promenne
#pridavame dailyBars z extData
# if hasattr(detail, "ext_data") and "dailyBars" in detail.ext_data:
# state.dailyBars = detail.ext_data["dailyBars"]
if __name__ == "__main__":
attach_previous_data(state)

74
testy/tablesizes.py Normal file
View File

@ -0,0 +1,74 @@
import queue
import sqlite3
import threading
from appdirs import user_data_dir
DATA_DIR = user_data_dir("v2realbot")
sqlite_db_file = DATA_DIR + "/v2trading.db"
class ConnectionPool:
def __init__(self, max_connections):
self.max_connections = max_connections
self.connections = queue.Queue(max_connections)
self.lock = threading.Lock()
def get_connection(self):
with self.lock:
if self.connections.empty():
return self.create_connection()
else:
return self.connections.get()
def release_connection(self, connection):
with self.lock:
self.connections.put(connection)
def create_connection(self):
connection = sqlite3.connect(sqlite_db_file, check_same_thread=False)
return connection
pool = ConnectionPool(10)
def get_table_sizes_in_mb():
# Connect to the SQLite database
conn = pool.get_connection()
cursor = conn.cursor()
# Get the list of tables
cursor.execute("SELECT name FROM sqlite_master WHERE type='table';")
tables = cursor.fetchall()
# Dictionary to store table sizes
table_sizes = {}
for table in tables:
table_name = table[0]
# Get total number of rows in the table
cursor.execute(f"SELECT COUNT(*) FROM {table_name};")
row_count = cursor.fetchone()[0]
if row_count > 0:
# Sample a few rows (e.g., 10 rows) and calculate average row size
cursor.execute(f"SELECT * FROM {table_name} LIMIT 10;")
sample_rows = cursor.fetchall()
total_sample_size = sum(sum(len(str(cell)) for cell in row) for row in sample_rows)
avg_row_size = total_sample_size / len(sample_rows)
# Estimate table size in megabytes
size_in_mb = (avg_row_size * row_count) / (1024 * 1024)
else:
size_in_mb = 0
table_sizes[table_name] = {'size_mb': size_in_mb, 'rows': row_count}
conn.close()
return table_sizes
# Usage example
db_path = 'path_to_your_database.db'
table_sizes = get_table_sizes_in_mb()
for table, info in table_sizes.items():
print(f"Table: {table}, Size: {info['size_mb']} MB, Rows: {info['rows']}")

View File

@ -0,0 +1,66 @@
import os
from bs4 import BeautifulSoup
import html2text
def convert_html_to_markdown(html_content, link_mapping):
h = html2text.HTML2Text()
h.ignore_links = False
# Update internal links to point to the relevant sections in the Markdown
soup = BeautifulSoup(html_content, 'html.parser')
for a in soup.find_all('a', href=True):
href = a['href']
if href in link_mapping:
a['href'] = f"#{link_mapping[href]}"
return h.handle(str(soup))
def create_link_mapping(root_dir):
link_mapping = {}
for subdir, _, files in os.walk(root_dir):
for file in files:
if file == "index.html":
relative_path = os.path.relpath(os.path.join(subdir, file), root_dir)
chapter_id = relative_path.replace(os.sep, '-').replace('index.html', '')
link_mapping[relative_path] = chapter_id
link_mapping[relative_path.replace(os.sep, '/')] = chapter_id # for URLs with slashes
return link_mapping
def read_html_files(root_dir, link_mapping):
markdown_content = []
for subdir, _, files in os.walk(root_dir):
relative_path = os.path.relpath(subdir, root_dir)
if files and any(file == "index.html" for file in files):
# Add directory as a heading based on its depth
heading_level = relative_path.count(os.sep) + 1
markdown_content.append(f"{'#' * heading_level} {relative_path}\n")
for file in files:
if file == "index.html":
file_path = os.path.join(subdir, file)
with open(file_path, 'r', encoding='utf-8') as f:
html_content = f.read()
soup = BeautifulSoup(html_content, 'html.parser')
title = soup.title.string if soup.title else "No Title"
chapter_id = os.path.relpath(file_path, root_dir).replace(os.sep, '-').replace('index.html', '')
markdown_content.append(f"<a id='{chapter_id}'></a>\n")
markdown_content.append(f"{'#' * (heading_level + 1)} {title}\n")
markdown_content.append(convert_html_to_markdown(html_content, link_mapping))
return "\n".join(markdown_content)
def save_to_markdown_file(content, output_file):
with open(output_file, 'w', encoding='utf-8') as f:
f.write(content)
def main():
root_dir = "./v2realbot/static/js/vbt/"
output_file = "output.md"
link_mapping = create_link_mapping(root_dir)
markdown_content = read_html_files(root_dir, link_mapping)
save_to_markdown_file(markdown_content, output_file)
print(f"Markdown document created at {output_file}")
if __name__ == "__main__":
main()

View File

@ -3,7 +3,7 @@ sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from v2realbot.strategy.base import StrategyState
from v2realbot.strategy.StrategyOrderLimitVykladaciNormalizedMYSELL import StrategyOrderLimitVykladaciNormalizedMYSELL
from v2realbot.enums.enums import RecordType, StartBarAlign, Mode, Account
from v2realbot.utils.utils import zoneNY, print
from v2realbot.utils.utils import zoneNY, print, fetch_calendar_data, send_to_telegram
from v2realbot.utils.historicals import get_historical_bars
from datetime import datetime, timedelta
from rich import print as printanyway
@ -16,9 +16,9 @@ from v2realbot.strategyblocks.newtrade.signals import signal_search
from v2realbot.strategyblocks.activetrade.activetrade_hub import manage_active_trade
from v2realbot.strategyblocks.inits.init_indicators import initialize_dynamic_indicators
from v2realbot.strategyblocks.inits.init_directives import intialize_directive_conditions
from alpaca.trading.requests import GetCalendarRequest
from v2realbot.strategyblocks.inits.init_attached_data import attach_previous_data
from alpaca.trading.client import TradingClient
from v2realbot.config import ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, DATA_DIR, OFFLINE_MODE
from v2realbot.config import ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, DATA_DIR
from alpaca.trading.models import Calendar
from v2realbot.indicators.oscillators import rsi
from v2realbot.indicators.moving_averages import sma
@ -116,6 +116,10 @@ def init(state: StrategyState):
#models
state.vars.loaded_models = {}
#state attributes for martingale sizing mngmt
state.vars["transferables"] = {}
state.vars["transferables"]["martingale"] = dict(cont_loss_series_cnt=0)
#INITIALIZE CBAR INDICATORS - do vlastni funkce
#state.cbar_indicators['ivwap'] = []
state.vars.last_tick_price = 0
@ -129,6 +133,9 @@ def init(state: StrategyState):
initialize_dynamic_indicators(state)
intialize_directive_conditions(state)
#attach part of yesterdays data, bars, indicators, cbar_indicators
attach_previous_data(state)
#intitialize indicator mapping (for use in operation) - mozna presunout do samostatne funkce prip dat do base kdyz se osvedci
local_dict_cbar_inds = {key: state.cbar_indicators[key] for key in state.cbar_indicators.keys() if key != "time"}
local_dict_inds = {key: state.indicators[key] for key in state.indicators.keys() if key != "time"}
@ -167,10 +174,13 @@ def init(state: StrategyState):
today = time_to.date()
several_days_ago = today - timedelta(days=60)
#printanyway(f"{today=}",f"{several_days_ago=}")
clientTrading = TradingClient(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, raw_data=False)
#clientTrading = TradingClient(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, raw_data=False)
#get all market days from here to 40days ago
calendar_request = GetCalendarRequest(start=several_days_ago,end=today)
cal_dates = clientTrading.get_calendar(calendar_request)
#calendar_request = GetCalendarRequest(start=several_days_ago,end=today)
cal_dates = fetch_calendar_data(several_days_ago, today)
#cal_dates = clientTrading.get_calendar(calendar_request)
#find the first market day - 40days ago
#history_datetime_from = zoneNY.localize(cal_dates[0].open)
@ -197,6 +207,11 @@ def init(state: StrategyState):
#NOTE zatim pridano takto do baru dalsi indikatory
#BUDE PREDELANO - v rámci custom rozliseni a static indikátoru
if state.dailyBars is None:
print("Nepodařilo se načíst denní bary")
err_msg = f"Nepodařilo se načíst denní bary (get_historical_bars) pro {state.symbol} od {history_datetime_from} do {history_datetime_to} ve strat.init. Probably wrong symbol?"
send_to_telegram(err_msg)
raise Exception(err_msg)
#RSI vraci pouze pro vsechny + prepend with zeros nepocita prvnich N (dle rsi length)
rsi_calculated = rsi(state.dailyBars["vwap"], 14).tolist()

View File

@ -40,10 +40,10 @@
from uuid import UUID, uuid4
from alpaca.trading.enums import OrderSide, OrderStatus, TradeEvent, OrderType
from v2realbot.common.model import TradeUpdate, Order
#from rich import print
from rich import print as printanyway
import threading
import asyncio
from v2realbot.config import BT_DELAYS, DATA_DIR, BT_FILL_CONDITION_BUY_LIMIT, BT_FILL_CONDITION_SELL_LIMIT, BT_FILL_LOG_SURROUNDING_TRADES, BT_FILL_CONS_TRADES_REQUIRED,BT_FILL_PRICE_MARKET_ORDER_PREMIUM
from v2realbot.config import DATA_DIR
from v2realbot.utils.utils import AttributeDict, ltp, zoneNY, trunc, count_decimals, print
from v2realbot.utils.tlog import tlog
from v2realbot.enums.enums import FillCondition
@ -60,6 +60,7 @@ from v2realbot.utils.dash_save_html import make_static
import dash_bootstrap_components as dbc
from dash.dependencies import Input, Output
from dash import dcc, html, dash_table, Dash
import v2realbot.utils.config_handler as cfh
""""
LATENCY DELAYS
.000 trigger - last_trade_time (.4246266)
@ -171,7 +172,7 @@ class Backtester:
todel.append(order)
elif not self.symbol or order.symbol == self.symbol:
#pricteme mininimalni latency od submittu k fillu
if order.submitted_at.timestamp() + BT_DELAYS.sub_to_fill > float(intime):
if order.submitted_at.timestamp() + cfh.config_handler.get_val('BT_DELAYS','sub_to_fill') > float(intime):
print(f"too soon for {order.id}")
#try to execute
else:
@ -197,7 +198,7 @@ class Backtester:
#Mazeme, jinak je to hruza
#nechavame na konci trady, které muzeme potrebovat pro consekutivni pravidlo
#osetrujeme, kdy je malo tradu a oriznuti by slo do zaporu
del_to_index = index_end-2-BT_FILL_CONS_TRADES_REQUIRED
del_to_index = index_end-2-cfh.config_handler.get_val('BT_FILL_CONS_TRADES_REQUIRED')
del_to_index = del_to_index if del_to_index > 0 else 0
del self.btdata[0:del_to_index]
##ic("after delete",len(self.btdata[0:index_end]))
@ -218,7 +219,7 @@ class Backtester:
fill_time = None
fill_price = None
order_min_fill_time = o.submitted_at.timestamp() + BT_DELAYS.sub_to_fill
order_min_fill_time = o.submitted_at.timestamp() + cfh.config_handler.get_val('BT_DELAYS','sub_to_fill')
#ic(order_min_fill_time)
#ic(len(work_range))
@ -240,17 +241,18 @@ class Backtester:
#NASTVENI PODMINEK PLNENI
fast_fill_condition = i[1] <= o.limit_price
slow_fill_condition = i[1] < o.limit_price
if BT_FILL_CONDITION_BUY_LIMIT == FillCondition.FAST:
fill_cond_buy_limit = cfh.config_handler.get_val('BT_FILL_CONDITION_BUY_LIMIT')
if fill_cond_buy_limit == FillCondition.FAST:
fill_condition = fast_fill_condition
elif BT_FILL_CONDITION_BUY_LIMIT == FillCondition.SLOW:
elif fill_cond_buy_limit == FillCondition.SLOW:
fill_condition = slow_fill_condition
else:
print("unknow fill condition")
return -1
if float(i[0]) > float(order_min_fill_time+BT_DELAYS.limit_order_offset) and fill_condition:
if float(i[0]) > float(order_min_fill_time+cfh.config_handler.get_val('BT_DELAYS','limit_order_offset')) and fill_condition:
consec_cnt += 1
if consec_cnt == BT_FILL_CONS_TRADES_REQUIRED:
if consec_cnt == cfh.config_handler.get_val('BT_FILL_CONS_TRADES_REQUIRED'):
#(1679081919.381649, 27.88)
#ic(i)
@ -261,10 +263,10 @@ class Backtester:
#fill_price = i[1]
print("FILL LIMIT BUY at", fill_time, datetime.fromtimestamp(fill_time).astimezone(zoneNY), "at",i[1])
if BT_FILL_LOG_SURROUNDING_TRADES != 0:
if cfh.config_handler.get_val('BT_FILL_LOG_SURROUNDING_TRADES') != 0:
#TODO loguru
print("FILL SURR TRADES: before",work_range[index-BT_FILL_LOG_SURROUNDING_TRADES:index])
print("FILL SURR TRADES: fill and after",work_range[index:index+BT_FILL_LOG_SURROUNDING_TRADES])
print("FILL SURR TRADES: before",work_range[index-cfh.config_handler.get_val('BT_FILL_LOG_SURROUNDING_TRADES'):index])
print("FILL SURR TRADES: fill and after",work_range[index:index+cfh.config_handler.get_val('BT_FILL_LOG_SURROUNDING_TRADES')])
break
else:
consec_cnt = 0
@ -275,17 +277,18 @@ class Backtester:
#NASTVENI PODMINEK PLNENI
fast_fill_condition = i[1] >= o.limit_price
slow_fill_condition = i[1] > o.limit_price
if BT_FILL_CONDITION_SELL_LIMIT == FillCondition.FAST:
fill_conf_sell_cfg = cfh.config_handler.get_val('BT_FILL_CONDITION_SELL_LIMIT')
if fill_conf_sell_cfg == FillCondition.FAST:
fill_condition = fast_fill_condition
elif BT_FILL_CONDITION_SELL_LIMIT == FillCondition.SLOW:
elif fill_conf_sell_cfg == FillCondition.SLOW:
fill_condition = slow_fill_condition
else:
print("unknown fill condition")
return -1
if float(i[0]) > float(order_min_fill_time+BT_DELAYS.limit_order_offset) and fill_condition:
if float(i[0]) > float(order_min_fill_time+cfh.config_handler.get_val('BT_DELAYS','limit_order_offset')) and fill_condition:
consec_cnt += 1
if consec_cnt == BT_FILL_CONS_TRADES_REQUIRED:
if consec_cnt == cfh.config_handler.get_val('BT_FILL_CONS_TRADES_REQUIRED'):
#(1679081919.381649, 27.88)
#ic(i)
fill_time = i[0]
@ -297,10 +300,11 @@ class Backtester:
#fill_price = i[1]
print("FILL LIMIT SELL at", fill_time, datetime.fromtimestamp(fill_time).astimezone(zoneNY), "at",i[1])
if BT_FILL_LOG_SURROUNDING_TRADES != 0:
surr_trades_cfg = cfh.config_handler.get_val('BT_FILL_LOG_SURROUNDING_TRADES')
if surr_trades_cfg != 0:
#TODO loguru
print("FILL SELL SURR TRADES: before",work_range[index-BT_FILL_LOG_SURROUNDING_TRADES:index])
print("FILL SELL SURR TRADES: fill and after",work_range[index:index+BT_FILL_LOG_SURROUNDING_TRADES])
print("FILL SELL SURR TRADES: before",work_range[index-surr_trades_cfg:index])
print("FILL SELL SURR TRADES: fill and after",work_range[index:index+surr_trades_cfg])
break
else:
consec_cnt = 0
@ -314,11 +318,16 @@ class Backtester:
#ic(i)
fill_time = i[0]
fill_price = i[1]
#přičteme MARKET PREMIUM z konfigurace (do budoucna mozna rozdilne pro BUY/SELL a nebo mozna z konfigurace pro dany itutl)
#přičteme MARKET PREMIUM z konfigurace (je v pct nebo abs) (do budoucna mozna rozdilne pro BUY/SELL a nebo mozna z konfigurace pro dany titul)
cfg_premium = cfh.config_handler.get_val('BT_FILL_PRICE_MARKET_ORDER_PREMIUM')
if cfg_premium < 0: #configured as percentage
premium = abs(cfg_premium) * fill_price / 100.0
else: #configured as absolute value
premium = cfg_premium
if o.side == OrderSide.BUY:
fill_price = fill_price + BT_FILL_PRICE_MARKET_ORDER_PREMIUM
fill_price = fill_price + premium
elif o.side == OrderSide.SELL:
fill_price = fill_price - BT_FILL_PRICE_MARKET_ORDER_PREMIUM
fill_price = fill_price - premium
print("FILL ",o.side,"MARKET at", fill_time, datetime.fromtimestamp(fill_time).astimezone(zoneNY), "cena", i[1])
break
@ -367,7 +376,7 @@ class Backtester:
def _do_notification_with_callbacks(self, tradeupdate: TradeUpdate, time: float):
#do callbacku je třeba zpropagovat filltime čas (včetně latency pro notifikaci), aby se pripadne akce v callbacku udály s tímto časem
self.time = time + float(BT_DELAYS.fill_to_not)
self.time = time + float(cfh.config_handler.get_val('BT_DELAYS','fill_to_not'))
print("current bt.time",self.time)
#print("FILL NOTIFICATION: ", tradeupdate)
res = asyncio.run(self.order_fill_callback(tradeupdate))
@ -470,11 +479,11 @@ class Backtester:
print("BT: submit order entry")
if not time or time < 0:
print("time musi byt vyplneny")
printanyway("time musi byt vyplneny")
return -1
if not size or int(size) < 0:
print("size musi byt vetsi nez 0")
printanyway("size musi byt vetsi nez 0")
return -1
if (order_type != OrderType.MARKET) and (order_type != OrderType.LIMIT):
@ -482,11 +491,11 @@ class Backtester:
return -1
if not side == OrderSide.BUY and not side == OrderSide.SELL:
print("side buy/sell required")
printanyway("side buy/sell required")
return -1
if order_type == OrderType.LIMIT and count_decimals(price) > 2:
print("only 2 decimals supported", price)
printanyway("only 2 decimals supported", price)
return -1
#pokud neexistuje klic v accountu vytvorime si ho
@ -508,14 +517,14 @@ class Backtester:
actual_minus_reserved = int(self.account[symbol][0]) - reserved
if actual_minus_reserved > 0 and actual_minus_reserved - int(size) < 0:
print("not enough shares available to sell or shorting while long position",self.account[symbol][0],"reserved",reserved,"available",int(self.account[symbol][0]) - reserved,"selling",size)
printanyway("not enough shares available to sell or shorting while long position",self.account[symbol][0],"reserved",reserved,"available",int(self.account[symbol][0]) - reserved,"selling",size)
return -1
#if is shorting - check available cash to short
if actual_minus_reserved <= 0:
cena = price if price else self.get_last_price(time, self.symbol)
if (self.cash - reserved_price - float(int(size)*float(cena))) < 0:
print("not enough cash for shorting. cash",self.cash,"reserved",reserved,"available",self.cash-reserved,"needed",float(int(size)*float(cena)))
printanyway("ERROR: not enough cash for shorting. cash",self.cash,"reserved",reserved,"available",self.cash-reserved,"needed",float(int(size)*float(cena)))
return -1
#check for available cash
@ -534,14 +543,14 @@ class Backtester:
#jde o uzavreni shortu
if actual_plus_reserved_qty < 0 and (actual_plus_reserved_qty + int(size)) > 0:
print("nejprve je treba uzavrit short pozici pro buy res_qty, size", actual_plus_reserved_qty, size)
printanyway("nejprve je treba uzavrit short pozici pro buy res_qty, size", actual_plus_reserved_qty, size)
return -1
#jde o standardni long, kontroluju cash
if actual_plus_reserved_qty >= 0:
cena = price if price else self.get_last_price(time, self.symbol)
if (self.cash - reserved_price - float(int(size)*float(cena))) < 0:
print("not enough cash to buy long. cash",self.cash,"reserved_qty",reserved_qty,"reserved_price",reserved_price, "available",self.cash-reserved_price,"needed",float(int(size)*float(cena)))
printanyway("ERROR: not enough cash to buy long. cash",self.cash,"reserved_qty",reserved_qty,"reserved_price",reserved_price, "available",self.cash-reserved_price,"needed",float(int(size)*float(cena)))
return -1
id = str(uuid4())
@ -568,11 +577,11 @@ class Backtester:
print("BT: replace order entry",id,size,price)
if not price and not size:
print("size or price required")
printanyway("size or price required")
return -1
if len(self.open_orders) == 0:
print("BT: order doesnt exist")
printanyway("BT: order doesnt exist")
return 0
#with lock:
for o in self.open_orders:
@ -600,7 +609,7 @@ class Backtester:
"""
print("BT: cancel order entry",id)
if len(self.open_orders) == 0:
print("BTC: order doesnt exist")
printanyway("BTC: order doesnt exist")
return 0
#with lock:
for o in self.open_orders:
@ -820,10 +829,10 @@ class Backtester:
Trades:''' + str(len(self.trades)))
textik8 = html.Div('''
Profit:''' + str(state.profit))
textik9 = html.Div(f"{BT_FILL_CONS_TRADES_REQUIRED=}")
textik10 = html.Div(f"{BT_FILL_LOG_SURROUNDING_TRADES=}")
textik11 = html.Div(f"{BT_FILL_CONDITION_BUY_LIMIT=}")
textik12 = html.Div(f"{BT_FILL_CONDITION_SELL_LIMIT=}")
textik9 = html.Div(f"{cfh.config_handler.get_val('BT_FILL_CONS_TRADES_REQUIRED')=}")
textik10 = html.Div(f"{cfh.config_handler.get_val('BT_FILL_LOG_SURROUNDING_TRADES')=}")
textik11 = html.Div(f"{cfh.config_handler.get_val('BT_FILL_CONDITION_BUY_LIMIT')=}")
textik12 = html.Div(f"{cfh.config_handler.get_val('BT_FILL_CONDITION_SELL_LIMIT')=}")
orders_title = dcc.Markdown('## Open orders')
trades_title = dcc.Markdown('## Trades')

View File

@ -1,11 +1,8 @@
from v2realbot.config import DATA_DIR
import sqlite3
import queue
import threading
import time
from v2realbot.common.model import RunArchive, RunArchiveView
from datetime import datetime
import orjson
from v2realbot.config import DATA_DIR
sqlite_db_file = DATA_DIR + "/v2trading.db"
# Define the connection pool
@ -31,7 +28,7 @@ class ConnectionPool:
return connection
def execute_with_retry(cursor: sqlite3.Cursor, statement: str, params = None, retry_interval: int = 1) -> sqlite3.Cursor:
def execute_with_retry(cursor: sqlite3.Cursor, statement: str, params = None, retry_interval: int = 2) -> sqlite3.Cursor:
"""get connection from pool and execute SQL statement with retry logic if required.
Args:
@ -61,52 +58,3 @@ pool = ConnectionPool(10)
#for one shared connection (used for writes only in WAL mode)
insert_conn = sqlite3.connect(sqlite_db_file, check_same_thread=False)
insert_queue = queue.Queue()
#prevede dict radku zpatky na objekt vcetme retypizace
def row_to_runarchiveview(row: dict) -> RunArchiveView:
return RunArchive(
id=row['runner_id'],
strat_id=row['strat_id'],
batch_id=row['batch_id'],
symbol=row['symbol'],
name=row['name'],
note=row['note'],
started=datetime.fromisoformat(row['started']) if row['started'] else None,
stopped=datetime.fromisoformat(row['stopped']) if row['stopped'] else None,
mode=row['mode'],
account=row['account'],
bt_from=datetime.fromisoformat(row['bt_from']) if row['bt_from'] else None,
bt_to=datetime.fromisoformat(row['bt_to']) if row['bt_to'] else None,
ilog_save=bool(row['ilog_save']),
profit=float(row['profit']),
trade_count=int(row['trade_count']),
end_positions=int(row['end_positions']),
end_positions_avgp=float(row['end_positions_avgp']),
metrics=orjson.loads(row['metrics']) if row['metrics'] else None
)
#prevede dict radku zpatky na objekt vcetme retypizace
def row_to_runarchive(row: dict) -> RunArchive:
return RunArchive(
id=row['runner_id'],
strat_id=row['strat_id'],
batch_id=row['batch_id'],
symbol=row['symbol'],
name=row['name'],
note=row['note'],
started=datetime.fromisoformat(row['started']) if row['started'] else None,
stopped=datetime.fromisoformat(row['stopped']) if row['stopped'] else None,
mode=row['mode'],
account=row['account'],
bt_from=datetime.fromisoformat(row['bt_from']) if row['bt_from'] else None,
bt_to=datetime.fromisoformat(row['bt_to']) if row['bt_to'] else None,
strat_json=orjson.loads(row['strat_json']),
settings=orjson.loads(row['settings']),
ilog_save=bool(row['ilog_save']),
profit=float(row['profit']),
trade_count=int(row['trade_count']),
end_positions=int(row['end_positions']),
end_positions_avgp=float(row['end_positions_avgp']),
metrics=orjson.loads(row['metrics']),
stratvars_toml=row['stratvars_toml']
)

View File

@ -1,13 +1,16 @@
from uuid import UUID
from uuid import UUID, uuid4
from alpaca.trading.enums import OrderSide, OrderStatus, TradeEvent,OrderType
#from utils import AttributeDict
from rich import print
from typing import Any, Optional, List, Union
from datetime import datetime, date
from pydantic import BaseModel
from v2realbot.enums.enums import Mode, Account
from pydantic import BaseModel, Field
from v2realbot.enums.enums import Mode, Account, SchedulerStatus, Moddus, Market
from alpaca.data.enums import Exchange
#models for server side datatables
# Model for individual column data
class ColumnData(BaseModel):
@ -91,12 +94,12 @@ class TestList(BaseModel):
class Trade(BaseModel):
symbol: str
timestamp: datetime
exchange: Optional[Union[Exchange, str]]
exchange: Optional[Union[Exchange, str]] = None
price: float
size: float
id: int
conditions: Optional[List[str]]
tape: Optional[str]
conditions: Optional[List[str]] = None
tape: Optional[str] = None
#persisted object in pickle
@ -111,8 +114,20 @@ class StrategyInstance(BaseModel):
close_rush: int = 0
stratvars_conf: str
add_data_conf: str
note: Optional[str]
history: Optional[str]
note: Optional[str] = None
history: Optional[str] = None
def __setstate__(self, state: dict[Any, Any]) -> None:
"""
Hack to allow unpickling models stored from pydantic V1
"""
state.setdefault("__pydantic_extra__", {})
state.setdefault("__pydantic_private__", {})
if "__pydantic_fields_set__" not in state:
state["__pydantic_fields_set__"] = state.get("__fields_set__")
super().__setstate__(state)
class RunRequest(BaseModel):
id: UUID
@ -122,8 +137,8 @@ class RunRequest(BaseModel):
debug: bool = False
strat_json: Optional[str] = None
ilog_save: bool = False
bt_from: datetime = None
bt_to: datetime = None
bt_from: Optional[datetime] = None
bt_to: Optional[datetime] = None
#weekdays filter
#pokud je uvedeny filtrujeme tyto dny
weekdays_filter: Optional[list] = None
@ -134,7 +149,34 @@ class RunRequest(BaseModel):
cash: int = 100000
skip_cache: Optional[bool] = False
#Trida, která je nadstavbou runrequestu a pouzivame ji v scheduleru, je zde navic jen par polí
class RunManagerRecord(BaseModel):
moddus: Moddus
id: UUID = Field(default_factory=uuid4)
strat_id: UUID
symbol: Optional[str] = None
account: Account
mode: Mode
note: Optional[str] = None
ilog_save: bool = False
market: Optional[Market] = Market.US
bt_from: Optional[datetime] = None
bt_to: Optional[datetime] = None
#weekdays filter
#pokud je uvedeny filtrujeme tyto dny
weekdays_filter: Optional[list] = None #list of strings 0-6 representing days to run
#GENERATED ID v ramci runu, vaze vsechny runnery v batchovem behu
batch_id: Optional[str] = None
testlist_id: Optional[str] = None
start_time: str #time (HH:MM) that start function is called
stop_time: Optional[str] = None #time (HH:MM) that stop function is called
status: SchedulerStatus
last_processed: Optional[datetime] = None
history: Optional[str] = None
valid_from: Optional[datetime] = None # US East time zone daetime
valid_to: Optional[datetime] = None # US East time zone daetime
runner_id: Optional[UUID] = None #last runner_id from scheduler after stratefy is started
strat_running: Optional[bool] = None #automatically updated field based on status of runner_id above, it is added by row_to_RunManagerRecord
class RunnerView(BaseModel):
id: UUID
strat_id: UUID
@ -164,10 +206,10 @@ class Runner(BaseModel):
run_name: Optional[str] = None
run_note: Optional[str] = None
run_ilog_save: Optional[bool] = False
run_trade_count: Optional[int]
run_profit: Optional[float]
run_positions: Optional[int]
run_avgp: Optional[float]
run_trade_count: Optional[int] = None
run_profit: Optional[float] = None
run_positions: Optional[int] = None
run_avgp: Optional[float] = None
run_strat_json: Optional[str] = None
run_stopped: Optional[datetime] = None
run_paused: Optional[datetime] = None
@ -201,41 +243,41 @@ class Bar(BaseModel):
low: float
close: float
volume: float
trade_count: Optional[float]
vwap: Optional[float]
trade_count: Optional[float] = 0
vwap: Optional[float] = 0
class Order(BaseModel):
id: UUID
submitted_at: datetime
filled_at: Optional[datetime]
canceled_at: Optional[datetime]
filled_at: Optional[datetime] = None
canceled_at: Optional[datetime] = None
symbol: str
qty: int
status: OrderStatus
order_type: OrderType
filled_qty: Optional[int]
filled_avg_price: Optional[float]
filled_qty: Optional[int] = None
filled_avg_price: Optional[float] = None
side: OrderSide
limit_price: Optional[float]
limit_price: Optional[float] = None
#entita pro kazdy kompletni FILL, je navazana na prescribed_trade
class TradeUpdate(BaseModel):
event: Union[TradeEvent, str]
execution_id: Optional[UUID]
execution_id: Optional[UUID] = None
order: Order
timestamp: datetime
position_qty: Optional[float]
price: Optional[float]
qty: Optional[float]
value: Optional[float]
cash: Optional[float]
pos_avg_price: Optional[float]
profit: Optional[float]
profit_sum: Optional[float]
rel_profit: Optional[float]
rel_profit_cum: Optional[float]
signal_name: Optional[str]
prescribed_trade_id: Optional[str]
position_qty: Optional[float] = None
price: Optional[float] = None
qty: Optional[float] = None
value: Optional[float] = None
cash: Optional[float] = None
pos_avg_price: Optional[float] = None
profit: Optional[float] = None
profit_sum: Optional[float] = None
rel_profit: Optional[float] = None
rel_profit_cum: Optional[float] = None
signal_name: Optional[str] = None
prescribed_trade_id: Optional[str] = None
class RunArchiveChange(BaseModel):
@ -260,8 +302,7 @@ class RunArchive(BaseModel):
bt_from: Optional[datetime] = None
bt_to: Optional[datetime] = None
strat_json: Optional[str] = None
##bude decomiss, misto toho stratvars_toml
stratvars: Optional[dict] = None
transferables: Optional[dict] = None #varaibles that are transferrable to next run
settings: Optional[dict] = None
ilog_save: Optional[bool] = False
profit: float = 0
@ -291,6 +332,8 @@ class RunArchiveView(BaseModel):
end_positions: int = 0
end_positions_avgp: float = 0
metrics: Union[dict, str] = None
batch_profit: float = 0 # Total profit for the batch - now calculated during query
batch_count: int = 0 # Count of runs in the batch - now calculated during query
#same but with pagination
class RunArchiveViewPagination(BaseModel):
@ -301,7 +344,7 @@ class RunArchiveViewPagination(BaseModel):
#trida pro ukladani historie stoplossy do ext_data
class SLHistory(BaseModel):
id: Optional[UUID]
id: Optional[UUID] = None
time: datetime
sl_val: float
@ -314,7 +357,7 @@ class RunArchiveDetail(BaseModel):
indicators: List[dict]
statinds: dict
trades: List[TradeUpdate]
ext_data: Optional[dict]
ext_data: Optional[dict] = None
class InstantIndicator(BaseModel):

View File

@ -0,0 +1,87 @@
from v2realbot.common.model import RunArchive, RunArchiveView, RunManagerRecord
from datetime import datetime
import orjson
import v2realbot.controller.services as cs
#prevede dict radku zpatky na objekt vcetme retypizace
def row_to_runmanager(row: dict) -> RunManagerRecord:
is_running = cs.is_runner_running(row['runner_id']) if row['runner_id'] else False
res = RunManagerRecord(
moddus=row['moddus'],
id=row['id'],
strat_id=row['strat_id'],
symbol=row['symbol'],
mode=row['mode'],
account=row['account'],
note=row['note'],
ilog_save=bool(row['ilog_save']),
market=row['market'] if row['market'] is not None else None,
bt_from=datetime.fromisoformat(row['bt_from']) if row['bt_from'] else None,
bt_to=datetime.fromisoformat(row['bt_to']) if row['bt_to'] else None,
weekdays_filter=[int(x) for x in row['weekdays_filter'].split(',')] if row['weekdays_filter'] else [],
batch_id=row['batch_id'],
testlist_id=row['testlist_id'],
start_time=row['start_time'],
stop_time=row['stop_time'],
status=row['status'],
#last_started=zoneNY.localize(datetime.fromisoformat(row['last_started'])) if row['last_started'] else None,
last_processed=datetime.fromisoformat(row['last_processed']) if row['last_processed'] else None,
history=row['history'],
valid_from=datetime.fromisoformat(row['valid_from']) if row['valid_from'] else None,
valid_to=datetime.fromisoformat(row['valid_to']) if row['valid_to'] else None,
runner_id = row['runner_id'] if row['runner_id'] and is_running else None, #runner_id is only present if it is running
strat_running = is_running) #cant believe this when called from separate process as not current
return res
#prevede dict radku zpatky na objekt vcetme retypizace
def row_to_runarchiveview(row: dict) -> RunArchiveView:
a = RunArchiveView(
id=row['runner_id'],
strat_id=row['strat_id'],
batch_id=row['batch_id'],
symbol=row['symbol'],
name=row['name'],
note=row['note'],
started=datetime.fromisoformat(row['started']) if row['started'] else None,
stopped=datetime.fromisoformat(row['stopped']) if row['stopped'] else None,
mode=row['mode'],
account=row['account'],
bt_from=datetime.fromisoformat(row['bt_from']) if row['bt_from'] else None,
bt_to=datetime.fromisoformat(row['bt_to']) if row['bt_to'] else None,
ilog_save=bool(row['ilog_save']),
profit=float(row['profit']),
trade_count=int(row['trade_count']),
end_positions=int(row['end_positions']),
end_positions_avgp=float(row['end_positions_avgp']),
metrics=orjson.loads(row['metrics']) if row['metrics'] else None,
batch_profit=int(row['batch_profit']) if row['batch_profit'] and row['batch_id'] else 0,
batch_count=int(row['batch_count']) if row['batch_count'] and row['batch_id'] else 0,
)
return a
#prevede dict radku zpatky na objekt vcetme retypizace
def row_to_runarchive(row: dict) -> RunArchive:
return RunArchive(
id=row['runner_id'],
strat_id=row['strat_id'],
batch_id=row['batch_id'],
symbol=row['symbol'],
name=row['name'],
note=row['note'],
started=datetime.fromisoformat(row['started']) if row['started'] else None,
stopped=datetime.fromisoformat(row['stopped']) if row['stopped'] else None,
mode=row['mode'],
account=row['account'],
bt_from=datetime.fromisoformat(row['bt_from']) if row['bt_from'] else None,
bt_to=datetime.fromisoformat(row['bt_to']) if row['bt_to'] else None,
strat_json=orjson.loads(row['strat_json']),
settings=orjson.loads(row['settings']),
ilog_save=bool(row['ilog_save']),
profit=float(row['profit']),
trade_count=int(row['trade_count']),
end_positions=int(row['end_positions']),
end_positions_avgp=float(row['end_positions_avgp']),
metrics=orjson.loads(row['metrics']),
stratvars_toml=row['stratvars_toml'],
transferables=orjson.loads(row['transferables']) if row['transferables'] else None
)

View File

@ -2,66 +2,42 @@ from alpaca.data.enums import DataFeed
from v2realbot.enums.enums import Mode, Account, FillCondition
from appdirs import user_data_dir
from pathlib import Path
import os
from collections import defaultdict
from dotenv import load_dotenv
# Global flag to track if the ml module has been imported (solution for long import times of tensorflow)
#the first occurence of using it will load it globally
_ml_module_loaded = False
#directory for generated images and basic reports
MEDIA_DIRECTORY = Path(__file__).parent.parent.parent / "media"
VBT_DOC_DIRECTORY = Path(__file__).parent.parent.parent / "vbt-doc" #directory for vbt doc
RUNNER_DETAIL_DIRECTORY = Path(__file__).parent.parent.parent / "runner_detail"
#location of strat.log - it is used to fetch by gui
LOG_PATH = Path(__file__).parent.parent
LOG_FILE = Path(__file__).parent.parent / "strat.log"
JOB_LOG_FILE = Path(__file__).parent.parent / "job.log"
DOTENV_DIRECTORY = Path(__file__).parent.parent.parent
ENV_FILE = DOTENV_DIRECTORY / '.env'
#'0.0.0.0',
#currently only prod server has acces to LIVE
PROD_SERVER_HOSTNAMES = ['tradingeastcoast','David-MacBook-Pro.local'] #,'David-MacBook-Pro.local'
TEST_SERVER_HOSTNAMES = ['tradingtest']
#TODO vybrane dat do config db a managovat pres GUI
#DEFAULT AGGREGATOR filter trades
#NOTE pridana F - Inter Market Sweep Order - obcas vytvarela spajky
AGG_EXCLUDED_TRADES = ['C','O','4','B','7','V','P','W','U','Z','F']
OFFLINE_MODE = False
# ilog lvls = 0,1 - 0 debug, 1 info
ILOG_SAVE_LEVEL_FROM = 1
#minimalni vzdalenost mezi trady, kterou agregator pousti pro CBAR(0.001 - blokuje mensi nez 1ms)
GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN = 0.003
#normalized price for tick 0.01
NORMALIZED_TICK_BASE_PRICE = 30.00
LOG_RUNNER_EVENTS = False
#no print in console
QUIET_MODE = True
#how many consecutive trades with the fill price are necessary for LIMIT fill to happen in backtesting
#0 - optimistic, every knot high will fill the order
#N - N consecutive trades required
#not impl.yet
#minimum is 1, na alpace live to vetsinou vychazi 7-8 u BAC, je to hodne podobne tomu, nez je cena překonaná pul centu. tzn. 7-8 a nebo FillCondition.SLOW
BT_FILL_CONS_TRADES_REQUIRED = 2
#during bt trade execution logs X-surrounding trades of the one that triggers the fill
BT_FILL_LOG_SURROUNDING_TRADES = 10
#fill condition for limit order in bt
# fast - price has to be equal or bigger <=
# slow - price has to be bigger <
BT_FILL_CONDITION_BUY_LIMIT = FillCondition.SLOW
BT_FILL_CONDITION_SELL_LIMIT = FillCondition.SLOW
#TBD TODO not implemented yet
BT_FILL_PRICE_MARKET_ORDER_PREMIUM = 0.005
#backend counter of api requests
COUNT_API_REQUESTS = False
#stratvars that cannot be changed in gui
STRATVARS_UNCHANGEABLES = ['pendingbuys', 'blockbuy', 'jevylozeno', 'limitka']
DATA_DIR = user_data_dir("v2realbot")
DATA_DIR = user_data_dir("v2realbot", False)
MODEL_DIR = Path(DATA_DIR)/"models"
#BT DELAYS
#profiling
PROFILING_NEXT_ENABLED = False
PROFILING_OUTPUT_DIR = DATA_DIR
#FILL CONFIGURATION CLASS FOR BACKTESTING
#NALOADUJEME DOTENV ENV VARIABLES
if load_dotenv(ENV_FILE, verbose=True) is False:
print(f"Error loading.env file {ENV_FILE}. Now depending on ENV VARIABLES set externally.")
else:
print(f"Loaded env variables from file {ENV_FILE}")
#WIP
#WIP - FILL CONFIGURATION CLASS FOR BACKTESTING
class BT_FILL_CONF:
""""
Trida pro konfiguraci backtesting fillu pro dany symbol, pokud neexistuje tak fallback na obecny viz vyse-
@ -75,24 +51,6 @@ class BT_FILL_CONF:
self.BT_FILL_CONDITION_SELL_LIMIT=BT_FILL_CONDITION_SELL_LIMIT
self.BT_FILL_PRICE_MARKET_ORDER_PREMIUM=BT_FILL_PRICE_MARKET_ORDER_PREMIUM
""""
LATENCY DELAYS for LIVE eastcoast
.000 trigger - last_trade_time (.4246266)
+.020 vstup do strategie a BUY (.444606)
+.023 submitted (.469198)
+.008 filled (.476695552)
+.023 fill not(.499888)
"""
#TODO změnit názvy delay promennych vystizneji a obecneji
class BT_DELAYS:
trigger_to_strat: float = 0.020
strat_to_sub: float = 0.023
sub_to_fill: float = 0.008
fill_to_not: float = 0.023
#doplnit dle live
limit_order_offset: float = 0
class Keys:
def __init__(self, api_key, secret_key, paper, feed) -> None:
self.API_KEY = api_key
@ -101,7 +59,8 @@ class Keys:
self.FEED = feed
# podle modu (PAPER, LIVE) vrati objekt
# obsahujici klice pro pripojeni k alpace
# obsahujici klice pro pripojeni k alpace - používá se pro Trading API a order updates websockets (pristupy relevantni per strategie)
#pro real time data se bere LIVE_DATA_API_KEY, LIVE_DATA_SECRET_KEY, LIVE_DATA_FEED nize - jelikoz jde o server wide nastaveni
def get_key(mode: Mode, account: Account):
if mode not in [Mode.PAPER, Mode.LIVE]:
print("has to be LIVE or PAPER only")
@ -120,36 +79,85 @@ def get_key(mode: Mode, account: Account):
#strategy instance main loop heartbeat
HEARTBEAT_TIMEOUT=5
WEB_API_KEY="david"
WEB_API_KEY=os.environ.get('WEB_API_KEY')
#PRIMARY PAPER
ACCOUNT1_PAPER_API_KEY = 'PKGGEWIEYZOVQFDRY70L'
ACCOUNT1_PAPER_SECRET_KEY = 'O5Kt8X4RLceIOvM98i5LdbalItsX7hVZlbPYHy8Y'
ACCOUNT1_PAPER_API_KEY = os.environ.get('ACCOUNT1_PAPER_API_KEY')
ACCOUNT1_PAPER_SECRET_KEY = os.environ.get('ACCOUNT1_PAPER_SECRET_KEY')
ACCOUNT1_PAPER_MAX_BATCH_SIZE = 1
ACCOUNT1_PAPER_PAPER = True
ACCOUNT1_PAPER_FEED = DataFeed.SIP
#ACCOUNT1_PAPER_FEED = DataFeed.SIP
# Load the data feed type from environment variable
data_feed_type_str = os.environ.get('ACCOUNT1_PAPER_FEED', 'iex') # Default to 'sip' if not set
# Convert the string to DataFeed enum
try:
ACCOUNT1_PAPER_FEED = DataFeed(data_feed_type_str)
except ValueError:
# Handle the case where the environment variable does not match any enum member
print(f"Invalid data feed type: {data_feed_type_str} in ACCOUNT1_PAPER_FEED defaulting to 'iex'")
ACCOUNT1_PAPER_FEED = DataFeed.SIP
#PRIMARY LIVE
ACCOUNT1_LIVE_API_KEY = 'AKB5HD32LPDZC9TPUWJT'
ACCOUNT1_LIVE_SECRET_KEY = 'Xq1wPSNOtwmlMTAd4cEmdKvNDgfcUYfrOaCccaAs'
ACCOUNT1_LIVE_API_KEY = os.environ.get('ACCOUNT1_LIVE_API_KEY')
ACCOUNT1_LIVE_SECRET_KEY = os.environ.get('ACCOUNT1_LIVE_SECRET_KEY')
ACCOUNT1_LIVE_MAX_BATCH_SIZE = 1
ACCOUNT1_LIVE_PAPER = False
ACCOUNT1_LIVE_FEED = DataFeed.SIP
#ACCOUNT1_LIVE_FEED = DataFeed.SIP
# Load the data feed type from environment variable
data_feed_type_str = os.environ.get('ACCOUNT1_LIVE_FEED', 'iex') # Default to 'sip' if not set
# Convert the string to DataFeed enum
try:
ACCOUNT1_LIVE_FEED = DataFeed(data_feed_type_str)
except ValueError:
# Handle the case where the environment variable does not match any enum member
print(f"Invalid data feed type: {data_feed_type_str} in ACCOUNT1_LIVE_FEED defaulting to 'iex'")
ACCOUNT1_LIVE_FEED = DataFeed.IEX
#SECONDARY PAPER - Martin
ACCOUNT2_PAPER_API_KEY = 'PKPDTCQLNHCBC2D9GQFB'
ACCOUNT2_PAPER_SECRET_KEY = 'c1Z2V0gBleQmwHYCreqqTs45Jy33RqPGrofuSayz'
ACCOUNT2_PAPER_API_KEY = os.environ.get('ACCOUNT2_PAPER_API_KEY')
ACCOUNT2_PAPER_SECRET_KEY = os.environ.get('ACCOUNT2_PAPER_SECRET_KEY')
ACCOUNT2_PAPER_MAX_BATCH_SIZE = 1
ACCOUNT2_PAPER_PAPER = True
ACCOUNT2_PAPER_FEED = DataFeed.IEX
#ACCOUNT2_PAPER_FEED = DataFeed.IEX
# #SECONDARY PAPER
# ACCOUNT2_PAPER_API_KEY = 'PK0OQHZG03PUZ1SC560V'
# ACCOUNT2_PAPER_SECRET_KEY = 'cTglhm7kwRcZfFT27fQWz31sXaxadzQApFDW6Lat'
# ACCOUNT2_PAPER_MAX_BATCH_SIZE = 1
# ACCOUNT2_PAPER_PAPER = True
# ACCOUNT2_PAPER_FEED = DataFeed.IEX
# Load the data feed type from environment variable
data_feed_type_str = os.environ.get('ACCOUNT2_PAPER_FEED', 'iex') # Default to 'sip' if not set
# Convert the string to DataFeed enum
try:
ACCOUNT2_PAPER_FEED = DataFeed(data_feed_type_str)
except ValueError:
# Handle the case where the environment variable does not match any enum member
print(f"Invalid data feed type: {data_feed_type_str} in ACCOUNT2_PAPER_FEED defaulting to 'iex'")
ACCOUNT2_PAPER_FEED = DataFeed.IEX
#SECONDARY LIVE - Martin
# ACCOUNT2_LIVE_API_KEY = os.environ.get('ACCOUNT2_LIVE_API_KEY')
# ACCOUNT2_LIVE_SECRET_KEY = os.environ.get('ACCOUNT2_LIVE_SECRET_KEY')
# ACCOUNT2_LIVE_MAX_BATCH_SIZE = 1
# ACCOUNT2_LIVE_PAPER = True
# #ACCOUNT2_LIVE_FEED = DataFeed.IEX
# # Load the data feed type from environment variable
# data_feed_type_str = os.environ.get('ACCOUNT2_LIVE_FEED', 'iex') # Default to 'sip' if not set
# # Convert the string to DataFeed enum
# try:
# ACCOUNT2_LIVE_FEED = DataFeed(data_feed_type_str)
# except ValueError:
# # Handle the case where the environment variable does not match any enum member
# print(f"Invalid data feed type: {data_feed_type_str} in ACCOUNT2_LIVE_FEED defaulting to 'iex'")
# ACCOUNT2_LIVE_FEED = DataFeed.IEX
#zatim jsou LIVE_DATA nastaveny jako z account1_paper
LIVE_DATA_API_KEY = ACCOUNT1_PAPER_API_KEY
LIVE_DATA_SECRET_KEY = ACCOUNT1_PAPER_SECRET_KEY
#LIVE_DATA_FEED je nastaveny v config_handleru
class KW:
activate: str = "activate"

View File

@ -0,0 +1,112 @@
import v2realbot.common.db as db
from v2realbot.common.model import ConfigItem
import v2realbot.utils.config_handler as ch
# region CONFIG db services
#TODO vytvorit modul pro dotahovani z pythonu (get_from_config(var_name, def_value) {)- stejne jako v js
#TODO zvazit presunuti do TOML z JSONu
def get_all_config_items():
conn = db.pool.get_connection()
try:
cursor = conn.cursor()
cursor.execute('SELECT id, item_name, json_data FROM config_table')
config_items = [{"id": row[0], "item_name": row[1], "json_data": row[2]} for row in cursor.fetchall()]
finally:
db.pool.release_connection(conn)
return 0, config_items
# Function to get a config item by ID
def get_config_item_by_id(item_id):
conn = db.pool.get_connection()
try:
cursor = conn.cursor()
cursor.execute('SELECT item_name, json_data FROM config_table WHERE id = ?', (item_id,))
row = cursor.fetchone()
finally:
db.pool.release_connection(conn)
if row is None:
return -2, "not found"
else:
return 0, {"item_name": row[0], "json_data": row[1]}
# Function to get a config item by ID
def get_config_item_by_name(item_name):
#print(item_name)
conn = db.pool.get_connection()
try:
cursor = conn.cursor()
query = f"SELECT item_name, json_data FROM config_table WHERE item_name = '{item_name}'"
#print(query)
cursor.execute(query)
row = cursor.fetchone()
#print(row)
finally:
db.pool.release_connection(conn)
if row is None:
return -2, "not found"
else:
return 0, {"item_name": row[0], "json_data": row[1]}
# Function to create a new config item
def create_config_item(config_item: ConfigItem):
conn = db.pool.get_connection()
try:
try:
cursor = conn.cursor()
cursor.execute('INSERT INTO config_table (item_name, json_data) VALUES (?, ?)', (config_item.item_name, config_item.json_data))
item_id = cursor.lastrowid
conn.commit()
print(item_id)
finally:
db.pool.release_connection(conn)
return 0, {"id": item_id, "item_name":config_item.item_name, "json_data":config_item.json_data}
except Exception as e:
return -2, str(e)
# Function to update a config item by ID
def update_config_item(item_id, config_item: ConfigItem):
conn = db.pool.get_connection()
try:
try:
cursor = conn.cursor()
cursor.execute('UPDATE config_table SET item_name = ?, json_data = ? WHERE id = ?', (config_item.item_name, config_item.json_data, item_id))
conn.commit()
#refresh active item je zatím řešena takto natvrdo při updatu položky "active_profile" a při startu aplikace
if config_item.item_name == "active_profile":
ch.config_handler.activate_profile()
finally:
db.pool.release_connection(conn)
return 0, {"id": item_id, **config_item.dict()}
except Exception as e:
return -2, str(e)
# Function to delete a config item by ID
def delete_config_item(item_id):
conn = db.pool.get_connection()
try:
cursor = conn.cursor()
cursor.execute('DELETE FROM config_table WHERE id = ?', (item_id,))
conn.commit()
finally:
db.pool.release_connection(conn)
return 0, {"id": item_id}
# endregion
#Example of using config directive
# config_directive = "overrides"
# ret, res = get_config_item_by_name(config_directive)
# if ret < 0:
# print(f"CONFIG OVERRIDE {config_directive} Error {res}")
# else:
# config = orjson.loads(res["json_data"])
# print("OVERRIDN CFG:", config)
# for key, value in config.items():
# if hasattr(cfg, key):
# print(f"Overriding {key} with {value}")
# setattr(cfg, key, value)

View File

@ -0,0 +1,463 @@
from typing import Any, List, Tuple
from uuid import UUID, uuid4
from v2realbot.common.model import RunManagerRecord, StrategyInstance, RunDay, StrategyInstance, Runner, RunRequest, RunArchive, RunArchiveView, RunArchiveViewPagination, RunArchiveDetail, RunArchiveChange, Bar, TradeEvent, TestList, Intervals, ConfigItem, InstantIndicator, DataTablesRequest
from v2realbot.utils.utils import validate_and_format_time, AttributeDict, zoneNY, zonePRG, safe_get, dict_replace_value, Store, parse_toml_string, json_serial, is_open_hours, send_to_telegram, concatenate_weekdays, transform_data
from v2realbot.utils.ilog import delete_logs
from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeStatus, TradeStoplossType
from datetime import datetime
from v2realbot.loader.trade_offline_streamer import Trade_Offline_Streamer
from threading import Thread, current_thread, Event, enumerate
from v2realbot.config import STRATVARS_UNCHANGEABLES, ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, ACCOUNT1_LIVE_API_KEY, ACCOUNT1_LIVE_SECRET_KEY, DATA_DIR,MEDIA_DIRECTORY, RUNNER_DETAIL_DIRECTORY
import importlib
from alpaca.trading.requests import GetCalendarRequest
from alpaca.trading.client import TradingClient
#from alpaca.trading.models import Calendar
from queue import Queue
from tinydb import TinyDB, Query, where
from tinydb.operations import set
import orjson
import numpy as np
from rich import print
import pandas as pd
from traceback import format_exc
from datetime import timedelta, time
from threading import Lock
import v2realbot.common.db as db
import v2realbot.common.transform as tr
from sqlite3 import OperationalError, Row
import v2realbot.strategyblocks.indicators.custom as ci
from v2realbot.strategyblocks.inits.init_indicators import initialize_dynamic_indicators
from v2realbot.strategyblocks.indicators.indicators_hub import populate_dynamic_indicators
from v2realbot.interfaces.backtest_interface import BacktestInterface
import os
import v2realbot.reporting.metricstoolsimage as mt
import gzip
import os
import msgpack
import v2realbot.controller.services as cs
import v2realbot.scheduler.ap_scheduler as aps
# Functions for your 'run_manager' table
# CREATE TABLE "run_manager" (
# "moddus" TEXT NOT NULL,
# "id" varchar(32),
# "strat_id" varchar(32) NOT NULL,
# "symbol" TEXT,
# "account" TEXT NOT NULL,
# "mode" TEXT NOT NULL,
# "note" TEXT,
# "ilog_save" BOOLEAN,
# "bt_from" TEXT,
# "bt_to" TEXT,
# "weekdays_filter" TEXT,
# "batch_id" TEXT,
# "start_time" TEXT NOT NULL,
# "stop_time" TEXT NOT NULL,
# "status" TEXT NOT NULL,
# "last_processed" TEXT,
# "history" TEXT,
# "valid_from" TEXT,
# "valid_to" TEXT,
# "testlist_id" TEXT,
# "runner_id" varchar2(32),
# PRIMARY KEY("id")
# )
# CREATE INDEX idx_moddus ON run_manager (moddus);
# CREATE INDEX idx_status ON run_manager (status);
# CREATE INDEX idx_status_moddus ON run_manager (status, moddus);
# CREATE INDEX idx_valid_from_to ON run_manager (valid_from, valid_to);
# CREATE INDEX idx_stopped_batch_id ON runner_header (stopped, batch_id);
# CREATE INDEX idx_search_value ON runner_header (strat_id, batch_id);
##weekdays are stored as comma separated values
# Fetching (assume 'weekdays' field is a comma-separated string)
# weekday_str = record['weekdays']
# weekdays = [int(x) for x in weekday_str.split(',')]
# # ... logic to check whether today's weekday is in 'weekdays'
# # Storing
# weekdays = [1, 2, 5] # Example
# weekday_str = ",".join(str(x) for x in weekdays)
# update_data = {'weekdays': weekday_str}
# # ... use in an SQL UPDATE statement
# for row in records:
# row['weekdays_filter'] = [int(x) for x in row['weekdays_filter'].split(',')] if row['weekdays_filter'] else []
#get stratin info return
# strat : StrategyInstance = None
# result, strat = cs.get_stratin("625760ac-6376-47fa-8989-1e6a3f6ab66a")
# if result == 0:
# print(strat)
# else:
# print("Error:", strat)
# Fetch all
#result, records = fetch_all_run_manager_records()
#TODO zvazit rozsireni vystupu o strat_status (running/stopped)
def fetch_all_run_manager_records() -> list[RunManagerRecord]:
conn = db.pool.get_connection()
try:
conn.row_factory = Row
cursor = conn.cursor()
cursor.execute('SELECT * FROM run_manager')
rows = cursor.fetchall()
results = []
#Transform row to object
for row in rows:
#add transformed object into result list
results.append(tr.row_to_runmanager(row))
return 0, results
finally:
conn.row_factory = None
db.pool.release_connection(conn)
# Fetch by strategy_id
# result, record = fetch_run_manager_record_by_id('625760ac-6376-47fa-8989-1e6a3f6ab66a')
def fetch_run_manager_record_by_id(strategy_id) -> RunManagerRecord:
conn = db.pool.get_connection()
try:
conn.row_factory = Row
cursor = conn.cursor()
cursor.execute('SELECT * FROM run_manager WHERE id = ?', (str(strategy_id),))
row = cursor.fetchone()
if row is None:
return -2, "not found"
else:
return 0, tr.row_to_runmanager(row)
except Exception as e:
print("ERROR while fetching all records:", str(e) + format_exc())
return -2, str(e) + format_exc()
finally:
conn.row_factory = None
db.pool.release_connection(conn)
def add_run_manager_record(new_record: RunManagerRecord):
#validation/standardization of time
new_record.start_time = validate_and_format_time(new_record.start_time)
if new_record.start_time is None:
return -2, f"Invalid start_time format {new_record.start_time}"
if new_record.stop_time is not None:
new_record.stop_time = validate_and_format_time(new_record.stop_time)
if new_record.stop_time is None:
return -2, f"Invalid stop_time format {new_record.stop_time}"
if new_record.batch_id is None:
new_record.batch_id = str(uuid4())[:8]
conn = db.pool.get_connection()
try:
strat : StrategyInstance = None
result, strat = cs.get_stratin(id=str(new_record.strat_id))
if result == 0:
new_record.symbol = strat.symbol
else:
return -1, f"Strategy {new_record.strat_id} not found"
cursor = conn.cursor()
# Construct a suitable INSERT query based on your RunManagerRecord fields
insert_query = """
INSERT INTO run_manager (moddus, id, strat_id, symbol,account, mode, note,ilog_save,
market, bt_from, bt_to, weekdays_filter, batch_id,
start_time, stop_time, status, last_processed,
history, valid_from, valid_to, testlist_id)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?,?)
"""
values = [
new_record.moddus, str(new_record.id), str(new_record.strat_id), new_record.symbol, new_record.account, new_record.mode, new_record.note,
int(new_record.ilog_save), new_record.market,
new_record.bt_from.isoformat() if new_record.bt_from is not None else None,
new_record.bt_to.isoformat() if new_record.bt_to is not None else None,
",".join(str(x) for x in new_record.weekdays_filter) if new_record.weekdays_filter else None,
new_record.batch_id, new_record.start_time,
new_record.stop_time, new_record.status,
new_record.last_processed.isoformat() if new_record.last_processed is not None else None,
new_record.history,
new_record.valid_from.isoformat() if new_record.valid_from is not None else None,
new_record.valid_to.isoformat() if new_record.valid_to is not None else None,
new_record.testlist_id
]
db.execute_with_retry(cursor, insert_query, values)
conn.commit()
#Add APS scheduler job refresh
res, result = aps.initialize_jobs()
if res < 0:
return -2, f"Error initializing jobs: {res} {result}"
return 0, new_record.id # Assuming success, you might return something more descriptive
except Exception as e:
print("ERROR while adding record:", str(e) + format_exc())
return -2, str(e) + format_exc()
finally:
db.pool.release_connection(conn)
# Update (example)
# update_data = {'last_started': '2024-02-13 10:35:00'}
# result, message = update_run_manager_record('625760ac-6376-47fa-8989-1e6a3f6ab66a', update_data)
def update_run_manager_record(record_id, updated_record: RunManagerRecord):
#validation/standardization of time
updated_record.start_time = validate_and_format_time(updated_record.start_time)
if updated_record.start_time is None:
return -2, f"Invalid start_time format {updated_record.start_time}"
if updated_record.stop_time is not None:
updated_record.stop_time = validate_and_format_time(updated_record.stop_time)
if updated_record.stop_time is None:
return -2, f"Invalid stop_time format {updated_record.stop_time}"
conn = db.pool.get_connection()
try:
cursor = conn.cursor()
#strategy lookup check, if strategy still exists
strat : StrategyInstance = None
result, strat = cs.get_stratin(id=str(updated_record.strat_id))
if result == 0:
updated_record.symbol = strat.symbol
else:
return -1, f"Strategy {updated_record.strat_id} not found"
#remove values with None, so they are not updated
#updated_record_dict = updated_record.dict(exclude_none=True)
# Construct update query and handle weekdays conversion
update_query = 'UPDATE run_manager SET '
update_params = []
for key, value in updated_record.dict().items(): # Iterate over model attributes
if key in ['id', 'strat_running']: # Skip updating the primary key
continue
update_query += f"{key} = ?, "
if key == "ilog_save":
value = int(value)
elif key in ["strat_id", "runner_id"]:
value = str(value) if value else None
elif key == "weekdays_filter":
value = ",".join(str(x) for x in value) if value else None
elif key in ['valid_from', 'valid_to', 'bt_from', 'bt_to', 'last_processed']:
value = value.isoformat() if value else None
update_params.append(value)
# if 'weekdays_filter' in updated_record.dict():
# updated_record.weekdays_filter = ",".join(str(x) for x in updated_record.weekdays_filter)
update_query = update_query[:-2] # Remove trailing comma and space
update_query += ' WHERE id = ?'
update_params.append(str(record_id))
db.execute_with_retry(cursor, update_query, update_params)
#cursor.execute(update_query, update_params)
conn.commit()
#Add APS scheduler job refresh
res, result = aps.initialize_jobs()
if res < 0:
return -2, f"Error initializing jobs: {res} {result}"
except Exception as e:
print("ERROR while updating record:", str(e) + format_exc())
return -2, str(e) + format_exc()
finally:
db.pool.release_connection(conn)
return 0, record_id
# result, message = delete_run_manager_record('625760ac-6376-47fa-8989-1e6a3f6ab66a')
def delete_run_manager_record(record_id):
conn = db.pool.get_connection()
try:
cursor = conn.cursor()
db.execute_with_retry(cursor, 'DELETE FROM run_manager WHERE id = ?', (str(record_id),))
#cursor.execute('DELETE FROM run_manager WHERE id = ?', (str(strategy_id),))
conn.commit()
except Exception as e:
print("ERROR while deleting record:", str(e) + format_exc())
return -2, str(e) + format_exc()
finally:
db.pool.release_connection(conn)
return 0, record_id
def fetch_scheduled_candidates_for_start_and_stop(market_datetime_now, market) -> tuple[int, dict]:
"""
Fetches all active records from the 'run_manager' table where the mode is 'schedule'. It checks if the current
time in the America/New_York timezone is within the operational intervals specified by 'start_time' and 'stop_time'
for each record. This function is designed to correctly handle scenarios where the operational interval crosses
midnight, as well as intervals contained within a single day.
The function localizes 'valid_from', 'valid_to', 'start_time', and 'stop_time' using the 'zoneNY' timezone object
for accurate comparison with the current time.
Parameters:
market_datetime_now (datetime): The current date and time in the America/New_York timezone.
market (str): The market identifier.
Returns:
Tuple[int, dict]: A tuple where the first element is a status code (0 for success, -2 for error), and the
second element is a dictionary. This dictionary has keys 'start' and 'stop', each containing a list of
RunManagerRecord objects meeting the respective criteria. If an error occurs, the second element is a
descriptive error message.
Note:
- This function assumes that the 'zoneNY' pytz timezone object is properly defined and configured to represent
the America/New York timezone.
- It also assumes that the 'run_manager' table exists in the database with the required columns.
- 'start_time' and 'stop_time' are expected to be strings representing times in 24-hour format.
- If 'valid_from', 'valid_to', 'start_time', or 'stop_time' are NULL in the database, they are considered as
having unlimited boundaries.
Pozor: je jeste jeden okrajovy pripad, kdy by to nemuselo zafungovat: kdyby casy byly nastaveny pro
beh strategie pres pulnoc, ale zapla by se pozdeji az po pulnoci
(https://chat.openai.com/c/3c77674a-8a2c-45aa-afbd-ab140f473e07)
"""
conn = db.pool.get_connection()
try:
conn.row_factory = Row
cursor = conn.cursor()
# Get current datetime in America/New York timezone
market_datetime_now_str = market_datetime_now.strftime('%Y-%m-%d %H:%M:%S')
current_time_str = market_datetime_now.strftime('%H:%M')
print("current_market_datetime_str:", market_datetime_now_str)
print("current_time_str:", current_time_str)
# Select also supports scenarios where strategy runs overnight
# SQL query to fetch records with active status and date constraints for both start and stop times
query = """
SELECT *,
CASE
WHEN start_time <= stop_time AND (? >= start_time AND ? < stop_time) OR
start_time > stop_time AND (? >= start_time OR ? < stop_time) THEN 1
ELSE 0
END as is_start_time,
CASE
WHEN start_time <= stop_time AND (? >= stop_time OR ? < start_time) OR
start_time > stop_time AND (? >= stop_time AND ? < start_time) THEN 1
ELSE 0
END as is_stop_time
FROM run_manager
WHERE status = 'active' AND moddus = 'schedule' AND
((valid_from IS NULL OR strftime('%Y-%m-%d %H:%M:%S', valid_from) <= ?) AND
(valid_to IS NULL OR strftime('%Y-%m-%d %H:%M:%S', valid_to) >= ?))
"""
cursor.execute(query, (current_time_str, current_time_str, current_time_str, current_time_str,
current_time_str, current_time_str, current_time_str, current_time_str,
market_datetime_now_str, market_datetime_now_str))
rows = cursor.fetchall()
start_candidates = []
stop_candidates = []
for row in rows:
run_manager_record = tr.row_to_runmanager(row)
if row['is_start_time']:
start_candidates.append(run_manager_record)
if row['is_stop_time']:
stop_candidates.append(run_manager_record)
results = {'start': start_candidates, 'stop': stop_candidates}
return 0, results
except Exception as e:
msg_err = f"ERROR while fetching records for start and stop times with datetime {market_datetime_now_str}: {str(e)} {format_exc()}"
print(msg_err)
return -2, msg_err
finally:
conn.row_factory = None
db.pool.release_connection(conn)
def fetch_startstop_scheduled_candidates(market_datetime_now, time_check, market = "US") -> tuple[int, list[RunManagerRecord]]:
"""
Fetches all active records from the 'run_manager' table where moddus is schedule, the current date and time
in the America/New_York timezone falls between the 'valid_from' and 'valid_to' datetime
fields, and either 'start_time' or 'stop_time' matches the specified condition with the current time.
If 'valid_from', 'valid_to', or the time column ('start_time'/'stop_time') are NULL, they are considered
as having unlimited boundaries.
The function localizes the 'valid_from', 'valid_to', and the time column times using the 'zoneNY'
timezone object for accurate comparison with the current time.
Parameters:
market_datetime_now (datetime): Current datetime in the market timezone.
market (str): The market for which to fetch candidates.
time_check (str): Either 'start' or 'stop', indicating which time condition to check.
Returns:
Tuple[int, list[RunManagerRecord]]: A tuple where the first element is a status code
(0 for success, -2 for error), and the second element is a list of RunManagerRecord
objects meeting the criteria. If an error occurs, the second element is a descriptive
error message.
Note:
This function assumes that the 'zoneNY' pytz timezone object is properly defined and
configured to represent the America/New York timezone. It also assumes that the
'run_manager' table exists in the database with the columns as described in the
provided schema.
"""
if time_check not in ['start', 'stop']:
return -2, "Invalid time_check parameter. Must be 'start' or 'stop'."
conn = db.pool.get_connection()
try:
conn.row_factory = Row
cursor = conn.cursor()
# Get current datetime in America/New York timezone
market_datetime_now_str = market_datetime_now.strftime('%Y-%m-%d %H:%M:%S')
current_time_str = market_datetime_now.strftime('%H:%M')
print("current_market_datetime_str:", market_datetime_now_str)
print("current_time_str:", current_time_str)
# SQL query to fetch records with active status, date constraints, and time condition
time_column = 'start_time' if time_check == 'start' else 'stop_time'
query = f"""
SELECT * FROM run_manager
WHERE status = 'active' AND moddus = 'schedule' AND
((valid_from IS NULL OR strftime('%Y-%m-%d %H:%M:%S', valid_from) <= ?) AND
(valid_to IS NULL OR strftime('%Y-%m-%d %H:%M:%S', valid_to) >= ?)) AND
({time_column} IS NULL OR {time_column} <= ?)
"""
cursor.execute(query, (market_datetime_now_str, market_datetime_now_str, current_time_str))
rows = cursor.fetchall()
results = [tr.row_to_runmanager(row) for row in rows]
return 0, results
except Exception as e:
msg_err = f"ERROR while fetching records based on {time_check} time with datetime {market_datetime_now_str}: {str(e)} {format_exc()}"
print(msg_err)
return -2, msg_err
finally:
conn.row_factory = None
db.pool.release_connection(conn)
if __name__ == "__main__":
res, sada = fetch_startstop_scheduled_candidates(datetime.now().astimezone(zoneNY), "start")
if res == 0:
print(sada)
else:
print("Error:", sada)
# from apscheduler.schedulers.background import BackgroundScheduler
# import time
# def print_hello():
# print("Hello")
# def schedule_job():
# scheduler = BackgroundScheduler()
# scheduler.add_job(print_hello, 'interval', seconds=10)
# scheduler.start()
# schedule_job()

View File

@ -14,7 +14,7 @@ from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeSt
from datetime import datetime
from v2realbot.loader.trade_offline_streamer import Trade_Offline_Streamer
from threading import Thread, current_thread, Event, enumerate
from v2realbot.config import STRATVARS_UNCHANGEABLES, ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, ACCOUNT1_LIVE_API_KEY, ACCOUNT1_LIVE_SECRET_KEY, DATA_DIR,BT_FILL_CONS_TRADES_REQUIRED,BT_FILL_LOG_SURROUNDING_TRADES,BT_FILL_CONDITION_BUY_LIMIT,BT_FILL_CONDITION_SELL_LIMIT, GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN, MEDIA_DIRECTORY, RUNNER_DETAIL_DIRECTORY, OFFLINE_MODE
from v2realbot.config import STRATVARS_UNCHANGEABLES, ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, ACCOUNT1_PAPER_FEED, ACCOUNT1_LIVE_API_KEY, ACCOUNT1_LIVE_SECRET_KEY, ACCOUNT1_LIVE_FEED, DATA_DIR, MEDIA_DIRECTORY, RUNNER_DETAIL_DIRECTORY
import importlib
from alpaca.trading.requests import GetCalendarRequest
from alpaca.trading.client import TradingClient
@ -24,23 +24,25 @@ from tinydb import TinyDB, Query, where
from tinydb.operations import set
import orjson
import numpy as np
from numpy import ndarray
from rich import print
import pandas as pd
from traceback import format_exc
from datetime import timedelta, time
from threading import Lock
from v2realbot.common.db import pool, execute_with_retry, row_to_runarchive, row_to_runarchiveview
from v2realbot.common.db import pool, execute_with_retry
import v2realbot.common.transform as tr
from sqlite3 import OperationalError, Row
import v2realbot.strategyblocks.indicators.custom as ci
from v2realbot.strategyblocks.inits.init_indicators import initialize_dynamic_indicators
from v2realbot.strategyblocks.indicators.indicators_hub import populate_dynamic_indicators
from v2realbot.strategyblocks.inits.init_attached_data import attach_previous_data
from v2realbot.interfaces.backtest_interface import BacktestInterface
import os
from v2realbot.reporting.metricstoolsimage import generate_trading_report_image
import msgpack
import v2realbot.reporting.metricstoolsimage as mt
import gzip
import os
import msgpack
import v2realbot.utils.config_handler as cfh
#import gc
#from pyinstrument import Profiler
#adding lock to ensure thread safety of TinyDB (in future will be migrated to proper db)
@ -81,7 +83,7 @@ def get_all_stratins():
else:
return (0, [])
def get_stratin(id: UUID):
def get_stratin(id: UUID) -> List[StrategyInstance]:
for i in db.stratins:
if str(i.id) == str(id):
return (0, i)
@ -101,12 +103,12 @@ def create_stratin(si: StrategyInstance):
#validate toml
res, stp = parse_toml_string(si.stratvars_conf)
if res < 0:
return (-1,"stratvars invalid")
return (-1,f"stratvars invalid: {stp}")
res, adp = parse_toml_string(si.add_data_conf)
if res < 0:
return (-1, "None")
return (-1, f"add data conf invalid {adp}")
si.id = uuid4()
print(si)
#print(si)
db.stratins.append(si)
db.save()
#print(db.stratins)
@ -118,10 +120,10 @@ def modify_stratin(si: StrategyInstance, id: UUID):
return (-1, "strat is running, use modify_stratin_running")
res, stp = parse_toml_string(si.stratvars_conf)
if res < 0:
return (-1, "stratvars invalid")
return (-1, f"stratvars invalid {stp}")
res, adp = parse_toml_string(si.add_data_conf)
if res < 0:
return (-1, "add data conf invalid")
return (-1, f"add data conf invalid {adp}")
for i in db.stratins:
if str(i.id) == str(id):
#print("removing",i)
@ -179,14 +181,14 @@ def modify_stratin_running(si: StrategyInstance, id: UUID):
#validate toml
res,stp = parse_toml_string(si.stratvars_conf)
if res < 0:
return (-1, "new stratvars format invalid")
return (-1, f"new stratvars format invalid {stp}")
for i in db.stratins:
if str(i.id) == str(id):
if not is_stratin_running(id=str(id)):
return (-1, "not running")
res,stp_old = parse_toml_string(i.stratvars_conf)
if res < 0:
return (-1, "current stratin stratvars invalid")
return (-1, f"current stratin stratvars invalid {stp_old}")
#TODO reload running strat
#print(stp)
#print("starting injection", stp)
@ -243,13 +245,14 @@ def pause_runner(id: UUID):
return (0, "paused runner " + str(i.id))
print("no ID found")
return (-1, "not running instance found")
def stop_runner(id: UUID = None):
#allows to delete runner based on runner_id, strat_id or all (both none)
#podpruje i hodnotu strat_id v id
def stop_runner(id: UUID = None, strat_id: UUID = None):
chng = []
try:
for i in db.runners:
#print(i['id'])
if id is None or str(i.id) == id:
if (id is None and strat_id is None) or str(i.id) == str(id) or str(i.strat_id) == str(strat_id) or str(i.strat_id) == str(id):
chng.append(i.id)
print("Sending STOP signal to Runner", i.id)
#just sending the signal, update is done in stop after plugin
@ -349,13 +352,28 @@ def capsule(target: object, db: object, inter_batch_params: dict = None):
db.runners.remove(i)
#vytvoreni report image pro RUNNER
try:
res, val = generate_trading_report_image(runner_ids=[str(i.id)])
res, val = mt.generate_trading_report_image(runner_ids=[str(i.id)])
if res == 0:
print("DAILY REPORT IMAGE CREATED")
else:
print(f"Daily report ERROR - {val}")
except Exception as e:
print("Nepodarilo se vytvorit report image", str(e)+format_exc())
err_msg = "Nepodarilo se vytvorit daily report image" + str(e)+format_exc()
send_to_telegram(err_msg)
print(err_msg)
#PRO LIVE a PAPER pri vyplnenem batchi vytvarime batchovy soubor zde (pro BT ridi batch_manager)
if i.run_mode in [Mode.LIVE, Mode.PAPER] and i.batch_id is not None:
try:
res, val = mt.generate_trading_report_image(batch_id=i.batch_id)
if res == 0:
print("BATCH REPORT CREATED")
else:
print(f"BATCH REPORT ERROR - {val}")
except Exception as e:
err_msg = f"Nepodarilo se vytvorit batchj report image pro {i.strat_id} a batch{i.batch_id}" + str(e)+format_exc()
send_to_telegram(err_msg)
print(err_msg)
target.release()
print("Runner STOPPED")
@ -395,7 +413,7 @@ def run_batch_stratin(id: UUID, runReq: RunRequest):
def get_market_days_in_interval(datefrom, dateto, note = None, id = None):
#getting dates from calendat
clientTrading = TradingClient(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, raw_data=False, paper=True)
calendar_request = GetCalendarRequest(start=datefrom,end=dateto)
calendar_request = GetCalendarRequest(start=datefrom.date(),end=dateto.date())
cal_dates = clientTrading.get_calendar(calendar_request)
#list(Calendar)
# Calendar
@ -429,7 +447,7 @@ def run_batch_stratin(id: UUID, runReq: RunRequest):
cal_list.append(RunDay(start = start_time, end = end_time, note = note, id = id))
print(f"Getting interval dates from - to - RESULT ({len(cal_list)}):")
print(cal_list)
#print(cal_list)
return cal_list
#getting days to run into RunDays format
@ -476,6 +494,9 @@ def run_batch_stratin(id: UUID, runReq: RunRequest):
# bud ceka na dokonceni v runners nebo to bude ridit jinak a bude mit jednoho runnera?
# nejak vymyslet.
# logovani zatim jen do print
##OFFLINE BATCH RUN MANAGER (generuje batch_id, ridi datove provazani runnerů(inter_batch_data) a generuje batch report
## a samozrejme spousti jednotlivé dny
def batch_run_manager(id: UUID, runReq: RunRequest, rundays: list[RunDay]):
#zde muzu iterovat nad intervaly
#cekat az dobehne jeden interval a pak spustit druhy
@ -567,14 +588,16 @@ def batch_run_manager(id: UUID, runReq: RunRequest, rundays: list[RunDay]):
runReq = None
#vytvoreni report image pro batch
try:
res, val = generate_trading_report_image(batch_id=batch_id)
res, val = mt.generate_trading_report_image(batch_id=batch_id)
if res == 0:
print("BATCH REPORT CREATED")
else:
print(f"BATCH REPORT ERROR - {val}")
except Exception as e:
print("Nepodarilo se vytvorit report image", str(e)+format_exc())
err_msg = "Nepodarilo se vytvorit batch report image" + str(e)+format_exc()
send_to_telegram(err_msg)
print(err_msg)
#gc.collect()
@ -596,10 +619,10 @@ def run_stratin(id: UUID, runReq: RunRequest, synchronous: bool = False, inter_b
#validate toml
res, stp = parse_toml_string(i.stratvars_conf)
if res < 0:
return (-1, "stratvars invalid")
return (-1, f"stratvars invalid {stp}")
res, adp = parse_toml_string(i.add_data_conf)
if res < 0:
return (-1, "add data conf invalid")
return (-1, f"add data conf invalid {adp}")
id = uuid4()
print(f"RUN {id} INITIATED")
name = i.name
@ -697,7 +720,7 @@ def get_trade_history(symbol: str, timestamp_from: float, timestamp_to:float):
#datetime_object_from = datetime(2023, 4, 14, 15, 51, 38, tzinfo=zoneNY)
#datetime_object_to = datetime(2023, 4, 14, 15, 51, 39, tzinfo=zoneNY)
client = StockHistoricalDataClient(ACCOUNT1_LIVE_API_KEY, ACCOUNT1_LIVE_SECRET_KEY, raw_data=False)
trades_request = StockTradesRequest(symbol_or_symbols=symbol, feed = DataFeed.SIP, start=datetime_object_from, end=datetime_object_to)
trades_request = StockTradesRequest(symbol_or_symbols=symbol, feed = ACCOUNT1_LIVE_FEED, start=datetime_object_from, end=datetime_object_to)
all_trades = client.get_stock_trades(trades_request)
#print(all_trades[symbol])
return 0, all_trades[symbol]
@ -866,11 +889,7 @@ def archive_runner(runner: Runner, strat: StrategyInstance, inter_batch_params:
rectype=strat.state.rectype,
cache_used=strat.dataloader.cache_used if isinstance(strat.dataloader, Trade_Offline_Streamer) else None,
configs=dict(
GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN=GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN,
BT_FILL_CONS_TRADES_REQUIRED=BT_FILL_CONS_TRADES_REQUIRED,
BT_FILL_LOG_SURROUNDING_TRADES=BT_FILL_LOG_SURROUNDING_TRADES,
BT_FILL_CONDITION_BUY_LIMIT=BT_FILL_CONDITION_BUY_LIMIT,
BT_FILL_CONDITION_SELL_LIMIT=BT_FILL_CONDITION_SELL_LIMIT))
CONFIG_HANDLER=dict(profile=cfh.config_handler.active_profile, values=cfh.config_handler.active_config)))
#add profit of this batch iteration to batch_sum_profit
@ -907,7 +926,8 @@ def archive_runner(runner: Runner, strat: StrategyInstance, inter_batch_params:
end_positions=strat.state.positions,
end_positions_avgp=round(float(strat.state.avgp),3),
metrics=results_metrics,
stratvars_toml=runner.run_stratvars_toml
stratvars_toml=runner.run_stratvars_toml,
transferables=strat.state.vars["transferables"]
)
#flatten indicators from numpy array
@ -915,7 +935,7 @@ def archive_runner(runner: Runner, strat: StrategyInstance, inter_batch_params:
#pole indicatoru, kazdy ma svoji casovou osu time
flattened_indicators_list = []
for key, value in strat.state.indicators.items():
if isinstance(value, ndarray):
if isinstance(value, np.ndarray):
#print("is numpy", key,value)
flattened_indicators[key]= value.tolist()
#print("changed numpy:",value.tolist())
@ -925,7 +945,7 @@ def archive_runner(runner: Runner, strat: StrategyInstance, inter_batch_params:
flattened_indicators_list.append(flattened_indicators)
flattened_indicators = {}
for key, value in strat.state.cbar_indicators.items():
if isinstance(value, ndarray):
if isinstance(value, np.ndarray):
#print("is numpy", key,value)
flattened_indicators[key]= value.tolist()
#print("changed numpy:",value.tolist())
@ -988,7 +1008,7 @@ def get_all_archived_runners() -> list[RunArchiveView]:
rows = c.fetchall()
results = []
for row in rows:
results.append(row_to_runarchiveview(row))
results.append(tr.row_to_runarchiveview(row))
finally:
conn.row_factory = None
pool.release_connection(conn)
@ -1018,7 +1038,7 @@ def get_all_archived_runners() -> list[RunArchiveView]:
# c.execute(paginated_query)
# rows = c.fetchall()
# results = [row_to_runarchiveview(row) for row in rows]
# results = [tr.row_to_runarchiveview(row) for row in rows]
# finally:
# conn.row_factory = None
@ -1032,7 +1052,7 @@ def get_all_archived_runners() -> list[RunArchiveView]:
#new version to support search and ordering
#TODO index nad strat_id a batch_id mam?
def get_all_archived_runners_p(request: DataTablesRequest) -> Tuple[int, RunArchiveViewPagination]:
def get_all_archived_runners_p_original(request: DataTablesRequest) -> Tuple[int, RunArchiveViewPagination]:
conn = pool.get_connection()
search_value = request.search.value # Extract the search value from the request
try:
@ -1068,7 +1088,80 @@ def get_all_archived_runners_p(request: DataTablesRequest) -> Tuple[int, RunArch
c.execute(filtered_count_query, {'search_value': f'%{search_value}%'})
filtered_count = c.fetchone()[0]
results = [row_to_runarchiveview(row) for row in rows]
results = [tr.row_to_runarchiveview(row) for row in rows]
finally:
conn.row_factory = None
pool.release_connection(conn)
try:
obj = RunArchiveViewPagination(draw=request.draw, recordsTotal=total_count, recordsFiltered=filtered_count, data=results)
return 0, obj
except Exception as e:
return -2, str(e) + format_exc()
#new version with batch_id asc sortin https://chat.openai.com/c/64511445-5181-411b-b9d0-51d16930bf71
#Tato verze správně groupuje záznamy se stejnym batch_id (podle maximalniho batche) a non batch zaznamy prolne mezi ne podle jeho stopped date - vlozi zaznam po nebo pred jednotlivou skupinu (dle jejiho max.date)
#diky tomu se mi radi batche a nonbatche spravne a pokud do batche pridame zaznam zobrazi se nam batch nahore
def get_all_archived_runners_p(request: DataTablesRequest) -> Tuple[int, RunArchiveViewPagination]:
conn = pool.get_connection()
search_value = request.search.value # Extract the search value from the request
try:
conn.row_factory = Row
c = conn.cursor()
# Total count query
total_count_query = """
SELECT COUNT(*) FROM runner_header
WHERE (:search_value = '' OR strat_id LIKE :search_value OR batch_id LIKE :search_value OR symbol like :search_value OR name like :search_value)
"""
c.execute(total_count_query, {'search_value': f'%{search_value}%'})
total_count = c.fetchone()[0]
# Paginated query with advanced sorting logic
paginated_query = f"""
WITH GroupedData AS (
SELECT runner_id, strat_id, batch_id, symbol, name, note, started,
stopped, mode, account, bt_from, bt_to, ilog_save, profit,
trade_count, end_positions, end_positions_avgp, metrics,
MAX(stopped) OVER (PARTITION BY batch_id) AS max_stopped,
SUM(profit) OVER (PARTITION BY batch_id) AS batch_profit,
COUNT(*) OVER (PARTITION BY batch_id) AS batch_count
FROM runner_header
WHERE (:search_value = '' OR strat_id LIKE :search_value OR batch_id LIKE :search_value OR symbol like :search_value OR name like :search_value)
),
InterleavedGroups AS (
SELECT *,
CASE
WHEN batch_id IS NOT NULL THEN max_stopped
ELSE stopped
END AS sort_key
FROM GroupedData
)
SELECT runner_id, strat_id, batch_id, symbol, name, note, started,
stopped, mode, account, bt_from, bt_to, ilog_save, profit,
trade_count, end_positions, end_positions_avgp, metrics,
batch_profit, batch_count
FROM InterleavedGroups
ORDER BY
sort_key DESC,
CASE WHEN batch_id IS NOT NULL THEN 0 ELSE 1 END,
stopped DESC
LIMIT {request.length} OFFSET {request.start}
"""
c.execute(paginated_query, {'search_value': f'%{search_value}%'})
rows = c.fetchall()
# Filtered count query
filtered_count_query = """
SELECT COUNT(*) FROM runner_header
WHERE (:search_value = '' OR strat_id LIKE :search_value OR batch_id LIKE :search_value OR symbol like :search_value OR name like :search_value)
"""
c.execute(filtered_count_query, {'search_value': f'%{search_value}%'})
filtered_count = c.fetchone()[0]
results = [tr.row_to_runarchiveview(row) for row in rows]
finally:
conn.row_factory = None
@ -1103,7 +1196,7 @@ def get_archived_runner_header_byID(id: UUID) -> RunArchive:
row = c.fetchone()
if row:
return 0, row_to_runarchive(row)
return 0, tr.row_to_runarchive(row)
else:
return -2, "not found"
@ -1129,12 +1222,38 @@ def get_archived_runner_header_byID(id: UUID) -> RunArchive:
# else:
# return 0, res
#vrátí seznam runneru s danym batch_id
def get_archived_runnerslist_byBatchID(batch_id: str):
# #vrátí seznam runneru s danym batch_id
# def get_archived_runnerslist_byBatchID(batch_id: str):
# conn = pool.get_connection()
# try:
# cursor = conn.cursor()
# cursor.execute(f"SELECT runner_id FROM runner_header WHERE batch_id='{str(batch_id)}'")
# runner_list = [row[0] for row in cursor.fetchall()]
# finally:
# pool.release_connection(conn)
# return 0, runner_list
#update that allows to sort
def get_archived_runnerslist_byBatchID(batch_id: str, sort_order: str = "asc"):
"""
Fetches all runner records by batch_id, sorted by the 'started' column.
:param batch_id: The batch ID to filter runners by.
:param sort_order: The sort order of the 'started' column. Defaults to 'asc'.
Accepts 'asc' for ascending or 'desc' for descending order.
:return: A tuple with the first element being a status code and the second being the list of runner_ids.
"""
# Validate sort_order
if sort_order.lower() not in ['asc', 'desc']:
return -1, [] # Returning an error code and an empty list in case of invalid sort_order
conn = pool.get_connection()
try:
cursor = conn.cursor()
cursor.execute(f"SELECT runner_id FROM runner_header WHERE batch_id='{str(batch_id)}'")
query = f"""SELECT runner_id FROM runner_header
WHERE batch_id=?
ORDER BY datetime(started) {sort_order.upper()}"""
cursor.execute(query, (batch_id,))
runner_list = [row[0] for row in cursor.fetchall()]
finally:
pool.release_connection(conn)
@ -1148,11 +1267,11 @@ def insert_archive_header(archeader: RunArchive):
res = c.execute("""
INSERT INTO runner_header
(runner_id, strat_id, batch_id, symbol, name, note, started, stopped, mode, account, bt_from, bt_to, strat_json, settings, ilog_save, profit, trade_count, end_positions, end_positions_avgp, metrics, stratvars_toml)
(runner_id, strat_id, batch_id, symbol, name, note, started, stopped, mode, account, bt_from, bt_to, strat_json, settings, ilog_save, profit, trade_count, end_positions, end_positions_avgp, metrics, stratvars_toml, transferables)
VALUES
(?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
(?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
""",
(str(archeader.id), str(archeader.strat_id), archeader.batch_id, archeader.symbol, archeader.name, archeader.note, archeader.started, archeader.stopped, archeader.mode, archeader.account, archeader.bt_from, archeader.bt_to, orjson.dumps(archeader.strat_json).decode('utf-8'), orjson.dumps(archeader.settings).decode('utf-8'), archeader.ilog_save, archeader.profit, archeader.trade_count, archeader.end_positions, archeader.end_positions_avgp, orjson.dumps(archeader.metrics, default=json_serial, option=orjson.OPT_PASSTHROUGH_DATETIME).decode('utf-8'), archeader.stratvars_toml))
(str(archeader.id), str(archeader.strat_id), archeader.batch_id, archeader.symbol, archeader.name, archeader.note, archeader.started, archeader.stopped, archeader.mode, archeader.account, archeader.bt_from, archeader.bt_to, orjson.dumps(archeader.strat_json).decode('utf-8'), orjson.dumps(archeader.settings).decode('utf-8'), archeader.ilog_save, archeader.profit, archeader.trade_count, archeader.end_positions, archeader.end_positions_avgp, orjson.dumps(archeader.metrics, default=json_serial, option=orjson.OPT_PASSTHROUGH_DATETIME).decode('utf-8'), archeader.stratvars_toml, orjson.dumps(archeader.transferables).decode('utf-8')))
#retry not yet supported for statement format above
#res = execute_with_retry(c,statement)
@ -1476,7 +1595,7 @@ def preview_indicator_byTOML(id: UUID, indicator: InstantIndicator, save: bool =
# print(row)
res, toml_parsed = parse_toml_string(tomlino)
if res < 0:
return (-2, "toml invalid")
return (-2, f"toml invalid: {toml_parsed}")
#print("parsed toml", toml_parsed)
@ -1571,9 +1690,17 @@ def preview_indicator_byTOML(id: UUID, indicator: InstantIndicator, save: bool =
state.ind_mapping = {**local_dict_inds, **local_dict_bars, **local_dict_cbar_inds}
#print("IND MAPPING DONE:", state.ind_mapping)
##intialize required vars from strat init
state.vars["loaded_models"] = {}
#state attributes for martingale sizing mngmt
state.vars["transferables"] = {}
state.vars["transferables"]["martingale"] = dict(cont_loss_series_cnt=0)
##intialize dynamic indicators
initialize_dynamic_indicators(state)
#TODO vazit attached data (z toho potrebuji jen transferables, tzn. najit nejak predchozi runner a prelipnout transferables od zacatku)
#nejspis upravit attach_previous_data a nebo udelat specialni verzi
#attach_previous_data(state)
# print("subtype")
# function = "ci."+subtype+"."+subtype
@ -1714,10 +1841,10 @@ def preview_indicator_byTOML(id: UUID, indicator: InstantIndicator, save: bool =
#vracime list, kde pozice 0 je bar indicators, pozice 1 je ticks indicators
if output == "bar":
return 0, [output_dict, []]
return 0, [output_dict, {}]
#return 0, [new_inds[indicator.name], []]
else:
return 0, [[], output_dict]
return 0, [{}, output_dict]
#return 0, [[], new_tick_inds[indicator.name]]
except Exception as e:
@ -1757,107 +1884,18 @@ def delete_indicator_byName(id: UUID, indicator: InstantIndicator):
print(str(e) + format_exc())
return -2, str(e)
# region CONFIG db services
#TODO vytvorit modul pro dotahovani z pythonu (get_from_config(var_name, def_value) {)- stejne jako v js
#TODO zvazit presunuti do TOML z JSONu
def get_all_config_items():
conn = pool.get_connection()
try:
cursor = conn.cursor()
cursor.execute('SELECT id, item_name, json_data FROM config_table')
config_items = [{"id": row[0], "item_name": row[1], "json_data": row[2]} for row in cursor.fetchall()]
finally:
pool.release_connection(conn)
return 0, config_items
# Function to get a config item by ID
def get_config_item_by_id(item_id):
conn = pool.get_connection()
try:
cursor = conn.cursor()
cursor.execute('SELECT item_name, json_data FROM config_table WHERE id = ?', (item_id,))
row = cursor.fetchone()
finally:
pool.release_connection(conn)
if row is None:
return -2, "not found"
else:
return 0, {"item_name": row[0], "json_data": row[1]}
# Function to get a config item by ID
def get_config_item_by_name(item_name):
#print(item_name)
conn = pool.get_connection()
try:
cursor = conn.cursor()
query = f"SELECT item_name, json_data FROM config_table WHERE item_name = '{item_name}'"
#print(query)
cursor.execute(query)
row = cursor.fetchone()
#print(row)
finally:
pool.release_connection(conn)
if row is None:
return -2, "not found"
else:
return 0, {"item_name": row[0], "json_data": row[1]}
# Function to create a new config item
def create_config_item(config_item: ConfigItem):
conn = pool.get_connection()
try:
try:
cursor = conn.cursor()
cursor.execute('INSERT INTO config_table (item_name, json_data) VALUES (?, ?)', (config_item.item_name, config_item.json_data))
item_id = cursor.lastrowid
conn.commit()
print(item_id)
finally:
pool.release_connection(conn)
return 0, {"id": item_id, "item_name":config_item.item_name, "json_data":config_item.json_data}
except Exception as e:
return -2, str(e)
# Function to update a config item by ID
def update_config_item(item_id, config_item: ConfigItem):
conn = pool.get_connection()
try:
try:
cursor = conn.cursor()
cursor.execute('UPDATE config_table SET item_name = ?, json_data = ? WHERE id = ?', (config_item.item_name, config_item.json_data, item_id))
conn.commit()
finally:
pool.release_connection(conn)
return 0, {"id": item_id, **config_item.dict()}
except Exception as e:
return -2, str(e)
# Function to delete a config item by ID
def delete_config_item(item_id):
conn = pool.get_connection()
try:
cursor = conn.cursor()
cursor.execute('DELETE FROM config_table WHERE id = ?', (item_id,))
conn.commit()
finally:
pool.release_connection(conn)
return 0, {"id": item_id}
# endregion
#returns b
def get_alpaca_history_bars(symbol: str, datetime_object_from: datetime, datetime_object_to: datetime, timeframe: TimeFrame):
"""Returns Bar object
"""
try:
result = []
client = StockHistoricalDataClient(ACCOUNT1_LIVE_API_KEY, ACCOUNT1_LIVE_SECRET_KEY, raw_data=False)
#datetime_object_from = datetime(2023, 2, 27, 18, 51, 38, tzinfo=datetime.timezone.utc)
#datetime_object_to = datetime(2023, 2, 27, 21, 51, 39, tzinfo=datetime.timezone.utc)
bar_request = StockBarsRequest(symbol_or_symbols=symbol,timeframe=timeframe, start=datetime_object_from, end=datetime_object_to, feed=DataFeed.SIP)
bar_request = StockBarsRequest(symbol_or_symbols=symbol,timeframe=timeframe, start=datetime_object_from, end=datetime_object_to, feed=ACCOUNT1_LIVE_FEED)
#print("before df")
bars = client.get_stock_bars(bar_request)
result = []
##pridavame pro jistotu minutu z obou stran kvuli frontendu
business_hours = {
# monday = 0, tuesday = 1, ... same pattern as date.weekday()
@ -1888,12 +1926,25 @@ def get_alpaca_history_bars(symbol: str, datetime_object_from: datetime, datetim
#bars.data[symbol]
return 0, result
except Exception as e:
print(str(e) + format_exc())
if OFFLINE_MODE:
print("OFFLINE MODE ENABLED")
return 0, []
return -2, str(e)
# Workaround of error when no data foun d AttributeError and has the specific message
if isinstance(e, AttributeError) and str(e) == "'NoneType' object has no attribute 'items'":
print("Caught the specific AttributeError: 'NoneType' object has no attribute 'items' means NO DATA FOUND")
print(str(e) + format_exc())
return 0, result
else:
print(str(e) + format_exc())
if cfh.config_handler.get_val('OFFLINE_MODE'):
print("OFFLINE MODE ENABLED")
return 0, []
return -2, str(e)
# change_archived_runner
# delete_archived_runner_details
#Example of using config directive
# config_directive = "python"
# ret, res = get_config_item_by_name(config_directive)
# if ret < 0:
# print(f"Error {res}")
# else:
# config = orjson.loads(res["json_data"])
# print(config)

View File

@ -1,6 +1,11 @@
from enum import Enum
from alpaca.trading.enums import OrderSide, OrderStatus, OrderType
class BarType(str, Enum):
TIME = "time"
VOLUME = "volume"
DOLLAR = "dollar"
class Env(str, Enum):
PROD = "prod"
TEST = "test"
@ -52,6 +57,16 @@ class Account(str, Enum):
"""
ACCOUNT1 = "ACCOUNT1"
ACCOUNT2 = "ACCOUNT2"
class Moddus(str, Enum):
"""
Moddus for RunManager record
schedule - scheduled record
queue - queued record
"""
SCHEDULE = "schedule"
QUEUE = "queue"
class RecordType(str, Enum):
"""
Represents output of aggregator
@ -64,6 +79,15 @@ class RecordType(str, Enum):
CBARRENKO = "cbarrenko"
TRADE = "trade"
class SchedulerStatus(str, Enum):
"""
ACTIVE - active scheduling
SUSPENDED - suspended for scheduling
"""
ACTIVE = "active"
SUSPENDED = "suspended"
class Mode(str, Enum):
"""
LIVE - live on production
@ -77,7 +101,6 @@ class Mode(str, Enum):
BT = "backtest"
PREP = "prep"
class StartBarAlign(str, Enum):
"""
Represents first bar start time alignement according to timeframe
@ -86,3 +109,9 @@ class StartBarAlign(str, Enum):
"""
ROUND = "round"
RANDOM = "random"
class Market(str, Enum):
US = "US"
CRYPTO = "CRYPTO"

View File

@ -2,9 +2,9 @@ from alpaca.trading.enums import OrderSide, OrderType
from threading import Lock
from v2realbot.interfaces.general_interface import GeneralInterface
from v2realbot.backtesting.backtester import Backtester
from v2realbot.config import BT_DELAYS, COUNT_API_REQUESTS
from datetime import datetime
from v2realbot.utils.utils import zoneNY
import v2realbot.utils.config_handler as cfh
""""
backtester methods can be called
@ -19,7 +19,7 @@ class BacktestInterface(GeneralInterface):
def __init__(self, symbol, bt: Backtester) -> None:
self.symbol = symbol
self.bt = bt
self.count_api_requests = COUNT_API_REQUESTS
self.count_api_requests = cfh.config_handler.get_val('COUNT_API_REQUESTS')
self.mincnt = list([dict(minute=0,count=0)])
#TODO time v API nejspis muzeme dat pryc a BT bude si to brat primo ze self.time (nezapomenout na + BT_DELAYS)
# self.time = self.bt.time
@ -43,33 +43,33 @@ class BacktestInterface(GeneralInterface):
def buy(self, size = 1, repeat: bool = False):
self.count()
#add REST API latency
return self.bt.submit_order(time=self.bt.time + BT_DELAYS.strat_to_sub,symbol=self.symbol,side=OrderSide.BUY,size=size,order_type = OrderType.MARKET)
return self.bt.submit_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'),symbol=self.symbol,side=OrderSide.BUY,size=size,order_type = OrderType.MARKET)
"""buy limit"""
def buy_l(self, price: float, size: int = 1, repeat: bool = False, force: int = 0):
self.count()
return self.bt.submit_order(time=self.bt.time + BT_DELAYS.strat_to_sub,symbol=self.symbol,side=OrderSide.BUY,size=size,price=price,order_type = OrderType.LIMIT)
return self.bt.submit_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'),symbol=self.symbol,side=OrderSide.BUY,size=size,price=price,order_type = OrderType.LIMIT)
"""sell market"""
def sell(self, size = 1, repeat: bool = False):
self.count()
return self.bt.submit_order(time=self.bt.time + BT_DELAYS.strat_to_sub,symbol=self.symbol,side=OrderSide.SELL,size=size,order_type = OrderType.MARKET)
return self.bt.submit_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'),symbol=self.symbol,side=OrderSide.SELL,size=size,order_type = OrderType.MARKET)
"""sell limit"""
async def sell_l(self, price: float, size = 1, repeat: bool = False):
self.count()
return self.bt.submit_order(time=self.bt.time + BT_DELAYS.strat_to_sub,symbol=self.symbol,side=OrderSide.SELL,size=size,price=price,order_type = OrderType.LIMIT)
return self.bt.submit_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'),symbol=self.symbol,side=OrderSide.SELL,size=size,price=price,order_type = OrderType.LIMIT)
"""replace order"""
async def repl(self, orderid: str, price: float = None, size: int = None, repeat: bool = False):
self.count()
return self.bt.replace_order(time=self.bt.time + BT_DELAYS.strat_to_sub,id=orderid,size=size,price=price)
return self.bt.replace_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'),id=orderid,size=size,price=price)
"""cancel order"""
#TBD exec predtim?
def cancel(self, orderid: str):
self.count()
return self.bt.cancel_order(time=self.bt.time + BT_DELAYS.strat_to_sub, id=orderid)
return self.bt.cancel_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'), id=orderid)
"""get positions ->(size,avgp)"""
#TBD exec predtim?

View File

@ -40,7 +40,9 @@ class LiveInterface(GeneralInterface):
return market_order.id
except Exception as e:
print("Nepodarilo se odeslat buy", str(e))
reason = "Nepodarilo se market buy:" + str(e) + format_exc()
print(reason)
send_to_telegram(reason)
return -1
"""buy limit"""
@ -65,7 +67,9 @@ class LiveInterface(GeneralInterface):
return limit_order.id
except Exception as e:
print("Nepodarilo se odeslat limitku", str(e))
reason = "Nepodarilo se odeslat buy limitku:" + str(e) + format_exc()
print(reason)
send_to_telegram(reason)
return -1
"""sell market"""
@ -87,7 +91,9 @@ class LiveInterface(GeneralInterface):
return market_order.id
except Exception as e:
print("Nepodarilo se odeslat sell", str(e))
reason = "Nepodarilo se odeslat sell:" + str(e) + format_exc()
print(reason)
send_to_telegram(reason)
return -1
"""sell limit"""
@ -112,8 +118,9 @@ class LiveInterface(GeneralInterface):
return limit_order.id
except Exception as e:
print("Nepodarilo se odeslat sell_l", str(e))
#raise Exception(e)
reason = "Nepodarilo se odeslat sell limitku:" + str(e) + format_exc()
print(reason)
send_to_telegram(reason)
return -1
"""order replace"""
@ -136,7 +143,9 @@ class LiveInterface(GeneralInterface):
if e.code == 42210000: return orderid
else:
##mozna tady proste vracet vzdy ok
print("Neslo nahradit profitku. Problem",str(e))
reason = "Neslo nahradit profitku. Problem:" + str(e) + format_exc()
print(reason)
send_to_telegram(reason)
return -1
#raise Exception(e)
@ -150,7 +159,9 @@ class LiveInterface(GeneralInterface):
#order doesnt exist
if e.code == 40410000: return 0
else:
print("nepovedlo se zrusit objednavku", str(e))
reason = "Nepovedlo se zrusit objednavku:" + str(e) + format_exc()
print(reason)
send_to_telegram(reason)
#raise Exception(e)
return -1
@ -178,7 +189,9 @@ class LiveInterface(GeneralInterface):
#list of Orders (orderlist[0].id)
return orderlist
except Exception as e:
print("Chyba pri dotazeni objednávek.", str(e))
reason = "Chyba pri dotazeni objednávek:" + str(e) + format_exc()
print(reason)
send_to_telegram(reason)
#raise Exception (e)
return -1

File diff suppressed because it is too large Load Diff

View File

@ -11,10 +11,10 @@ import threading
from copy import deepcopy
from msgpack import unpackb
import os
from v2realbot.config import DATA_DIR, GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN, AGG_EXCLUDED_TRADES
import pickle
from v2realbot.config import DATA_DIR
import dill
import gzip
import v2realbot.utils.config_handler as cfh
class TradeAggregator:
def __init__(self,
@ -25,7 +25,7 @@ class TradeAggregator:
align: StartBarAlign = StartBarAlign.ROUND,
mintick: int = 0,
exthours: bool = False,
excludes: list = AGG_EXCLUDED_TRADES,
excludes: list = cfh.config_handler.get_val('AGG_EXCLUDED_TRADES'),
skip_cache: bool = False):
"""
UPDATED VERSION - vrací více záznamů
@ -48,7 +48,7 @@ class TradeAggregator:
self.excludes = excludes
self.skip_cache = skip_cache
if mintick >= resolution:
if resolution > 0 and mintick >= resolution:
print("Mintick musi byt mensi nez resolution")
raise Exception
@ -293,7 +293,7 @@ class TradeAggregator:
self.diff_price = True
self.last_price = data['p']
if float(data['t']) - float(self.lasttimestamp) < GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN:
if float(data['t']) - float(self.lasttimestamp) < cfh.config_handler.get_val('GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN'):
self.trades_too_close = True
else:
self.trades_too_close = False
@ -320,13 +320,13 @@ class TradeAggregator:
#TODO: do budoucna vymyslet, kdyz bude mene tradu, tak to radit vzdy do spravneho intervalu
#zarovname time prvniho baru podle timeframu kam patří (např. 5, 10, 15 ...) (ROUND)
if self.align == StartBarAlign.ROUND and self.bar_start == 0:
t = datetime.fromtimestamp(data['t'])
t = datetime.fromtimestamp(data['t'], tz=zoneUTC)
t = t - timedelta(seconds=t.second % self.resolution,microseconds=t.microsecond)
self.bar_start = datetime.timestamp(t)
#nebo pouzijeme datum tradu zaokrouhlene na vteriny (RANDOM)
else:
#ulozime si jeho timestamp (odtum pocitame resolution)
t = datetime.fromtimestamp(int(data['t']))
t = datetime.fromtimestamp(int(data['t']), tz=zoneUTC)
#timestamp
self.bar_start = int(data['t'])
@ -376,7 +376,7 @@ class TradeAggregator:
if self.mintick != 0 and self.lastBarConfirmed:
#d zacatku noveho baru musi ubehnout x sekund nez posilame updazte
#pocatek noveho baru + Xs musi byt vetsi nez aktualni trade
if (self.newBar['time'] + timedelta(seconds=self.mintick)) > datetime.fromtimestamp(data['t']):
if (self.newBar['time'] + timedelta(seconds=self.mintick)) > datetime.fromtimestamp(data['t'], tz=zoneUTC):
#print("waiting for mintick")
return []
else:
@ -540,7 +540,7 @@ class TradeAggregator:
self.diff_price = True
self.last_price = data['p']
if float(data['t']) - float(self.lasttimestamp) < GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN:
if float(data['t']) - float(self.lasttimestamp) < cfh.config_handler.get_val('GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN'):
self.trades_too_close = True
else:
self.trades_too_close = False
@ -712,7 +712,7 @@ class TradeAggregator:
self.diff_price = True
self.last_price = data['p']
if float(data['t']) - float(self.lasttimestamp) < GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN:
if float(data['t']) - float(self.lasttimestamp) < cfh.config_handler.get_val('GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN'):
self.trades_too_close = True
else:
self.trades_too_close = False
@ -756,8 +756,14 @@ class TradeAggregator:
Ve strategii je třeba počítat s tím, že open v nepotvrzeném baru není finální.
"""""
if self.resolution < 0: # Treat as percentage
reference_price = self.lastConfirmedBar['close'] if self.lastConfirmedBar is not None else float(data['p'])
brick_size = abs(self.resolution) * reference_price / 100.0
else: # Treat as absolute value pocet ticku
brick_size = self.resolution
#pocet ticku např. 10ticků, případně pak na procenta
brick_size = self.resolution
#brick_size = self.resolution
#potvrzene pripravene k vraceni
confirmedBars = []
#potvrdi existujici a nastavi k vraceni
@ -866,7 +872,7 @@ class TradeAggregator:
self.diff_price = True
self.last_price = data['p']
if float(data['t']) - float(self.lasttimestamp) < GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN:
if float(data['t']) - float(self.lasttimestamp) < cfh.config_handler.get_val('GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN'):
self.trades_too_close = True
else:
self.trades_too_close = False
@ -962,7 +968,7 @@ class TradeAggregator2Queue(TradeAggregator):
Child of TradeAggregator - sends items to given queue
In the future others will be added - TradeAggToTxT etc.
"""
def __init__(self, symbol: str, queue: Queue, rectype: RecordType = RecordType.BAR, resolution: int = 5, minsize: int = 100, update_ltp: bool = False, align: StartBarAlign = StartBarAlign.ROUND, mintick: int = 0, exthours: bool = False, excludes: list = AGG_EXCLUDED_TRADES, skip_cache: bool = False):
def __init__(self, symbol: str, queue: Queue, rectype: RecordType = RecordType.BAR, resolution: int = 5, minsize: int = 100, update_ltp: bool = False, align: StartBarAlign = StartBarAlign.ROUND, mintick: int = 0, exthours: bool = False, excludes: list = cfh.config_handler.get_val('AGG_EXCLUDED_TRADES'), skip_cache: bool = False):
super().__init__(rectype=rectype, resolution=resolution, minsize=minsize, update_ltp=update_ltp, align=align, mintick=mintick, exthours=exthours, excludes=excludes, skip_cache=skip_cache)
self.queue = queue
self.symbol = symbol
@ -1007,7 +1013,7 @@ class TradeAggregator2List(TradeAggregator):
""""
stores records to the list
"""
def __init__(self, symbol: str, btdata: list, rectype: RecordType = RecordType.BAR, resolution: int = 5, minsize: int = 100, update_ltp: bool = False, align: StartBarAlign = StartBarAlign.ROUND, mintick: int = 0, exthours: bool = False, excludes: list = AGG_EXCLUDED_TRADES, skip_cache: bool = False):
def __init__(self, symbol: str, btdata: list, rectype: RecordType = RecordType.BAR, resolution: int = 5, minsize: int = 100, update_ltp: bool = False, align: StartBarAlign = StartBarAlign.ROUND, mintick: int = 0, exthours: bool = False, excludes: list = cfh.config_handler.get_val('AGG_EXCLUDED_TRADES'), skip_cache: bool = False):
super().__init__(rectype=rectype, resolution=resolution, minsize=minsize, update_ltp=update_ltp, align=align, mintick=mintick, exthours=exthours, excludes=excludes, skip_cache=skip_cache)
self.btdata = btdata
self.symbol = symbol

View File

@ -0,0 +1,570 @@
import pandas as pd
import numpy as np
from numba import jit
from alpaca.data.historical import StockHistoricalDataClient
from sqlalchemy import column
from v2realbot.config import ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, DATA_DIR
from alpaca.data.requests import StockTradesRequest
import time as time_module
from v2realbot.utils.utils import parse_alpaca_timestamp, ltp, zoneNY, send_to_telegram, fetch_calendar_data
import pyarrow
from traceback import format_exc
from datetime import timedelta, datetime, time
from concurrent.futures import ThreadPoolExecutor
import os
import gzip
import pickle
import random
from alpaca.data.models import BarSet, QuoteSet, TradeSet
import v2realbot.utils.config_handler as cfh
from v2realbot.enums.enums import BarType
from tqdm import tqdm
""""
Module used for vectorized aggregation of trades.
Includes fetch (remote/cached) methods and numba aggregator function for TIME BASED, VOLUME BASED and DOLLAR BARS
"""""
def aggregate_trades(symbol: str, trades_df: pd.DataFrame, resolution: int, type: BarType = BarType.TIME):
""""
Accepts dataframe with trades keyed by symbol. Preparess dataframe to
numpy and calls Numba optimized aggregator for given bar type. (time/volume/dollar)
"""""
trades_df = trades_df.loc[symbol]
trades_df= trades_df.reset_index()
ticks = trades_df[['timestamp', 'price', 'size']].to_numpy()
# Extract the timestamps column (assuming it's the first column)
timestamps = ticks[:, 0]
# Convert the timestamps to Unix timestamps in seconds with microsecond precision
unix_timestamps_s = np.array([ts.timestamp() for ts in timestamps], dtype='float64')
# Replace the original timestamps in the NumPy array with the converted Unix timestamps
ticks[:, 0] = unix_timestamps_s
ticks = ticks.astype(np.float64)
#based on type, specific aggregator function is called
match type:
case BarType.TIME:
ohlcv_bars = generate_time_bars_nb(ticks, resolution)
case BarType.VOLUME:
ohlcv_bars = generate_volume_bars_nb(ticks, resolution)
case BarType.DOLLAR:
ohlcv_bars = generate_dollar_bars_nb(ticks, resolution)
case _:
raise ValueError("Invalid bar type. Supported types are 'time', 'volume' and 'dollar'.")
# Convert the resulting array back to a DataFrame
columns = ['time', 'open', 'high', 'low', 'close', 'volume', 'trades']
if type == BarType.DOLLAR:
columns.append('amount')
columns.append('updated')
if type == BarType.TIME:
columns.append('vwap')
columns.append('buyvolume')
columns.append('sellvolume')
if type == BarType.VOLUME:
columns.append('buyvolume')
columns.append('sellvolume')
ohlcv_df = pd.DataFrame(ohlcv_bars, columns=columns)
ohlcv_df['time'] = pd.to_datetime(ohlcv_df['time'], unit='s').dt.tz_localize('UTC').dt.tz_convert(zoneNY)
#print(ohlcv_df['updated'])
ohlcv_df['updated'] = pd.to_datetime(ohlcv_df['updated'], unit="s").dt.tz_localize('UTC').dt.tz_convert(zoneNY)
# Round to microseconds to maintain six decimal places
ohlcv_df['updated'] = ohlcv_df['updated'].dt.round('us')
ohlcv_df.set_index('time', inplace=True)
#ohlcv_df.index = ohlcv_df.index.tz_localize('UTC').tz_convert(zoneNY)
return ohlcv_df
# Function to ensure fractional seconds are present
def ensure_fractional_seconds(timestamp):
if '.' not in timestamp:
# Inserting .000000 before the timezone indicator 'Z'
return timestamp.replace('Z', '.000000Z')
else:
return timestamp
def convert_dict_to_multiindex_df(tradesResponse):
""""
Converts dictionary from cache or from remote (raw input) to multiindex dataframe.
with microsecond precision (from nanoseconds in the raw data)
"""""
# Create a DataFrame for each key and add the key as part of the MultiIndex
dfs = []
for key, values in tradesResponse.items():
df = pd.DataFrame(values)
# Rename columns
# Select and order columns explicitly
#print(df)
df = df[['t', 'x', 'p', 's', 'i', 'c','z']]
df.rename(columns={'t': 'timestamp', 'c': 'conditions', 'p': 'price', 's': 'size', 'x': 'exchange', 'z':'tape', 'i':'id'}, inplace=True)
df['symbol'] = key # Add ticker as a column
# Apply the function to ensure all timestamps have fractional seconds
#zvazit zda toto ponechat a nebo dat jen pri urcitem erroru pri to_datetime
#pripadne pak pridelat efektivnejsi pristup, aneb nahrazeni NaT - https://chatgpt.com/c/d2be6f87-b38f-4050-a1c6-541d100b1474
df['timestamp'] = df['timestamp'].apply(ensure_fractional_seconds)
df['timestamp'] = pd.to_datetime(df['timestamp'], errors='coerce') # Convert 't' from string to datetime before setting it as an index
#Adjust to microsecond precision
df.loc[df['timestamp'].notna(), 'timestamp'] = df['timestamp'].dt.floor('us')
df.set_index(['symbol', 'timestamp'], inplace=True) # Set the multi-level index using both 'ticker' and 't'
df = df.tz_convert(zoneNY, level='timestamp')
dfs.append(df)
# Concatenate all DataFrames into a single DataFrame with MultiIndex
final_df = pd.concat(dfs)
return final_df
def dict_to_df(tradesResponse, start, end, exclude_conditions = None, minsize = None):
""""
Transforms dict to Tradeset, then df and to zone aware
Also filters to start and end if necessary (ex. 9:30 to 15:40 is required only)
NOTE: prepodkladame, ze tradesResponse je dict from Raw data (cached/remote)
"""""
df = convert_dict_to_multiindex_df(tradesResponse)
#REQUIRED FILTERING
#pokud je zacatek pozdeji nebo konec driv tak orizneme
if (start.time() > time(9, 30) or end.time() < time(16, 0)):
print(f"filtrujeme {start.time()} {end.time()}")
# Define the time range
# start_time = pd.Timestamp(start.time(), tz=zoneNY).time()
# end_time = pd.Timestamp(end.time(), tz=zoneNY).time()
# Create a mask to filter rows within the specified time range
mask = (df.index.get_level_values('timestamp') >= start) & \
(df.index.get_level_values('timestamp') <= end)
# Apply the mask to the DataFrame
df = df[mask]
if exclude_conditions is not None:
print(f"excluding conditions {exclude_conditions}")
# Create a mask to exclude rows with any of the specified conditions
mask = df['conditions'].apply(lambda x: any(cond in exclude_conditions for cond in x))
# Filter out the rows with specified conditions
df = df[~mask]
if minsize is not None:
print(f"minsize {minsize}")
#exclude conditions
df = df[df['size'] >= minsize]
return df
def fetch_daily_stock_trades(symbol, start, end, exclude_conditions=None, minsize=None, force_remote=False, max_retries=5, backoff_factor=1):
#doc for this function
"""
Attempts to fetch stock trades either from cache or remote. When remote, it uses retry mechanism with exponential backoff.
Also it stores the data to cache if it is not already there.
by using force_remote - forcess using remote data always and thus refreshing cache for these dates
Attributes:
:param symbol: The stock symbol to fetch trades for.
:param start: The start time for the trade data.
:param end: The end time for the trade data.
:exclude_conditions: list of string conditions to exclude from the data
:minsize minimum size of trade to be included in the data
:force_remote will always use remote data and refresh cache
:param max_retries: Maximum number of retries.
:param backoff_factor: Factor to determine the next sleep time.
:return: TradesResponse object.
:raises: ConnectionError if all retries fail.
We use tradecache only for main sessison requests = 9:30 to 16:00
Do budoucna ukládat celý den BAC-20240203.cache.gz a z toho si pak filtrovat bud main sesssionu a extended
Ale zatim je uloženo jen main session v BAC-timestampopenu-timestampclose.cache.gz
"""
is_same_day = start.date() == end.date()
# Determine if the requested times fall within the main session
in_main_session = (time(9, 30) <= start.time() < time(16, 0)) and (time(9, 30) <= end.time() <= time(16, 0))
file_path = ''
if in_main_session:
filename_start = zoneNY.localize(datetime.combine(start.date(), time(9, 30)))
filename_end = zoneNY.localize(datetime.combine(end.date(), time(16, 0)))
daily_file = f"{symbol}-{int(filename_start.timestamp())}-{int(filename_end.timestamp())}.cache.gz"
file_path = f"{DATA_DIR}/tradecache/{daily_file}"
if not force_remote and os.path.exists(file_path):
print(f"Searching {str(start.date())} cache: " + daily_file)
with gzip.open(file_path, 'rb') as fp:
tradesResponse = pickle.load(fp)
print("FOUND in CACHE", daily_file)
return dict_to_df(tradesResponse, start, end, exclude_conditions, minsize)
print("NOT FOUND. Fetching from remote")
client = StockHistoricalDataClient(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, raw_data=True)
stockTradeRequest = StockTradesRequest(symbol_or_symbols=symbol, start=start, end=end)
last_exception = None
for attempt in range(max_retries):
try:
tradesResponse = client.get_stock_trades(stockTradeRequest)
is_empty = not tradesResponse[symbol]
print(f"Remote fetched: {is_empty=}", start, end)
if in_main_session and not is_empty:
current_time = datetime.now().astimezone(zoneNY)
if not (start < current_time < end):
with gzip.open(file_path, 'wb') as fp:
pickle.dump(tradesResponse, fp)
print("Saving to Trade CACHE", file_path)
else: # Don't save the cache if the market is still open
print("Not saving trade cache, market still open today")
return pd.DataFrame() if is_empty else dict_to_df(tradesResponse, start, end, exclude_conditions, minsize)
except Exception as e:
print(f"Attempt {attempt + 1} failed: {e}")
last_exception = e
time_module.sleep(backoff_factor * (2 ** attempt) + random.uniform(0, 1)) # Adding random jitter
print("All attempts to fetch data failed.")
raise ConnectionError(f"Failed to fetch stock trades after {max_retries} retries. Last exception: {str(last_exception)} and {format_exc()}")
def fetch_trades_parallel(symbol, start_date, end_date, exclude_conditions = cfh.config_handler.get_val('AGG_EXCLUDED_TRADES'), minsize = 100, force_remote = False, max_workers=None):
"""
Fetches trades for each day between start_date and end_date during market hours (9:30-16:00) in parallel and concatenates them into a single DataFrame.
:param symbol: Stock symbol.
:param start_date: Start date as datetime.
:param end_date: End date as datetime.
:return: DataFrame containing all trades from start_date to end_date.
"""
futures = []
results = []
market_open_days = fetch_calendar_data(start_date, end_date)
day_count = len(market_open_days)
print("Contains", day_count, " market days")
max_workers = min(10, max(2, day_count // 2)) if max_workers is None else max_workers # Heuristic: half the days to process, but at least 1 and no more than 10
with ThreadPoolExecutor(max_workers=max_workers) as executor:
#for single_date in (start_date + timedelta(days=i) for i in range((end_date - start_date).days + 1)):
for market_day in tqdm(market_open_days, desc="Processing market days"):
#start = datetime.combine(single_date, time(9, 30)) # Market opens at 9:30 AM
#end = datetime.combine(single_date, time(16, 0)) # Market closes at 4:00 PM
interval_from = zoneNY.localize(market_day.open)
interval_to = zoneNY.localize(market_day.close)
#pripadne orizneme pokud je pozadovane pozdejsi zacatek a drivejsi konek
start = start_date if interval_from < start_date else interval_from
#start = max(start_date, interval_from)
end = end_date if interval_to > end_date else interval_to
#end = min(end_date, interval_to)
future = executor.submit(fetch_daily_stock_trades, symbol, start, end, exclude_conditions, minsize, force_remote)
futures.append(future)
for future in tqdm(futures, desc="Fetching data"):
try:
result = future.result()
results.append(result)
except Exception as e:
print(f"Error fetching data for a day: {e}")
# Batch concatenation to improve speed
batch_size = 10
batches = [results[i:i + batch_size] for i in range(0, len(results), batch_size)]
final_df = pd.concat([pd.concat(batch, ignore_index=False) for batch in batches], ignore_index=False)
return final_df
#original version
#return pd.concat(results, ignore_index=False)
@jit(nopython=True)
def generate_dollar_bars_nb(ticks, amount_per_bar):
""""
Generates Dollar based bars from ticks.
There is also simple prevention of aggregation from different days
as described here https://chatgpt.com/c/17804fc1-a7bc-495d-8686-b8392f3640a2
Downside: split days by UTC (which is ok for main session, but when extended hours it should be reworked by preprocessing new column identifying session)
When trade is split into multiple bars it is counted as trade in each of the bars.
Other option: trade count can be proportionally distributed by weight (0.2 to 1st bar, 0.8 to 2nd bar) - but this is not implemented yet
https://chatgpt.com/c/ff4802d9-22a2-4b72-8ab7-97a91e7a515f
"""""
ohlcv_bars = []
remaining_amount = amount_per_bar
# Initialize bar values based on the first tick to avoid uninitialized values
open_price = ticks[0, 1]
high_price = ticks[0, 1]
low_price = ticks[0, 1]
close_price = ticks[0, 1]
volume = 0
trades_count = 0
current_day = np.floor(ticks[0, 0] / 86400) # Calculate the initial day from the first tick timestamp
bar_time = ticks[0, 0] # Initialize bar time with the time of the first tick
for tick in ticks:
tick_time = tick[0]
price = tick[1]
tick_volume = tick[2]
tick_amount = price * tick_volume
tick_day = np.floor(tick_time / 86400) # Calculate the day of the current tick
# Check if the new tick is from a different day, then close the current bar
if tick_day != current_day:
if trades_count > 0:
ohlcv_bars.append([bar_time, open_price, high_price, low_price, close_price, volume, trades_count, amount_per_bar, tick_time])
# Reset for the new day using the current tick data
open_price = price
high_price = price
low_price = price
close_price = price
volume = 0
trades_count = 0
remaining_amount = amount_per_bar
current_day = tick_day
bar_time = tick_time
# Start new bar if needed because of the dollar value
while tick_amount > 0:
if tick_amount < remaining_amount:
# Add the entire tick to the current bar
high_price = max(high_price, price)
low_price = min(low_price, price)
close_price = price
volume += tick_volume
remaining_amount -= tick_amount
trades_count += 1
tick_amount = 0
else:
# Calculate the amount of volume that fits within the remaining dollar amount
volume_to_add = remaining_amount / price
volume += volume_to_add # Update the volume here before appending and resetting
# Append the partially filled bar to the list
ohlcv_bars.append([bar_time, open_price, high_price, low_price, close_price, volume, trades_count + 1, amount_per_bar, tick_time])
# Fill the current bar and continue with a new bar
tick_volume -= volume_to_add
tick_amount -= remaining_amount
# Reset bar values for the new bar using the current tick data
open_price = price
high_price = price
low_price = price
close_price = price
volume = 0 # Reset volume for the new bar
trades_count = 0
remaining_amount = amount_per_bar
# Increment bar time if splitting a trade
if tick_volume > 0: #pokud v tradu je jeste zbytek nastavujeme cas o nanosekundu vetsi
bar_time = tick_time + 1e-6
else:
bar_time = tick_time #jinak nastavujeme cas ticku
#bar_time = tick_time
# Add the last bar if it contains any trades
if trades_count > 0:
ohlcv_bars.append([bar_time, open_price, high_price, low_price, close_price, volume, trades_count, amount_per_bar, tick_time])
return np.array(ohlcv_bars)
@jit(nopython=True)
def generate_volume_bars_nb(ticks, volume_per_bar):
""""
Generates Volume based bars from ticks.
NOTE: UTC day split here (doesnt aggregate trades from different days)
but realized from UTC (ok for main session) - but needs rework for extension by preprocessing ticks_df and introduction sesssion column
When trade is split into multiple bars it is counted as trade in each of the bars.
Other option: trade count can be proportionally distributed by weight (0.2 to 1st bar, 0.8 to 2nd bar) - but this is not implemented yet
https://chatgpt.com/c/ff4802d9-22a2-4b72-8ab7-97a91e7a515f
"""""
ohlcv_bars = []
remaining_volume = volume_per_bar
# Initialize bar values based on the first tick to avoid uninitialized values
open_price = ticks[0, 1]
high_price = ticks[0, 1]
low_price = ticks[0, 1]
close_price = ticks[0, 1]
volume = 0
trades_count = 0
current_day = np.floor(ticks[0, 0] / 86400) # Calculate the initial day from the first tick timestamp
bar_time = ticks[0, 0] # Initialize bar time with the time of the first tick
buy_volume = 0 # Volume of buy trades
sell_volume = 0 # Volume of sell trades
prev_price = ticks[0, 1] # Initialize previous price for the first tick
for tick in ticks:
tick_time = tick[0]
price = tick[1]
tick_volume = tick[2]
tick_day = np.floor(tick_time / 86400) # Calculate the day of the current tick
# Check if the new tick is from a different day, then close the current bar
if tick_day != current_day:
if trades_count > 0:
ohlcv_bars.append([bar_time, open_price, high_price, low_price, close_price, volume, trades_count, tick_time, buy_volume, sell_volume])
# Reset for the new day using the current tick data
open_price = price
high_price = price
low_price = price
close_price = price
volume = 0
trades_count = 0
remaining_volume = volume_per_bar
current_day = tick_day
bar_time = tick_time # Update bar time to the current tick time
buy_volume = 0
sell_volume = 0
# Reset previous tick price (calulating imbalance for each day from the start)
prev_price = price
# Start new bar if needed because of the volume
while tick_volume > 0:
if tick_volume < remaining_volume:
# Add the entire tick to the current bar
high_price = max(high_price, price)
low_price = min(low_price, price)
close_price = price
volume += tick_volume
remaining_volume -= tick_volume
trades_count += 1
# Update buy and sell volumes
if price > prev_price:
buy_volume += tick_volume
elif price < prev_price:
sell_volume += tick_volume
tick_volume = 0
else:
# Fill the current bar and continue with a new bar
volume_to_add = remaining_volume
volume += volume_to_add
tick_volume -= volume_to_add
trades_count += 1
# Update buy and sell volumes
if price > prev_price:
buy_volume += volume_to_add
elif price < prev_price:
sell_volume += volume_to_add
# Append the completed bar to the list
ohlcv_bars.append([bar_time, open_price, high_price, low_price, close_price, volume, trades_count, tick_time, buy_volume, sell_volume])
# Reset bar values for the new bar using the current tick data
open_price = price
high_price = price
low_price = price
close_price = price
volume = 0
trades_count = 0
remaining_volume = volume_per_bar
buy_volume = 0
sell_volume = 0
# Increment bar time if splitting a trade
if tick_volume > 0: # If there's remaining volume in the trade, set bar time slightly later
bar_time = tick_time + 1e-6
else:
bar_time = tick_time # Otherwise, set bar time to the tick time
prev_price = price
# Add the last bar if it contains any trades
if trades_count > 0:
ohlcv_bars.append([bar_time, open_price, high_price, low_price, close_price, volume, trades_count, tick_time, buy_volume, sell_volume])
return np.array(ohlcv_bars)
@jit(nopython=True)
def generate_time_bars_nb(ticks, resolution):
# Initialize the start and end time
start_time = np.floor(ticks[0, 0] / resolution) * resolution
end_time = np.floor(ticks[-1, 0] / resolution) * resolution
# # Calculate number of bars
# num_bars = int((end_time - start_time) // resolution + 1)
# Using a list to append data only when trades exist
ohlcv_bars = []
# Variables to track the current bar
current_bar_index = -1
open_price = 0
high_price = -np.inf
low_price = np.inf
close_price = 0
volume = 0
trades_count = 0
vwap_cum_volume_price = 0 # Cumulative volume * price
cum_volume = 0 # Cumulative volume for VWAP
buy_volume = 0 # Volume of buy trades
sell_volume = 0 # Volume of sell trades
prev_price = ticks[0, 1] # Initialize previous price for the first tick
prev_day = np.floor(ticks[0, 0] / 86400) # Calculate the initial day from the first tick timestamp
for tick in ticks:
curr_time = tick[0] #updated time
tick_time = np.floor(tick[0] / resolution) * resolution
price = tick[1]
tick_volume = tick[2]
tick_day = np.floor(tick_time / 86400) # Calculate the day of the current tick
#if the new tick is from a new day, reset previous tick price (calculating imbalance starts over)
if tick_day != prev_day:
prev_price = price
prev_day = tick_day
# Check if the tick belongs to a new bar
if tick_time != start_time + current_bar_index * resolution:
if current_bar_index >= 0 and trades_count > 0: # Save the previous bar if trades happened
vwap = vwap_cum_volume_price / cum_volume if cum_volume > 0 else 0
ohlcv_bars.append([start_time + current_bar_index * resolution, open_price, high_price, low_price, close_price, volume, trades_count, curr_time, vwap, buy_volume, sell_volume])
# Reset bar values
current_bar_index = int((tick_time - start_time) / resolution)
open_price = price
high_price = price
low_price = price
volume = 0
trades_count = 0
vwap_cum_volume_price = 0
cum_volume = 0
buy_volume = 0
sell_volume = 0
# Update the OHLCV values for the current bar
high_price = max(high_price, price)
low_price = min(low_price, price)
close_price = price
volume += tick_volume
trades_count += 1
vwap_cum_volume_price += price * tick_volume
cum_volume += tick_volume
# Update buy and sell volumes
if price > prev_price:
buy_volume += tick_volume
elif price < prev_price:
sell_volume += tick_volume
prev_price = price
# Save the last processed bar
if trades_count > 0:
vwap = vwap_cum_volume_price / cum_volume if cum_volume > 0 else 0
ohlcv_bars.append([start_time + current_bar_index * resolution, open_price, high_price, low_price, close_price, volume, trades_count, curr_time, vwap, buy_volume, sell_volume])
return np.array(ohlcv_bars)
# Example usage
if __name__ == '__main__':
pass
#example in agg_vect.ipynb

View File

@ -1,14 +1,13 @@
from v2realbot.loader.aggregator import TradeAggregator, TradeAggregator2List, TradeAggregator2Queue
#from v2realbot.loader.cacher import get_cached_agg_data
from alpaca.trading.requests import GetCalendarRequest
from alpaca.trading.client import TradingClient
from alpaca.data.live import StockDataStream
from v2realbot.config import ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, DATA_DIR, OFFLINE_MODE
from v2realbot.config import ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, DATA_DIR
from alpaca.data.enums import DataFeed
from alpaca.data.historical import StockHistoricalDataClient
from alpaca.data.requests import StockLatestQuoteRequest, StockBarsRequest, StockTradesRequest
from threading import Thread, current_thread
from v2realbot.utils.utils import parse_alpaca_timestamp, ltp, zoneNY
from v2realbot.utils.utils import parse_alpaca_timestamp, ltp, zoneNY, send_to_telegram, fetch_calendar_data
from v2realbot.utils.tlog import tlog
from datetime import datetime, timedelta, date
from threading import Thread
@ -26,13 +25,15 @@ from tqdm import tqdm
import time
from traceback import format_exc
from collections import defaultdict
import requests
import v2realbot.utils.config_handler as cfh
"""
Trade offline data streamer, based on Alpaca historical data.
"""
class Trade_Offline_Streamer(Thread):
#pro BT se pripojujeme vzdy k primarnimu uctu - pouze tahame historicka data + calendar
client = StockHistoricalDataClient(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, raw_data=True)
clientTrading = TradingClient(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, raw_data=False)
#clientTrading = TradingClient(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, raw_data=False)
def __init__(self, time_from: datetime, time_to: datetime, btdata) -> None:
# Call the Thread class's init function
Thread.__init__(self)
@ -64,6 +65,35 @@ class Trade_Offline_Streamer(Thread):
def stop(self):
pass
def fetch_stock_trades(self, symbol, start, end, max_retries=5, backoff_factor=1):
"""
Attempts to fetch stock trades with exponential backoff. Raises an exception if all retries fail.
:param symbol: The stock symbol to fetch trades for.
:param start: The start time for the trade data.
:param end: The end time for the trade data.
:param max_retries: Maximum number of retries.
:param backoff_factor: Factor to determine the next sleep time.
:return: TradesResponse object.
:raises: ConnectionError if all retries fail.
"""
stockTradeRequest = StockTradesRequest(symbol_or_symbols=symbol, start=start, end=end)
last_exception = None
for attempt in range(max_retries):
try:
tradesResponse = self.client.get_stock_trades(stockTradeRequest)
print("Remote Fetch DAY DATA Complete", start, end)
return tradesResponse
except Exception as e:
print(f"Attempt {attempt + 1} failed: {e}")
last_exception = e
time.sleep(backoff_factor * (2 ** attempt))
print("All attempts to fetch data failed.")
send_to_telegram(f"Failed to fetch stock trades after {max_retries} retries. Last exception: {str(last_exception)} and {format_exc()}")
raise ConnectionError(f"Failed to fetch stock trades after {max_retries} retries. Last exception: {str(last_exception)} and {format_exc()}")
# Override the run() function of Thread class
#odebrano async
def main(self):
@ -74,6 +104,8 @@ class Trade_Offline_Streamer(Thread):
print("call add streams to queue first")
return 0
cfh.config_handler.print_current_config()
#iterujeme nad streamy
for i in self.streams:
self.uniquesymbols.add(i.symbol)
@ -108,24 +140,20 @@ class Trade_Offline_Streamer(Thread):
#REFACTOR STARTS HERE
#print(f"{self.time_from=} {self.time_to=}")
if OFFLINE_MODE:
if cfh.config_handler.get_val('OFFLINE_MODE'):
#just one day - same like time_from
den = str(self.time_to.date())
bt_day = Calendar(date=den,open="9:30",close="16:00")
cal_dates = [bt_day]
else:
calendar_request = GetCalendarRequest(start=self.time_from,end=self.time_to)
#toto zatim workaround - dat do retry funkce a obecne vymyslet exception handling, abych byl notifikovan a bylo videt okamzite v logu a na frontendu
try:
cal_dates = self.clientTrading.get_calendar(calendar_request)
except Exception as e:
print("CHYBA - retrying in 4s: " + str(e) + format_exc())
time.sleep(5)
cal_dates = self.clientTrading.get_calendar(calendar_request)
start_date = self.time_from # Assuming this is your start date
end_date = self.time_to # Assuming this is your end date
cal_dates = fetch_calendar_data(start_date, end_date)
#zatim podpora pouze main session
live_data_feed = cfh.config_handler.get_val('LIVE_DATA_FEED')
#zatim podpora pouze 1 symbolu, predelat na froloop vsech symbolu ze symbpole
#minimalni jednotka pro CACHE je 1 den - a to jen marketopen to marketclose (extended hours not supported yet)
for day in cal_dates:
@ -168,9 +196,10 @@ class Trade_Offline_Streamer(Thread):
# stream.send_cache_to_output(cache)
# to_rem.append(stream)
#cache resime jen kdyz backtestujeme cely den
#cache resime jen kdyz backtestujeme cely den a mame sip datapoint (iex necachujeme)
#pokud ne tak ani necteme, ani nezapisujeme do cache
if self.time_to >= day.close and self.time_from <= day.open:
if (self.time_to >= day.close and self.time_from <= day.open) and live_data_feed == DataFeed.SIP:
#tento odstavec obchazime pokud je nastaveno "dont_use_cache"
stream_btdata = self.to_run[symbpole[0]][0]
cache_btdata, file_btdata = stream_btdata.get_cache(day.open, day.close)
@ -213,14 +242,22 @@ class Trade_Offline_Streamer(Thread):
print("Loading from Trade CACHE", file_path)
#daily file doesnt exist
else:
# TODO refactor pro zpracovani vice symbolu najednou(multithreads), nyni predpokladame pouze 1
stockTradeRequest = StockTradesRequest(symbol_or_symbols=symbpole[0], start=day.open,end=day.close)
tradesResponse = self.client.get_stock_trades(stockTradeRequest)
#implement retry mechanism
symbol = symbpole[0] # Assuming symbpole[0] is your target symbol
day_open = day.open # Assuming day.open is the start time
day_close = day.close # Assuming day.close is the end time
tradesResponse = self.fetch_stock_trades(symbol, day_open, day_close)
# # TODO refactor pro zpracovani vice symbolu najednou(multithreads), nyni predpokladame pouze 1
# stockTradeRequest = StockTradesRequest(symbol_or_symbols=symbpole[0], start=day.open,end=day.close)
# tradesResponse = self.client.get_stock_trades(stockTradeRequest)
print("Remote Fetch DAY DATA Complete", day.open, day.close)
#pokud jde o dnešní den a nebyl konec trhu tak cache neukládáme
if day.open < datetime.now().astimezone(zoneNY) < day.close:
print("not saving trade cache, market still open today")
#pokud jde o dnešní den a nebyl konec trhu tak cache neukládáme, pripadne pri iex datapointu necachujeme
if (day.open < datetime.now().astimezone(zoneNY) < day.close) or live_data_feed == DataFeed.IEX:
print("not saving trade cache, market still open today or IEX datapoint")
#ic(datetime.now().astimezone(zoneNY))
#ic(day.open, day.close)
else:
@ -258,7 +295,7 @@ class Trade_Offline_Streamer(Thread):
cnt = 1
for t in tqdm(tradesResponse[symbol]):
for t in tqdm(tradesResponse[symbol], desc="Loading Trades"):
#protoze je zde cely den, poustime dal, jen ty relevantni
#pokud je start_time < trade < end_time
@ -271,6 +308,9 @@ class Trade_Offline_Streamer(Thread):
#tmp = to_datetime(t['t'], utc=True).timestamp()
#obcas se v response objevoval None radek
if t is None:
continue
datum = to_datetime(t['t'], utc=True)

View File

@ -4,7 +4,7 @@
"""
from v2realbot.loader.aggregator import TradeAggregator2Queue
from alpaca.data.live import StockDataStream
from v2realbot.config import ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, ACCOUNT1_PAPER_FEED
from v2realbot.config import LIVE_DATA_API_KEY, LIVE_DATA_SECRET_KEY
from alpaca.data.historical import StockHistoricalDataClient
from alpaca.data.requests import StockLatestQuoteRequest, StockBarsRequest, StockTradesRequest
from threading import Thread, current_thread
@ -12,6 +12,7 @@ from v2realbot.utils.utils import parse_alpaca_timestamp, ltp
from datetime import datetime, timedelta
from threading import Thread, Lock
from msgpack import packb
import v2realbot.utils.config_handler as cfh
"""
Shared streamer (can be shared amongst concurrently running strategies)
@ -19,9 +20,12 @@ from msgpack import packb
by strategies
"""
class Trade_WS_Streamer(Thread):
live_data_feed = cfh.config_handler.get_val('LIVE_DATA_FEED')
##tento ws streamer je pouze jeden pro vsechny, tzn. vyuziváme natvrdo placena data primarniho uctu (nezalezi jestli paper nebo live)
client = StockDataStream(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, raw_data=True, websocket_params={}, feed=ACCOUNT1_PAPER_FEED)
msg = f"Realtime Websocket connection will use FEED: {live_data_feed} and credential of ACCOUNT1"
print(msg)
#cfh.config_handler.print_current_config()
client = StockDataStream(LIVE_DATA_API_KEY, LIVE_DATA_SECRET_KEY, raw_data=True, websocket_params={}, feed=live_data_feed)
#uniquesymbols = set()
_streams = []
#to_run = dict()
@ -38,10 +42,23 @@ class Trade_WS_Streamer(Thread):
return False
def add_stream(self, obj: TradeAggregator2Queue):
print(Trade_WS_Streamer.msg)
print("stav pred pridavanim", Trade_WS_Streamer._streams)
Trade_WS_Streamer._streams.append(obj)
if Trade_WS_Streamer.client._running is False:
print("websocket zatim nebezi, pouze pridavame do pole")
#zde delame refresh clienta (pokud se zmenilo live_data_feed)
# live_data_feed = cfh.config_handler.get_val('LIVE_DATA_FEED')
# #po otestování přepnout jen pokud se live_data_feed změnil
# #if live_data_feed != Trade_WS_Streamer.live_data_feed:
# # Trade_WS_Streamer.live_data_feed = live_data_feed
# msg = f"REFRESH OF CLIENT! Realtime Websocket connection will use FEED: {live_data_feed} and credential of ACCOUNT1"
# print(msg)
# #cfh.config_handler.print_current_config()
# Trade_WS_Streamer.client = StockDataStream(LIVE_DATA_API_KEY, LIVE_DATA_SECRET_KEY, raw_data=True, websocket_params={}, feed=live_data_feed)
else:
print("websocket client bezi")
if self.symbol_exists(obj.symbol):
@ -59,7 +76,12 @@ class Trade_WS_Streamer(Thread):
#if it is the last item at all, stop the client from running
if len(Trade_WS_Streamer._streams) == 0:
print("removed last item from WS, stopping the client")
Trade_WS_Streamer.client.stop()
#Trade_WS_Streamer.client.stop_ws()
#Trade_WS_Streamer.client.stop()
#zkusíme explicitně zavolat kroky pro disconnect od ws
if Trade_WS_Streamer.client._stop_stream_queue.empty():
Trade_WS_Streamer.client._stop_stream_queue.put_nowait({"should_stop": True})
Trade_WS_Streamer.client._should_run = False
return
if not self.symbol_exists(obj.symbol):

View File

@ -1,7 +1,7 @@
import os,sys
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
os.environ["KERAS_BACKEND"] = "jax"
from v2realbot.config import WEB_API_KEY, DATA_DIR, MEDIA_DIRECTORY, LOG_FILE, MODEL_DIR
from v2realbot.config import WEB_API_KEY, DATA_DIR, MEDIA_DIRECTORY, LOG_PATH, MODEL_DIR, VBT_DOC_DIRECTORY
from alpaca.data.timeframe import TimeFrame, TimeFrameUnit
from datetime import datetime
from rich import print
@ -9,16 +9,16 @@ from fastapi import FastAPI, Depends, HTTPException, status, File, UploadFile, R
from fastapi.security import APIKeyHeader
import uvicorn
from uuid import UUID
import v2realbot.controller.services as cs
from v2realbot.utils.ilog import get_log_window
from v2realbot.common.model import StrategyInstance, RunnerView, RunRequest, Trade, RunArchive, RunArchiveView, RunArchiveViewPagination, RunArchiveDetail, Bar, RunArchiveChange, TestList, ConfigItem, InstantIndicator, DataTablesRequest, AnalyzerInputs
from fastapi import FastAPI, WebSocket, WebSocketDisconnect, Depends, HTTPException, status, WebSocketException, Cookie, Query
from v2realbot.common.model import RunManagerRecord, StrategyInstance, RunnerView, RunRequest, Trade, RunArchive, RunArchiveView, RunArchiveViewPagination, RunArchiveDetail, Bar, RunArchiveChange, TestList, ConfigItem, InstantIndicator, DataTablesRequest, AnalyzerInputs
from fastapi import FastAPI, WebSocket, WebSocketDisconnect, Depends, HTTPException, status, WebSocketException, Cookie, Query, Request
from fastapi.responses import FileResponse, StreamingResponse, JSONResponse
from fastapi.staticfiles import StaticFiles
from fastapi.security import HTTPBasic, HTTPBasicCredentials
from v2realbot.enums.enums import Env, Mode
from typing import Annotated
import os
import psutil
import uvicorn
import orjson
from queue import Queue, Empty
@ -36,9 +36,15 @@ from traceback import format_exc
#from v2realbot.reporting.optimizecutoffs import find_optimal_cutoff
import v2realbot.reporting.analyzer as ci
import shutil
from starlette.responses import JSONResponse
from starlette.responses import JSONResponse, HTMLResponse, FileResponse, RedirectResponse
import mlroom
import mlroom.utils.mlutils as ml
from typing import List
import v2realbot.controller.run_manager as rm
import v2realbot.scheduler.ap_scheduler as aps
import re
import v2realbot.controller.configs as cf
import v2realbot.controller.services as cs
#from async io import Queue, QueueEmpty
#
# install()
@ -70,13 +76,67 @@ def api_key_auth(api_key: str = Depends(X_API_KEY)):
detail="Forbidden"
)
def authenticate_user(credentials: HTTPBasicCredentials = Depends(HTTPBasic())):
correct_username = "david"
correct_password = "david"
if credentials.username == correct_username and credentials.password == correct_password:
return True
else:
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Incorrect username or password",
headers={"WWW-Authenticate": "Basic"},
)
app = FastAPI()
root = os.path.dirname(os.path.abspath(__file__))
app.mount("/static", StaticFiles(html=True, directory=os.path.join(root, 'static')), name="static")
#app.mount("/static", StaticFiles(html=True, directory=os.path.join(root, 'static')), name="static")
app.mount("/media", StaticFiles(directory=str(MEDIA_DIRECTORY)), name="media")
#app.mount("/", StaticFiles(html=True, directory=os.path.join(root, 'static')), name="www")
security = HTTPBasic()
@app.get("/static/{path:path}")
async def static_files(request: Request, path: str, authenticated: bool = Depends(authenticate_user)):
root = os.path.dirname(os.path.abspath(__file__))
static_dir = os.path.join(root, 'static')
if not path or path == "/":
file_path = os.path.join(static_dir, 'index.html')
else:
file_path = os.path.join(static_dir, path)
# Check if path is a directory
if os.path.isdir(file_path):
# If it's a directory, try to serve index.html within that directory
index_path = os.path.join(file_path, 'index.html')
if os.path.exists(index_path):
return FileResponse(index_path)
else:
# Optionally, you can return a directory listing or a custom 404 page here
return HTMLResponse("Directory listing not enabled.", status_code=403)
if not os.path.exists(file_path):
raise HTTPException(status_code=404, detail="File not found")
return FileResponse(file_path)
@app.get("/vbt-doc/{file_path:path}")
async def serve_protected_docs(file_path: str, credentials: HTTPBasicCredentials = Depends(authenticate_user)):
file_location = VBT_DOC_DIRECTORY / file_path
if file_location.is_dir(): # If it's a directory, serve index.html
index_file = file_location / "index.html"
if index_file.exists():
return FileResponse(index_file)
else:
raise HTTPException(status_code=404, detail="Index file not found")
elif file_location.exists():
return FileResponse(file_location)
else:
raise HTTPException(status_code=404, detail="File not found")
def get_current_username(
credentials: Annotated[HTTPBasicCredentials, Depends(security)]
@ -98,9 +158,9 @@ async def get_api_key(
return session or api_key
#TODO predelat z Async?
@app.get("/static")
async def get(username: Annotated[str, Depends(get_current_username)]):
return FileResponse("index.html")
# @app.get("/static")
# async def get(username: Annotated[str, Depends(get_current_username)]):
# return FileResponse("index.html")
@app.websocket("/runners/{runner_id}/ws")
async def websocket_endpoint(
@ -249,11 +309,13 @@ def _run_stratin(stratin_id: UUID, runReq: RunRequest):
runReq.bt_to = zoneNY.localize(runReq.bt_to)
#pokud jedeme nad test intervaly anebo je požadováno více dní - pouštíme jako batch day by day
#do budoucna dát na FE jako flag
if runReq.mode != Mode.LIVE and runReq.test_batch_id is not None or (runReq.bt_from.date() != runReq.bt_to.date()):
#print(runReq)
if runReq.mode not in [Mode.LIVE, Mode.PAPER] and (runReq.test_batch_id is not None or (runReq.bt_from is not None and runReq.bt_to is not None and runReq.bt_from.date() != runReq.bt_to.date())):
res, id = cs.run_batch_stratin(id=stratin_id, runReq=runReq)
else:
if runReq.weekdays_filter is not None:
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Weekday only for backtest mode with batch (not single day)")
#not necessary for live/paper the weekdays are simply ignored, in the future maybe add validation if weekdays are presented
#if runReq.weekdays_filter is not None:
# raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Weekday only for backtest mode with batch (not single day)")
res, id = cs.run_stratin(id=stratin_id, runReq=runReq)
if res == 0: return id
elif res < 0:
@ -552,30 +614,68 @@ def _get_archived_runner_log_byID(runner_id: UUID, timestamp_from: float, timest
else:
raise HTTPException(status_code=404, detail=f"No logs found with id: {runner_id} and between {timestamp_from} and {timestamp_to}")
def remove_ansi_codes(text):
ansi_escape = re.compile(r'\x1B[@-_][0-?]*[ -/]*[@-~]')
return ansi_escape.sub('', text)
# endregion
# A simple function to read the last lines of a file
def tail(file_path, n=10, buffer_size=1024):
with open(file_path, 'rb') as f:
f.seek(0, 2) # Move to the end of the file
file_size = f.tell()
lines = []
buffer = bytearray()
# def tail(file_path, n=10, buffer_size=1024):
# try:
# with open(file_path, 'rb') as f:
# f.seek(0, 2) # Move to the end of the file
# file_size = f.tell()
# lines = []
# buffer = bytearray()
for i in range(file_size // buffer_size + 1):
read_start = max(-buffer_size * (i + 1), -file_size)
f.seek(read_start, 2)
read_size = min(buffer_size, file_size - buffer_size * i)
buffer[0:0] = f.read(read_size) # Prepend to buffer
# for i in range(file_size // buffer_size + 1):
# read_start = max(-buffer_size * (i + 1), -file_size)
# f.seek(read_start, 2)
# read_size = min(buffer_size, file_size - buffer_size * i)
# buffer[0:0] = f.read(read_size) # Prepend to buffer
# if buffer.count(b'\n') >= n + 1:
# break
# lines = buffer.decode(errors='ignore').splitlines()[-n:]
# lines = [remove_ansi_codes(line) for line in lines]
# return lines
# except Exception as e:
# return [str(e) + format_exc()]
#updated version that reads lines line by line
def tail(file_path, n=10):
try:
with open(file_path, 'rb') as f:
f.seek(0, 2) # Move to the end of the file
file_size = f.tell()
lines = []
line = b''
f.seek(-1, 2) # Start at the last byte
while len(lines) < n and f.tell() != 0:
byte = f.read(1)
if byte == b'\n':
# Decode, remove ANSI codes, and append the line
lines.append(remove_ansi_codes(line.decode(errors='ignore')))
line = b''
else:
line = byte + line
f.seek(-2, 1) # Move backwards by two bytes
if line:
# Append any remaining line after removing ANSI codes
lines.append(remove_ansi_codes(line.decode(errors='ignore')))
return lines[::-1] # Reverse the list to get the lines in correct order
except Exception as e:
return [str(e)]
if buffer.count(b'\n') >= n + 1:
break
lines = buffer.decode(errors='ignore').splitlines()[-n:]
return lines
@app.get("/log", dependencies=[Depends(api_key_auth)])
def read_log(lines: int = 10):
log_path = LOG_FILE
def read_log(lines: int = 700, logfile: str = "strat.log"):
log_path = LOG_PATH / logfile
return {"lines": tail(log_path, lines)}
#get alpaca history bars
@ -609,7 +709,7 @@ def _generate_report_image(runner_ids: list[UUID]):
res, stream = generate_trading_report_image(runner_ids=runner_ids,stream=True)
if res == 0: return StreamingResponse(stream, media_type="image/png",headers={"Content-Disposition": "attachment; filename=report.png"})
elif res < 0:
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error: {res}:{id}")
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error: {res}:{stream}")
except Exception as e:
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error: {str(e)}" + format_exc())
@ -645,7 +745,8 @@ def _generate_analysis(analyzerInputs: AnalyzerInputs):
if res == 0: return StreamingResponse(stream, media_type="image/png")
elif res < 0:
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error: {res}:{id}")
print("Error when generating analysis: ",str(stream))
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error: {res}:{stream}")
except Exception as e:
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error: {str(e)}" + format_exc())
@ -674,7 +775,7 @@ def get_testlists():
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=f"No data found")
# API endpoint to retrieve a single record by ID
@app.get('/testlists/{record_id}')
@app.get('/testlists/{record_id}', dependencies=[Depends(api_key_auth)])
def get_testlist(record_id: str):
res, testlist = cs.get_testlist_byID(record_id=record_id)
@ -684,7 +785,7 @@ def get_testlist(record_id: str):
raise HTTPException(status_code=404, detail='Record not found')
# API endpoint to update a record
@app.put('/testlists/{record_id}')
@app.put('/testlists/{record_id}', dependencies=[Depends(api_key_auth)])
def update_testlist(record_id: str, testlist: TestList):
# Check if the record exists
conn = pool.get_connection()
@ -704,7 +805,7 @@ def update_testlist(record_id: str, testlist: TestList):
return testlist
# API endpoint to delete a record
@app.delete('/testlists/{record_id}')
@app.delete('/testlists/{record_id}', dependencies=[Depends(api_key_auth)])
def delete_testlist(record_id: str):
# Check if the record exists
conn = pool.get_connection()
@ -727,7 +828,7 @@ def delete_testlist(record_id: str):
# Get all config items
@app.get("/config-items/", dependencies=[Depends(api_key_auth)])
def get_all_items() -> list[ConfigItem]:
res, sada = cs.get_all_config_items()
res, sada = cf.get_all_config_items()
if res == 0:
return sada
else:
@ -737,7 +838,7 @@ def get_all_items() -> list[ConfigItem]:
# Get a config item by ID
@app.get("/config-items/{item_id}", dependencies=[Depends(api_key_auth)])
def get_item(item_id: int)-> ConfigItem:
res, sada = cs.get_config_item_by_id(item_id)
res, sada = cf.get_config_item_by_id(item_id)
if res == 0:
return sada
else:
@ -746,7 +847,7 @@ def get_item(item_id: int)-> ConfigItem:
# Get a config item by Name
@app.get("/config-items-by-name/", dependencies=[Depends(api_key_auth)])
def get_item(item_name: str)-> ConfigItem:
res, sada = cs.get_config_item_by_name(item_name)
res, sada = cf.get_config_item_by_name(item_name)
if res == 0:
return sada
else:
@ -755,7 +856,7 @@ def get_item(item_name: str)-> ConfigItem:
# Create a new config item
@app.post("/config-items/", dependencies=[Depends(api_key_auth)], status_code=status.HTTP_200_OK)
def create_item(config_item: ConfigItem) -> ConfigItem:
res, sada = cs.create_config_item(config_item)
res, sada = cf.create_config_item(config_item)
if res == 0: return sada
else:
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error not created: {res}:{id} {sada}")
@ -764,11 +865,11 @@ def create_item(config_item: ConfigItem) -> ConfigItem:
# Update a config item by ID
@app.put("/config-items/{item_id}", dependencies=[Depends(api_key_auth)])
def update_item(item_id: int, config_item: ConfigItem) -> ConfigItem:
res, sada = cs.get_config_item_by_id(item_id)
res, sada = cf.get_config_item_by_id(item_id)
if res != 0:
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=f"No data found")
res, sada = cs.update_config_item(item_id, config_item)
res, sada = cf.update_config_item(item_id, config_item)
if res == 0: return sada
else:
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error not created: {res}:{id}")
@ -777,17 +878,77 @@ def update_item(item_id: int, config_item: ConfigItem) -> ConfigItem:
# Delete a config item by ID
@app.delete("/config-items/{item_id}", dependencies=[Depends(api_key_auth)])
def delete_item(item_id: int) -> dict:
res, sada = cs.get_config_item_by_id(item_id)
res, sada = cf.get_config_item_by_id(item_id)
if res != 0:
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=f"No data found")
res, sada = cs.delete_config_item(item_id)
res, sada = cf.delete_config_item(item_id)
if res == 0: return sada
else:
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error not created: {res}:{id}")
# endregion
# region scheduler
# 1. Fetch All RunManagerRecords
@app.get("/run_manager_records/", dependencies=[Depends(api_key_auth)], response_model=List[RunManagerRecord])
#TODO zvazit rozsireni vystupu o strat_status (running/stopped)
def get_all_run_manager_records():
result, records = rm.fetch_all_run_manager_records()
if result != 0:
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail="Error fetching records")
return records
# 2. Fetch RunManagerRecord by ID
@app.get("/run_manager_records/{record_id}", dependencies=[Depends(api_key_auth)], response_model=RunManagerRecord)
#TODO zvazit rozsireni vystupu o strat_status (running/stopped)
def get_run_manager_record(record_id: UUID):
result, record = rm.fetch_run_manager_record_by_id(record_id)
if result == -2: # Record not found
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Record not found")
elif result != 0:
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail="Error fetching record")
return record
# 3. Update RunManagerRecord
@app.patch("/run_manager_records/{record_id}", dependencies=[Depends(api_key_auth)], status_code=status.HTTP_200_OK)
def update_run_manager_record(record_id: UUID, update_data: RunManagerRecord):
#make dates zone aware zoneNY
# if update_data.valid_from is not None:
# update_data.valid_from = zoneNY.localize(update_data.valid_from)
# if update_data.valid_to is not None:
# update_data.valid_to = zoneNY.localize(update_data.valid_to)
result, message = rm.update_run_manager_record(record_id, update_data)
if result == -2: # Update failed
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=message)
elif result != 0:
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail=f"Error during update {result} {message}")
return {"message": "Record updated successfully"}
# 4. Delete RunManagerRecord
@app.delete("/run_manager_records/{record_id}", dependencies=[Depends(api_key_auth)], status_code=status.HTTP_200_OK)
def delete_run_manager_record(record_id: UUID):
result, message = rm.delete_run_manager_record(record_id)
if result == -2: # Delete failed
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=message)
elif result != 0:
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error during deletion {result} {message}")
return {"message": "Record deleted successfully"}
@app.post("/run_manager_records/", status_code=status.HTTP_201_CREATED)
def create_run_manager_record(new_record: RunManagerRecord, api_key_auth: Depends = Depends(api_key_auth)):
#make date zone aware - convert to zoneNY
# if new_record.valid_from is not None:
# new_record.valid_from = zoneNY.localize(new_record.valid_from)
# if new_record.valid_to is not None:
# new_record.valid_to = zoneNY.localize(new_record.valid_to)
result, record_id = rm.add_run_manager_record(new_record)
if result != 0:
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error during record creation: {result} {record_id}")
return {"id": record_id}
# endregion
#model section
#UPLOAD MODEL
@app.post("/model/upload_model", dependencies=[Depends(api_key_auth)])
@ -881,7 +1042,25 @@ def get_metadata(model_name: str):
# "last_modified": os.path.getmtime(model_path),
# # ... other metadata fields ...
# }
@app.get("/system-info")
def get_system_info():
"""Get system info, e.g. disk free space, used percentage ... """
disk_total = round(psutil.disk_usage('/').total / 1024**3, 1)
disk_used = round(psutil.disk_usage('/').used / 1024**3, 1)
disk_free = round(psutil.disk_usage('/').free / 1024**3, 1)
disk_used_percentage = round(psutil.disk_usage('/').percent, 1)
# memory_total = round(psutil.virtual_memory().total / 1024**3, 1)
# memory_perc = round(psutil.virtual_memory().percent, 1)
# cpu_time_user = round(psutil.cpu_times().user,1)
# cpu_time_system = round(psutil.cpu_times().system,1)
# cpu_time_idle = round(psutil.cpu_times().idle,1)
# network_sent = round(psutil.net_io_counters().bytes_sent / 1024**3, 6)
# network_recv = round(psutil.net_io_counters().bytes_recv / 1024**3, 6)
return {"disk_space": {"total": disk_total, "used": disk_used, "free" : disk_free, "used_percentage" : disk_used_percentage},
# "memory": {"total": memory_total, "used_percentage": memory_perc},
# "cpu_time" : {"user": cpu_time_user, "system": cpu_time_system, "idle": cpu_time_idle},
# "network": {"sent": network_sent, "received": network_recv}
}
# Thread function to insert data from the queue into the database
def insert_queue2db():
@ -924,7 +1103,22 @@ if __name__ == "__main__":
insert_thread = Thread(target=insert_queue2db)
insert_thread.start()
#attach debugGER to be able to debug scheduler jobs (run in separate threads)
# debugpy.listen(('localhost', 5678))
# print("Waiting for debugger to attach...")
# debugpy.wait_for_client() # Script will pause here until debugger is attached
#init scheduled tasks from schedule table
#Add APS scheduler job refresh
res, result = aps.initialize_jobs()
if res < 0:
#raise exception
raise Exception(f"Error {res} initializing APS jobs, error {result}")
uvicorn.run("__main__:app", host="0.0.0.0", port=8000, reload=False)
except Exception as e:
print("Error intializing app: " + str(e) + format_exc())
aps.scheduler.shutdown(wait=False)
finally:
print("closing insert_conn connection")
insert_conn.close()

View File

View File

@ -0,0 +1,310 @@
from uuid import UUID
from typing import Any, List, Tuple
from uuid import UUID, uuid4
from v2realbot.enums.enums import Moddus, SchedulerStatus, RecordType, StartBarAlign, Mode, Account, OrderSide
from v2realbot.common.model import RunManagerRecord, StrategyInstance, RunDay, StrategyInstance, Runner, RunRequest, RunArchive, RunArchiveView, RunArchiveViewPagination, RunArchiveDetail, RunArchiveChange, Bar, TradeEvent, TestList, Intervals, ConfigItem, InstantIndicator, DataTablesRequest, Market
from v2realbot.utils.utils import validate_and_format_time, AttributeDict, zoneNY, zonePRG, safe_get, dict_replace_value, Store, parse_toml_string, json_serial, is_open_hours, send_to_telegram, concatenate_weekdays, transform_data
from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeStatus, TradeStoplossType
from datetime import datetime
from v2realbot.config import JOB_LOG_FILE, STRATVARS_UNCHANGEABLES, ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, ACCOUNT1_LIVE_API_KEY, ACCOUNT1_LIVE_SECRET_KEY, DATA_DIR, MEDIA_DIRECTORY, RUNNER_DETAIL_DIRECTORY
import numpy as np
from rich import print as richprint
import v2realbot.controller.services as cs
import v2realbot.controller.run_manager as rm
import v2realbot.scheduler.scheduler as sch
from apscheduler.schedulers.background import BackgroundScheduler
from apscheduler.triggers.cron import CronTrigger
from apscheduler.job import Job
#NOTE zatím není podporováno spouštění strategie přes půlnoc - musí se dořešit weekday_filter
#který je zatím jen jeden jak pro start_time tak stop_time - což by v případě strategií běžících
#přes půlnoc nezafungovalo (stop by byl následující den a scheduler by jej nespustil)
def format_apscheduler_jobs(jobs: list[Job]) -> list[dict]:
if not jobs:
print("No scheduled jobs.")
return
jobs_info = []
for job in jobs:
job_info = {
"Job ID": job.id,
"Next Run Time": job.next_run_time,
"Job Function": job.func.__name__,
"Trigger": str(job.trigger),
"Job Args": ', '.join(map(str, job.args)),
"Job Kwargs": ', '.join(f"{k}={v}" for k, v in job.kwargs.items())
}
jobs_info.append(job_info)
return jobs_info
def get_day_of_week(weekdays_filter):
if not weekdays_filter:
return '*' # All days of the week
return ','.join(map(str, weekdays_filter))
#initialize_jobs se spousti
#- pri spusteni
#- triggerovano z add/update a delete
#zatim cely refresh, v budoucnu upravime jen na zmene menene polozky - viz
#https://chat.openai.com/c/2a1423ee-59df-47ff-b073-0c49ade51ed7
#pomocna funkce, ktera vraci strat_id, ktera jsou v scheduleru vickrat (logika pro ne se lisi)
def stratin_occurences(all_records: list[RunManagerRecord]):
# Count occurrences
strat_id_counts = {}
for record in all_records:
if record.strat_id in strat_id_counts:
strat_id_counts[record.strat_id] += 1
else:
strat_id_counts[record.strat_id] = 1
# Find strat_id values that appear twice or more
repeated_strat_ids = [strat_id for strat_id, count in strat_id_counts.items() if count >= 2]
return 0, repeated_strat_ids
def initialize_jobs(run_manager_records: RunManagerRecord = None):
"""
Initialize all scheduled jobs from RunManagerRecords with moddus = "schedule"
Triggered on app init and update of table
It deleted all "schedule_" prefixed jobs and schedule new ones base on runmanager table
prefiX of "schedule_" in aps scheduler allows to distinguisd schedule types jobs and allows more jobs categories
Parameters
----------
run_manager_records : RunManagerRecord, optional
RunManagerRecords to initialize the jobs from, by default None
Returns
-------
Tuple[int, Union[List[dict], str]]
A tuple containing an error code and a message. If there is no error, the
message will contain a list of dictionaries with information about the
scheduled jobs, otherwise it will contain an error message.
"""
if run_manager_records is None:
res, run_manager_records = rm.fetch_all_run_manager_records()
if res < 0:
err_msg= f"Error {res} fetching all runmanager records, error {run_manager_records}"
print(err_msg)
return -2, err_msg
scheduled_jobs = scheduler.get_jobs()
#print(f"Current {len(scheduled_jobs)} scheduled jobs: {str(scheduled_jobs)}")
for job in scheduled_jobs:
if job.id.startswith("scheduler_"):
scheduler.remove_job(job.id)
record : RunManagerRecord = None
for record in run_manager_records:
if record.status == SchedulerStatus.ACTIVE and record.moddus == Moddus.SCHEDULE:
day_of_week = get_day_of_week(record.weekdays_filter)
hour, minute = map(int, record.start_time.split(':'))
start_trigger = CronTrigger(day_of_week=day_of_week, hour=hour, minute=minute,
start_date=record.valid_from, end_date=record.valid_to, timezone=zoneNY)
stop_hour, stop_minute = map(int, record.stop_time.split(':'))
stop_trigger = CronTrigger(day_of_week=day_of_week, hour=stop_hour, minute=stop_minute,
start_date=record.valid_from, end_date=record.valid_to, timezone=zoneNY)
# Schedule new jobs with the 'scheduler_' prefix
scheduler.add_job(start_runman_record, start_trigger, id=f"scheduler_start_{record.id}", args=[record.id])
scheduler.add_job(stop_runman_record, stop_trigger, id=f"scheduler_stop_{record.id}", args=[record.id])
#scheduler.add_job(print_hello, 'interval', seconds=10, id=
# f"scheduler_testinterval")
scheduled_jobs = scheduler.get_jobs()
print(f"APS jobs refreshed ({len(scheduled_jobs)})")
current_jobs_dict = format_apscheduler_jobs(scheduled_jobs)
richprint(current_jobs_dict)
return 0, current_jobs_dict
#zastresovaci funkce resici error handling a printing
def start_runman_record(id: UUID, debug_date = None):
record = None
res, record, msg = _start_runman_record(id=id, debug_date=debug_date)
if record is not None:
market_time_now = datetime.now().astimezone(zoneNY) if debug_date is None else debug_date
record.last_processed = market_time_now
formatted_date = market_time_now.strftime("%y.%m.%d %H:%M:%S")
history_string = f"{formatted_date}"
history_string += " STARTED" if res == 0 else "NOTE:" + msg if res == -1 else "ERROR:" + msg
print(history_string)
if record.history is None:
record.history = history_string
else:
record.history += "\n" + history_string
rs, msg_rs = update_runman_record(record)
if rs < 0:
msg_rs = f"Error saving result to history: {msg_rs}"
print(msg_rs)
send_to_telegram(msg_rs)
if res < -1:
msg = f"START JOB: {id} ERROR\n" + msg
send_to_telegram(msg)
print(msg)
else:
print(f"START JOB: {id} FINISHED {res}")
def update_runman_record(record: RunManagerRecord):
#update record (nejspis jeste upravit - last_run a history)
res, set = rm.update_run_manager_record(record.id, record)
if res == 0:
print(f"Record updated {set}")
return 0, "OK"
else:
err_msg= f"STOP: Error updating {record.id} errir {set} with values {record}"
return -2, err_msg#toto stopne zpracovani dalsich zaznamu pri chybe, zvazit continue
def stop_runman_record(id: UUID, debug_date = None):
res, record, msg = _stop_runman_record(id=id, debug_date=debug_date)
#results : 0 - ok, -1 not running/already running/not specific, -2 error
#report vzdy zapiseme do history, pokud je record not None, pripadna chyba se stala po dotazeni recordu
if record is not None:
market_time_now = datetime.now().astimezone(zoneNY) if debug_date is None else debug_date
record.last_processed = market_time_now
formatted_date = market_time_now.strftime("%y.%m.%d %H:%M:%S")
history_string = f"{formatted_date}"
history_string += " STOPPED" if res == 0 else "NOTE:" + msg if res == -1 else "ERROR:" + msg
print(history_string)
if record.history is None:
record.history = history_string
else:
record.history += "\n" + history_string
rs, msg_rs = update_runman_record(record)
if rs < 0:
msg_rs = f"Error saving result to history: {msg_rs}"
print(msg_rs)
send_to_telegram(msg_rs)
if res < -1:
msg = f"STOP JOB: {id} ERROR\n" + msg
send_to_telegram(msg)
print(msg)
else:
print(f"STOP JOB: {id} FINISHED")
#start function that is called from the job
def _start_runman_record(id: UUID, debug_date = None):
print(f"Start scheduled record {id}")
record : RunManagerRecord = None
res, result = rm.fetch_run_manager_record_by_id(id)
if res < 0:
result = "Error fetching run manager record by id: " + str(id) + " Error: " + str(result)
return res, record, result
record = result
if record.market == Market.US or record.market == Market.CRYPTO:
res, sada = sch.get_todays_market_times(market=record.market, debug_date=debug_date)
if res == 0:
market_time_now, market_open_datetime, market_close_datetime = sada
print(f"OPEN:{market_open_datetime} CLOSE:{market_close_datetime}")
else:
sada = f"Market {record.market} Error getting market times (CLOSED): " + str(sada)
return res, record, sada
else:
print("Market type is unknown.")
if cs.is_stratin_running(record.strat_id):
return -1, record, f"Stratin {record.strat_id} is already running"
res, result = sch.run_scheduled_strategy(record)
if res < 0:
result = "Error running strategy: " + str(result)
return res, record, result
else:
record.runner_id = UUID(result)
return 0, record, record.runner_id
#stop function that is called from the job
def _stop_runman_record(id: UUID, debug_date = None):
record = None
#get all records
print(f"Stopping record {id}")
res, all_records = rm.fetch_all_run_manager_records()
if res < 0:
err_msg= f"Error {res} fetching all runmanager records, error {all_records}"
return -2, record, err_msg
record : RunManagerRecord = None
for rec in all_records:
if rec.id == id:
record = rec
break
if record is None:
return -2, record, f"Record id {id} not found"
#strat_ids that are repeated
res, repeated_strat_ids = stratin_occurences(all_records)
if res < 0:
err_msg= f"Error {res} finding repeated strat_ids, error {repeated_strat_ids}"
return -2, record, err_msg
if record.strat_running is True:
#stopneme na zaklade record.runner_id
#this code
id_to_stop = record.runner_id
#pokud existuje manualne spustena stejna strategie a neni jich vic - je to jednoznacne - stopneme ji
elif cs.is_stratin_running(record.strat_id) and record.strat_id not in repeated_strat_ids:
#stopneme na zaklade record.strat_id
id_to_stop = record.strat_id
else:
msg = f"strategy {record.strat_id} not RUNNING or not distinctive (manually launched or two strat_ids in scheduler)"
print(msg)
return -1, record, msg
print(f"Requesting STOP {id_to_stop}")
res, msg = cs.stop_runner(id=id_to_stop)
if res < 0:
msg = f"ERROR while STOPPING runner_id/strat_id {id_to_stop} {msg}"
return -2, record, msg
else:
record.runner_id = None
return 0, record, "finished"
# Global scheduler instance
scheduler = BackgroundScheduler(timezone=zoneNY)
scheduler.start()
if __name__ == "__main__":
#use naive datetoime
debug_date = None
debug_date = datetime(2024, 2, 16, 9, 37, 0, 0)
#debug_date = datetime(2024, 2, 16, 10, 30, 0, 0)
#debug_date = datetime(2024, 2, 16, 16, 1, 0, 0)
id = UUID("bc4ec7d2-249b-4799-a02f-f1ce66f83d4a")
if debug_date is not None:
# Localize the naive datetime object to the Eastern timezone
debug_date = zoneNY.localize(debug_date)
#debugdate formatted as string in format "23.12.2024 9:30"
formatted_date = debug_date.strftime("%d.%m.%Y %H:%M")
print("Scheduler.py NY time: ", formatted_date)
print("ISoformat", debug_date.isoformat())
# res, result = start_runman_record(id=id, market = "US", debug_date = debug_date)
# print(f"CALL FINISHED, with {debug_date} RESULT: {res}, {result}")
res, result = stop_runman_record(id=id, debug_date = debug_date)
print(f"CALL FINISHED, with {debug_date} RESULT: {res}, {result}")

View File

@ -0,0 +1,439 @@
import json
import datetime
import v2realbot.controller.services as cs
import v2realbot.controller.run_manager as rm
from v2realbot.common.model import RunnerView, RunManagerRecord, StrategyInstance, Runner, RunRequest, Trade, RunArchive, RunArchiveView, RunArchiveViewPagination, RunArchiveDetail, Bar, RunArchiveChange, TestList, ConfigItem, InstantIndicator, DataTablesRequest, AnalyzerInputs, Market
from uuid import uuid4, UUID
from v2realbot.utils.utils import json_serial, send_to_telegram, zoneNY, zonePRG, zoneUTC, fetch_calendar_data
from datetime import datetime, timedelta, time
from traceback import format_exc
from rich import print
import requests
from v2realbot.config import WEB_API_KEY
#Puvodni varainta schedulera, ktera mela bezet v pravidelnych intervalech
#a spoustet scheduled items v RunManagerRecord
#Nově bylo zrefaktorováno a využitý apscheduler - knihovna v pythonu
#umožňující plánování jobů, tzn. nyní je každý scheduled záznam RunManagerRecord
#naplanovany jako samostatni job a triggerován pouze jednou v daný čas pro start a stop
#novy kod v aps_scheduler.py
def is_US_market_day(date):
cal_dates = fetch_calendar_data(date, date)
if len(cal_dates) == 0:
print("Today is not a market day.")
return False, cal_dates
else:
print("Market is open")
return True, cal_dates
def get_todays_market_times(market, debug_date = None):
try:
if market == Market.US:
#zjistit vsechny podminky - mozna loopovat - podminky jsou vlevo
if debug_date is not None:
nowNY = debug_date
else:
nowNY = datetime.now().astimezone(zoneNY)
nowNY_date = nowNY.date()
#is market open - nyni pouze US
stat, calendar_dates = is_US_market_day(nowNY_date)
if stat:
#zatim podpora pouze main session
#pouze main session
market_open_datetime = zoneNY.localize(calendar_dates[0].open)
market_close_datetime = zoneNY.localize(calendar_dates[0].close)
return 0, (nowNY, market_open_datetime, market_close_datetime)
else:
return -1, "Market is closed."
elif market == Market.CRYPTO:
now_market_datetime = datetime.now().astimezone(zoneUTC)
market_open_datetime = datetime.combine(datetime.now(), time.min)
matket_close_datetime = datetime.combine(datetime.now(), time.max)
return 0, (now_market_datetime, market_open_datetime, matket_close_datetime)
else:
return -1, "Market not supported"
except Exception as e:
err_msg = f"General error in {e} {format_exc()}"
print(err_msg)
return -2, err_msg
def get_running_strategies():
# Construct the URL for the local REST API endpoint on port 8000
api_url = "http://localhost:8000/runners/"
# Headers for the request
headers = {
"X-API-Key": WEB_API_KEY
}
try:
# Make the GET request to the API with the headers
response = requests.get(api_url, headers=headers)
# Check if the request was successful
if response.status_code == 200:
runners = response.json()
print("Successfully fetched runners.")
strat_ids = []
ids = []
for runner_view in runners:
strat_ids.append(UUID(runner_view["strat_id"]))
ids.append(UUID(runner_view["id"]))
return 0, (strat_ids, ids)
else:
err_msg = f"Failed to fetch runners. Status Code: {response.status_code}, Response: {response.text}"
print(err_msg)
return -2, err_msg
except requests.RequestException as e:
err_msg = f"Request failed: {str(e)}"
print(err_msg)
return -2, err_msg
def stop_strategy(runner_id):
# Construct the URL for the local REST API endpoint on port 8000 #option 127.0.0.1
api_url = f"http://localhost:8000/runners/{runner_id}/stop"
# Headers for the request
headers = {
"X-API-Key": WEB_API_KEY
}
try:
# Make the PUT request to the API with the headers
response = requests.put(api_url, headers=headers)
# Check if the request was successful
if response.status_code == 200:
print(f"Runner/strat_id {runner_id} stopped successfully.")
return 0, runner_id
else:
err_msg = f"Failed to stop runner {runner_id}. Status Code: {response.status_code}, Response: {response.text}"
print(err_msg)
return -2, err_msg
except requests.RequestException as e:
err_msg = f"Request failed: {str(e)}"
print(err_msg)
return -2, err_msg
def fetch_stratin(stratin_id):
# Construct the URL for the REST API endpoint
api_url = f"http://localhost:8000/stratins/{stratin_id}"
# Headers for the request
headers = {
"X-API-Key": WEB_API_KEY
}
try:
# Make the GET request to the API with the headers
response = requests.get(api_url, headers=headers)
# Check if the request was successful
if response.status_code == 200:
# Parse the response as a StrategyInstance object
strategy_instance = response.json()
#strategy_instance = response # Assuming the response is in JSON format
print(f"StrategyInstance fetched: {stratin_id}")
return 0, strategy_instance
else:
err_msg = f"Failed to fetch StrategyInstance {stratin_id}. " \
f"Status Code: {response.status_code}, Response: {response.text}"
print(err_msg)
return -1, err_msg
except requests.RequestException as e:
err_msg = f"Request failed: {str(e)}"
print(err_msg)
return -2, err_msg
#return list of strat_ids that are in the scheduled table more than once
#TODO toto je workaround dokud nebude canndidates logika ze selectu nyni presunuta na fetch_all_run_manager_records a logiku v pythonu
def stratin_occurences():
#get all records
res, all_records = rm.fetch_all_run_manager_records()
if res < 0:
err_msg= f"Error {res} fetching all runmanager records, error {all_records}"
print(err_msg)
return -2, err_msg
# Count occurrences
strat_id_counts = {}
for record in all_records:
if record.strat_id in strat_id_counts:
strat_id_counts[record.strat_id] += 1
else:
strat_id_counts[record.strat_id] = 1
# Find strat_id values that appear twice or more
repeated_strat_ids = [strat_id for strat_id, count in strat_id_counts.items() if count >= 2]
return 0, repeated_strat_ids
# in case debug_date is not provided, it takes current time of the given market
#V budoucnu zde bude loopa pro kazdy obsluhovany market, nyni pouze US
def startstop_scheduled(debug_date = None, market = "US") -> tuple[int, str]:
res, sada = get_todays_market_times(market=market, debug_date=debug_date)
if res == 0:
market_time_now, market_open_datetime, market_close_datetime = sada
print(f"OPEN:{market_open_datetime} CLOSE:{market_close_datetime}")
else:
return res, sada
#its market day
res, candidates = rm.fetch_scheduled_candidates_for_start_and_stop(market_time_now, market)
if res == 0:
print(f"Candidates fetched, start: {len(candidates['start'])} stop: {len(candidates['stop'])}")
else:
return res, candidates
if candidates is None or (len(candidates["start"]) == 0 and len(candidates["stop"]) == 0):
return -1, f"No candidates found for {market_time_now} and {market}"
#do budoucna, az budou runnery persistovane, bude stav kazde strategie v RunManagerRecord
#get current runners (mozna optimalizace, fetch per each section start/stop)
res, sada = get_running_strategies()
if res < 0:
err_msg= f"Error fetching running strategies, error {sada}"
print(err_msg)
send_to_telegram(err_msg)
return -2, err_msg
strat_ids_running, runnerids_running = sada
print(f"Currently running: {len(strat_ids_running)}")
#IERATE over START CAndidates
record: RunManagerRecord = None
print(f"START - Looping over {len(candidates['start'])} candidates")
for record in candidates['start']:
print("Candidate: ", record)
if record.weekdays_filter is not None and len(record.weekdays_filter) > 0:
curr_weekday = market_time_now.weekday()
if curr_weekday not in record.weekdays_filter:
print(f"Strategy {record.strat_id} not started, today{curr_weekday} not in weekdays filter {record.weekdays_filter}")
continue
#one strat_id can run only once at time
if record.strat_id in strat_ids_running:
msg = f"strategy already {record.strat_id} is running"
continue
res, result = run_scheduled_strategy(record)
if res < 0:
send_to_telegram(result)
print(result)
else:
record.runner_id = UUID(result)
strat_ids_running.append(record.strat_id)
runnerids_running.append(record.runner_id)
record.last_processed = market_time_now
history_string = f"{market_time_now.isoformat()} strategy STARTED" if res == 0 else "ERROR:" + result
if record.history is None:
record.history = history_string
else:
record.history += "\n" + history_string
#update record (nejspis jeste upravit - last_run a history)
res, set = rm.update_run_manager_record(record.id, record)
if res == 0:
print(f"Record in db updated {set}")
#return 0, set
else:
err_msg= f"Error updating {record.id} errir {set} with values {record}. Process stopped."
print(err_msg)
send_to_telegram(msg)
return -2, err_msg #toto stopne dalsi zpracovani, zvazit continue
#if stop candidates, then fetch existing runners
stop_candidates_cnt = len(candidates['stop'])
if stop_candidates_cnt > 0:
res, repeated_strat_ids = stratin_occurences()
if res < 0:
err_msg= f"Error {res} in callin stratin_occurences, error {repeated_strat_ids}"
send_to_telegram(err_msg)
return -2, err_msg
#dalsi OPEN ISSUE pri STOPu:
# má STOP_TIME strategie záviset na dni v týdnu? jinými slovy pokud je strategie
# nastavená na 9:30-10 v pondělí. Mohu si ji manuálně spustit v úterý a systém ji neshodí?
# Zatím to je postaveno, že předpis určuje okno, kde má strategie běžet a mimo tuto dobu bude
# automaticky shozena. Druhou možností je potom, že scheduler si striktně hlídá jen strategie,
# které byly jím zapnuté a ostatní jsou mu putna. V tomto případě pak např. později ručně spuštěmá
# strategie (např. kvůli opravě bugu) bude scheduler ignorovat a nevypne ji i kdyz je nastavena na vypnuti.
# Dopady: weekdays pri stopu a stratin_occurences
#IERATE over STOP Candidates
record: RunManagerRecord = None
print(f"STOP - Looping over {stop_candidates_cnt} candidates")
for record in candidates['stop']:
print("Candidate: ", record)
#Tento šelmostroj se stratin_occurences tu je jen proto, aby scheduler zafungoval i na manualne spustene strategie (ve vetsine pripadu)
# Při stopu evaluace kandidátů na vypnutí
# - pokud mám v schedules jen 1 strategii s konkretnim strat_id, můžu jet přes strat_id - bezici strategie s timto strat_id bude vypnuta (i manualne startnuta)
# - pokud jich mám více, musím jet přes runnery uložené v schedules
# (v tomto případě je omezení: ručně pouštěna strategii nebude automaticky
# stopnuta - systém neví, která to je)
#zjistime zda strategie bezi
#strategii mame v scheduleru pouze jednou, muzeme pouzit strat_id
if record.strat_id not in repeated_strat_ids:
if record.strat_id not in strat_ids_running:
msg = f"strategy {record.strat_id} NOT RUNNING"
print(msg)
continue
else:
#do stop
id_to_stop = record.strat_id
#strat_id je pouzito v scheduleru vicekrat, musime pouzit runner_id
elif record.runner_id is not None and record.runner_id in runnerids_running:
#do stop
id_to_stop = record.runner_id
#no distinctive condition
else:
#dont do anything
print(f"strategy {record.strat_id} not RUNNING or not distinctive (manually launched or two strat_ids in scheduler)")
continue
print(f"Requesting STOP {id_to_stop}")
res, msg = stop_strategy(id_to_stop)
if res < 0:
msg = f"ERROR while STOPPING runner_id/strat_id {id_to_stop} {msg}"
send_to_telegram(msg)
else:
if record.strat_id in strat_ids_running:
strat_ids_running.remove(record.strat_id)
if record.runner_id is not None and record.runner_id in runnerids_running:
runnerids_running.remove(record.runner_id)
record.runner_id = None
record.last_processed = market_time_now
history_string = f"{market_time_now.isoformat()} strategy {record.strat_id}" + "STOPPED" if res == 0 else "ERROR:" + msg
if record.history is None:
record.history = history_string
else:
record.history += "\n" + history_string
#update record (nejspis jeste upravit - last_run a history)
res, set = rm.update_run_manager_record(record.id, record)
if res == 0:
print(f"Record updated {set}")
else:
err_msg= f"Error updating {record.id} errir {set} with values {record}"
print(err_msg)
send_to_telegram(err_msg)
return -2, err_msg#toto stopne zpracovani dalsich zaznamu pri chybe, zvazit continue
return 0, "DONE"
##LIVE or PAPER
#tato verze využívate REST API, po predelani jobu na apscheduler uz muze vyuzivat prime volani cs.run_stratin
#TODO predelat
def run_scheduled_strategy(record: RunManagerRecord):
#get strat_json
sada : StrategyInstance = None
res, sada = fetch_stratin(record.strat_id)
if res == 0:
# #TODO toto overit jestli je stejny vystup jako JS
# print("Sada", sada)
# #strategy_instance = StrategyInstance(**sada)
strat_json = json.dumps(sada, default=json_serial)
# Replace escaped characters with their unescaped versions so it matches the JS output
#strat_json = strat_json.replace('\\r\\n', '\r\n')
#print(f"Strat_json fetched, {strat_json}")
else:
err_msg= f"Strategy {record.strat_id} not found. ERROR {sada}"
print(err_msg)
return -2, err_msg
#TBD mozna customizovat NOTE
#pokud neni batch_id pak vyhgeneruju a ulozim do db
# if record.batch_id is None:
# record.batch_id = str(uuid4())[:8]
api_url = f"http://localhost:8000/stratins/{record.strat_id}/run"
# Initialize RunRequest with record values
runReq = {
"id": str(record.strat_id),
"strat_json": strat_json,
"mode": record.mode,
"account": record.account,
"ilog_save": record.ilog_save,
"weekdays_filter": record.weekdays_filter,
"test_batch_id": record.testlist_id,
"batch_id": record.batch_id or str(uuid4())[:8],
"bt_from": record.bt_from.isoformat() if record.bt_from else None,
"bt_to": record.bt_to.isoformat() if record.bt_to else None,
"note": f"SCHED {record.start_time}-" + record.stop_time if record.stop_time else "" + record.note if record.note is not None else ""
}
# Headers for the request
headers = {
"X-API-Key": WEB_API_KEY
}
try:
# Make the PUT request to the API with the headers
response = requests.put(api_url, json=runReq, headers=headers)
# Check if the request was successful
if response.status_code == 200:
print(f"Strategy {record.strat_id} started successfully.")
return 0, response.json()
else:
err_msg = f"Strategy {record.strat_id} NOT started. Status Code: {response.status_code}, Response: {response.text}"
print(err_msg)
return -2, err_msg
except requests.RequestException as e:
err_msg = f"Request failed: {str(e)}"
print(err_msg)
return -2, err_msg
# #intiializae RunRequest with record values
# runReq = RunRequest(id=record.strat_id,
# strat_json=strat_json,
# mode=record.mode,
# account=record.account,
# ilog_save=record.ilog_save,
# weekdays_filter=record.weekdays_filter,
# test_batch_id=record.testlist_id,
# batch_id=record.batch_id,
# bt_from=record.bt_from,
# bt_to=record.bt_to,
# note=record.note)
# #call rest API to start strategy
# #start strategy
# res, sada = cs.run_stratin(id=record.strat_id, runReq=runReq, inter_batch_params=None)
# if res == 0:
# print(f"Strategy {sada} started")
# return 0, sada
# else:
# err_msg= f"Strategy {record.strat_id} NOT started. ERROR {sada}"
# print(err_msg)
# return -2, err_msg
if __name__ == "__main__":
#use naive datetoime
debug_date = None
debug_date = datetime(2024, 2, 16, 16, 37, 0, 0)
#debug_date = datetime(2024, 2, 16, 10, 30, 0, 0)
#debug_date = datetime(2024, 2, 16, 16, 1, 0, 0)
if debug_date is not None:
# Localize the naive datetime object to the Eastern timezone
debug_date = zoneNY.localize(debug_date)
#debugdate formatted as string in format "23.12.2024 9:30"
formatted_date = debug_date.strftime("%d.%m.%Y %H:%M")
print("Scheduler.py NY time: ", formatted_date)
print("ISoformat", debug_date.isoformat())
res, msg = startstop_scheduled(debug_date=debug_date, market="US")
print(f"CALL FINISHED, with {debug_date} RESULT: {res}, {msg}")

View File

@ -26,7 +26,7 @@
<!-- <script src="https://code.jquery.com/jquery-3.6.4.js" integrity="sha256-a9jBBRygX1Bh5lt8GZjXDzyOB+bWve9EiO7tROUtj/E=" crossorigin="anonymous"></script> -->
<script src="/static/js/libs/jquery-3.6.4.js" integrity="sha256-a9jBBRygX1Bh5lt8GZjXDzyOB+bWve9EiO7tROUtj/E=" crossorigin="anonymous"></script>
<script src="/static/js/libs/jquery-3.6.4.js"></script>
<!-- <script src="https://cdn.datatables.net/1.13.4/js/jquery.dataTables.min.js"></script> -->
<script src="/static/js/libs/jquery.dataTables.min.js"></script>
@ -57,7 +57,7 @@
<!-- <script src="https://code.jquery.com/jquery-3.5.1.js"></script> -->
<link rel="stylesheet" href="/static/main.css?v=1.06">
<link rel="stylesheet" href="/static/main.css?v=1.07">
<!-- <script src="https://cdnjs.cloudflare.com/ajax/libs/mousetrap/1.4.6/mousetrap.min.js"></script> -->
<script src="/static/js/libs/mousetrap.min.js"></script>
@ -131,9 +131,29 @@
<!-- <script type="text/javascript" src="https://cdnjs.cloudflare.com/ajax/libs/monaco-editor/0.41.0/min/vs/editor/editor.main.js"></script> -->
<!-- <script type="text/javascript" src="https://cdnjs.cloudflare.com/ajax/libs/monaco-editor/0.41.0/min/vs/loader.min.js"></script> -->
<script src="/static/js/systeminfo.js"> </script>
</head>
<body>
<div id="main" class="mainConteiner flex-container content">
<div id="system-info" class="flex-items">
<label data-bs-toggle="collapse" data-bs-target="#system-info-inner" aria-expanded="true">
<h4>System Info </h4>
</label>
<div id="system-info-inner" class="collapse">
<div id="system-info-output"></div>
<div id="graphical-output">
<div id="disk-gauge-container">
<span id="title"> Disk Space: </span>
<span id="free-space">Free: -- GB</span> |
<span id="total-space">Total: -- GB</span> |
<span id="used-percent">Used: -- %</span>
<div id="disk-gauge">
<div id="disk-gauge-bar"></div>
</div>
</div>
</div>
</div>
</div>
<div id="chartContainer" class="flex-items">
<label data-bs-toggle="collapse" data-bs-target="#chartContainerInner" aria-expanded="true">
<h4>Chart</h4>
@ -230,6 +250,7 @@
<!-- <table id="trades-data-table" class="dataTable no-footer" style="width: 300px;display: contents;"></table> -->
</div>
</div>
<div id="runner-table" class="flex-items">
<label data-bs-toggle="collapse" data-bs-target="#runner-table-inner">
<h4>Running Strategies</h4>
@ -298,6 +319,252 @@
</div>
</div>
</div>
</div>
<!-- SCHEDULER -->
<div id="runmanager-table" class="flex-items">
<label data-bs-toggle="collapse" data-bs-target="#runmanager-table-inner">
<h4>Run Manager</h4>
</label>
<div id="runmanager-table-inner" class="collapse show collapsible-section" style="width:58%">
<div id="controls">
<button title="Create new" id="button_add_sched" class="btn btn-outline-success btn-sm">Add</button>
<button title="Edit selected" id="button_edit_sched" class="btn btn-outline-success btn-sm">Edit</button>
<button title="Delete selected" id="button_delete_sched" class="btn btn-outline-success btn-sm">Delete</button>
<button title="History" id="button_history_sched" class="btn btn-outline-success btn-sm">History</button>
<button title="Refresh" id="button_refresh_sched" class="btn btn-outline-success btn-sm">Refresh</button>
<div class="btn-group btn-group-toggle" data-toggle="buttons">
<!-- <input type="radio" class="btn-check" name="filterOptions" id="filterNone" autocomplete="off" checked>
<label class="btn btn-outline-primary" for="filterNone">All</label> -->
<input type="radio" class="btn-check" name="filterOptions" id="filterSchedule" autocomplete="off" checked>
<label class="btn btn-outline-primary" for="filterSchedule">Scheduled</label>
<input type="radio" class="btn-check" name="filterOptions" id="filterQueue" autocomplete="off">
<label class="btn btn-outline-primary" for="filterQueue">Queued</label>
</div>
</div>
<table id="runmanagerTable" class="table-striped table dataTable" style="width:100%; border-color: #dce1dc;">
<thead>
<tr>
<th>Id</th>
<th>Type</th>
<th>Strat_Id</th>
<th>Symbol</th>
<th>Account</th>
<th>Mode</th>
<th>Note</th>
<th>Log</th>
<th>BT_from</th>
<th>BT_to</th>
<th>days</th>
<th>batch_id</th>
<th>start</th>
<th>stop</th>
<th>status</th>
<th>last_processed</th>
<th>history</th>
<th>valid_from</th>
<th>valid_to</th>
<th>testlist_id</th>
<th>Running</th>
<th>RunnerId</th>
<th>Market</th>
</tr>
</thead>
<tbody></tbody>
</table>
</div>
<div id="delModalRunmanager" class="modal fade">
<div class="modal-dialog">
<form method="post" id="delFormRunmanager">
<div class="modal-content">
<div class="modal-header">
<h4 class="modal-title"><i class="fa fa-plus"></i> Delete record</h4>
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
</div>
<div class="modal-body">
<div class="form-group">
<label for="delidrunmanager" class="form-label">Id</label>
<!-- <div id="listofids"></div> -->
<input type="text" class="form-control" id="delidrunmanager" name="id" placeholder="id" readonly>
</div>
</div>
<div class="modal-footer">
<input type="submit" name="delete" id="deleterunmanager" class="btn btn-primary" value="Delete" />
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Close</button>
</div>
</div>
</form>
</div>
</div>
<div id="addeditModalRunmanager" class="modal fade">
<div class="modal-dialog">
<form method="post" id="addeditFormRunmanager">
<div class="modal-content">
<div class="modal-header">
<h4 class="modal-title_run"><i class="fa fa-plus"></i> Add scheduler record</h4>
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
</div>
<div class="modal-body">
<div class="form-group">
<label for="runmanid" class="form-label">Record Id</label>
<input type="text" class="form-control" id="runmanid" name="id" placeholder="auto generated id" readonly>
</div>
<div class="form-group">
<label for="runmanmoddus" class="form-label">Type</label>
<input type="text" class="form-control" id="runmanmoddus" name="moddus" readonly>
</div>
<div class="form-group">
<label for="runmanstrat_id" class="form-label">StrategyId</label>
<input type="text" class="form-control" id="runmanstrat_id" name="strat_id" placeholder="strategy id">
</div>
<div class="form-group">
<label for="runmode" class="form-label">Mode</label>
<select class="form-control" id="runmanmode" name="mode"><option value="paper">paper</option><option value="live">live</option><option value="backtest">backtest</option><option value="prep">prep</option></select>
</div>
<div class="form-group">
<label for="account" class="form-label">Account</label>
<select class="form-control" id="runmanaccount" name="account"><option value="ACCOUNT1">ACCOUNT1</option><option value="ACCOUNT2">ACCOUNT2</option></select>
</div>
<div class="form-group">
<label for="status" class="form-label">Status</label>
<select class="form-control" id="runmanstatus" name="status"><option value="active">active</option><option value="suspended">suspended</option></select>
</div>
<div class="form-group" id="runmanstart_time_div">
<label for="start" class="form-label">Start Time</label>
<input type="text" class="form-control" id="runmanstart_time" name="start_time" value="9:30" step="1">
</div>
<div class="form-group" id="runmanstop_time_div">
<label for="stop" class="form-label">Stop Time</label>
<input type="text-local" class="form-control" id="runmanstop_time" name="stop_time" value="16:00" step="1">
</div>
<!-- pro budouci queueing backtestu -->
<div class="form-group" id="runmanbt_from_div">
<label for="bt_from" class="form-label">bt_from</label>
<input type="datetime-local" class="form-control" id="runmanbt_from" name="bt_from" placeholder="2023-04-06T09:00:00Z" step="1">
</div>
<div class="form-group" id="runmanbt_to_div">
<label for="bt_to" class="form-label">bt_to</label>
<input type="datetime-local" class="form-control" id="runmanbt_to" name="bt_to" placeholder="2023-04-06T09:00:00Z" step="1">
</div>
<div class="form-group" id="runmantestlist_id_div">
<label for="test_batch_id" class="form-label">Test List ID</label>
<input type="text" class="form-control" id="runmantestlist_id" name="testlist_id" placeholder="test intervals ID">
</div>
<!-- pro budouci queueing backtestu -->
<!-- Initial Checkbox for Enabling Weekday Selection -->
<div class="form-group">
<div style="display:inline-flex">
<label for="runman_enable_weekdays" class="form-label">Limit to Weekdays</label>
<input type="checkbox" class="form-check" id="runman_enable_weekdays" name="enable_weekdays" aria-label="Enable Weekday Selection">
</div>
</div>
<!-- Weekday Checkboxes -->
<div class="form-group weekday-checkboxes" style="display:none;">
<!-- <label class="form-label">Select Weekdays:</label> -->
<div>
<input type="checkbox" id="monday" name="weekdays" value="monday">
<label for="monday">Monday</label>
</div>
<div>
<input type="checkbox" id="tuesday" name="weekdays" value="tuesday">
<label for="tuesday">Tuesday</label>
</div>
<div>
<input type="checkbox" id="wednesday" name="weekdays" value="wednesday">
<label for="wednesday">Wednesday</label>
</div>
<div>
<input type="checkbox" id="thursday" name="weekdays" value="thursday">
<label for="thursday">Thursday</label>
</div>
<div>
<input type="checkbox" id="friday" name="weekdays" value="friday">
<label for="friday">Friday</label>
</div>
</div>
<div class="form-group" id="runmanvalid_from_div">
<label for="runmanvalid_from" class="form-label">Valid from</label>
<input type="datetime-local" class="form-control" id="runmanvalid_from" name="valid_from" placeholder="2023-04-06T09:00:00Z" step="1">
</div>
<div class="form-group" id="runmanvalid_to_div">
<label for="runmanvalid_to" class="form-label">Valid to</label>
<input type="datetime-local" class="form-control" id="runmanvalid_to" name="valid_to" placeholder="2023-04-06T09:00:00Z" step="1">
</div>
<div class="form-group">
<label for="batch_id" class="form-label">Batch ID</label>
<input type="text" class="form-control" id="runmanbatch_id" name="batch_id" placeholder="batch id">
</div>
<div class="form-group">
<div style="display:inline-flex">
<label for="ilog_save" class="form-label">Enable logs</label>
<input type="checkbox" class="form-check" id="runmanilog_save" name="ilog_save" aria-label="Enable logs">
</div>
</div>
<div class="form-group">
<label for="note" class="form-label">note</label>
<textarea class="form-control" rows="1" id="runmannote" name="note"></textarea>
</div>
</div>
<div class="modal-footer">
<input type="hidden" name="runner_id" id="runmanrunner_id" />
<input type="hidden" name="history" id="runmanhistory" />
<input type="hidden" name="last_processed" id="runmanlast_processed" />
<!--<input type="hidden" name="action" id="action" value="" />-->
<input type="submit" id="runmanagersubmit" class="btn btn-primary" value="Add" />
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Close</button>
</div>
</div>
</form>
</div>
</div>
<div id="historyModalRunmanager" class="modal fade">
<div class="modal-dialog">
<form method="post" id="historyModalRunmanagerForm">
<div class="modal-content">
<div class="modal-header">
<h4 class="modal-title"><i class="fa fa-plus"></i>View History</h4>
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
</div>
<div class="modal-body">
<div class="form-group">
<label for="RunmanId" class="form-label">Id</label>
<input type="text" class="form-control" id="RunmanId" name="id" placeholder="id" readonly>
</div>
<div class="form-group">
<label for="Runmanlast_processed" class="form-label">Last processed</label>
<input type="text" class="form-control" id="Runmanlast_processed" name="last_processed" readonly>
</div>
<div class="form-group">
<label for="Runmanhistory" class="form-label">History</label>
<textarea class="form-control" rows="8" id="Runmanhistory" name="history" readonly></textarea>
</div>
<!-- <div class="form-group">
<label for="metrics" class="form-label">Metrics</label>
<textarea class="form-control" rows="8" id="metrics" name="metrics"></textarea>
</div>
<div class="form-group">
<label for="stratvars" class="form-label">Stratvars</label>
<textarea class="form-control" rows="8" id="editstratvars" name="stratvars"></textarea>
</div>
<div class="form-group">
<label for="strat_json" class="form-label">Strat JSON</label>
<textarea class="form-control" rows="6" id="editstratjson" name="stratjson"></textarea>
</div> -->
</div>
<div class="modal-footer">
<!-- <input type="submit" name="delete" id="editarchive" class="btn btn-primary" value="Edit" /> -->
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Close</button>
</div>
</div>
</form>
</div>
</div>
</div>
<div id="archive-table" class="flex-items">
<label data-bs-toggle="collapse" data-bs-target="#archive-table-inner">
@ -316,6 +583,7 @@
<button id="button_refresh" class="refresh btn btn-outline-success btn-sm">Refresh</button>
<button title="Compare selected days" id="button_compare_arch" class="refresh btn btn-outline-success btn-sm">Compare</button>
<button title="Run selected day" id="button_runagain_arch" class="refresh btn btn-outline-success btn-sm">Run Again(r)</button>
<button title="Runs LIVE/PAPER in BT mode with same dates" id="button_runbt_arch" class="refresh btn btn-outline-success btn-sm">Backtest same period</button>
<button title="Select all days on the page" id="button_selpage" class="btn btn-outline-success btn-sm">Select all</button>
<button title="Export selected days to XML" id="button_export_xml" class="btn btn-outline-success btn-sm">Export xml</button>
<button title="Export selected days to CSV" id="button_export_csv" class="btn btn-outline-success btn-sm">Export csv</button>
@ -351,6 +619,8 @@
<th>avgp</th>
<th>metrics</th>
<th>batchid</th>
<th>batchprofit</th>
<th>batchcount</th>
</tr>
</thead>
<tbody></tbody>
@ -403,25 +673,32 @@
</div>
<div id="logModal" class="modal fade" style="--bs-modal-width: 825px;">
<div class="modal-dialog">
<div class="modal-content">
<div class="modal-header">
<h4 class="modal-title"><i class="fa fa-plus"></i>Log</h4>
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
<div class="modal-content">
<div class="modal-header">
<h4 class="modal-title"><i class="fa fa-plus"></i>Log</h4>
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
</div>
<div class="modal-body">
<div class="form-group">
<label for="logFileSelect" class="form-label">Select Log File</label>
<select class="form-select" id="logFileSelect" aria-label="Log file select">
<!-- <option selected>Select a log file</option> -->
<option value="strat.log" selected>strat.log</option>
<option value="job.log">job.log</option>
</select>
</div>
<div class="modal-body">
<div class="form-group">
<label for="logHere" class="form-label">Log</label>
<div id="log-container">
<pre id="log-content"></pre>
</div>
<!-- <input type="text" class="form-control" id="delidarchive" name="delidarchive" placeholder="id"> -->
<div class="form-group mt-3">
<label for="logHere" class="form-label">Log</label>
<div id="log-container"style="height:700px;border:1px solid black;">
<!-- <pre id="log-content"></pre> -->
</div>
</div>
<div class="modal-footer">
<button type="button" class="btn btn-primary" id="logRefreshButton" value="Refresh">Refresh</button>
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Close</button>
</div>
</div>
<div class="modal-footer">
<button type="button" class="btn btn-primary" id="logRefreshButton" value="Refresh">Refresh</button>
<button type="button" class="btn btn-secondary" id="closeLogModal" data-bs-dismiss="modal">Close</button>
</div>
</div>
</div>
</div>
<div id="editModalArchive" class="modal fade">
@ -449,6 +726,10 @@
<label for="stratvars" class="form-label">Stratvars</label>
<textarea class="form-control" rows="8" id="editstratvars" name="stratvars"></textarea>
</div>
<div class="form-group">
<label for="stratvars" class="form-label">Transferables</label>
<textarea class="form-control" rows="8" id="edittransferables" name="stratvars"></textarea>
</div>
<div class="form-group">
<label for="strat_json" class="form-label">Strat JSON</label>
<textarea class="form-control" rows="6" id="editstratjson" name="stratjson"></textarea>
@ -887,39 +1168,43 @@
<BR>
</div>
</div>
<script src="/static/js/config.js?v=1.02"></script>
<script src="/static/js/config.js?v=1.04"></script>
<!-- tady zacina polska docasna lokalizace -->
<!-- <script type="text/javascript" src="https://unpkg.com/lightweight-charts/dist/lightweight-charts.standalone.production.js"></script> -->
<script type="text/javascript" src="/static/js/libs/lightweightcharts/lightweight-charts.standalone.production410.js"></script>
<script src="/static/js/dynamicbuttons.js?v=1.03"></script>
<script type="text/javascript" src="/static/js/libs/lightweightcharts/lightweight-charts.standalone.production413.js"></script>
<script src="/static/js/dynamicbuttons.js?v=1.05"></script>
<!-- <script src="/static/js/utils.js?v=1.01"></script> -->
<!-- new util structure and exports and colors -->
<script src="/static/js/utils/utils.js?v=1.02"></script>
<script src="/static/js/utils/exports.js?v=1.02"></script>
<script src="/static/js/utils/colors.js?v=1.02"></script>
<script src="/static/js/utils/utils.js?v=1.06"></script>
<script src="/static/js/utils/exports.js?v=1.04"></script>
<script src="/static/js/utils/colors.js?v=1.04"></script>
<script src="/static/js/instantindicators.js?v=1.01"></script>
<script src="/static/js/archivechart.js?v=1.03"></script>
<script src="/static/js/instantindicators.js?v=1.04"></script>
<script src="/static/js/archivechart.js?v=1.05"></script>
<!-- <script src="/static/js/archivetables.js?v=1.05"></script> -->
<!-- archiveTables split into separate files -->
<script src="/static/js/tables/archivetable/init.js?v=1.07"></script>
<script src="/static/js/tables/archivetable/functions.js?v=1.06"></script>
<script src="/static/js/tables/archivetable/modals.js?v=1.05"></script>
<script src="/static/js/tables/archivetable/handlers.js?v=1.05"></script>
<script src="/static/js/tables/archivetable/init.js?v=1.12"></script>
<script src="/static/js/tables/archivetable/functions.js?v=1.11"></script>
<script src="/static/js/tables/archivetable/modals.js?v=1.07"></script>
<script src="/static/js/tables/archivetable/handlers.js?v=1.11"></script>
<!-- Runmanager functionality -->
<script src="/static/js/tables/runmanager/init.js?v=1.1"></script>
<script src="/static/js/tables/runmanager/functions.js?v=1.08"></script>
<script src="/static/js/tables/runmanager/modals.js?v=1.07"></script>
<script src="/static/js/tables/runmanager/handlers.js?v=1.07"></script>
<script src="/static/js/livewebsocket.js?v=1.01"></script>
<script src="/static/js/realtimechart.js?v=1.01"></script>
<script src="/static/js/mytables.js?v=1.01"></script>
<script src="/static/js/livewebsocket.js?v=1.02"></script>
<script src="/static/js/realtimechart.js?v=1.02"></script>
<script src="/static/js/mytables.js?v=1.03"></script>
<script src="/static/js/testlist.js?v=1.01"></script>
<script src="/static/js/ml.js?v=1.02"></script>
<script src="/static/js/common.js?v=1.01"></script>
<script src="/static/js/configform.js?v=1.01"></script>
<!-- <script src="/static/js/scheduler.js?v=1.01"></script> -->
</body>
</html>

View File

@ -638,7 +638,7 @@ $(document).ready(function () {
else{
$('#editstratvars').val(JSON.stringify(row.stratvars,null,2));
}
$('#edittransferables').val(JSON.stringify(row.transferables,null,2));
$('#editstratjson').val(row.strat_json);
}

File diff suppressed because one or more lines are too long

View File

@ -90,9 +90,55 @@ $(document).ready(function () {
monaco.languages.register({ id: 'python' });
monaco.languages.register({ id: 'json' });
//Register mylogs language
monaco.languages.register({ id: 'mylogs' });
// Register the TOML language
monaco.languages.setLanguageConfiguration('mylogs', {
comments: {
lineComment: '//', // Adjust if your logs use a different comment symbol
},
brackets: [['[', ']'], ['{', '}']], // Array and object brackets
autoClosingPairs: [
{ open: '{', close: '}', notIn: ['string'] },
{ open: '"', close: '"', notIn: ['string', 'comment'] },
{ open: "'", close: "'", notIn: ['string', 'comment'] },
],
});
monaco.languages.setMonarchTokensProvider('mylogs', {
tokenizer: {
root: [
[/#.*/, 'comment'], // Comments (if applicable)
// Timestamps
[/\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}\.\d+/, 'timestamp'],
// Log Levels
[/\b(INFO|DEBUG|WARNING|ERROR|CRITICAL)\b/, 'log-level'],
// Strings
[/".*"/, 'string'],
[/'.*'/, 'string'],
// Key-Value Pairs
[/[A-Za-z_]+\s*:/, 'key'],
[/-?\d+\.\d+/, 'number.float'], // Floating-point
[/-?\d+/, 'number.integer'], // Integers
[/\btrue\b/, 'boolean.true'],
[/\bfalse\b/, 'boolean.false'],
// Other Words and Symbols
[/[A-Za-z_]+/, 'identifier'],
[/[ \t\r\n]+/, 'white'],
[/[\[\]{}(),]/, 'delimiter'], // Expand if more delimiters exist
]
}
});
monaco.languages.register({ id: 'toml' });
// Define the TOML language configuration
monaco.languages.setLanguageConfiguration('toml', {
comments: {
@ -621,7 +667,6 @@ $(document).ready(function () {
})
});
//button run
$('#button_run').click(function () {
row = stratinRecords.row('.selected').data();
@ -953,7 +998,18 @@ var runnerRecords =
render: function ( data, type, row ) {
return format_date(data)
},
},
{
targets: [4], //symbol
render: function ( data, type, row ) {
if (type === 'display') {
//console.log("arch")
var color = getColorForId(row.strat_id);
return '<span style="color:' + color + ';">'+data+'</span>';
}
return data;
},
},
],
// select: {
// style: 'multi'

View File

@ -0,0 +1,31 @@
function get_system_info() {
console.log('Button get system status clicked')
$.ajax({
url: '/system-info',
type: 'GET',
beforeSend: function (xhr) {
xhr.setRequestHeader('X-API-Key',
API_KEY); },
success: function(response) {
$.each(response, function(index, item) {
if (index=="disk_space") {
$('#disk-gauge-bar').css('width', response.disk_space.used_percentage + '%');
$('#free-space').text('Free: ' + response.disk_space.free + ' GB');
$('#total-space').text('Total: ' + response.disk_space.total + ' GB');
$('#used-percent').text('Used: ' + response.disk_space.used_percentage + '%');
} else {
var formatted_item = JSON.stringify(item, null, 4)
$('#system-info-output').append('<p>' + index + ': ' + formatted_item + '</p>');
}
});
},
error: function(xhr, status, error) {
$('#disk-gauge-bar').html('An error occurred: ' + error + xhr.responseText + status);
}
});
}
$(document).ready(function(){
get_system_info()
});

View File

@ -6,6 +6,7 @@ let editor_diff_arch1
let editor_diff_arch2
var archData = null
var batchHeaders = []
var editorLog = null
function refresh_arch_and_callback(row, callback) {
//console.log("entering refresh")
@ -78,10 +79,11 @@ function get_detail_and_chart(row) {
})
}
//rerun stratin
function run_day_again() {
//rerun stratin (use to rerun strategy and also to rerun live/paper as bt on same period)
function run_day_again(turnintobt=false) {
row = archiveRecords.row('.selected').data();
$('#button_runagain_arch').attr('disabled',true);
var button_name = turnintobt ? '#button_runbt_arch' : '#button_runagain_arch'
$(button_name).attr('disabled',true)
var record1 = new Object()
//console.log(JSON.stringify(rows))
@ -142,7 +144,7 @@ function run_day_again() {
//console.log("Result from second request:", result2);
//console.log("calling compare")
rerun_strategy(result1, result2)
rerun_strategy(result1, result2, turnintobt)
// Perform your action with the results from both requests
// Example:
@ -154,13 +156,22 @@ function run_day_again() {
});
function rerun_strategy(archRunner, stratData) {
function rerun_strategy(archRunner, stratData, turnintobt) {
record1 = archRunner
//console.log(record1)
var note_prefix = "RERUN "
if ((turnintobt) && ((record1.mode == 'live') || (record1.mode == 'paper'))) {
record1.mode = 'backtest'
record1.bt_from = record1.started
record1.bt_to = record1.stopped
note_prefix = "BT SAME PERIOD "
}
record1.note = note_prefix + record1.note
//nebudeme muset odstanovat pri kazdem pridani noveho atributu v budoucnu
//smazeneme nepotrebne a pridame potrebne
//do budoucna predelat na vytvoreni noveho objektu
//nebudeme muset odstanovat pri kazdem pridani noveho atributu v budoucnu
delete record1["end_positions"];
delete record1["end_positions_avgp"];
delete record1["profit"];
@ -172,8 +183,6 @@ function run_day_again() {
delete record1["settings"];
delete record1["stratvars"];
record1.note = "RERUN " + record1.note
if (record1.bt_from == "") {delete record1["bt_from"];}
if (record1.bt_to == "") {delete record1["bt_to"];}
@ -212,7 +221,7 @@ function run_day_again() {
contentType: "application/json",
data: jsonString,
success:function(data){
$('#button_runagain_arch').attr('disabled',false);
$(button_name).attr('disabled',false);
setTimeout(function () {
runnerRecords.ajax.reload();
stratinRecords.ajax.reload();
@ -222,7 +231,7 @@ function run_day_again() {
var err = eval("(" + xhr.responseText + ")");
window.alert(JSON.stringify(xhr));
//console.log(JSON.stringify(xhr));
$('#button_runagain_arch').attr('disabled',false);
$(button_name).attr('disabled',false);
}
})
}
@ -453,8 +462,10 @@ function display_batch_report(batch_id) {
}
function refresh_logfile() {
logfile = $("#logFileSelect").val()
lines = 1200
$.ajax({
url:"/log?lines=30",
url:"/log?lines="+lines+"&logfile="+logfile,
beforeSend: function (xhr) {
xhr.setRequestHeader('X-API-Key',
API_KEY); },
@ -462,12 +473,34 @@ function refresh_logfile() {
contentType: "application/json",
dataType: "json",
success:function(response){
if (editorLog) {
editorLog.dispose();
}
if (response.lines.length == 0) {
$('#log-content').html("no records");
value = "no records";
// $('#log-content').html("no records");
}
else {
$('#log-content').html(response.lines.join('\n'));
//console.log(response.lines)
//var escapedLines = response.lines.map(line => escapeHtml(line));
value = response.lines.join('\n')
// $('#log-content').html(escapedLines.join('\n'));
}
require(["vs/editor/editor.main"], () => {
editorLog = monaco.editor.create(document.getElementById('log-container'), {
value: value,
language: 'mylogs',
theme: 'tomlTheme-dark',
automaticLayout: true,
readOnly: true
});
});
// Focus at the end of the file:
const model = editorLog.getModel();
const lastLineNumber = model.getLineCount();
const lastLineColumn = model.getLineMaxColumn(lastLineNumber);
editorLog.setPosition({ lineNumber: lastLineNumber, column: lastLineColumn });
editorLog.revealPosition({ lineNumber: lastLineNumber, column: lastLineColumn });
},
error: function(xhr, status, error) {
var err = eval("(" + xhr.responseText + ")");
@ -476,6 +509,14 @@ function refresh_logfile() {
})
}
function escapeHtml(text) {
return text
.replace(/&/g, "&amp;")
.replace(/</g, "&lt;")
.replace(/>/g, "&gt;")
.replace(/"/g, "&quot;")
.replace(/'/g, "&#039;");
}
function delete_arch_rows(ids) {
$.ajax({
url:"/archived_runners/",
@ -530,6 +571,7 @@ function generateStorageKey(batchId) {
function disable_arch_buttons() {
//disable buttons (enable on row selection)
$('#button_runagain_arch').attr('disabled','disabled');
$('#button_runbt_arch').attr('disabled','disabled');
$('#button_show_arch').attr('disabled','disabled');
$('#button_delete_arch').attr('disabled','disabled');
$('#button_delete_batch').attr('disabled','disabled');
@ -552,4 +594,10 @@ function enable_arch_buttons() {
$('#button_report').attr('disabled',false);
$('#button_export_xml').attr('disabled',false);
$('#button_export_csv').attr('disabled',false);
//Backtest same period button is displayed only when row with mode paper/live is selected
row = archiveRecords.row('.selected').data();
if ((row.mode == 'paper') || (row.mode == 'live')) {
$('#button_runbt_arch').attr('disabled',false);
}
}

View File

@ -265,8 +265,8 @@ $(document).ready(function () {
$('#diff_first').text(record1.name);
$('#diff_second').text(record2.name);
$('#diff_first_id').text(data1.id);
$('#diff_second_id').text(data2.id);
$('#diff_first_id').text(data1.id + ' Batch: ' + data1.batch_id);
$('#diff_second_id').text(data2.id + ' Batch: ' + data2.batch_id);
//monaco
require(["vs/editor/editor.main"], () => {
@ -358,11 +358,20 @@ $(document).ready(function () {
})
});
$('#closeLogModal').click(function () {
editorLog.dispose()
});
//button to query log
$('#logRefreshButton').click(function () {
editorLog.dispose()
refresh_logfile()
});
$('#logFileSelect').change(function() {
refresh_logfile();
});
//button to open log modal
$('#button_show_log').click(function () {
window.$('#logModal').modal('show');
@ -441,7 +450,7 @@ $(document).ready(function () {
$('#editstratvars').val(JSON.stringify(row.stratvars,null,2));
}
$('#edittransferables').val(JSON.stringify(row.transferables,null,2));
$('#editstratjson').val(row.strat_json);
}
});
@ -458,6 +467,11 @@ $(document).ready(function () {
//run again button
$('#button_runagain_arch').click(run_day_again)
//run in bt mode
$('#button_runbt_arch').click(function() {
run_day_again(true);
});
//workaround pro spatne oznacovani selectu i pro group-headery
// $('#archiveTable tbody').on('click', 'tr.group-header', function(event) {
// var $row = $(this);

View File

@ -42,6 +42,8 @@ function initialize_archiveRecords() {
{data: 'end_positions_avgp', visible: true},
{data: 'metrics', visible: true},
{data: 'batch_id', visible: true},
{data: 'batch_profit', visible: false},
{data: 'batch_count', visible: false},
],
paging: true,
processing: true,
@ -68,30 +70,32 @@ function initialize_archiveRecords() {
{
targets: [5],
render: function ( data, type, row ) {
now = new Date(data)
if (type == "sort") {
return new Date(data).getTime();
}
//data = "2024-02-26T19:29:13.400621-05:00"
// Create a date object from the string, represents given moment in time in UTC time
var date = new Date(data);
tit = date.toLocaleString('cs-CZ', {
timeZone: 'America/New_York',
})
if (isToday(now)) {
if (isToday(date)) {
//console.log("volame isToday s", date)
//return local time only
return '<div title="'+tit+'">'+ 'dnes ' + format_date(data,false,true)+'</div>'
return '<div title="'+tit+'">'+ 'dnes ' + format_date(data,true,true)+'</div>'
}
else
{
//return local datetime
return '<div title="'+tit+'">'+ format_date(data,false,false)+'</div>'
return '<div title="'+tit+'">'+ format_date(data,true,false)+'</div>'
}
},
},
{
targets: [6],
render: function ( data, type, row ) {
now = new Date(data)
if (type == "sort") {
return new Date(data).getTime();
}
@ -100,14 +104,14 @@ function initialize_archiveRecords() {
timeZone: 'America/New_York',
})
if (isToday(now)) {
if (isToday(date)) {
//return local time only
return '<div title="'+tit+'" class="token level comment">'+ 'dnes ' + format_date(data,false,true)+'</div>'
return '<div title="'+tit+'" class="token level comment">'+ 'dnes ' + format_date(data,true,true)+'</div>'
}
else
{
//return local datetime
return '<div title="'+tit+'" class="token level number">'+ format_date(data,false,false)+'</div>'
return '<div title="'+tit+'" class="token level number">'+ format_date(data,true,false)+'</div>'
}
},
},
@ -237,6 +241,8 @@ function initialize_archiveRecords() {
var groupId = group ? group : 'no-batch-id-' + firstRowData.id;
var stateKey = 'dt-group-state-' + groupId;
var state = localStorage.getItem(stateKey);
var profit = firstRowData.batch_profit
var itemCount = firstRowData.batch_count
// Iterate over each row in the group to set the data attribute
// zaroven pro kazdy node nastavime viditelnost podle nastaveni
@ -252,10 +258,10 @@ function initialize_archiveRecords() {
});
// Initialize variables for the group
var itemCount = 0;
//var itemCount = 0;
var period = '';
var batch_note = '';
var profit = '';
//var profit = '';
var started = null;
var stratinId = null;
var symbol = null;
@ -284,13 +290,23 @@ function initialize_archiveRecords() {
//pokud mame batch_id podivame se zda jeho nastaveni uz nema a pokud ano pouzijeme to
//pokud nemame tak si ho loadneme
//Tento kod parsuje informace do header hlavicky podle notes, je to relevantni pouze pro
//backtest batche, nikoliv pro paper a live, kde pocet dni je neznamy a poznamka se muze menit
//do budoucna tento parsing na frontendu bude nahrazen batch tabulkou v db, ktera persistuje
//tyto data
if (group) {
const existingBatch = batchHeaders.find(batch => batch.batch_id == group);
//jeste neni v poli batchu - udelame hlavicku
if (!existingBatch) {
itemCount = extractNumbersFromString(firstRowData.note);
try {profit = firstRowData.metrics.profit.batch_sum_profit;}
catch (e) {profit = 'NA'}
// itemCount = extractNumbersFromString(firstRowData.note);
// if (!itemCount) {
// itemCount="NA"
// }
// try { profit = firstRowData.metrics.profit.batch_sum_profit;}
// catch (e) {profit = 'NA'}
// if (!profit) {profit = 'NA'}
period = firstRowData.note ? firstRowData.note.substring(0, 14) : '';
try {
batch_note = firstRowData.note ? firstRowData.note.split("N:")[1].trim() : ''
@ -298,15 +314,22 @@ function initialize_archiveRecords() {
started = firstRowData.started
stratinId = firstRowData.strat_id
symbol = firstRowData.symbol
if (period.startsWith("SCHED")) {
period = "SCHEDULER";
}
var newBatchHeader = {batch_id:group, batch_note:batch_note, profit:profit, itemCount:itemCount, period:period, started:started, stratinId:stratinId, symbol:symbol};
batchHeaders.push(newBatchHeader)
}
//uz je v poli, ale mame novejsi (pribyl v ramci backtestu napr.) - updatujeme
else if (new Date(existingBatch.started) < new Date(firstRowData.started)) {
itemCount = extractNumbersFromString(firstRowData.note);
try {profit = firstRowData.metrics.profit.batch_sum_profit;}
catch (e) {profit = 'NA'}
// try {itemCount = extractNumbersFromString(firstRowData.note);}
// catch (e) {itemCount = 'NA'}
// try {profit = firstRowData.metrics.profit.batch_sum_profit;}
// catch (e) {profit = 'NA'}
period = firstRowData.note ? firstRowData.note.substring(0, 14) : '';
if (period.startsWith("SCHED")) {
period = "SCHEDULER";
}
try {
batch_note = firstRowData.note ? firstRowData.note.split("N:")[1].trim() : ''
} catch (e) { batch_note = ''}

View File

@ -0,0 +1,100 @@
function refresh_runmanager_and_callback(row, callback) {
//console.log("entering refresh")
var request = $.ajax({
url: "/run_manager_records/"+row.id,
beforeSend: function (xhr) {
xhr.setRequestHeader('X-API-Key',
API_KEY); },
method:"GET",
contentType: "application/json",
dataType: "json",
success:function(data){
//console.log("fetched data ok")
//console.log(JSON.stringify(data,null,2));
},
error: function(xhr, status, error) {
var err = eval("(" + xhr.responseText + ")");
window.alert(JSON.stringify(xhr));
console.log(JSON.stringify(xhr));
}
});
// Handling the responses of both requests
$.when(request).then(function(response) {
// Both requests have completed successfully
//console.log("Result from request:", response);
//console.log("Response received. calling callback")
//call callback function
callback(response)
}, function(error) {
// Handle errors from either request here
// Example:
console.error("Error from first request:", error);
console.log("requesting id error")
});
}
function delete_runmanager_row(id) {
$.ajax({
url:"/run_manager_records/"+id,
beforeSend: function (xhr) {
xhr.setRequestHeader('X-API-Key',
API_KEY); },
method:"DELETE",
contentType: "application/json",
dataType: "json",
// data: JSON.stringify(ids),
success:function(data){
$('#delFormRunmanager')[0].reset();
window.$('#delModalRunmanager').modal('hide');
$('#deleterunmanager').attr('disabled', false);
//console.log(data)
runmanagerRecords.ajax.reload();
disable_runmanager_buttons()
},
error: function(xhr, status, error) {
var err = eval("(" + xhr.responseText + ")");
window.alert(JSON.stringify(xhr));
console.log(JSON.stringify(xhr));
$('#deleterunmanager').attr('disabled', false);
//archiveRecords.ajax.reload();
}
})
}
//enable/disable based if row(s) selected
function disable_runmanager_buttons() {
//disable buttons (enable on row selection)
//$('#button_add_sched').attr('disabled','disabled');
$('#button_edit_sched').attr('disabled','disabled');
$('#button_delete_sched').attr('disabled','disabled');
$('#button_history_sched').attr('disabled','disabled');
}
function enable_runmanager_buttons() {
//enable buttons
//$('#button_add_sched').attr('disabled',false);
$('#button_edit_sched').attr('disabled',false);
$('#button_delete_sched').attr('disabled',false);
$('#button_history_sched').attr('disabled',false);
}
// Function to update options
function updateSelectOptions(type) {
var allOptions = {
'paper': '<option value="paper">paper</option>',
'live': '<option value="live">live</option>',
'backtest': '<option value="backtest">backtest</option>',
'prep': '<option value="prep">prep</option>'
};
var allowedOptions = (type === "schedule") ? ['paper', 'live'] : Object.keys(allOptions);
var $select = $('#runmanmode');
$select.empty(); // Clear current options
allowedOptions.forEach(function(opt) {
$select.append(allOptions[opt]); // Append allowed options
});
}

View File

@ -0,0 +1,296 @@
/* <button title="Create new" id="button_add_sched" class="btn btn-outline-success btn-sm">Add</button>
<button title="Edit selected" id="button_edit_sched" class="btn btn-outline-success btn-sm">Edit</button>
<button title="Delete selected" id="button_delete_sched" class="btn btn-outline-success btn-sm">Delete</button>
id="delModalRunmanager"
id="addeditModalRunmanager" id="runmanagersubmit" == "Add vs Edit"
*/
// Function to apply filter
function applyFilter(filter) {
switch (filter) {
case 'filterSchedule':
runmanagerRecords.column(1).search('schedule').draw();
break;
case 'filterQueue':
runmanagerRecords.column(1).search('queue').draw();
break;
// default:
// runmanagerRecords.search('').columns().search('').draw();
// break;
}
}
// Function to get the ID of current active filter
function getCurrentFilter() {
var activeFilter = $('input[name="filterOptions"]:checked').attr('id');
console.log("activeFilter", activeFilter)
return activeFilter;
}
// Function to show/hide input fields based on the current filter
function updateInputFields() {
var activeFilter = getCurrentFilter();
switch (activeFilter) {
case 'filterSchedule':
$('#runmantestlist_id_div').hide();
$('#runmanbt_from_div').hide();
$('#runmanbt_to_div').hide();
$('#runmanvalid_from_div').show();
$('#runmanvalid_to_div').show();
$('#runmanstart_time_div').show();
$('#runmanstop_time_div').show();
break;
case 'filterQueue':
$('#runmantestlist_id_div').show();
$('#runmanbt_from_div').show();
$('#runmanbt_to_div').show();
$('#runmanvalid_from_div').hide();
$('#runmanvalid_to_div').hide();
$('#runmanstart_time_div').hide();
$('#runmanstop_time_div').hide();
break;
default:
//$('#inputForSchedule, #inputForQueue').hide();
break;
}
}
//event handlers for runmanager table
$(document).ready(function () {
initialize_runmanagerRecords();
runmanagerRecords.ajax.reload();
disable_runmanager_buttons();
//on click on #button_refresh_sched call runmanagerRecords.ajax.reload()
$('#button_refresh_sched').click(function () {
runmanagerRecords.ajax.reload();
});
// Event listener for changes in the radio buttons
$('input[name="filterOptions"]').on('change', function() {
var selectedFilter = $(this).attr('id');
applyFilter(selectedFilter);
// Save the selected filter to local storage
localStorage.setItem('selectedFilter', selectedFilter);
});
// Load the last selected filter from local storage and apply it
var lastSelectedFilter = localStorage.getItem('selectedFilter');
if (lastSelectedFilter) {
$('#' + lastSelectedFilter).prop('checked', true).change();
}
//listen for changes on weekday enabling button
$('#runman_enable_weekdays').change(function() {
if ($(this).is(':checked')) {
$('.weekday-checkboxes').show();
} else {
$('.weekday-checkboxes').hide();
}
});
//selectable rows in runmanager table
$('#runmanagerTable tbody').on('click', 'tr', function () {
if ($(this).hasClass('selected')) {
//$(this).removeClass('selected');
//aadd here condition that disable is called only when there is no other selected class on tr[data-group-name]
// Check if there are no other selected rows before disabling buttons
if ($('#runmanagerTable tr.selected').length === 1) {
disable_runmanager_buttons();
}
//disable_arch_buttons()
} else {
//archiveRecords.$('tr.selected').removeClass('selected');
$(this).addClass('selected');
enable_runmanager_buttons()
}
});
//delete button
$('#button_delete_sched').click(function () {
row = runmanagerRecords.row('.selected').data();
window.$('#delModalRunmanager').modal('show');
$('#delidrunmanager').val(row.id);
// $('#action').val('delRecord');
// $('#save').val('Delete');
});
//button add
$('#button_add_sched').click(function () {
window.$('#addeditModalRunmanager').modal('show');
$('#addeditFormRunmanager')[0].reset();
//$("#runmanid").prop('readonly', false);
if (getCurrentFilter() == 'filterQueue') {
mode = 'queue';
} else {
mode = 'schedule';
}
//set modus
$('#runmanmoddus').val(mode);
//updates fields according to selected type
updateInputFields();
updateSelectOptions(mode);
// Initially, check the value of "batch" and enable/disable "btfrom" and "btto" accordingly
if ($("#runmantestlist_id").val() !== "") {
$("#runmanbt_from, #runmanbt_to").prop("disabled", true);
} else {
$("#runmanbt_from, #runmanbt_to").prop("disabled", false);
}
// Listen for changes in the "batch" input and diasble/enable "btfrom" and "btto" accordingly
$("#runmantestlist_id").on("input", function() {
if ($(this).val() !== "") {
// If "batch" is not empty, disable "from" and "to"
$("#runmanbt_from, #runmanbt_to").prop("disabled", true);
} else {
// If "batch" is empty, enable "from" and "to"
$("#runmanbt_from, #runmanbt_to").prop("disabled", false);
}
});
$('.modal-title_run').html("<i class='fa fa-plus'></i> Add Record");
$('#runmanagersubmit').val('Add');
$('#runmanager_enable_weekdays').prop('checked', false);
$('.weekday-checkboxes').hide();
});
//edit button
$('#button_edit_sched').click(function () {
row = runmanagerRecords.row('.selected').data();
if (row == undefined) {
return
}
window.$('#addeditModalRunmanager').modal('show');
//set fields as readonly
//$("#runmanid").prop('readonly', true);
//$("#runmanmoddus").prop('readonly', true);
console.log("pred editem puvodni row", row)
refresh_runmanager_and_callback(row, show_edit_modal)
function show_edit_modal(row) {
console.log("pred editem refreshnuta row", row);
$('#addeditFormRunmanager')[0].reset();
$('.modal-title_run').html("<i class='fa fa-plus'></i> Edit Record");
$('#runmanagersubmit').val('Edit');
//updates fields according to selected type
updateInputFields();
// get shared attributess
$('#runmanid').val(row.id);
$('#runmanhistory').val(row.history);
$('#runmanlast_processed').val(row.last_processed);
$('#runmanstrat_id').val(row.strat_id);
$('#runmanmode').val(row.mode);
$('#runmanmoddus').val(row.moddus);
$('#runmanaccount').val(row.account);
$('#runmanstatus').val(row.status);
$('#runmanbatch_id').val(row.batch_id);
$('#runmanrunner_id').val(row.runner_id);
$("#runmanilog_save").prop("checked", row.ilog_save);
$('#runmannote').val(row.note);
$('#runmantestlist_id').val(row.testlist_id);
$('#runmanbt_from').val(row.bt_from);
$('#runmanbt_to').val(row.bt_to);
$('#runmanvalid_from').val(row.valid_from);
$('#runmanvalid_to').val(row.valid_to);
$('#runmanstart_time').val(row.start_time);
$('#runmanstop_time').val(row.stop_time);
// Initially, check the value of "batch" and enable/disable "from" and "to" accordingly
if ($("#runmantestlist_id").val() !== "") {
$("#runmanbt_from, #runmanbt_to").prop("disabled", true);
} else {
$("#runmanbt_from, #runmanbt_to").prop("disabled", false);
}
// Listen for changes in the "batch" input
$("#runmantestlist_id").on("input", function() {
if ($(this).val() !== "") {
// If "batch" is not empty, disable "from" and "to"
$("#runmanbt_from, #runmanbt_to").prop("disabled", true);
} else {
// If "batch" is empty, enable "from" and "to"
$("#runmanbt_from, #runmanbt_to").prop("disabled", false);
}
});
type = $('#runmanmoddus').val();
updateSelectOptions(type);
//add weekdays_filter transformation from string "1,2,3" to array [1,2,3]
// Assuming you have row.weekend_filter available here
var weekdayFilter = row.weekdays_filter;
//
if (weekdayFilter) {
$('#runman_enable_weekdays').prop('checked', true);
$(".weekday-checkboxes").show();
// Map numbers to weekday names
var dayOfWeekMap = {
"0": "monday",
"1": "tuesday",
"2": "wednesday",
"3": "thursday",
"4": "friday",
"5": "saturday", // Adjust if needed for your mapping
"6": "sunday" // Adjust if needed for your mapping
};
// Iterate through the selected days
$.each(weekdayFilter, function(index, dayIndex) {
var dayOfWeek = dayOfWeekMap[dayIndex];
if (dayOfWeek) { // Make sure the day exists in the map
$("#" + dayOfWeek).prop("checked", true);
}
});
}
else {
$('#runman_enable_weekdays').prop('checked', false);
$(".weekday-checkboxes").hide();
}
}
});
//edit button
$('#button_history_sched').click(function () {
row = runmanagerRecords.row('.selected').data();
if (row == undefined) {
return
}
window.$('#historyModalRunmanager').modal('show');
//set fields as readonly
//$("#runmanid").prop('readonly', true);
//$("#runmanmoddus").prop('readonly', true);
//console.log("pred editem puvodni row", row)
refresh_runmanager_and_callback(row, show_history_modal)
function show_history_modal(row) {
//console.log("pred editem refreshnuta row", row);
$('#historyModalRunmanagerForm')[0].reset();
// get shared attributess
$('#RunmanId').val(row.id);
var date = new Date(row.last_processed);
formatted = date.toLocaleString('cs-CZ', {
timeZone: 'America/New_York',
})
$('#Runmanlast_processed').val(formatted);
$('#Runmanhistory').val(row.history);
}
});
});

View File

@ -0,0 +1,322 @@
var runmanagerRecords = null
//ekvivalent to ready
function initialize_runmanagerRecords() {
//archive table
runmanagerRecords =
$('#runmanagerTable').DataTable( {
ajax: {
url: '/run_manager_records/',
dataSrc: '',
method:"GET",
contentType: "application/json",
// dataType: "json",
beforeSend: function (xhr) {
xhr.setRequestHeader('X-API-Key',
API_KEY); },
data: function (d) {
return JSON.stringify(d);
},
error: function(xhr, status, error) {
//var err = eval("(" + xhr.responseText + ")");
//window.alert(JSON.stringify(xhr));
console.log(JSON.stringify(xhr));
}
},
columns: [ { data: 'id' },
{ data: 'moddus' },
{ data: 'strat_id' },
{data: 'symbol'},
{data: 'account'},
{data: 'mode'},
{data: 'note'},
{data: 'ilog_save'},
{data: 'bt_from'},
{data: 'bt_to'},
{data: 'weekdays_filter', visible: true},
{data: 'batch_id', visible: true},
{data: 'start_time', visible: true},
{data: 'stop_time', visible: true},
{data: 'status'},
{data: 'last_processed', visible: true},
{data: 'history', visible: false},
{data: 'valid_from', visible: true},
{data: 'valid_to', visible: true},
{data: 'testlist_id', visible: true},
{data: 'strat_running', visible: true},
{data: 'runner_id', visible: true},
{data: 'market', visible: true},
],
paging: true,
processing: true,
serverSide: false,
columnDefs: [
{ //history
targets: [6],
render: function(data, type, row, meta) {
if (!data) return data;
var stateClass = 'truncated-text';
var uniqueId = 'note-' + row.id;
if (localStorage.getItem(uniqueId) === 'expanded') {
stateClass = 'expanded-text';
}
if (type === 'display') {
return '<div class="' + stateClass + '" id="' + uniqueId + '">' + data + '</div>';
}
return data;
},
},
{ //iloc_save
targets: [7],
render: function ( data, type, row ) {
//if ilog_save true
if (data) {
return '<span class="material-symbols-outlined">done_outline</span>'
}
else {
return null
}
},
},
{
targets: [10], //weekdays
render: function (data, type, row) {
if (!data) return data;
// Map each number in the array to a weekday
var weekdays = ["monday", "tuesday", "wednesday", "thursday", "friday", "saturday", "sunday"];
return data.map(function(dayNumber) {
return weekdays[dayNumber];
}).join(', ');
},
},
{
targets: [0, 21], //interni id, runner_id
render: function ( data, type, row ) {
if (!data) return data;
if (type === 'display') {
return '<div class="tdnowrap" data-bs-toggle="tooltip" data-bs-placement="top" title="'+data+'">'+data+'</div>';
}
return data;
},
},
{
targets: [2], //strat_id
render: function ( data, type, row ) {
if (type === 'display') {
//console.log("arch")
var color = getColorForId(data);
return '<div class="tdnowrap" data-bs-toggle="tooltip" data-bs-placement="top" title="'+data+'"><span class="color-tag" style="background-color:' + color + ';"></span>'+data+'</div>';
}
return data;
},
},
{
targets: [3,12,13], //symbol, start_time, stop_time
render: function ( data, type, row ) {
if (type === 'display') {
//console.log("arch")
var color = getColorForId(row.strat_id);
return '<span style="color:' + color + ';">'+data+'</span>';
}
return data;
},
},
{
targets: [16], //history
render: function ( data, type, row ) {
if (type === 'display') {
if (!data) data = "";
return '<div data-bs-toggle="tooltip" data-bs-placement="top" title="'+data+'">'+data+'</div>';
}
return data;
},
},
{
targets: [14], //status
render: function ( data, type, row ) {
if (type === 'display') {
//console.log("arch")
var color = data == "active" ? "#3f953f" : "#f84c4c";
return '<span style="color:' + color + ';">'+data+'</span>';
}
return data;
},
},
{
targets: [20], //strat_running
render: function ( data, type, row ) {
if (type === 'display') {
if (!data) data = "";
console.log("running", data)
//var color = data == "active" ? "#3f953f" : "#f84c4c";
data = data ? "running" : ""
return '<div title="' + row.runner_id + '" style="color:#3f953f;">'+data+'</div>';
}
return data;
},
},
// {
// targets: [0,17],
// render: function ( data, type, row ) {
// if (!data) return data
// return '<div class="tdnowrap" title="'+data+'">'+data+'</i>'
// },
// },
{
targets: [15,17, 18, 8, 9], //start, stop, valid_from, valid_to, bt_from, bt_to, last_proccessed
render: function ( data, type, row ) {
if (!data) return data
if (type == "sort") {
return new Date(data).getTime();
}
var date = new Date(data);
tit = date.toLocaleString('cs-CZ', {
timeZone: 'America/New_York',
})
return '<div title="'+tit+'">'+ format_date(data,true,false)+'</div>'
// if (isToday(now)) {
// //return local time only
// return '<div title="'+tit+'">'+ 'dnes ' + format_date(data,true,true)+'</div>'
// }
// else
// {
// //return local datetime
// return '<div title="'+tit+'">'+ format_date(data,true,false)+'</div>'
// }
},
},
// {
// targets: [6],
// render: function ( data, type, row ) {
// now = new Date(data)
// if (type == "sort") {
// return new Date(data).getTime();
// }
// var date = new Date(data);
// tit = date.toLocaleString('cs-CZ', {
// timeZone: 'America/New_York',
// })
// if (isToday(now)) {
// //return local time only
// return '<div title="'+tit+'" class="token level comment">'+ 'dnes ' + format_date(data,false,true)+'</div>'
// }
// else
// {
// //return local datetime
// return '<div title="'+tit+'" class="token level number">'+ format_date(data,false,false)+'</div>'
// }
// },
// },
// {
// targets: [9,10],
// render: function ( data, type, row ) {
// if (type == "sort") {
// return new Date(data).getTime();
// }
// //console.log(data)
// //market datetime
// return data ? format_date(data, true) : data
// },
// },
// {
// targets: [2],
// render: function ( data, type, row ) {
// return '<div class="tdname tdnowrap" title="'+data+'">'+data+'</div>'
// },
// },
// // {
// // targets: [4],
// // render: function ( data, type, row ) {
// // return '<div class="tdname tdnowrap" title="'+data+'">'+data+'</div>'
// // },
// // },
// {
// targets: [16],
// render: function ( data, type, row ) {
// //console.log("metrics", data)
// try {
// data = JSON.parse(data)
// }
// catch (error) {
// //console.log(error)
// }
// var res = JSON.stringify(data)
// var unquoted = res.replace(/"([^"]+)":/g, '$1:')
// //zobrazujeme jen kratkou summary pokud mame, jinak davame vse, do titlu davame vzdy vse
// //console.log(data)
// short = null
// if ((data) && (data.profit) && (data.profit.sum)) {
// short = data.profit.sum
// }
// else {
// short = unquoted
// }
// return '<div class="tdmetrics" title="'+unquoted+'">'+short+'</div>'
// },
// },
// {
// targets: [4],
// render: function ( data, type, row ) {
// return '<div class="tdnote" title="'+data+'">'+data+'</div>'
// },
// },
// {
// targets: [13,14,15],
// render: function ( data, type, row ) {
// return '<div class="tdsmall">'+data+'</div>'
// },
// },
// {
// targets: [11],
// render: function ( data, type, row ) {
// //if ilog_save true
// if (data) {
// return '<span class="material-symbols-outlined">done_outline</span>'
// }
// else {
// return null
// }
// },
// },
{
targets: [4], //account
render: function ( data, type, row ) {
//if ilog_save true
if (data == "ACCOUNT1") {
res="ACC1"
}
else if (data == "ACCOUNT2") {
res="ACC2"
}
else { res=data}
return res
},
},
{
targets: [5], //mode
render: function ( data, type, row ) {
//if ilog_save true
if (data == "backtest") {
res="bt"
}
else { res=data}
return res
},
}
],
order: [[1, 'asc']],
select: {
info: true,
style: 'multi',
//selector: 'tbody > tr:not(.group-header)'
selector: 'tbody > tr:not(.group-header)'
},
paging: true
});
}

View File

@ -0,0 +1,195 @@
//delete modal
$("#delModalRunmanager").on('submit','#delFormRunmanager', function(event){
event.preventDefault();
$('#deleterunmanager').attr('disabled','disabled');
//get val from #delidrunmanager
id = $('#delidrunmanager').val();
delete_runmanager_row(id);
});
//add api
// fetch(`/run_manager_records/`, {
// method: 'POST',
// headers: {
// 'Content-Type': 'application/json',
// 'X-API-Key': API_KEY
// },
// body: JSON.stringify(newRecord)
// })
// fetch(`/run_manager_records/${recordId}`, {
// method: 'PATCH',
// headers: {
// 'Content-Type': 'application/json',
// 'X-API-Key': API_KEY
// },
// body: JSON.stringify(updatedData)
// })
function getCheckedWeekdays() {
const checkboxes = document.querySelectorAll('input[name="weekdays_filter[]"]:checked');
const selectedDays = Array.from(checkboxes).map(checkbox => checkbox.value);
return selectedDays;
}
//submit form
$("#addeditModalRunmanager").on('submit','#addeditFormRunmanager', function(event){
//event.preventDefault();
//code for add
if ($('#runmanagersubmit').val() == "Add") {
event.preventDefault();
//set id as editable
$('#runmanagersubmit').attr('disabled','disabled');
//trow = runmanagerRecords.row('.selected').data();
//note = $('#editnote').val()
// Handle weekdays functionality
var weekdays = [];
if ($('#runman_enable_weekdays').is(':checked')) {
$('#addeditFormRunmanager input[name="weekdays"]:checked').each(function() {
var weekday = $(this).val();
switch(weekday) {
case 'monday': weekdays.push(0); break;
case 'tuesday': weekdays.push(1); break;
case 'wednesday': weekdays.push(2); break;
case 'thursday': weekdays.push(3); break;
case 'friday': weekdays.push(4); break;
// Add cases for Saturday and Sunday if needed
}
});
}
console.log("weekdays pole", weekdays)
var formData = $(this).serializeJSON();
console.log("formData", formData)
delete formData["enable_weekdays"]
delete formData["weekdays"]
//pokud je zatrzeno tak aplikujeme filter, jinak nevyplnujeme
if (weekdays.length > 0) {
formData.weekdays_filter = weekdays
}
console.log(formData)
if ($('#runmanilog_save').prop('checked')) {
formData.ilog_save = true;
}
else
{
formData.ilog_save = false;
}
//if (formData.batch_id == "") {delete formData["batch_id"];}
//projede vsechny atributy a kdyz jsou "" tak je smaze, default nahradi backend
for (let key in formData) {
if (formData.hasOwnProperty(key) && formData[key] === "") {
delete formData[key];
}
}
jsonString = JSON.stringify(formData);
console.log("json string pro formData pred odeslanim", jsonString)
$.ajax({
url:"/run_manager_records/",
beforeSend: function (xhr) {
xhr.setRequestHeader('X-API-Key',
API_KEY); },
method:"POST",
contentType: "application/json",
// dataType: "json",
data: jsonString,
success:function(data){
$('#addeditFormRunmanager')[0].reset();
window.$('#addeditModalRunmanager').modal('hide');
$('#runmanagersubmit').attr('disabled', false);
runmanagerRecords.ajax.reload();
disable_runmanager_buttons();
},
error: function(xhr, status, error) {
var err = eval("(" + xhr.responseText + ")");
window.alert(JSON.stringify(xhr));
console.log(JSON.stringify(xhr));
$('#runmanagersubmit').attr('disabled', false);
}
})
}
//code for edit
else {
event.preventDefault();
$('#runmanagersubmit').attr('disabled','disabled');
//trow = runmanagerRecords.row('.selected').data();
//note = $('#editnote').val()
// Handle weekdays functionality
var weekdays = [];
if ($('#runman_enable_weekdays').is(':checked')) {
$('#addeditFormRunmanager input[name="weekdays"]:checked').each(function() {
var weekday = $(this).val();
switch(weekday) {
case 'monday': weekdays.push(0); break;
case 'tuesday': weekdays.push(1); break;
case 'wednesday': weekdays.push(2); break;
case 'thursday': weekdays.push(3); break;
case 'friday': weekdays.push(4); break;
// Add cases for Saturday and Sunday if needed
}
});
}
var formData = $(this).serializeJSON();
delete formData["enable_weekdays"]
delete formData["weekdays"]
//pokud je zatrzeno tak aplikujeme filter, jinak nevyplnujeme
if (weekdays.length > 0) {
formData.weekdays_filter = weekdays
}
console.log(formData)
if ($('#runmanilog_save').prop('checked')) {
formData.ilog_save = true;
}
else
{
formData.ilog_save = false;
}
//projede formatributy a kdyz jsou "" tak je smaze, default nahradi backend - tzn. smaze se puvodni hodnota
for (let key in formData) {
if (formData.hasOwnProperty(key) && formData[key] === "") {
delete formData[key];
}
}
jsonString = JSON.stringify(formData);
console.log("EDIT json string pro formData pred odeslanim", jsonString);
$.ajax({
url:"/run_manager_records/"+formData.id,
beforeSend: function (xhr) {
xhr.setRequestHeader('X-API-Key',
API_KEY); },
method:"PATCH",
contentType: "application/json",
// dataType: "json",
data: jsonString,
success:function(data){
console.log("EDIT success data", data);
$('#addeditFormRunmanager')[0].reset();
window.$('#addeditModalRunmanager').modal('hide');
$('#runmanagersubmit').attr('disabled', false);
runmanagerRecords.ajax.reload();
disable_runmanager_buttons();
},
error: function(xhr, status, error) {
var err = eval("(" + xhr.responseText + ")");
window.alert(JSON.stringify(xhr));
console.log(JSON.stringify(xhr));
$('#runmanagersubmit').attr('disabled', false);
}
});
}
});

View File

@ -371,9 +371,10 @@ function initialize_chart() {
}
chart = LightweightCharts.createChart(document.getElementById('chart'), chartOptions);
chart.applyOptions({ timeScale: { visible: true, timeVisible: true, secondsVisible: true }, crosshair: {
chart.applyOptions({ timeScale: { visible: true, timeVisible: true, secondsVisible: true, minBarSpacing: 0.003}, crosshair: {
mode: LightweightCharts.CrosshairMode.Normal, labelVisible: true
}})
console.log("chart intiialized")
}
//mozna atributy last value visible
@ -990,12 +991,26 @@ JSON.safeStringify = (obj, indent = 2) => {
return retVal;
};
function isToday(someDate) {
const today = new Date()
return someDate.getDate() == today.getDate() &&
someDate.getMonth() == today.getMonth() &&
someDate.getFullYear() == today.getFullYear()
}
function isToday(someDate) {
// Convert input date to Eastern Time
var dateInEastern = new Date(someDate.toLocaleString('en-US', { timeZone: 'America/New_York' }));
//console.log("vstupuje ",someDate)
//console.log("americky ",dateInEastern)
// Get today's date in Eastern Time
var todayInEastern = new Date(new Date().toLocaleString('en-US', { timeZone: 'America/New_York' }));
return dateInEastern.getDate() === todayInEastern.getDate() &&
dateInEastern.getMonth() === todayInEastern.getMonth() &&
dateInEastern.getFullYear() === todayInEastern.getFullYear();
}
// function isToday(someDate) {
// const today = new Date()
// return someDate.getDate() == today.getDate() &&
// someDate.getMonth() == today.getMonth() &&
// someDate.getFullYear() == today.getFullYear()
// }
//https://www.w3schools.com/jsref/jsref_tolocalestring.asp
function format_date(datum, markettime = false, timeonly = false) {

View File

@ -250,6 +250,17 @@ strong {
--bs-form-invalid-border-color: #ea868f;
}
.btn-check:checked+.btn, .btn.active, .btn.show, .btn:first-child:active, :not(.btn-check)+.btn:active {
color: var(--bs-btn-active-color);
background-color: #3a5962;
border-color: #3a5962;
}
.btn-outline-primary {
--bs-btn-color: #94b1b3;
--bs-btn-border-color: #3a5a62;
}
.form-label {
margin-top: 0.5em;
color: var(--bs-emphasis-color);
@ -983,3 +994,24 @@ pre {
#datepicker:disabled {
background-color: #f2f2f2;
}
#disk-gauge-container {
text-align: center;
width: 400px;
}
#disk-gauge {
width: 100%;
height: 20px;
background-color: #ddd;
border-radius: 10px;
overflow: hidden;
}
#disk-gauge-bar {
height: 100%;
background-color: #4285F4;
width: 0%; /* Initial state */
border-radius: 10px;
}

View File

@ -9,7 +9,7 @@ from alpaca.trading.enums import TradeEvent, OrderStatus
from v2realbot.indicators.indicators import ema
import orjson
from datetime import datetime
#from rich import print
from rich import print as printanyway
from random import randrange
from alpaca.common.exceptions import APIError
import numpy as np
@ -35,36 +35,62 @@ class StrategyClassicSL(Strategy):
max_sum_profit_to_quit_rel = safe_get(self.state.vars, "max_sum_profit_to_quit_rel", None)
max_sum_loss_to_quit_rel = safe_get(self.state.vars, "max_sum_loss_to_quit_rel", None)
#load typ direktivy hard/soft cutoff
hard_cutoff = safe_get(self.state.vars, "hard_cutoff", False)
rel_profit = round(float(np.sum(self.state.rel_profit_cum)),5)
if max_sum_profit_to_quit_rel is not None:
if rel_profit >= float(max_sum_profit_to_quit_rel):
self.state.ilog(e=f"QUITTING MAX SUM REL PROFIT REACHED {max_sum_profit_to_quit_rel=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}")
msg = f"QUITTING {hard_cutoff=} MAX SUM REL PROFIT REACHED {max_sum_profit_to_quit_rel=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}"
printanyway(msg)
self.state.ilog(e=msg)
self.state.vars.pending = "max_sum_profit_to_quit_rel"
send_to_telegram(f"QUITTING MAX SUM REL PROFIT REACHED {max_sum_profit_to_quit_rel=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}")
self.signal_stop = True
if self.mode not in [Mode.BT, Mode.PREP]:
send_to_telegram(msg)
if hard_cutoff:
self.hard_stop = True
else:
self.soft_stop = True
return True
if max_sum_loss_to_quit_rel is not None:
if rel_profit < 0 and rel_profit <= float(max_sum_loss_to_quit_rel):
self.state.ilog(e=f"QUITTING MAX SUM REL LOSS REACHED {max_sum_loss_to_quit_rel=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}")
msg=f"QUITTING {hard_cutoff=} MAX SUM REL LOSS REACHED {max_sum_loss_to_quit_rel=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}"
printanyway(msg)
self.state.ilog(e=msg)
self.state.vars.pending = "max_sum_loss_to_quit_rel"
send_to_telegram(f"QUITTING MAX SUM REL LOSS REACHED {max_sum_loss_to_quit_rel=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}")
self.signal_stop = True
if self.mode not in [Mode.BT, Mode.PREP]:
send_to_telegram(msg)
if hard_cutoff:
self.hard_stop = True
else:
self.soft_stop = True
return True
if max_sum_profit_to_quit is not None:
if float(self.state.profit) >= float(max_sum_profit_to_quit):
self.state.ilog(e=f"QUITTING MAX SUM ABS PROFIT REACHED {max_sum_profit_to_quit=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}")
msg = f"QUITTING {hard_cutoff=} MAX SUM ABS PROFIT REACHED {max_sum_profit_to_quit=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}"
printanyway(msg)
self.state.ilog(e=msg)
self.state.vars.pending = "max_sum_profit_to_quit"
send_to_telegram(f"QUITTING MAX SUM ABS PROFIT REACHED {max_sum_profit_to_quit=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}")
self.signal_stop = True
if self.mode not in [Mode.BT, Mode.PREP]:
send_to_telegram(msg)
if hard_cutoff:
self.hard_stop = True
else:
self.soft_stop = True
return True
if max_sum_loss_to_quit is not None:
if float(self.state.profit) < 0 and float(self.state.profit) <= float(max_sum_loss_to_quit):
self.state.ilog(e=f"QUITTING MAX SUM ABS LOSS REACHED {max_sum_loss_to_quit=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}")
msg = f"QUITTING {hard_cutoff=} MAX SUM ABS LOSS REACHED {max_sum_loss_to_quit=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}"
printanyway(msg)
self.state.ilog(e=msg)
self.state.vars.pending = "max_sum_loss_to_quit"
send_to_telegram(f"QUITTING MAX SUM ABS LOSS REACHED {max_sum_loss_to_quit=} {self.state.profit=} {rel_profit=} relprofits:{str(self.state.rel_profit_cum)}")
self.signal_stop = True
if self.mode not in [Mode.BT, Mode.PREP]:
send_to_telegram(msg)
if hard_cutoff:
self.hard_stop = True
else:
self.soft_stop = True
return True
return False
@ -149,6 +175,10 @@ class StrategyClassicSL(Strategy):
self.state.rel_profit_cum.append(rel_profit)
rel_profit_cum_calculated = round(np.sum(self.state.rel_profit_cum),5)
#pro martingale updatujeme loss_series_cnt
self.state.vars["transferables"]["martingale"]["cont_loss_series_cnt"] = 0 if rel_profit > 0 else self.state.vars["transferables"]["martingale"]["cont_loss_series_cnt"]+1
self.state.ilog(lvl=1, e=f"update cont_loss_series_cnt na {self.state.vars['transferables']['martingale']['cont_loss_series_cnt']}")
self.state.ilog(e=f"BUY notif - SHORT PROFIT: {partial_exit=} {partial_last=} {round(float(trade_profit),3)} celkem:{round(float(self.state.profit),3)} rel:{float(rel_profit)} rel_cum:{round(rel_profit_cum_calculated,7)}", msg=str(data.event), rel_profit_cum=str(self.state.rel_profit_cum), bought_amount=bought_amount, avg_costs=avg_costs, trade_qty=data.qty, trade_price=data.price, orderid=str(data.order.id))
#zapsat profit do prescr.trades
@ -185,7 +215,8 @@ class StrategyClassicSL(Strategy):
setattr(tradeData, "rel_profit_cum", rel_profit_cum_calculated)
#test na maximalni profit/loss, pokud vypiname pak uz nedelame pripdany reverzal
if await self.stop_when_max_profit_loss() is False:
#kontrolu na max loss provadime az u FILLu, kdy je znama celkova castka
if data.event == TradeEvent.FILL and await self.stop_when_max_profit_loss() is False:
#pIF REVERSAL REQUIRED - reverse position is added to prescr.Trades with same signal name
#jen při celém FILLU
@ -293,6 +324,10 @@ class StrategyClassicSL(Strategy):
self.state.rel_profit_cum.append(rel_profit)
rel_profit_cum_calculated = round(np.sum(self.state.rel_profit_cum),5)
#pro martingale updatujeme loss_series_cnt
self.state.vars["transferables"]["martingale"]["cont_loss_series_cnt"] = 0 if rel_profit > 0 else self.state.vars["transferables"]["martingale"]["cont_loss_series_cnt"]+1
self.state.ilog(lvl=1, e=f"update cont_loss_series_cnt na {self.state.vars['transferables']['martingale']['cont_loss_series_cnt']}")
self.state.ilog(e=f"SELL notif - LONG PROFIT {partial_exit=} {partial_last=}:{round(float(trade_profit),3)} celkem:{round(float(self.state.profit),3)} rel:{float(rel_profit)} rel_cum:{round(rel_profit_cum_calculated,7)}", msg=str(data.event), rel_profit_cum = str(self.state.rel_profit_cum), sold_amount=sold_amount, avg_costs=avg_costs, trade_qty=data.qty, trade_price=data.price, orderid=str(data.order.id))
#zapsat profit do prescr.trades
@ -328,7 +363,8 @@ class StrategyClassicSL(Strategy):
setattr(tradeData, "rel_profit_cum", rel_profit_cum_calculated)
#sem nejspis update skutecne vstupni ceny (celk.mnozstvi(order.qty) a avg_costs), to same i druhy smer
if await self.stop_when_max_profit_loss() is False:
#kontrolu na max loss provadime az u FILLu, kdy je znama celkova castka
if data.event == TradeEvent.FILL and await self.stop_when_max_profit_loss() is False:
#IF REVERSAL REQUIRED - reverse position is added to prescr.Trades with same signal name
if data.event == TradeEvent.FILL and self.state.vars.requested_followup is not None:
@ -400,7 +436,7 @@ class StrategyClassicSL(Strategy):
populate_all_indicators(item, self.state)
#pro přípravu dat next nevoláme
if self.mode == Mode.PREP:
if self.mode == Mode.PREP or self.soft_stop:
return
else:
self.next(item, self.state)
@ -417,12 +453,12 @@ class StrategyClassicSL(Strategy):
#jde o uzavreni short pozice
if int(self.state.positions) < 0 and (int(self.state.positions) + int(sizer)) > 0:
self.state.ilog(e="buy nelze nakoupit vic nez shortuji", positions=self.state.positions, size=size)
print("buy nelze nakoupit vic nez shortuji")
printanyway("buy nelze nakoupit vic nez shortuji")
return -2
if int(self.state.positions) >= self.state.vars.maxpozic:
self.state.ilog(e="buy Maxim mnozstvi naplneno", positions=self.state.positions)
print("max mnostvi naplneno")
printanyway("max mnostvi naplneno")
return 0
self.state.blockbuy = 1
@ -441,13 +477,13 @@ class StrategyClassicSL(Strategy):
#jde o uzavreni long pozice
if int(self.state.positions) > 0 and (int(self.state.positions) - int(size)) < 0:
self.state.ilog(e="nelze prodat vic nez longuji", positions=self.state.positions, size=size)
print("nelze prodat vic nez longuji")
printanyway("nelze prodat vic nez longuji")
return -2
#pokud shortuji a mam max pozic
if int(self.state.positions) < 0 and abs(int(self.state.positions)) >= self.state.vars.maxpozic:
self.state.ilog(e="short - Maxim mnozstvi naplneno", positions=self.state.positions, size=size)
print("max mnostvi naplneno")
printanyway("short - Maxim mnozstvi naplneno")
return 0
#self.state.blocksell = 1

View File

@ -6,7 +6,7 @@ from v2realbot.utils.utils import AttributeDict, zoneNY, is_open_rush, is_close_
from v2realbot.utils.tlog import tlog
from v2realbot.utils.ilog import insert_log, insert_log_multiple_queue
from v2realbot.enums.enums import RecordType, StartBarAlign, Mode, Order, Account
from v2realbot.config import BT_DELAYS, get_key, HEARTBEAT_TIMEOUT, QUIET_MODE, LOG_RUNNER_EVENTS, ILOG_SAVE_LEVEL_FROM,PROFILING_NEXT_ENABLED, PROFILING_OUTPUT_DIR, AGG_EXCLUDED_TRADES
from v2realbot.config import get_key, HEARTBEAT_TIMEOUT, PROFILING_NEXT_ENABLED, PROFILING_OUTPUT_DIR
import queue
#from rich import print
from v2realbot.loader.aggregator import TradeAggregator2Queue, TradeAggregator2List, TradeAggregator
@ -29,6 +29,7 @@ from rich import print as printnow
from collections import defaultdict
import v2realbot.strategyblocks.activetrade.sl.optimsl as optimsl
from tqdm import tqdm
import v2realbot.utils.config_handler as cfh
if PROFILING_NEXT_ENABLED:
from pyinstrument import Profiler
@ -79,7 +80,8 @@ class Strategy:
self.pe = pe
self.se = se
#signal stop - internal
self.signal_stop = False
self.hard_stop = False #indikuje hard stop, tedy vypnuti strategie
self.soft_stop = False #indikuje soft stop (napr. při dosažení max zisku/ztráty), tedy pokracovani strategie, vytvareni dat, jen bez obchodu
#prdelat queue na dynamic - podle toho jak bud uchtit pracovat s multiresolutions
#zatim jen jedna q1
@ -93,7 +95,7 @@ class Strategy:
align: StartBarAlign = StartBarAlign.ROUND,
mintick: int = 0,
exthours: bool = False,
excludes: list = AGG_EXCLUDED_TRADES):
excludes: list = cfh.config_handler.get_val('AGG_EXCLUDED_TRADES')):
##TODO vytvorit self.datas_here containing dict - queue - SYMBOL - RecType -
##zatim natvrdo
@ -327,8 +329,8 @@ class Strategy:
elif self.rectype == RecordType.TRADE:
self.state.last_trade_time = item['t']
if self.mode == Mode.BT or self.mode == Mode.PREP:
self.bt.time = self.state.last_trade_time + BT_DELAYS.trigger_to_strat
self.state.time = self.state.last_trade_time + BT_DELAYS.trigger_to_strat
self.bt.time = self.state.last_trade_time + cfh.config_handler.get_val('BT_DELAYS','trigger_to_strat')
self.state.time = self.state.last_trade_time + cfh.config_handler.get_val('BT_DELAYS','trigger_to_strat')
elif self.mode == Mode.LIVE or self.mode == Mode.PAPER:
self.state.time = datetime.now().timestamp()
#ic('time updated')
@ -424,7 +426,7 @@ class Strategy:
#main strat loop
print(self.name, "Waiting for DATA",self.q1.qsize())
with tqdm(total=self.q1.qsize()) as pbar:
with tqdm(total=self.q1.qsize(), desc=self.name + "-Ingesting Aggregated") as pbar:
while True:
try:
#block 5s, after that check signals
@ -432,7 +434,7 @@ class Strategy:
#printnow(current_thread().name, "Items waiting in queue:", self.q1.qsize())
except queue.Empty:
#check internal signals - for profit/loss optim etc - valid for runner
if self.signal_stop:
if self.hard_stop:
print(current_thread().name, "Stopping signal - internal")
break
@ -453,7 +455,7 @@ class Strategy:
if item == "last" or self.se.is_set():
print(current_thread().name, "stopping")
break
elif self.signal_stop:
elif self.hard_stop:
print(current_thread().name, "Stopping signal - internal")
break
elif self.pe.is_set():
@ -655,7 +657,7 @@ class Strategy:
if len(self.state.iter_log_list) > 0:
rt_out["iter_log"] = self.state.iter_log_list
#print(rt_out)
printnow(rt_out)
print("RTQUEUE INSERT")
#send current values to Realtime display on frontend
@ -805,7 +807,7 @@ class StrategyState:
self.iter_log_list = None
def ilog(self, e: str = None, msg: str = None, lvl: int = 1, **kwargs):
if lvl < ILOG_SAVE_LEVEL_FROM:
if lvl < cfh.config_handler.get_val('ILOG_SAVE_LEVEL_FROM'):
return
if self.mode == Mode.LIVE or self.mode == Mode.PAPER:
@ -830,5 +832,3 @@ class StrategyState:
self.iter_log_list.append(row)
row["name"] = self.name
print(row)
#zatim obecny parametr -predelat per RUN?
#if LOG_RUNNER_EVENTS: insert_log(self.runner_id, time=self.time, logdict=row)

View File

@ -0,0 +1,26 @@
from v2realbot.strategyblocks.indicators.custom.classes.indicatorbase import IndicatorBase
import pywt
import numpy as np
class DWT(IndicatorBase):
def __init__(self, state=None, wavelet='db1', levels=2):
super().__init__(state)
self.wavelet = wavelet
self.levels = levels
def next(self, close):
coeffs = pywt.wavedec(close, self.wavelet, level=self.levels)
# Zeroing out all detail coefficients
coeffs = [coeffs[0]] + [np.zeros_like(c) for c in coeffs[1:]]
# Reconstruct the signal using only the approximation coefficients
reconstructed_signal = pywt.waverec(coeffs, self.wavelet)
# Handle length difference
length_difference = len(close) - len(reconstructed_signal)
if length_difference > 0:
reconstructed_signal = np.pad(reconstructed_signal, (0, length_difference), 'constant', constant_values=(0, 0))
self.state.indicators["MultiLevelDWT"] = reconstructed_signal.tolist()
return float(reconstructed_signal[-1])

View File

@ -10,6 +10,7 @@ class SuperTrendTV(IndicatorBase):
See conversation: [Indicator Plugin Builder](https://chat.openai.com/g/g-aCKuSmbIe-indicator-plugin-builder/c/1ad650dc-05f1-4cf6-b936-772c0ea86ffa)
inspirace https://www.tradingview.com/script/r6dAP7yi/
"""
def __init__(self, atr_period=10, atr_multiplier=3.0, state=None):
super().__init__(state)
self.atr_period = atr_period
@ -19,7 +20,7 @@ class SuperTrendTV(IndicatorBase):
self.closes = deque(maxlen=atr_period)
self.up = None
self.down = None
self.trend = 1
self.trend = 0
def next(self, high, low, close):
self.highs.append(high[-1])
@ -44,7 +45,14 @@ class SuperTrendTV(IndicatorBase):
self.up = max(up, self.up) if close[-2] > self.up else up
self.down = min(dn, self.down) if close[-2] < self.down else dn
previous_trend = self.trend
self.trend = 1 if (self.trend == -1 and close[-1] > self.down) else -1 if (self.trend == 1 and close[-1] < self.up) else self.trend
# Update trend for the first time if it's still at initial state
if self.trend == 0:
self.trend = 1 if close[-1] > self.down else -1 if close[-1] < self.up else 0
else:
# Update trend based on previous values if it's not at initial state
self.trend = 1 if (self.trend == -1 and close[-1] > self.down) else -1 if (self.trend == 1 and close[-1] < self.up) else self.trend
#previous_trend = self.trend
#self.trend = 1 if (self.trend == -1 and close[-1] > self.down) else -1 if (self.trend == 1 and close[-1] < self.up) else self.trend
return [self.up, self.down, self.trend]

View File

@ -0,0 +1,112 @@
from v2realbot.common.model import RunDay, StrategyInstance, Runner, RunRequest, RunArchive, RunArchiveView, RunArchiveViewPagination, RunArchiveDetail, RunArchiveChange, Bar, TradeEvent, TestList, Intervals, ConfigItem, InstantIndicator, DataTablesRequest
import v2realbot.controller.services as cs
from v2realbot.utils.utils import slice_dict_lists,zoneUTC,safe_get, AttributeDict, filter_timeseries_by_timestamp
#id = "b11c66d9-a9b6-475a-9ac1-28b11e1b4edf"
#state = AttributeDict(vars={})
from rich import print
from traceback import format_exc
def attach_previous_data(state):
"""""
Attaches data from previous runner of the same batch.
"""""
print("ATTACHING PREVIOUS DATA")
try:
runner : Runner
#get batch_id of current runer
res, runner = cs.get_runner(state.runner_id)
if res < 0:
if runner.batch_id is None:
print(f"No batch_id found for runner {runner.id}")
else:
print(f"Couldnt get previous runner {state.runner_id} error: {runner}")
return None
batch_id = runner.batch_id
#batch_id = "6a6b0bcf"
res, runner_ids =cs.get_archived_runnerslist_byBatchID(batch_id, "desc")
if res < 0:
msg = f"error whne fetching runners of batch {batch_id} {runner_ids}"
print(msg)
return None
if runner_ids is None or len(runner_ids) == 0:
print(f"NO runners found for batch {batch_id} {runner_ids}")
return None
last_runner = runner_ids[0]
print("Previous runner identified:", last_runner)
#get archived header - to get transferables
runner_header : RunArchive = None
res, runner_header = cs.get_archived_runner_header_byID(last_runner)
if res < 0:
print(f"Error when fetching runner header {last_runner}")
return None
state.vars["transferables"] = runner_header.transferables
print("INITIALIZED transferables", state.vars["transferables"])
#get details from the runner
print(f"Fetching runner details of {last_runner}")
res, val = cs.get_archived_runner_details_byID(last_runner)
if res < 0:
print(f"no archived runner {last_runner}")
return None
detail = RunArchiveDetail(**val)
#print("toto jsme si dotahnuli", detail.bars)
if len(detail.bars["time"]) == 0:
print(f"no bars for runner {last_runner}")
return None
# from stratvars directives
attach_previous_bar_data = safe_get(state.vars, "attach_previous_bar_data", 50)
attach_previous_tick_data = safe_get(state.vars, "attach_previous_tick_data", None)
#indicators datetime utc
indicators = slice_dict_lists(d=detail.indicators[0],last_item=attach_previous_bar_data, time_to_datetime=True)
#time -datetime utc, updated - timestamp float
bars = slice_dict_lists(d=detail.bars, last_item=attach_previous_bar_data, time_to_datetime=True)
cbar_ids = {}
#zarovname tick spolu s bar daty
if attach_previous_tick_data is None:
oldest_timestamp = bars["updated"][0]
#returns only values older that oldest_timestamp
cbar_inds = filter_timeseries_by_timestamp(detail.indicators[1], oldest_timestamp)
else:
cbar_inds = slice_dict_lists(d=detail.indicators[1],last_item=attach_previous_tick_data)
#USE these as INITs - TADY SI TO JESTE ZASTAVIT a POROVNAT
#print("state.indicatorsL", state.indicators, "NEW:", indicators)
state.indicators = AttributeDict(**indicators)
print("transfered indicators:", len(state.indicators["time"]))
#print("state.bars", state.bars, "NEW:", bars)
state.bars = AttributeDict(bars)
print("transfered bars:", len(state.bars["time"]))
#print("state.cbar_indicators", state.cbar_indicators, "NEW:", cbar_inds)
state.cbar_indicators = AttributeDict(cbar_inds)
print("transfered ticks:", len(state.cbar_indicators["time"]))
print("TRANSFERABLEs INITIALIZED")
#bars
#transferable_state_vars = ["martingale", "batch_profit"]
#1. pri initu se tyto klice v state vars se namapuji do ext_data ext_data["transferrables"]["martingale"] = state.vars["martingale"]
#2. pri transferu se vse z ext_data["trasferrables"] dá do stejnénné state.vars["martingale"]
#3. na konci dne se uloží do sloupce transferables v RunArchive
#pridavame dailyBars z extData
# if hasattr(detail, "ext_data") and "dailyBars" in detail.ext_data:
# state.dailyBars = detail.ext_data["dailyBars"]
return
except Exception as e:
print(str(e)+format_exc())
return None
# if __name__ == "__main__":
# attach_previous_data(state)

View File

@ -4,9 +4,9 @@ from v2realbot.enums.enums import RecordType, StartBarAlign, Mode, Account, Foll
from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeStatus
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, print, safe_get, is_still, is_window_open, eval_cond_dict, crossed_down, crossed_up, crossed, is_pivot, json_serial, pct_diff, create_new_bars, slice_dict_lists
from v2realbot.utils.directive_utils import get_conditions_from_configuration
import mlroom.utils.mlutils as ml
#import mlroom.utils.mlutils as ml
from v2realbot.common.model import SLHistory
from v2realbot.config import KW, MODEL_DIR
from v2realbot.config import KW, MODEL_DIR, _ml_module_loaded
from uuid import uuid4
from datetime import datetime
#import random
@ -17,8 +17,14 @@ from rich import print as printanyway
from threading import Event
from traceback import format_exc
def load_ml_model(modelname, modelversion, MODEL_DIR):
global ml
import mlroom.utils.mlutils as ml
return ml.load_model(modelname, modelversion, None, MODEL_DIR)
def initialize_dynamic_indicators(state):
#pro vsechny indikatory, ktere maji ve svych stratvars TYPE inicializujeme
##ßprintanyway(state.vars, state)
dict_copy = state.vars.indicators.copy()
for indname, indsettings in dict_copy.items():
@ -68,7 +74,8 @@ def initialize_dynamic_indicators(state):
modelname = safe_get(indsettings["cp"], 'name', None)
modelversion = safe_get(indsettings["cp"], 'version', "1")
if modelname is not None:
state.vars.loaded_models[modelname] = ml.load_model(modelname, modelversion, None, MODEL_DIR)
state.vars.loaded_models[modelname] = load_ml_model(modelname, modelversion, MODEL_DIR)
# state.vars.loaded_models[modelname] = ml.load_model(modelname, modelversion, None, MODEL_DIR)
if state.vars.loaded_models[modelname] is not None:
printanyway(f"model {modelname} loaded")
else:

View File

@ -78,6 +78,7 @@ def execute_prescribed_trades(state: StrategyState, data):
size = state.vars.chunk
res = state.sell(size=size)
if isinstance(res, int) and res < 0:
print(f"error in required operation SHORT {res}")
raise Exception(f"error in required operation SHORT {res}")
#defaultní goalprice nastavujeme az v notifikaci

View File

@ -78,7 +78,7 @@ def execute_signal_generator(state, data, name):
last_update=datetime.fromtimestamp(state.time).astimezone(zoneNY),
status=TradeStatus.READY,
generated_by=name,
size=multiplier*state.vars.chunk,
size=int(multiplier*state.vars.chunk),
size_multiplier = multiplier,
direction=TradeDirection.LONG,
entry_price=None,
@ -90,7 +90,7 @@ def execute_signal_generator(state, data, name):
last_update=datetime.fromtimestamp(state.time).astimezone(zoneNY),
status=TradeStatus.READY,
generated_by=name,
size=multiplier*state.vars.chunk,
size=int(multiplier*state.vars.chunk),
size_multiplier = multiplier,
direction=TradeDirection.SHORT,
entry_price=None,

View File

@ -9,6 +9,7 @@ from traceback import format_exc
from v2realbot.strategyblocks.newtrade.conditions import go_conditions_met, common_go_preconditions_check
from v2realbot.strategyblocks.indicators.helpers import get_source_series
import numpy as np
from scipy.interpolate import interp1d
def get_size(state: StrategyState, data, signaloptions: dict, direction: TradeDirection):
return state.vars.chunk * get_multiplier(state, signaloptions, direction)
@ -134,10 +135,34 @@ def get_multiplier(state: StrategyState, data, signaloptions: dict, direction: T
return multiplier
state.ilog(lvl=1,e=f"SIZER - Input value of {pattern_source} value {input_value}", options=options, time=state.time)
multiplier = np.interp(input_value, pattern_source_axis, pattern_size_axis)
#puvodni jednoducha interpolace
#multiplier = np.interp(input_value, pattern_source_axis, pattern_size_axis)
# Updated interpolation function for smoother results
# Create the interpolation function
f = interp1d(pattern_source_axis, pattern_size_axis, kind='cubic')
# Interpolate the input value using the interpolation function
multiplier = f(input_value)
state.ilog(lvl=1,e=f"SIZER - Interpolated value {multiplier}", input_value=input_value, pattern_source_axis=pattern_source_axis, pattern_size_axis=pattern_size_axis, options=options, time=state.time)
if multiplier > 1 or multiplier <= 0:
martingale_enabled = utls.safe_get(options, "martingale_enabled", False)
#pocet ztrátových obchodů v řadě mi udává multiplikátor (0 - 1, 1 ztráta 2x, 3 v řadě - 4x atp.)
if martingale_enabled:
#martingale base - základ umocňování - klasicky 2
base = float(utls.safe_get(options, "martingale_base", 2))
#pocet aktuálních konsekutivních ztrát
cont_loss_series_cnt = state.vars["transferables"]["martingale"]["cont_loss_series_cnt"]
if cont_loss_series_cnt == 0:
multiplier = 1
else:
multiplier = base ** cont_loss_series_cnt
state.ilog(lvl=1,e=f"SIZER - MARTINGALE {multiplier}", options=options, time=state.time, cont_loss_series_cnt=cont_loss_series_cnt)
if (martingale_enabled is False and multiplier > 1) or multiplier <= 0:
state.ilog(lvl=1,e=f"SIZER - Mame nekde problem MULTIPLIER mimo RANGE ERROR {multiplier}", options=options, time=state.time)
multiplier = 1
return multiplier

View File

View File

@ -0,0 +1,585 @@
import argparse
#import v2realbot.reporting.metricstoolsimage as mt
import matplotlib
import matplotlib.dates as mdates
matplotlib.use('Agg') # Set the Matplotlib backend to 'Agg'
import matplotlib.pyplot as plt
import seaborn as sns
import pandas as pd
from datetime import datetime
from typing import List
from enum import Enum
import numpy as np
import v2realbot.controller.services as cs
#from rich import print
from v2realbot.common.PrescribedTradeModel import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, safe_get
from pathlib import Path
from v2realbot.config import WEB_API_KEY, DATA_DIR, MEDIA_DIRECTORY
from v2realbot.enums.enums import RecordType, StartBarAlign, Mode, Account, OrderSide
from io import BytesIO
from v2realbot.utils.historicals import get_historical_bars
from alpaca.data.timeframe import TimeFrame, TimeFrameUnit
import mlroom.utils.ext_services as es
from v2realbot.common.db import pool, execute_with_retry
from v2realbot.utils.utils import ltp, isrising, isfalling,trunc,AttributeDict
import tqdm
# start_date = datetime(2020, 1, 1, 0, 0, 0, 0, zoneNY)
# end_date = datetime(2024, 1, 2, 0, 0, 0, 0, zoneNY)
# bars= get_historical_bars("BLK", start_date, end_date, TimeFrame.Hour)
# print("bars for given period",bars)
#upload image to the remote server scp 4bea3a54.png david@142.132.188.109:/home/david/media/basic/
#zkopirovany a upravena funkce, ktera umi vybrat cilovy server
def generate_trading_report_image(runner_ids: list = None, batch_id: str = None, stream: bool = False, server: str = None):
#TODO dopracovat drawdown a minimalni a maximalni profity nikoliv cumulovane, zamyslet se
#TODO list of runner_ids
#TODO pridelat na vytvoreni runnera a batche, samostatne REST API + na remove archrunnera
if runner_ids is None and batch_id is None:
return -2, f"runner_id or batch_id must be present"
if batch_id is not None:
res, runner_ids = es.get_archived_runners_list_by_batch_id(batch_id, server)
if res != 0:
print(f"no batch {batch_id} found")
return -1, f"no batch {batch_id} found"
trades = []
symbol = None
mode = None
sada_list = []
for id in tqdm.tqdm(runner_ids):
#get runner
res, sada =es.get_archived_runner_header_by_id(id, server)
if res != 0:
print(f"no runner {id} found")
return -1, f"no runner {id} found"
#sada = AttributeDict(**sada)
print("archrunner")
print(sada)
sada["started"]=datetime.fromisoformat(sada['started']) if sada['started'] else None
sada["stopped"]=datetime.fromisoformat(sada['stopped']) if sada['stopped'] else None
sada["bt_from"]=datetime.fromisoformat(sada['bt_from']) if sada['bt_from'] else None
sada["bt_to"]=datetime.fromisoformat(sada['bt_to']) if sada['bt_to'] else None
sada_list.append(sada)
symbol = sada["symbol"]
mode = sada["mode"]
# Parse trades
trades_dicts = sada["metrics"]["prescr_trades"]
for trade_dict in trades_dicts:
trade_dict['last_update'] = datetime.fromtimestamp(trade_dict.get('last_update')).astimezone(zoneNY) if trade_dict['last_update'] is not None else None
trade_dict['entry_time'] = datetime.fromtimestamp(trade_dict.get('entry_time')).astimezone(zoneNY) if trade_dict['entry_time'] is not None else None
trade_dict['exit_time'] = datetime.fromtimestamp(trade_dict.get('exit_time')).astimezone(zoneNY) if trade_dict['exit_time'] is not None else None
trades.append(Trade(**trade_dict))
#print(trades)
#get from to dates
#calculate start a end z min a max dni - jelikoz muze byt i seznam runner_ids a nejenom batch, pripadne testovaci sady
if mode in [Mode.BT,Mode.PREP]:
start_date = min(runner["bt_from"] for runner in sada_list)
end_date = max(runner["bt_to"] for runner in sada_list)
else:
start_date = min(runner["started"] for runner in sada_list)
end_date = max(runner["stopped"] for runner in sada_list)
#hour bars for backtested period
print(start_date,end_date)
bars= get_historical_bars(symbol, start_date, end_date, TimeFrame.Hour)
print("bars for given period",bars)
"""Bars a dictionary with the following keys:
* high: A list of high prices
* low: A list of low prices
* volume: A list of volumes
* close: A list of close prices
* hlcc4: A list of HLCC4 indicators
* open: A list of open prices
* time: A list of times in UTC (ISO 8601 format)
* trades: A list of number of trades
* resolution: A list of resolutions (all set to 'D')
* confirmed: A list of booleans (all set to True)
* vwap: A list of VWAP indicator
* updated: A list of booleans (all set to True)
* index: A list of integers (from 0 to the length of the list of daily bars)
"""
# Filter to only use trades with status 'CLOSED'
closed_trades = [trade for trade in trades if trade.status == TradeStatus.CLOSED]
if len(closed_trades) == 0:
return -1, "image generation no closed trades"
# Data extraction for the plots
exit_times = [trade.exit_time for trade in closed_trades if trade.exit_time is not None]
##cumulative_profits = [trade.profit_sum for trade in closed_trades if trade.profit_sum is not None]
profits = [trade.profit for trade in closed_trades if trade.profit is not None]
cumulative_profits = np.cumsum(profits)
wins = [trade.profit for trade in closed_trades if trade.profit > 0]
losses = [trade.profit for trade in closed_trades if trade.profit < 0]
wins_long = [trade.profit for trade in closed_trades if trade.profit > 0 and trade.direction == TradeDirection.LONG]
losses_long = [trade.profit for trade in closed_trades if trade.profit < 0 and trade.direction == TradeDirection.LONG]
wins_short = [trade.profit for trade in closed_trades if trade.profit > 0 and trade.direction == TradeDirection.SHORT]
losses_short = [trade.profit for trade in closed_trades if trade.profit < 0 and trade.direction == TradeDirection.SHORT]
directions = [trade.direction for trade in closed_trades]
long_profits = [trade.profit for trade in closed_trades if trade.direction == TradeDirection.LONG and trade.profit is not None]
short_profits = [trade.profit for trade in closed_trades if trade.direction == TradeDirection.SHORT and trade.profit is not None]
# Setting up dark mode for the plots
plt.style.use('dark_background')
# Optionally, you can further customize colors, labels, and axes
params = {
'axes.titlesize': 9,
'axes.labelsize': 8,
'xtick.labelsize': 9,
'ytick.labelsize': 9,
'axes.labelcolor': '#a9a9a9', #a1a3aa',
'axes.facecolor': '#121722', #'#0e0e0e', #202020', # Dark background for plot area
'axes.grid': False, # Turn off the grid globally
'grid.color': 'gray', # If the grid is on, set grid line color
'grid.linestyle': '--', # Grid line style
'grid.linewidth': 1,
'xtick.color': '#a9a9a9',
'ytick.color': '#a9a9a9',
'axes.edgecolor': '#a9a9a9'
}
plt.rcParams.update(params)
#Custom dark theme similar to the provided image
# dark_finance_theme = {
# 'background': '#1a1a1a', # Very dark (almost black) background
# 'text': '#eaeaea', # Light grey text for readability
# 'grid': '#333333', # Dark grey grid lines
# 'accent': '#2e91e5', # Bright blue accent for main elements
# 'secondary': '#e15f99', # Secondary pink/magenta color for highlights
# 'highlight': '#fcba03', # Gold-like color for special highlights
# }
# # Apply the theme settings
# plt.style.use('dark_background')
# plt.rcParams.update({
# 'figure.facecolor': dark_finance_theme['background'],
# 'axes.facecolor': dark_finance_theme['background'],
# 'axes.edgecolor': dark_finance_theme['text'],
# 'axes.labelcolor': dark_finance_theme['text'],
# 'axes.titlesize': 12,
# 'axes.labelsize': 10,
# 'xtick.color': dark_finance_theme['text'],
# 'xtick.labelsize': 8,
# 'ytick.color': dark_finance_theme['text'],
# 'ytick.labelsize': 8,
# 'grid.color': dark_finance_theme['grid'],
# 'grid.linestyle': '-',
# 'grid.linewidth': 0.6,
# 'legend.facecolor': dark_finance_theme['background'],
# 'legend.edgecolor': dark_finance_theme['background'],
# 'legend.fontsize': 10,
# 'text.color': dark_finance_theme['text'],
# 'lines.color': dark_finance_theme['accent'],
# 'patch.edgecolor': dark_finance_theme['accent'],
# })
if len(closed_trades) > 100:
fig, axs = plt.subplots(3, 4, figsize=(15, 10))
else:
# Create a combined figure for all plots 11,7 ideal na 3,4
fig, axs = plt.subplots(3, 4, figsize=(12, 7))
#TITLE
title = ""
cnt_ids = len(runner_ids)
if batch_id is not None:
title = "Batch: "+str(batch_id)+ " "
title += "Days: " + str(cnt_ids)
if cnt_ids == 1:
title += " ("+str(runner_ids[0])[0:14]+") "
if sada["mode"] == Mode.BT:
datum = sada["bt_from"]
else:
datum = sada["started"]
title += datum.strftime("%d.%m.%Y %H:%M")
# Add a title to the figure
fig.suptitle(title, fontsize=15, color='white')
# Plot 1: Overall Profit Summary Chart
total_wins = int(sum(wins))
total_losses = int(sum(losses))
net_profit = int(sum(profits))
sns.barplot(x=['Total', 'Wins','Losses'],
y=[net_profit, total_wins, total_losses],
ax=axs[0, 0])
axs[0, 0].set_title('Overall Profit Summary')
# Define the offset for placing text inside the bars
offset = max(total_wins, abs(total_losses), net_profit) * 0.05 # 5% of the highest (or lowest) bar value
# Function to place text annotation
def place_annotation(ax, x, value, offset):
va = 'top' if value >= 0 else 'bottom'
y = value - offset if value >= 0 else value + offset
ax.text(x, y, f'{value}', ha='center', va=va, color='black', fontsize=12)
# Annotate the Total Wins, Losses, and Net Profit bars
place_annotation(axs[0, 0], 0, net_profit, offset)
place_annotation(axs[0, 0], 1, total_wins, offset)
place_annotation(axs[0, 0], 2, total_losses, offset)
# Plot 2: LONG - profit summary
total_wins_long = int(sum(wins_long))
total_losses_long = int(sum(losses_long))
total_long = total_wins_long + total_losses_long
sns.barplot(x=['Total', 'Wins','Losses'],
y=[total_long, total_wins_long, total_losses_long],
ax=axs[0, 1])
axs[0, 1].set_title('LONG Profit Summary')
# Define the offset for placing text inside the bars
offset = max(total_wins_long, abs(total_losses_long)) * 0.05 # 5% of the highest (or lowest) bar value
place_annotation(axs[0, 1], 0, total_long, offset)
place_annotation(axs[0, 1], 1, total_wins_long, offset)
place_annotation(axs[0, 1], 2, total_losses_long, offset)
# Plot 3: SHORT - profit summary
total_wins_short =int(sum(wins_short))
total_losses_short = int(sum(losses_short))
total_short = total_wins_short + total_losses_short
sns.barplot(x=['Total', 'Wins', 'Losses'],
y=[total_short, total_wins_short,
total_losses_short],
ax=axs[0, 2])
axs[0, 2].set_title('SHORT Profit Summary')
# Define the offset for placing text inside the bars
offset = max(total_wins_short, abs(total_losses_short)) * 0.05 # 5% of the highest (or lowest) bar value
place_annotation(axs[0, 2], 0, total_short, offset)
place_annotation(axs[0, 2], 1, total_wins_short, offset)
place_annotation(axs[0, 2], 2, total_losses_short, offset)
# Plot 4: Trade Counts Bar Chart
long_count = len([trade for trade in closed_trades if trade.direction == TradeDirection.LONG])
short_count = len([trade for trade in closed_trades if trade.direction == TradeDirection.SHORT])
sns.barplot(x=['Long Trades', 'Short Trades'], y=[long_count, short_count], ax=axs[0, 3])
axs[0, 3].set_title('Trade Counts')
offset = max(long_count, short_count) * 0.05 # 5% of the highest (or lowest) bar value
place_annotation(axs[0, 3], 0, long_count, offset)
place_annotation(axs[0, 3], 1, short_count, offset)
#PLOT 5 - Heatman (exit time)
# Creating a DataFrame for the heatmap
heatmap_data_list = []
for trade in trades:
if trade.status == TradeStatus.CLOSED:
day = trade.exit_time.strftime('%m-%d') # Format date as 'MM-DD'
#day = trade.exit_time.date()
hour = trade.exit_time.hour
profit = trade.profit
heatmap_data_list.append({'Day': day, 'Hour': hour, 'Profit': profit})
try:
heatmap_data = pd.DataFrame(heatmap_data_list)
heatmap_data = heatmap_data.groupby(['Day', 'Hour']).sum().reset_index()
heatmap_pivot = heatmap_data.pivot(index='Day', columns='Hour', values='Profit')
# Heatmap of Profits
sns.heatmap(heatmap_pivot, cmap='viridis', ax=axs[1, 0])
axs[1, 0].set_title('Heatmap of Profits (based on Exit time)')
axs[1, 0].set_xlabel('Hour of Day')
axs[1, 0].set_ylabel('Day')
except KeyError:
# Handle the case where there is no data
axs[1, 0].text(0.5, 0.5, 'No data available',
horizontalalignment='center',
verticalalignment='center',
transform=axs[1, 0].transAxes)
axs[1, 0].set_title('Heatmap of Profits (based on Exit time)')
# Plot 6: Profit/Loss Distribution Histogram
sns.histplot(profits, bins=30, ax=axs[1, 1], kde=True, color='skyblue')
axs[1, 1].set_title('Profit/Loss Distribution')
axs[1, 1].set_xlabel('Profit/Loss')
axs[1, 1].set_ylabel('Frequency')
# Plot 7
# - for 1 den: Position Size Distribution
# - for more days: Trade Duration vs. Profit/Loss
if len(runner_ids) == 1:
sizes = [trade.size for trade in closed_trades if trade.size is not None]
if sizes:
size_counts = {size: sizes.count(size) for size in set(sizes)}
sns.barplot(x=list(size_counts.keys()), y=list(size_counts.values()), ax=axs[1, 2])
axs[1, 2].set_title('Position Size Distribution')
else:
# Handle the case where there is no data
axs[1, 2].text(0.5, 0.5, 'No data available',
horizontalalignment='center',
verticalalignment='center',
transform=axs[1, 2].transAxes)
axs[1, 2].set_title('Position Size Distribution')
else:
trade_durations = []
trade_profits = []
#trade_volumes = [] # Assuming you have a way to measure the size/volume of each trade
trade_types = [] # 'Long' or 'Short'
for trade in trades:
if trade.status == TradeStatus.CLOSED:
duration = (trade.exit_time - trade.entry_time).total_seconds() / 60 # Duration in minutes (3600 for hours)
trade_durations.append(duration)
trade_profits.append(trade.profit)
##trade_volumes.append(trade.size) # or any other measure of trade size
trade_types.append('Long' if trade.direction == TradeDirection.LONG else 'Short')
# Trade Duration vs. Profit/Loss
scatter_data = pd.DataFrame({
'Duration': trade_durations,
'Profit': trade_profits,
#'Volume': trade_volumes,
'Type': trade_types
})
#sns.scatterplot(data=scatter_data, x='Duration', y='Profit', size='Volume', hue='Type', ax=axs[1, 2])
sns.scatterplot(data=scatter_data, x='Duration', y='Profit', hue='Type', ax=axs[1, 2])
axs[1, 2].set_title('Trade Duration vs. Profit/Loss')
axs[1, 2].set_xlabel('Duration (Minutes)')
axs[1, 2].set_ylabel('Profit/Loss')
#Plot 8 Cumulative profit - bud 1 den nebo vice dni + pridame pod to vyvoj ceny
# Extract the closing prices and times
closing_prices = bars.get('close',[]) if bars is not None else []
#times = bars['time'] # Assuming this is a list of pandas Timestamp objects
times = pd.to_datetime(bars['time']) if bars is not None else [] # Ensure this is a Pandas datetime series
# # Plot the closing prices over time
# axs[0, 4].plot(times, closing_prices, color='blue')
# axs[0, 4].tick_params(axis='x', rotation=45) # Rotate date labels if necessar
# axs[0, 4].xaxis.set_major_formatter(mdates.DateFormatter('%H', tz=zoneNY))
if len(runner_ids)== 1:
if cumulative_profits.size > 0:
# Plot 3: Cumulative Profit Over Time with Max Profit Point
max_profit_time = exit_times[np.argmax(cumulative_profits)]
max_profit = max(cumulative_profits)
min_profit_time = exit_times[np.argmin(cumulative_profits)]
min_profit = min(cumulative_profits)
#Plot Cumulative Profit Over Time with Max Profit Point on the primary y-axis
# Create a secondary y-axis for the closing prices
ax2 = axs[1, 3].twinx()
ax2.plot(times, closing_prices, label='Closing Price', color='orange')
ax2.set_ylabel('Closing Price', color='orange')
ax2.tick_params(axis='y', labelcolor='orange')
# Set the limits for the x-axis to cover the full range of 'times'
if isinstance(times, pd.DatetimeIndex):
axs[1, 3].set_xlim(times.min(), times.max())
sns.lineplot(x=exit_times, y=cumulative_profits, ax=axs[1, 3], color='limegreen')
axs[1, 3].scatter(max_profit_time, max_profit, color='green', label='Max Profit')
axs[1, 3].scatter(min_profit_time, min_profit, color='red', label='Min Profit')
axs[1, 3].set_xlabel('Time')
axs[1, 3].set_ylabel('Cumulative Profit', color='limegreen')
axs[1, 3].tick_params(axis='y', labelcolor='limegreen')
axs[1, 3].xaxis.set_major_formatter(mdates.DateFormatter('%H', tz=zoneNY))
# Add legends to the plot
# lines, labels = axs[1, 3].get_legend_handles_labels()
# lines2, labels2 = ax2.get_legend_handles_labels()
# axs[1, 3].legend(lines + lines2, labels + labels2, loc='upper left')
else:
# Handle the case where cumulative_profits is empty
axs[1, 3].text(0.5, 0.5, 'No profit data available',
horizontalalignment='center',
verticalalignment='center',
transform=axs[1, 3].transAxes)
axs[1, 3].set_title('Cumulative Profit Over Time')
else:
# Calculate cumulative profit
# Additional Plot: Cumulative Profit Over Time
# Sort trades by exit time
# # Set the limits for the x-axis to cover the full range of 'times'
# axs[1, 3].set_xlim(times.min(), times.max())
sorted_trades = sorted([trade for trade in trades if trade.status == TradeStatus.CLOSED],
key=lambda x: x.exit_time)
cumulative_profits_sorted = np.cumsum([trade.profit for trade in sorted_trades])
exit_times_sorted = [trade.exit_time for trade in sorted_trades if trade.exit_time is not None]
# Create a secondary y-axis for the closing prices
ax2 = axs[1, 3].twinx()
ax2.plot(times, closing_prices, label='Closing Price', color='orange')
ax2.set_ylabel('Closing Price', color='orange')
ax2.tick_params(axis='y', labelcolor='orange')
axs[1, 3].set_xlim(times.min(), times.max())
# Plot Cumulative Profit Over Time on the primary y-axis
axs[1, 3].plot(exit_times_sorted, cumulative_profits_sorted, label='Cumulative Profit', color='blue')
axs[1, 3].set_xlabel('Time')
axs[1, 3].set_ylabel('Cumulative Profit', color='blue')
axs[1, 3].tick_params(axis='y', labelcolor='blue')
# Format dates on the x-axis
axs[1, 3].xaxis.set_major_formatter(mdates.DateFormatter('%d.%m.', tz=zoneNY))
axs[1, 3].tick_params(axis='x', rotation=45) # Rotate date labels if necessary
# Set the title
axs[1, 3].set_title('Cumulative Profit and Closing Price Over Time')
# Add legends to the plot
# axs[1, 3].legend(loc='upper left')
# ax2.legend(loc='upper right')
# Plot 9
# - for 1 day: Daily Relative Profit Chart
# - for more days: Heatmap of Profits (based on Entry time)
if len(runner_ids) == 1:
daily_rel_profits = [trade.rel_profit for trade in closed_trades if trade.rel_profit is not None]
sns.lineplot(x=range(len(daily_rel_profits)), y=daily_rel_profits, ax=axs[2, 0])
axs[2, 0].set_title('Daily Relative Profit')
else:
# Creating a DataFrame for the heatmap
heatmap_data_list = []
for trade in trades:
if trade.status == TradeStatus.CLOSED:
day = trade.entry_time.strftime('%m-%d') # Format date as 'MM-DD'
#day = trade.entry_time.date()
hour = trade.entry_time.hour
profit = trade.profit
heatmap_data_list.append({'Day': day, 'Hour': hour, 'Profit': profit})
heatmap_data = pd.DataFrame(heatmap_data_list)
heatmap_data = heatmap_data.groupby(['Day', 'Hour']).sum().reset_index()
heatmap_pivot = heatmap_data.pivot(index='Day', columns='Hour', values='Profit')
# Heatmap of Profits
sns.heatmap(heatmap_pivot, cmap='viridis', ax=axs[2, 0])
axs[2, 0].set_title('Heatmap of Profits (based on Entry time)')
axs[2, 0].set_xlabel('Hour of Day')
axs[2, 0].set_ylabel('Day')
# Plot 10: Profits Based on Hour of the Day (Entry)
entry_hours = [trade.entry_time.hour for trade in closed_trades if trade.entry_time is not None]
profits_by_hour = {}
for hour, trade in zip(entry_hours, closed_trades):
if hour not in profits_by_hour:
profits_by_hour[hour] = 0
profits_by_hour[hour] += trade.profit
# Sorting by hour for plotting
sorted_hours = sorted(profits_by_hour.keys())
sorted_profits = [profits_by_hour[hour] for hour in sorted_hours]
if sorted_profits:
sns.barplot(x=sorted_hours, y=sorted_profits, ax=axs[2, 1])
axs[2, 1].set_title('Profits by Hour of Day (Entry)')
axs[2, 1].set_xlabel('Hour of Day')
axs[2, 1].set_ylabel('Profit')
else:
# Handle the case where sorted_profits is empty
axs[2, 1].text(0.5, 0.5, 'No data available',
horizontalalignment='center',
verticalalignment='center',
transform=axs[2, 1].transAxes)
axs[2, 1].set_title('Profits by Hour of Day (Entry)')
# Plot 11: Profits Based on Hour of the Day - based on Exit
exit_hours = [trade.exit_time.hour for trade in closed_trades if trade.exit_time is not None]
profits_by_hour = {}
for hour, trade in zip(exit_hours, closed_trades):
if hour not in profits_by_hour:
profits_by_hour[hour] = 0
profits_by_hour[hour] += trade.profit
# Sorting by hour for plotting
sorted_hours = sorted(profits_by_hour.keys())
sorted_profits = [profits_by_hour[hour] for hour in sorted_hours]
if sorted_profits:
sns.barplot(x=sorted_hours, y=sorted_profits, ax=axs[2, 2])
axs[2, 2].set_title('Profits by Hour of Day (Exit)')
axs[2, 2].set_xlabel('Hour of Day')
axs[2, 2].set_ylabel('Profit')
else:
# Handle the case where sorted_profits is empty
axs[2, 2].text(0.5, 0.5, 'No data available',
horizontalalignment='center',
verticalalignment='center',
transform=axs[2, 2].transAxes)
axs[2, 2].set_title('Profits by Hour of Day (Exit)')
# Plot 12: Calculate profits by day of the week
day_of_week_profits = {i: 0 for i in range(7)} # Dictionary to store profits for each day of the week
for trade in trades:
if trade.status == TradeStatus.CLOSED:
day_of_week = trade.exit_time.weekday() # Monday is 0 and Sunday is 6
day_of_week_profits[day_of_week] += trade.profit
days = ['Mo', 'Tue', 'Wed', 'Thu', 'Fri']
# Additional Plot: Strategy Performance by Day of the Week
axs[2, 3].bar(days, [day_of_week_profits[i] for i in range(5)])
axs[2, 3].set_title('Profit by Day of the Week')
axs[2, 3].set_xlabel('Day of the Week')
axs[2, 3].set_ylabel('Cumulative Profit')
#filename
file = batch_id if batch_id is not None else runner_ids[0]
image_file_name = f"{file}.png"
image_path = str(MEDIA_DIRECTORY / "basic" / image_file_name)
# Adjust layout and save the combined plot as an image
plt.tight_layout()
if stream is False:
plt.savefig(image_path)
plt.close()
return 0, None
else:
# Return the image as a BytesIO stream
img_stream = BytesIO()
plt.savefig(img_stream, format='png')
plt.close()
img_stream.seek(0) # Rewind the stream to the beginning
return 0, img_stream
##Generates BATCH REPORT again for the given batch_id
##USAGE: python createbatchimage.py <batch_id>
#Parse the command-line arguments
#parser = argparse.ArgumentParser(description="Generate trading report image with batch ID")
# parser.add_argument("server", type=str, help="The server IP for the report")
# parser.add_argument("batch_id", type=str, help="The batch ID for the report")
# args = parser.parse_args()
# batch_id = args.batch_id
# server = args.server
# Generate the report image, using local copy, which will replace the current e0639b45 4bea3a54
res, val = generate_trading_report_image(batch_id="4bea3a54", server="142.132.188.109")
# Print the result
if res == 0:
print("BATCH REPORT CREATED")
else:
print(f"BATCH REPORT ERROR - {val}")

View File

@ -0,0 +1,320 @@
import matplotlib
import matplotlib.dates as mdates
#matplotlib.use('Agg') # Set the Matplotlib backend to 'Agg'
import matplotlib.pyplot as plt
import seaborn as sns
import pandas as pd
from datetime import datetime
from typing import List
from enum import Enum
import numpy as np
import v2realbot.controller.services as cs
from rich import print as richprint
from v2realbot.common.model import AnalyzerInputs
from v2realbot.common.PrescribedTradeModel import TradeDirection, TradeStatus, Trade, TradeStoplossType
from v2realbot.utils.utils import isrising, isfalling,zoneNY, price2dec, safe_get#, print
from pathlib import Path
from v2realbot.config import WEB_API_KEY, DATA_DIR, MEDIA_DIRECTORY
from v2realbot.enums.enums import RecordType, StartBarAlign, Mode, Account, OrderSide
from io import BytesIO
from v2realbot.utils.historicals import get_historical_bars
from alpaca.data.timeframe import TimeFrame, TimeFrameUnit
from collections import defaultdict
from scipy.stats import zscore
from io import BytesIO
from typing import Tuple, Optional, List
from v2realbot.common.PrescribedTradeModel import TradeDirection, TradeStatus, Trade, TradeStoplossType
from collections import Counter
import vectorbtpro as vbt
# Function to add 23 seconds to the last datetime (if it exists and is the same day)
def adjust_datetime_iteratively(df, resolution):
adjusted_times = []
for i, current_time in enumerate(df.index):
if i == 0:
# The first entry is unchanged
adjusted_times.append(current_time)
continue
previous_time = adjusted_times[-1]
# Check if it's the same day
if previous_time.date() == current_time.date():
# Add resolution to the previous datetime
adjusted_time = previous_time + pd.Timedelta(seconds=resolution)
else:
# Different day, leave it as is
adjusted_time = current_time
adjusted_times.append(adjusted_time)
# Update DataFrame index
df.index = pd.DatetimeIndex(adjusted_times)
return df
def convert_to_dataframe(ohlcv):
"""
Convert a dictionary containing OHLCV data into a pandas DataFrame.
Parameters:
ohlcv (dict): Dictionary containing OHLCV data.
It should have keys 'time', 'open', 'high', 'low', 'close', 'volume', 'updated'.
'time' should be a list of float timestamps.
'updated' should be a list of Python datetimes in UTC time zone.
Returns:
pd.DataFrame: DataFrame containing the OHLCV data with the index converted to East coast US time.
"""
#pokud existuje key index, tak menime na custom_index, aby nedelal neplechu v pd
try:
if ohlcv.get('index', False):
ohlcv['custom_index'] = ohlcv.pop('index')
except Exception as e:
pass
#keys that should not go uppercase letter first
keys_not_to_upper = ["time", "updated"]
# Update keys not in the exclusion list
for key in list(ohlcv.keys()): # Iterate over a copy of the keys
if key not in keys_not_to_upper:
ohlcv[key.title()] = ohlcv.pop(key)
# Create DataFrame from the dictionary
df = pd.DataFrame(ohlcv)
# Convert 'time' to datetime and set as index
df['time'] = pd.to_datetime(df['time'], unit='s', utc=True)
df.set_index('time', inplace=True)
# Convert index to East coast US time zone
df.index = df.index.tz_convert('US/Eastern')
if 'updated' in df.columns:
df['updated'] = pd.to_datetime(df['updated'], unit='s', utc=True)
df['updated'] = df['updated'].dt.tz_convert('US/Eastern')
return df
def print(v, *args, **kwargs):
if v:
richprint(*args, **kwargs)
def load_batch(runner_ids: List = None, batch_id: str = None, space_resolution_evenly = False, main_session_only = True, merge_ind2bars = True, bars_columns = ['Open', 'High', 'Low', 'Close', 'Volume', 'Vwap'], indicators_columns = [], verbose = False) -> Tuple[int, dict]:
"""Load batches (all runners from single batch) into pandas dataframes
Args:
runner_ids (List, optional): A list of runner identifiers (e.g., stock tickers). Defaults to None.
batch_id (str, optional): The ID of a specific batch to retrieve. Defaults to None.
merge_ind2bars (bool, optional): merge indicator into bars dataframe. Defaults to True.
bars_columns (list, optional): List of columns to keep in bars df. Defaults to ['Open', 'High', 'Low', 'Close', 'Volume', 'Vwap'].
indicators_columns (list, optional): List of columns to keep in indicators df. Defaults to an empty list.
space_resoution_evenly: If True then, it alters index so it is spaced evenly in given resolution in ['resooution']
Returns:
Tuple[int, dict]: A tuple containing:
* An integer potentially representing a status code or data count.
* A dictionary with keys bars, indicators and cbar_indicators - with pandas dataframe
"""
if runner_ids is None and batch_id is None:
return -2, f"runner_id or batch_id must be present", 0
if batch_id is not None:
res, runner_ids =cs.get_archived_runnerslist_byBatchID(batch_id)
if res != 0:
print(f"no batch {batch_id} found")
return -1, f"no batch {batch_id} found", 0
#DATA PREPARATION
bars = None
indicators = None
cnt = 0
dfs = dict(bars=[], indicators=[],cbar_indicators=[])
resolution = None
for id in runner_ids:
cnt += 1
#get runner detail
res, sada =cs.get_archived_runner_details_byID(id)
if res != 0:
print(f"no runner {id} found")
return -1, f"no runner {id} found", 0
if resolution is None:
resolution = sada["bars"]["resolution"][0]
print(verbose, f"Resolution : {resolution}")
#add daily bars limited to required columns, we keep updated as its mapping column to indicators
bars = convert_to_dataframe(sada["bars"])[bars_columns + ["updated"]]
#bars = bars.loc[:, bars_columns]
indicators = convert_to_dataframe(sada["indicators"][0])[indicators_columns]
#join indicators to bars dataframe
if merge_ind2bars:
#merge, time v indicators odpovida udpated v bars
bars = bars.reset_index()
bars = pd.merge(bars, indicators, left_on="updated", right_on="time", how="left")
bars = bars.set_index("time")
else:
dfs["indicators"].append(indicators)
#drop updated as mapping column
#bars = bars.drop("updated", axis=1)
dfs["bars"].append(bars)
#indicators = sada["indicators"][0]
#cbar_indicators = sada["indicators"][1]
#merge all days into single df
for key in dfs:
if len(dfs[key])>0:
concat_df = pd.concat(dfs[key], axis=0)
concat_df = concat_df.between_time('9:30', '16:00') if main_session_only else concat_df
# Count the number of duplicates (excluding the first occurrence)
num_duplicates = concat_df.index.duplicated().sum()
if num_duplicates > 0:
print(verbose, f"NOTE: DUPLICATES {num_duplicates}/{len(concat_df)} in {key}. REMOVING.")
concat_df = concat_df[~concat_df.index.duplicated()]
num_duplicates = concat_df.index.duplicated().sum()
print(verbose, f"Now there are {num_duplicates}/{len(concat_df)}")
if space_resolution_evenly and key != "cbar_indicators":
# Apply rounding to the datetime index according to resolution (in seconds)
concat_df = adjust_datetime_iteratively(concat_df, resolution)
dfs[key] = concat_df
return 0, dfs
if __name__ == "__main__":
res, df = load_batch(batch_id="e44a5075", space_resolution_evenly=True, indicators_columns=["Rsi14"], main_session_only=False)
if res < 0:
print("Error" + str(res) + str(df))
print(df)
df = df["bars"]
print(df.info(), df.head())
#filter columns
#columns_to_keep = ['Open', 'High', 'Low', 'Close', 'Volume', 'Vwap']
#df = df.loc[:, columns_to_keep]
#df = df.rename(columns={'index': 'custom_index'})
print(df.info(), df.head(), df.describe())
#filter times
#df = df.between_time('9:30', '16:00')
print(df.info())
# Set the frequency to 23 seconds
#df.index.freq = pd.tseries.offsets.Second(23)
# Check the frequency of the index
# Resample and aggregate the data
# resampled_df = df.resample('23S').agg({
# 'open': 'first',
# 'high': 'max',
# 'low': 'min',
# 'close': 'last',
# 'volume': 'sum'
# })
#df.index.freq = pd.infer_freq(df.index)
#print(df.index.freq)
# Set the frequency of the index explicitly - if it exists like 1T etc, if doesnt exists then custom_frequency will be used
#df.index.freq = pd.date_range(start=df.index[0], periods=len(df), freq='23S')
print(df.info())
vbt.settings.set_theme("dark")
vbt.settings['plotting']['layout']['width'] = 1280
vbt.settings.plotting.auto_rangebreaks = True
#naloadujeme do vbt symbol as column
bar_data = vbt.Data.from_data({"BAC": df}, tz_convert="US/Eastern")
print(bar_data)
print(bar_data.close)
print(bar_data.data["BAC"]["Rsi14"])
bar_data.data["BAC"]["Rsi14"].vbt.plot().show()
print(bar_data["Rsi14"])
#ohlcv plot (sublot 2x1)
bar_data.data["BAC"].vbt.ohlcv.plot().show()
#create two subplots 3x1 (ohlcv + RSI)
# fig = vbt.make_subplots(rows=3, cols=1)
# bar_data.data["BAC"].vbt.ohlcv.plot(add_trace_kwargs=dict(row=1, col=1),fig=fig)
# bar_data.data["BAC"]["Rsi14"].vbt.plot(add_trace_kwargs=dict(row=3, col=1),fig=fig)
# fig.show()
#create subplots with alternate Y axis - RSI overlay
fig1 = vbt.make_subplots(specs=[[{"secondary_y": True}]])
bar_data.data["BAC"]["Close"].vbt.plot(add_trace_kwargs=dict(secondary_y=False),fig=fig1)
bar_data.data["BAC"].vbt.plot(add_trace_kwargs=dict(secondary_y=True),fig=fig1)
fig1.show()
puv_df = bar_data.data["BAC"]
bar_data23s = bar_data[["Open", "High", "Low", "Close", "Volume"]]
print(bar_data23s)
#resample by vbt
bar_data46s = bar_data23s.get().resample("46s").agg({
"Open": "first",
"High": "max",
"Low": "min",
"Close": "last",
"Volume": "sum"
})
print(bar_data46s)
res_data = bar_data46s.data["BAC"]
#bar_data23s.data["BAC"].ptable()
#bar_data23s = bar_data.resample("23S")
print(bar_data46s)
print(bar_data46s.close)
vbt.settings.plotting.auto_rangebreaks = True
bar_data46s.data["BAC"].vbt.ohlcv.plot().show()
#TARGET DAYS - only one day or range
# Target Date
#target_date = pd.to_datetime('2023-10-12', tz='US/Eastern')
# Date Range
start_date = pd.to_datetime('2024-03-12')
#end_date = pd.to_datetime('2023-10-14')
new_data = bar_data.transform(lambda df: df[df.index.date == start_date.date()])
#range filtered_data = data[(data.index >= start_date) & (data.index <= end_date)
print(new_data)
new_data.data["BAC"].vbt.ohlcv.plot().show()
# Filtering RANGE or DAY
# filtered_data = data[(data.index >= start_date) & (data.index <= end_date)]g
# filtered_data = data[data.index.date == target_date.date()]
#custom aggregagation
# ohlcv_agg = pd.DataFrame({
# 'Open': df.resample('1T')['Open'].first(),
# 'High': df.resample('1T')['High'].max(),
# 'Low': df.resample('1T')['Low'].min(),
# 'Close': df.resample('1T')['Close'].last(),
# 'Volume': df.resample('1T')['Volume'].sum()
# })
#Define a custom frequency with a timedelta of 23 seconds
# custom_frequency = pd.tseries.offsets.DateOffset(seconds=23)
# # Create a new DataFrame with the desired frequency
# new_index = pd.date_range(start=df.index[0], end=df.index[-1], freq=custom_frequency)
# new_df = pd.DataFrame(index=new_index)
# # Reindex the DataFrame
# df = df.reindex(new_df.index)
# # Now you can check the frequency of the index
# print(df.index.freq)

View File

@ -1,6 +1,6 @@
import numpy as np
from scipy.interpolate import interp1d
import matplotlib.pyplot as plt
# Sem zadat pattern X a Y pro VIZUALIZACI
#minutes
@ -14,16 +14,34 @@ pattern_y = [0.1, 0.5, 0.8, 1, 0.6, 0.1]
#celkový profit
# Generating a range of input values for interpolation
input_values = np.linspace(min(pattern_x), max(pattern_x), 500)
multipliers = np.interp(input_values, pattern_x, pattern_y)
input_values = np.linspace(min(pattern_x), max(pattern_x), 1000) # Increase the number of points
# Plotting
# Bezier interpolation
interp_func = interp1d(pattern_x, pattern_y, kind='cubic')
multipliers_cubic = interp_func(input_values)
multipliers_linear = np.interp(input_values, pattern_x, pattern_y)
# Plotting multipliers_linear and multipliers_cubic on the same canvas
plt.figure(figsize=(10, 6))
plt.plot(pattern_x, pattern_y, 'o', label='Original Points')
plt.plot(input_values, multipliers, label='Interpolated Values')
plt.plot(input_values, multipliers_linear, label='Linear Interpolation')
plt.plot(input_values, multipliers_cubic, label='Cubic Interpolation')
plt.xlabel('X values')
plt.ylabel('Interpolated Multipliers')
plt.title('Interpolation Chart')
plt.title('Interpolation Comparison')
plt.legend()
plt.grid(True)
plt.show()
# # Plotting multipliers_cubic
# plt.figure(figsize=(10, 6))
# plt.plot(pattern_x, pattern_y, 'o', label='Original Points')
# plt.plot(input_values, multipliers_cubic, label='Cubic Interpolation')
# plt.xlabel('X values')
# plt.ylabel('Interpolated Multipliers')
# plt.title('Cubic Interpolation Chart')
# plt.legend()
# plt.grid(True)
# plt.show()

View File

@ -0,0 +1,64 @@
from alpaca.data.enums import DataFeed
from v2realbot.enums.enums import FillCondition
#Separate file that contains default values for all config variables
#they are loaded by the config_handler and then can be overriden on the fly
#by configuration profiles
#note if the type is not simple (enum etc.) dont forget to add it to config_handler get_val function to transform
#PREMIUM pro MARKET order, if positive it means absolute value (0.005), if negative it means pct (0.0167) #0.005 is approximately 0.0167% of base price 30.
BT_FILL_PRICE_MARKET_ORDER_PREMIUM=0.005
#no dense print in the console
QUIET_MODE=True
BT_FILL_CONS_TRADES_REQUIRED=2
BT_FILL_LOG_SURROUNDING_TRADES= 10
LIVE_DATA_FEED=DataFeed.IEX
OFFLINE_MODE = False
#minimalni vzdalenost mezi trady, kterou agregator pousti pro CBAR(0.001 - blokuje mensi nez 1ms)
GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN = 0.003
#normalized price for tick 0.01
NORMALIZED_TICK_BASE_PRICE = 30.00
#DEFAULT AGGREGATOR filter trades
#NOTE pridana F - Inter Market Sweep Order - obcas vytvarela spajky
AGG_EXCLUDED_TRADES = ['C','O','4','B','7','V','P','W','U','Z','F']
#how many consecutive trades with the fill price are necessary for LIMIT fill to happen in backtesting
#0 - optimistic, every knot high will fill the order
#N - N consecutive trades required
#not impl.yet
#minimum is 1, na alpace live to vetsinou vychazi 7-8 u BAC, je to hodne podobne tomu, nez je cena překonaná pul centu. tzn. 7-8 a nebo FillCondition.SLOW
BT_FILL_CONS_TRADES_REQUIRED = 2
#during bt trade execution logs X-surrounding trades of the one that triggers the fill
BT_FILL_LOG_SURROUNDING_TRADES = 10
#fill condition for limit order in bt
# fast - price has to be equal or bigger <=
# slow - price has to be bigger <
BT_FILL_CONDITION_BUY_LIMIT = FillCondition.SLOW
BT_FILL_CONDITION_SELL_LIMIT = FillCondition.SLOW
#backend counter of api requests
COUNT_API_REQUESTS = False
# ilog lvls = 0,1 - 0 debug, 1 info
ILOG_SAVE_LEVEL_FROM = 1
#currently only prod server has acces to LIVE
PROD_SERVER_HOSTNAMES = ['tradingeastcoast','David-MacBook-Pro.local'] #,'David-MacBook-Pro.local'
TEST_SERVER_HOSTNAMES = ['tradingtest']
""""
LATENCY DELAYS for LIVE eastcoast
.000 trigger - last_trade_time (.4246266)
+.020 vstup do strategie a BUY (.444606)
+.023 submitted (.469198)
+.008 filled (.476695552)
+.023 fill not(.499888)
"""
BT_DELAYS = {
"trigger_to_strat": 0.020,
"strat_to_sub": 0.023,
"sub_to_fill": 0.008,
"fill_to_not": 0.023,
#doplnit dle live
"limit_order_offset": 0,
}
#cfh.config_handler.get_val('BT_DELAYS','trigger_to_strat')

View File

@ -0,0 +1,121 @@
import v2realbot.controller.configs as cfgservices
import orjson
from traceback import format_exc
from alpaca.data.enums import DataFeed
import v2realbot.utils.config_defaults as config_defaults
from v2realbot.enums.enums import FillCondition
from rich import print
# from v2realbot.utils.utils import print
def aggregate_configurations(module):
return {key: getattr(module, key) for key in dir(module) if key.isupper()}
#config handler - signleton pattern
#details https://chat.openai.com/share/e056af70-76da-4dbe-93a1-ecf99f0b0f29
#it is initialized on app start, loading default and updating based on active_profile settings
#also there is handler for updating active_profile which changes it immediately (in controller.config.update_config_item)
class ConfigHandler:
_instance = None
#this ensure that it is created only once
def __new__(cls):
if cls._instance is None:
cls._instance = super(ConfigHandler, cls).__new__(cls)
# Initialize your default config here in __new__, since it's only done once
# Default configuration
# Dynamically create the configuration dictionary
cls.default_config = aggregate_configurations(config_defaults)
cls._instance.active_config = cls._instance.default_config.copy()
cls._instance.active_profile = "default"
#if there is profile to be activated, it is loaded overriding default
cls._instance.activate_profile()
return cls._instance
def load_profile(self, profile_name):
"""
Load configuration profiles, JSON with all profiles is stored in config item 'profiles'
"""
try:
config_directive = "profiles"
ret, res = cfgservices.get_config_item_by_name(config_directive)
if ret < 0:
print(f"CONFIG OVERRIDE {config_directive} Error {res}")
return
else:
fetched_dict = orjson.loads(res["json_data"])
override_configuration = fetched_dict.get(profile_name, None)
if override_configuration is not None:
#first reset to default then override profile on top of them
self.active_config = self.default_config.copy()
self.active_config.update(override_configuration)
self.active_profile = profile_name
#print(f"Profile {profile_name} loaded successfully.")
#print("Current values:", self.active_config)
else:
print(f"Profile {profile_name} does not exist in config item: {config_directive}")
except Exception as e:
print(f"Error while fetching {profile_name} error:" + str(e) + format_exc())
def activate_profile(self):
"""
Activates the profiles which is stored in configuration as currently active.
"""
try:
config_directive = "active_profile"
ret, res = cfgservices.get_config_item_by_name(config_directive)
if ret < 0:
print(f"ERROR fetching item {config_directive} Error {res}")
return
else:
fetched_dict = orjson.loads(res["json_data"])
active_profile = fetched_dict.get("ACTIVE_PROFILE", None)
if active_profile is not None:
print("Activating profile", active_profile)
self.load_profile(active_profile)
else:
print("No ACTIVE_PROFILE element in config item: " + config_directive)
except Exception as e:
print(f"Error while activating profile:" + str(e) + format_exc())
def get_val(self, key, subkey=None):
"""
Retrieve a configuration value by key and optionally transforms to appropriate type
Also supports nested dictionaries - with subkeys
"""
value = self.active_config.get(key, None)
if subkey and isinstance(value, dict):
return value.get(subkey, None)
match key:
case "LIVE_DATA_FEED":
return DataFeed(value) # Convert to DataFeed enum
case "BT_FILL_CONDITION_BUY_LIMIT":
return FillCondition(value)
case "BT_FILL_CONDITION_SELL_LIMIT":
return FillCondition(value)
case "AGG_EXCLUDED_TRADES":
return sorted(value) # Convert to sorted
# Add cases for other enumeration conversions or transformations as needed
case _:
return value
def print_current_config(self):
print(f"Active profile {self.active_profile} conf_values: {str(self.active_config)}")
# Global configuratio - it is imported by modules that need it. In the future can be changed to Dependency Ingestion (each service will have the config instance as input parameter)
config_handler = ConfigHandler()
#print(f"{config_handler.active_profile=}")
#print("config handler initialized")
#this is how to get value
#config_handler.get_val('BT_FILL_PRICE_MARKET_ORDER_PREMIUM')
# config_handler.load_profile('profile1') # Assuming 'profile1.json' exists
# print(f"{config_handler.active_profile=}")
# config_handler.load_profile('profile2') # Assuming 'profile1.json' exists
# print(f"{config_handler.active_profile=}")
# config_handler.activate_profile() # Switch to profile according to active_profile directive

View File

@ -3,8 +3,8 @@ from alpaca.data.requests import StockLatestQuoteRequest, StockBarsRequest, Stoc
from alpaca.data import Quote, Trade, Snapshot, Bar
from alpaca.data.models import BarSet, QuoteSet, TradeSet
from alpaca.data.timeframe import TimeFrame, TimeFrameUnit
from v2realbot.utils.utils import zoneNY
from v2realbot.config import ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY
from v2realbot.utils.utils import zoneNY, send_to_telegram
from v2realbot.config import ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, ACCOUNT1_PAPER_FEED
from alpaca.data.enums import DataFeed
from datetime import datetime, timedelta
import pandas as pd
@ -12,6 +12,8 @@ from rich import print
from collections import defaultdict
from pandas import to_datetime
from msgpack.ext import Timestamp
import time
from traceback import format_exc
def convert_historical_bars(daily_bars):
"""Converts a list of daily bars into a dictionary with the specified keys.
@ -49,6 +51,9 @@ def convert_historical_bars(daily_bars):
for i in range(len(daily_bars)):
bar = daily_bars[i]
if bar is None:
continue
# Calculate the HLCC4 indicator
hlcc4 = (bar['h'] + bar['l'] + bar['c'] + bar['o']) / 4
datum = to_datetime(bar['t'], utc=True)
@ -80,15 +85,48 @@ def get_todays_open():
pass
##vrati historicke bary v nasem formatu
def get_historical_bars(symbol: str, time_from: datetime, time_to: datetime, timeframe: TimeFrame):
stock_client = StockHistoricalDataClient(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, raw_data=True)
# snapshotRequest = StockSnapshotRequest(symbol_or_symbols=[symbol], feed=DataFeed.SIP)
# snapshotResponse = stock_client.get_stock_snapshot(snapshotRequest)
# print("snapshot", snapshotResponse)
# def get_historical_bars(symbol: str, time_from: datetime, time_to: datetime, timeframe: TimeFrame):
# stock_client = StockHistoricalDataClient(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, raw_data=True)
# # snapshotRequest = StockSnapshotRequest(symbol_or_symbols=[symbol], feed=DataFeed.SIP)
# # snapshotResponse = stock_client.get_stock_snapshot(snapshotRequest)
# # print("snapshot", snapshotResponse)
bar_request = StockBarsRequest(symbol_or_symbols=symbol,timeframe=timeframe, start=time_from, end=time_to, feed=DataFeed.SIP)
bars: BarSet = stock_client.get_stock_bars(bar_request)
#print("puvodni bars", bars["BAC"])
if bars[symbol][0] is None:
return None
return convert_historical_bars(bars[symbol])
# bar_request = StockBarsRequest(symbol_or_symbols=symbol,timeframe=timeframe, start=time_from, end=time_to, feed=DataFeed.SIP)
# bars: BarSet = stock_client.get_stock_bars(bar_request)
# #print("puvodni bars", bars["BAC"])
# if bars[symbol][0] is None:
# return None
# return convert_historical_bars(bars[symbol])
def get_historical_bars(symbol: str, time_from: datetime, time_to: datetime, timeframe: TimeFrame, max_retries=5, backoff_factor=1):
"""
Fetches historical bar data with retries on failure.
:param symbol: Stock symbol.
:param time_from: Start time for the data.
:param time_to: End time for the data.
:param timeframe: Timeframe for the data.
:param max_retries: Maximum number of retries.
:param backoff_factor: Factor to determine the next sleep time.
:return: Converted historical bar data.
:raises: Exception if all retries fail.
"""
stock_client = StockHistoricalDataClient(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, raw_data=True)
bar_request = StockBarsRequest(symbol_or_symbols=symbol, timeframe=timeframe, start=time_from, end=time_to, feed=ACCOUNT1_PAPER_FEED)
last_exception = None
for attempt in range(max_retries):
try:
bars = stock_client.get_stock_bars(bar_request)
if bars[symbol][0] is None:
return None
return convert_historical_bars(bars[symbol])
except Exception as e:
print(f"Load historical bars Attempt {attempt + 1} failed: {str(e)} and {format_exc()}")
last_exception = e
time.sleep(backoff_factor * (2 ** attempt))
print("All attempts to fetch historical bar data failed.")
send_to_telegram(f"Failed to fetch historical bar data after {max_retries} retries. Last exception: {str(last_exception)} and {format_exc()}")
raise Exception(f"Failed to fetch historical bar data after {max_retries} retries. Last exception: {str(last_exception)} and {format_exc()}")

View File

@ -6,7 +6,7 @@ import json
from datetime import datetime
from v2realbot.enums.enums import RecordType, StartBarAlign, Mode, Account
from v2realbot.common.db import pool, insert_queue
import sqlite3
#standardne vraci pole tuplů, kde clen tuplu jsou sloupce

View File

@ -1,11 +1,11 @@
import socket
from v2realbot.enums.enums import Env
from v2realbot.config import PROD_SERVER_HOSTNAMES, TEST_SERVER_HOSTNAMES
import v2realbot.utils.config_handler as cfh
def get_environment():
"""Determine if the current server is production or test based on hostname."""
hostname = socket.gethostname()
if hostname in PROD_SERVER_HOSTNAMES:
if hostname in cfh.config_handler.get_val('PROD_SERVER_HOSTNAMES'):
return Env.PROD
else:
return Env.TEST

View File

@ -13,19 +13,119 @@ from v2realbot.common.model import StrategyInstance, Runner, RunArchive, RunArch
from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeStatus, TradeStoplossType
from typing import List
import tomli
from v2realbot.config import DATA_DIR, QUIET_MODE,NORMALIZED_TICK_BASE_PRICE
from v2realbot.config import DATA_DIR, ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY
import requests
from uuid import UUID
#from decimal import Decimal
from enum import Enum
#from v2realbot.enums.enums import Order
from v2realbot.common.model import Order as btOrder, TradeUpdate as btTradeUpdate
from alpaca.trading.models import Order, TradeUpdate
from alpaca.trading.models import Order, TradeUpdate, Calendar
import numpy as np
import pandas as pd
from collections import deque
import socket
import numpy as np
from alpaca.trading.requests import GetCalendarRequest
from alpaca.trading.client import TradingClient
import time as timepkg
from traceback import format_exc
import re
import tempfile
import shutil
from filelock import FileLock
import v2realbot.utils.config_handler as cfh
import pandas_market_calendars as mcal
def validate_and_format_time(time_string):
"""
Validates if the given time string is in the format HH:MM or H:MM.
If valid, returns the standardized time string in HH:MM format.
Args:
time_string (str): The time string to validate.
Returns:
str or None: Standardized time string in HH:MM format if valid,
None otherwise.
"""
# Regular expression for matching the time format H:MM or HH:MM
time_pattern = re.compile(r'^([0-1]?[0-9]|2[0-3]):([0-5][0-9])$')
# Checking if the time string matches the pattern
if time_pattern.match(time_string):
# Standardize the time format to HH:MM
standardized_time = datetime.strptime(time_string, '%H:%M').strftime('%H:%M')
return standardized_time
else:
return None
def fetch_calendar_data(start: datetime, end: datetime) -> List[Calendar]:
"""
Fetches the trading schedule for the NYSE (New York Stock Exchange) between the specified start and end dates.
Args:
start (datetime): The start date for the trading schedule.
end (datetime): The end date for the trading schedule.
Returns:
List[Calendar]: A list of Calendar objects containing the trading dates and market open/close times.
Returns an empty list if no trading days are found within the specified range.
"""
nyse = mcal.get_calendar('NYSE')
schedule = nyse.schedule(start_date=start, end_date=end, tz='America/New_York')
if not schedule.empty:
schedule = (schedule.reset_index()
.rename(columns={"index": "date", "market_open": "open", "market_close": "close"})
.assign(date=lambda day: day['date'].dt.date.astype(str),
open=lambda day: day['open'].dt.strftime('%H:%M'),
close=lambda day: day['close'].dt.strftime('%H:%M'))
.to_dict(orient="records"))
cal_dates = [Calendar(**record) for record in schedule]
return cal_dates
else:
return []
#Alpaca Calendar wrapper with retry
def fetch_calendar_data_from_alpaca(start, end, max_retries=5, backoff_factor=1):
"""
Attempts to fetch calendar data with exponential backoff. Raises an exception if all retries fail.
TODO sem pridat local caching mechanism
:param client: Alpaca API client instance.
:param start: The start date for the calendar data.
:param end: The end date for the calendar data.
:param max_retries: Maximum number of retries.
:param backoff_factor: Factor to determine the next sleep time.
:return: Calendar data.
:raises: ConnectionError if all retries fail.
"""
# Ensure start and end are of type datetime.date
if isinstance(start, datetime):
start = start.date()
if isinstance(end, datetime):
end = end.date()
# Verify that start and end are datetime.date objects after conversion
if not all([isinstance(start, date), isinstance(end, date)]):
raise ValueError("start and end must be datetime.date objects")
clientTrading = TradingClient(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, raw_data=False)
calendar_request = GetCalendarRequest(start=start, end=end)
last_exception = None
for attempt in range(max_retries):
try:
cal_dates = clientTrading.get_calendar(calendar_request)
richprint("Calendar data fetch successful", start, end)
return cal_dates
except Exception as e:
richprint(f"Attempt {attempt + 1} failed: {e}")
last_exception = e
timepkg.sleep(backoff_factor * (2 ** attempt))
richprint("****All attempts to fetch calendar data failed.****")
send_to_telegram(f"FETCH_CALENDER_DATA_FAILED. {str(last_exception)} and {format_exc()} BACKEST STOPPED" )
raise ConnectionError(f"Failed to fetch calendar data after {max_retries} retries. Last exception: {str(last_exception)} and {format_exc()}")
def concatenate_weekdays(weekday_filter):
# Mapping of weekdays where 0 is Monday and 6 is Sunday
@ -37,21 +137,45 @@ def concatenate_weekdays(weekday_filter):
# Concatenate the weekday strings
return ','.join(weekday_strings)
def slice_dict_lists(d, last_item, to_tmstp = False):
def filter_timeseries_by_timestamp(timeseries, timestamp):
"""
Filter a timeseries dictionary, returning a new dictionary with entries
where the time value is greater than the provided timestamp.
Parameters:
- timeseries (dict): The original timeseries dictionary.
- timestamp (float): The timestamp to filter the timeseries by.
Returns:
- dict: A new timeseries dictionary filtered based on the provided timestamp.
"""
# Find indices where time values are greater than the provided timestamp
indices = [i for i, time in enumerate(timeseries['time']) if time > timestamp]
# Create a new dictionary with values filtered by the indices
filtered_timeseries = {key: [value[i] for i in indices] for key, value in timeseries.items()}
return filtered_timeseries
def slice_dict_lists(d, last_item, to_tmstp = False, time_to_datetime = False):
"""Slices every list in the dictionary to the last last_item items.
Args:
d: A dictionary.
last_item: The number of items to keep at the end of each list.
to_tmstp: For "time" elements change it to timestamp from datetime if required.
to_tmstp: For "time" elements change it from datetime to timestamp from datetime if required.
time_to_datetime: For "time" elements change it from timestamp to datetime UTC if required.
Returns:
A new dictionary with the sliced lists.
datetime.fromtimestamp(data['updated']).astimezone(zoneUTC)
"""
sliced_d = {}
for key in d.keys():
if key == "time" and to_tmstp:
sliced_d[key] = [datetime.timestamp(t) for t in d[key][-last_item:]]
elif key == "time" and time_to_datetime:
sliced_d[key] = [datetime.fromtimestamp(t).astimezone(zoneUTC) for t in d[key][-last_item:]]
else:
sliced_d[key] = d[key][-last_item:]
return sliced_d
@ -392,11 +516,11 @@ def get_tick(price: float, normalized_ticks: float = 0.01):
u cen pod 30, vrací 0.01. U cen nad 30 vrací pomerne zvetsene,
"""
if price<NORMALIZED_TICK_BASE_PRICE:
if price<cfh.config_handler.get_val('NORMALIZED_TICK_BASE_PRICE'):
return normalized_ticks
else:
#ratio of price vs base price
ratio = price/NORMALIZED_TICK_BASE_PRICE
ratio = price/cfh.config_handler.get_val('NORMALIZED_TICK_BASE_PRICE')
return price2dec(ratio*normalized_ticks)
def eval_cond_dict(cond: dict) -> tuple[bool, str]:
@ -550,6 +674,7 @@ def json_serial(obj):
Intervals: lambda obj: obj.__dict__,
SLHistory: lambda obj: obj.__dict__,
InstantIndicator: lambda obj: obj.__dict__,
StrategyInstance: lambda obj: obj.__dict__,
}
serializer = type_map.get(type(obj))
@ -578,23 +703,36 @@ def parse_toml_string(tomlst: str):
try:
tomlst = tomli.loads(tomlst)
except tomli.TOMLDecodeError as e:
print("Not valid TOML.", str(e))
return (-1, None)
msg = f"Not valid TOML: " + str(e)
richprint(msg)
return (-1, msg)
return (0, dict_replace_value(tomlst,"None",None))
#class to persist
# A FileLock is used to prevent concurrent access to the cache file.
# The __init__ method reads the existing cache file within the lock to ensure it's not being written to simultaneously by another process.
# The save method writes to a temporary file first and then atomically moves it to the desired file location. This prevents the issue of partial file writes in case the process is interrupted during the write.
#Zatim temporary fix, aby nezapisoval jiny process
#predtim nez bude implementovano ukladani do db
#pro ostatni processy je dostupne rest api get stratin
class Store:
stratins : List[StrategyInstance] = []
stratins: List[StrategyInstance] = []
runners: List[Runner] = []
def __init__(self) -> None:
self.lock = FileLock(DATA_DIR + "/strategyinstances.lock")
self.db_file = DATA_DIR + "/strategyinstances.cache"
if os.path.exists(self.db_file):
with open (self.db_file, 'rb') as fp:
with self.lock, open(self.db_file, 'rb') as fp:
self.stratins = pickle.load(fp)
def save(self):
with open(self.db_file, 'wb') as fp:
pickle.dump(self.stratins, fp)
with self.lock:
temp_fd, temp_path = tempfile.mkstemp(dir=DATA_DIR)
with os.fdopen(temp_fd, 'wb') as temp_file:
pickle.dump(self.stratins, temp_file)
shutil.move(temp_path, self.db_file)
qu = Queue()
@ -604,7 +742,7 @@ zoneUTC = pytz.utc
zonePRG = pytz.timezone('Europe/Amsterdam')
def print(*args, **kwargs):
if QUIET_MODE:
if cfh.config_handler.get_val('QUIET_MODE'):
pass
else:
####ic(*args, **kwargs)

97
v2trading_create_db.sql Normal file
View File

@ -0,0 +1,97 @@
BEGIN TRANSACTION;
CREATE TABLE IF NOT EXISTS "test_list" (
"id" varchar(32) NOT NULL,
"name" varchar(255) NOT NULL,
"dates" json NOT NULL
);
CREATE TABLE IF NOT EXISTS "runner_detail" (
"runner_id" varchar(32) NOT NULL,
"data" json NOT NULL,
PRIMARY KEY("runner_id")
);
CREATE TABLE IF NOT EXISTS "runner_header" (
"runner_id" varchar(32) NOT NULL,
"strat_id" TEXT,
"batch_id" TEXT,
"symbol" TEXT,
"name" TEXT,
"note" TEXT,
"started" TEXT,
"stopped" TEXT,
"mode" TEXT,
"account" TEXT,
"bt_from" TEXT,
"bt_to" TEXT,
"strat_json" TEXT,
"settings" TEXT,
"ilog_save" INTEGER,
"profit" NUMERIC,
"trade_count" INTEGER,
"end_positions" INTEGER,
"end_positions_avgp" NUMERIC,
"metrics" TEXT,
"stratvars_toml" TEXT,
"transferables" TEXT,
PRIMARY KEY("runner_id")
);
CREATE TABLE IF NOT EXISTS "config_table" (
"id" INTEGER,
"item_name" TEXT NOT NULL,
"json_data" JSON NOT NULL,
"item_lang" TEXT,
PRIMARY KEY("id" AUTOINCREMENT)
);
CREATE TABLE IF NOT EXISTS "runner_logs" (
"runner_id" varchar(32) NOT NULL,
"time" real NOT NULL,
"data" json NOT NULL
);
CREATE TABLE "run_manager" (
"moddus" TEXT NOT NULL,
"id" varchar(32),
"strat_id" varchar(32) NOT NULL,
"symbol" TEXT,
"account" TEXT NOT NULL,
"mode" TEXT NOT NULL,
"note" TEXT,
"ilog_save" BOOLEAN,
"bt_from" TEXT,
"bt_to" TEXT,
"weekdays_filter" TEXT,
"batch_id" TEXT,
"start_time" TEXT NOT NULL,
"stop_time" TEXT NOT NULL,
"status" TEXT NOT NULL,
"last_processed" TEXT,
"history" TEXT,
"valid_from" TEXT,
"valid_to" TEXT,
"testlist_id" TEXT,
"runner_id" varchar2(32),
"market" TEXT,
PRIMARY KEY("id")
);
CREATE INDEX idx_moddus ON run_manager (moddus);
CREATE INDEX idx_status ON run_manager (status);
CREATE INDEX idx_status_moddus ON run_manager (status, moddus);
CREATE INDEX idx_valid_from_to ON run_manager (valid_from, valid_to);
CREATE INDEX idx_stopped_batch_id ON runner_header (stopped, batch_id);
CREATE INDEX idx_search_value ON runner_header (strat_id, batch_id);
CREATE INDEX IF NOT EXISTS "index_runner_header_pk" ON "runner_header" (
"runner_id"
);
CREATE INDEX IF NOT EXISTS "index_runner_header_strat" ON "runner_header" (
"strat_id"
);
CREATE INDEX IF NOT EXISTS "index_runner_header_batch" ON "runner_header" (
"batch_id"
);
CREATE UNIQUE INDEX IF NOT EXISTS "index_runner_detail_pk" ON "runner_detail" (
"runner_id"
);
CREATE INDEX IF NOT EXISTS "index_runner_logs" ON "runner_logs" (
"runner_id",
"time"
);
INSERT INTO config_table VALUES (1, "test", "{}", "json");
COMMIT;