Compare commits
57 Commits
feature/mu
...
feature/ag
| Author | SHA1 | Date | |
|---|---|---|---|
| 093fe5dfbc | |||
| 05b7725a25 | |||
| 3de7d23009 | |||
| 9e7d974ebd | |||
| 66a4cb5d7c | |||
| 0bf9aadb0c | |||
| 81ca678f55 | |||
| 96c7f7207f | |||
| 26b72763da | |||
| adc7c3c1b6 | |||
| a6343abe88 | |||
| 075984fcff | |||
| 5fce627fe3 | |||
| 8de1356aa8 | |||
| 7f47890cad | |||
| 8cf1aea2a8 | |||
| 9231c1d273 | |||
| 9391d89aab | |||
| 9cff5fe6a1 | |||
| 0e5cf5f3e0 | |||
| 90c33c0528 | |||
| e9e6534d2b | |||
| 5874528d23 | |||
| 985445d814 | |||
| 6c1f7f0e2e | |||
| 20aaa2ac23 | |||
| 691514b102 | |||
| 84903aff77 | |||
| 4887e32665 | |||
| ce99448a48 | |||
| 887ea0ef00 | |||
| af7b678699 | |||
| 04c63df045 | |||
| ebac207489 | |||
| 9f99ddc86a | |||
| e75fbc7194 | |||
| c4d05f47ff | |||
| f6e31f45f9 | |||
| c42b1c4e1e | |||
| 1bf11d0dc4 | |||
| 1abbb07390 | |||
| b58639454b | |||
| a7e83fe051 | |||
| 6795338eba | |||
| 9aa8b58877 | |||
| eff78e8157 | |||
| d8bcc4bb8f | |||
| 7abdf47545 | |||
| 1f8afef042 | |||
| df60d16eb4 | |||
| 535c2824b0 | |||
| 9cf936672d | |||
| c1ad713a12 | |||
| e9bb8b84ec | |||
| 603736d441 | |||
| 2c968691d1 | |||
| 435b4d899a |
1
CODEOWNERS
Normal file
1
CODEOWNERS
Normal file
@ -0,0 +1 @@
|
|||||||
|
* @drew2323
|
||||||
53
README.md
Normal file
53
README.md
Normal file
@ -0,0 +1,53 @@
|
|||||||
|
**README - V2TRADING - Advanced Algorithmic Trading Platform**
|
||||||
|
|
||||||
|
**Overview**
|
||||||
|
Custom-built algorithmic trading platform for research, backtesting and automated trading. Trading engine capable of processing tick data, managing trades, and supporting backtesting in a highly accurate and efficient manner.
|
||||||
|
|
||||||
|
**Key Features**
|
||||||
|
- **Trading Engine**: At the core of the platform is a trading engine that processes tick data in real time. This engine is responsible for aggregating data and managing the execution of trades, ensuring precision and speed in trade placement and execution.
|
||||||
|
|
||||||
|
- **High-Fidelity Backtesting Environment**: ability to backtest strategies with 1:1 precision - meaning a tick-by-tick backtesting. This level of precision in backtesting, down to millisecond accuracy, mirrors live trading environments and is vital for developing and testing high-frequency trading strategies.
|
||||||
|
|
||||||
|
- **Custom Data Aggregation:** The platform includes a data aggregator that allows for custom aggregation rules. This flexibility supports a variety of data analysis approaches, including non-time based bars and other unique criteria.
|
||||||
|
|
||||||
|
- **Indicators** Contains inbuild [tulipy](https://tulipindicators.org/list) [ta-lib](https://ta-lib.github.io/ta-lib-python/) and templates for custom build multioutputs stateful indicators.
|
||||||
|
|
||||||
|
- **Machine Learning Integration:** Recently, the platform has expanded to incorporate machine learning capabilities. This includes modules for both training and inference, supporting the complete ML lifecycle. These ML models can be utilized within trading strategies for classification and exploiting statistical advantages.
|
||||||
|
|
||||||
|
**Technology Stack**
|
||||||
|
**Backend and API:** The backbone of the platform is built with Python, utilizing libraries such as FastAPI, NumPy, Keras, and JAX, ensuring high performance and scalability.
|
||||||
|
**Frontend:** The client-side is developed with Vanilla JavaScript and jQuery, employing LightweightCharts for charting purposes. Additional modules enhance the platform's functionality. The frontend is slated for a future refactoring to modern frameworks like Vue.js and Vuetify for a more robust user interface.
|
||||||
|
|
||||||
|
While the platform is fully functional and growing, ongoing development is planned, particularly in the realm of frontend enhancements and further integration of advanced machine learning techniques.
|
||||||
|
|
||||||
|
**Contributions**
|
||||||
|
Contributions to this project are welcome. Whether it's improving the frontend, enhancing the backend capabilities, or experimenting with new trading strategies and machine learning models, your input can help take this platform to the next level.
|
||||||
|
|
||||||
|
This repository represents a sophisticated and evolving tool for algorithmic traders, offering precision, speed, and a level of customization that is unparalleled in open-source systems. Join us in shaping the future of algorithmic trading.
|
||||||
|
|
||||||
|
<p align="center">
|
||||||
|
Main screen with entry/exit points and stoploss lines<br>
|
||||||
|
<img width="700" alt="Main screen with entry/exit points and stoploss lines" src="https://github.com/drew2323/v2trading/assets/28433232/751d5b0e-ef64-453f-8e76-89a39db679c5">
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<p align="center">
|
||||||
|
Main screen with tick based indicators<br>
|
||||||
|
<img width="700" alt="Main screen with tick based indicators" src="https://github.com/drew2323/v2trading/assets/28433232/4bf6128c-9b36-4e88-9da1-5a33319976a1">
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<p align="center">
|
||||||
|
Indicator editor<br>
|
||||||
|
<img width="700" alt="Indicator editor" src="https://github.com/drew2323/v2trading/assets/28433232/cc417393-7b88-4eea-afcb-3a00402d0a8d">
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<p align="center">
|
||||||
|
Strategy editor<br>
|
||||||
|
<img width="700" alt="Strategy editor" src="https://github.com/drew2323/v2trading/assets/28433232/74f67e7a-1efc-4f63-b763-7827b2337b6a">
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<p align="center">
|
||||||
|
Strategy analytical tools<br>
|
||||||
|
<img width="700" alt="Strategy analytical tools" src="https://github.com/drew2323/v2trading/assets/28433232/4bf8b3c3-e430-4250-831a-e5876bb6b743">
|
||||||
|
</p>
|
||||||
|
|
||||||
|
|
||||||
51
_run_scheduler.sh
Executable file
51
_run_scheduler.sh
Executable file
@ -0,0 +1,51 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# Approach: (https://chat.openai.com/c/43be8685-b27b-4e3b-bd18-0856f8d23d7e)
|
||||||
|
# cron runs this script every minute New York in range of 9:20 - 16:20 US time
|
||||||
|
# Also this scripts writes the "heartbeat" message to log file, so the user knows
|
||||||
|
#that cron is running
|
||||||
|
|
||||||
|
# Installation steps required:
|
||||||
|
#chmod +x run_scheduler.sh
|
||||||
|
#install tzdata package: sudo apt-get install tzdata
|
||||||
|
#crontab -e
|
||||||
|
#CRON_TZ=America/New_York
|
||||||
|
# * 9-16 * * 1-5 /home/david/v2trading/run_scheduler.sh
|
||||||
|
#
|
||||||
|
# (Runs every minute of every hour on every day-of-week from Monday to Friday) US East time
|
||||||
|
|
||||||
|
# Path to the Python script
|
||||||
|
PYTHON_SCRIPT="v2realbot/scheduler/scheduler.py"
|
||||||
|
|
||||||
|
# Log file path
|
||||||
|
LOG_FILE="job.log"
|
||||||
|
|
||||||
|
# Timezone for New York
|
||||||
|
TZ='America/New_York'
|
||||||
|
NY_DATE_TIME=$(TZ=$TZ date +'%Y-%m-%d %H:%M:%S')
|
||||||
|
echo "NY_DATE_TIME: $NY_DATE_TIME"
|
||||||
|
|
||||||
|
# Check if log file exists, create it if it doesn't
|
||||||
|
if [ ! -f "$LOG_FILE" ]; then
|
||||||
|
touch "$LOG_FILE"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check the last line of the log file
|
||||||
|
LAST_LINE=$(tail -n 1 "$LOG_FILE")
|
||||||
|
|
||||||
|
# Cron trigger message
|
||||||
|
CRON_TRIGGER="Cron trigger: $NY_DATE_TIME"
|
||||||
|
|
||||||
|
# Update the log
|
||||||
|
if [[ "$LAST_LINE" =~ "Cron trigger:".* ]]; then
|
||||||
|
# Replace the last line with the new trigger message
|
||||||
|
sed -i '' '$ d' "$LOG_FILE"
|
||||||
|
echo "$CRON_TRIGGER" >> "$LOG_FILE"
|
||||||
|
else
|
||||||
|
# Append a new cron trigger message
|
||||||
|
echo "$CRON_TRIGGER" >> "$LOG_FILE"
|
||||||
|
fi
|
||||||
|
|
||||||
|
|
||||||
|
# FOR DEBUG - Run the Python script and append output to log file
|
||||||
|
python3 "$PYTHON_SCRIPT" >> "$LOG_FILE" 2>&1
|
||||||
7
deployall.sh
Executable file
7
deployall.sh
Executable file
@ -0,0 +1,7 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# Navigate to your git repository directory
|
||||||
|
|
||||||
|
# Execute git commands
|
||||||
|
git push deploytest master
|
||||||
|
git push deploy master
|
||||||
@ -30,10 +30,12 @@ dateparser==1.1.8
|
|||||||
decorator==5.1.1
|
decorator==5.1.1
|
||||||
defusedxml==0.7.1
|
defusedxml==0.7.1
|
||||||
dill==0.3.7
|
dill==0.3.7
|
||||||
|
dm-tree==0.1.8
|
||||||
entrypoints==0.4
|
entrypoints==0.4
|
||||||
exceptiongroup==1.1.3
|
exceptiongroup==1.1.3
|
||||||
executing==1.2.0
|
executing==1.2.0
|
||||||
fastapi==0.95.0
|
fastapi==0.95.0
|
||||||
|
filelock==3.13.1
|
||||||
Flask==2.2.3
|
Flask==2.2.3
|
||||||
flatbuffers==23.5.26
|
flatbuffers==23.5.26
|
||||||
fonttools==4.39.0
|
fonttools==4.39.0
|
||||||
@ -46,7 +48,7 @@ google-auth-oauthlib==1.0.0
|
|||||||
google-pasta==0.2.0
|
google-pasta==0.2.0
|
||||||
grpcio==1.58.0
|
grpcio==1.58.0
|
||||||
h11==0.14.0
|
h11==0.14.0
|
||||||
h5py==3.9.0
|
h5py==3.10.0
|
||||||
icecream==2.1.3
|
icecream==2.1.3
|
||||||
idna==3.4
|
idna==3.4
|
||||||
imageio==2.31.6
|
imageio==2.31.6
|
||||||
@ -54,12 +56,18 @@ importlib-metadata==6.1.0
|
|||||||
ipython==8.17.2
|
ipython==8.17.2
|
||||||
ipywidgets==8.1.1
|
ipywidgets==8.1.1
|
||||||
itsdangerous==2.1.2
|
itsdangerous==2.1.2
|
||||||
|
jax==0.4.23
|
||||||
|
jaxlib==0.4.23
|
||||||
jedi==0.19.1
|
jedi==0.19.1
|
||||||
Jinja2==3.1.2
|
Jinja2==3.1.2
|
||||||
joblib==1.3.2
|
joblib==1.3.2
|
||||||
jsonschema==4.17.3
|
jsonschema==4.17.3
|
||||||
jupyterlab-widgets==3.0.9
|
jupyterlab-widgets==3.0.9
|
||||||
keras
|
keras==3.0.2
|
||||||
|
keras-core==0.1.7
|
||||||
|
keras-nightly==3.0.3.dev2024010203
|
||||||
|
keras-nlp-nightly==0.7.0.dev2024010203
|
||||||
|
keras-tcn @ git+https://github.com/drew2323/keras-tcn.git@4bddb17a02cb2f31c9fe2e8f616b357b1ddb0e11
|
||||||
kiwisolver==1.4.4
|
kiwisolver==1.4.4
|
||||||
libclang==16.0.6
|
libclang==16.0.6
|
||||||
llvmlite==0.39.1
|
llvmlite==0.39.1
|
||||||
@ -69,12 +77,12 @@ MarkupSafe==2.1.2
|
|||||||
matplotlib==3.8.2
|
matplotlib==3.8.2
|
||||||
matplotlib-inline==0.1.6
|
matplotlib-inline==0.1.6
|
||||||
mdurl==0.1.2
|
mdurl==0.1.2
|
||||||
ml-dtypes==0.2.0
|
ml-dtypes==0.3.1
|
||||||
mlroom @ git+https://github.com/drew2323/mlroom.git
|
mlroom @ git+https://github.com/drew2323/mlroom.git@692900e274c4e0542d945d231645c270fc508437
|
||||||
keras-tcn @ git+https://github.com/drew2323/keras-tcn.git
|
|
||||||
mplfinance==0.12.10b0
|
mplfinance==0.12.10b0
|
||||||
msgpack==1.0.4
|
msgpack==1.0.4
|
||||||
mypy-extensions==1.0.0
|
mypy-extensions==1.0.0
|
||||||
|
namex==0.0.7
|
||||||
newtulipy==0.4.6
|
newtulipy==0.4.6
|
||||||
numba==0.56.4
|
numba==0.56.4
|
||||||
numpy==1.23.5
|
numpy==1.23.5
|
||||||
@ -85,6 +93,7 @@ packaging==23.0
|
|||||||
pandas==1.5.3
|
pandas==1.5.3
|
||||||
param==1.13.0
|
param==1.13.0
|
||||||
parso==0.8.3
|
parso==0.8.3
|
||||||
|
patsy==0.5.6
|
||||||
pexpect==4.8.0
|
pexpect==4.8.0
|
||||||
Pillow==9.4.0
|
Pillow==9.4.0
|
||||||
plotly==5.13.1
|
plotly==5.13.1
|
||||||
@ -111,6 +120,7 @@ python-multipart==0.0.6
|
|||||||
pytz==2022.7.1
|
pytz==2022.7.1
|
||||||
pytz-deprecation-shim==0.1.0.post0
|
pytz-deprecation-shim==0.1.0.post0
|
||||||
pyviz-comms==2.2.1
|
pyviz-comms==2.2.1
|
||||||
|
PyWavelets==1.5.0
|
||||||
PyYAML==6.0
|
PyYAML==6.0
|
||||||
regex==2023.10.3
|
regex==2023.10.3
|
||||||
requests==2.31.0
|
requests==2.31.0
|
||||||
@ -128,11 +138,21 @@ sniffio==1.3.0
|
|||||||
sseclient-py==1.7.2
|
sseclient-py==1.7.2
|
||||||
stack-data==0.6.3
|
stack-data==0.6.3
|
||||||
starlette==0.26.1
|
starlette==0.26.1
|
||||||
|
statsmodels==0.14.1
|
||||||
streamlit==1.20.0
|
streamlit==1.20.0
|
||||||
structlog==23.1.0
|
structlog==23.1.0
|
||||||
TA-Lib==0.4.28
|
TA-Lib==0.4.28
|
||||||
|
tb-nightly==2.16.0a20240102
|
||||||
tenacity==8.2.2
|
tenacity==8.2.2
|
||||||
|
tensorboard==2.15.1
|
||||||
|
tensorboard-data-server==0.7.1
|
||||||
|
tensorflow-addons==0.23.0
|
||||||
|
tensorflow-estimator==2.15.0
|
||||||
|
tensorflow-io-gcs-filesystem==0.34.0
|
||||||
termcolor==2.3.0
|
termcolor==2.3.0
|
||||||
|
tf-estimator-nightly==2.14.0.dev2023080308
|
||||||
|
tf-nightly==2.16.0.dev20240101
|
||||||
|
tf_keras-nightly==2.16.0.dev2023123010
|
||||||
threadpoolctl==3.2.0
|
threadpoolctl==3.2.0
|
||||||
tinydb==4.7.1
|
tinydb==4.7.1
|
||||||
tinydb-serialization==2.1.0
|
tinydb-serialization==2.1.0
|
||||||
@ -143,12 +163,13 @@ toolz==0.12.0
|
|||||||
tornado==6.2
|
tornado==6.2
|
||||||
tqdm==4.65.0
|
tqdm==4.65.0
|
||||||
traitlets==5.13.0
|
traitlets==5.13.0
|
||||||
|
typeguard==2.13.3
|
||||||
typing_extensions==4.5.0
|
typing_extensions==4.5.0
|
||||||
tzdata==2023.2
|
tzdata==2023.2
|
||||||
tzlocal==4.3
|
tzlocal==4.3
|
||||||
urllib3==1.26.14
|
urllib3==1.26.14
|
||||||
uvicorn==0.21.1
|
uvicorn==0.21.1
|
||||||
-e git+https://github.com/drew2323/v2trading.git@523905ece6d99bf48a8952d39ced6a13f3b9a84e#egg=v2realbot
|
-e git+https://github.com/drew2323/v2trading.git@b58639454be921f9f0c9dd1880491cfcfdfdf3b7#egg=v2realbot
|
||||||
validators==0.20.0
|
validators==0.20.0
|
||||||
wcwidth==0.2.9
|
wcwidth==0.2.9
|
||||||
webencodings==0.5.1
|
webencodings==0.5.1
|
||||||
|
|||||||
104044
research/basic.ipynb
Normal file
104044
research/basic.ipynb
Normal file
File diff suppressed because it is too large
Load Diff
410
research/get_trades_at_once.ipynb
Normal file
410
research/get_trades_at_once.ipynb
Normal file
@ -0,0 +1,410 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Loading trades and vectorized aggregation\n",
|
||||||
|
"Describes how to fetch trades (remote/cached) and use new vectorized aggregation to aggregate bars of given type (time, volume, dollar) and resolution\n",
|
||||||
|
"\n",
|
||||||
|
"`fetch_trades_parallel` enables to fetch trades of given symbol and interval, also can filter conditions and minimum size. return `trades_df`\n",
|
||||||
|
"`aggregate_trades` acceptss `trades_df` and ressolution and type of bars (VOLUME, TIME, DOLLAR) and return aggregated ohlcv dataframe `ohlcv_df`"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 1,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">Activating profile profile1\n",
|
||||||
|
"</pre>\n"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"Activating profile profile1\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"trades_df-BAC-2024-01-11T09:30:00-2024-01-12T16:00:00.parquet\n",
|
||||||
|
"trades_df-SPY-2024-01-01T09:30:00-2024-05-14T16:00:00.parquet\n",
|
||||||
|
"ohlcv_df-BAC-2024-01-11T09:30:00-2024-01-12T16:00:00.parquet\n",
|
||||||
|
"ohlcv_df-SPY-2024-01-01T09:30:00-2024-05-14T16:00:00.parquet\n"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"import pandas as pd\n",
|
||||||
|
"import numpy as np\n",
|
||||||
|
"from numba import jit\n",
|
||||||
|
"from alpaca.data.historical import StockHistoricalDataClient\n",
|
||||||
|
"from v2realbot.config import ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, DATA_DIR\n",
|
||||||
|
"from alpaca.data.requests import StockTradesRequest\n",
|
||||||
|
"from v2realbot.enums.enums import BarType\n",
|
||||||
|
"import time\n",
|
||||||
|
"from datetime import datetime\n",
|
||||||
|
"from v2realbot.utils.utils import parse_alpaca_timestamp, ltp, zoneNY, send_to_telegram, fetch_calendar_data\n",
|
||||||
|
"import pyarrow\n",
|
||||||
|
"from v2realbot.loader.aggregator_vectorized import fetch_daily_stock_trades, fetch_trades_parallel, generate_time_bars_nb, aggregate_trades\n",
|
||||||
|
"import vectorbtpro as vbt\n",
|
||||||
|
"import v2realbot.utils.config_handler as cfh\n",
|
||||||
|
"\n",
|
||||||
|
"vbt.settings.set_theme(\"dark\")\n",
|
||||||
|
"vbt.settings['plotting']['layout']['width'] = 1280\n",
|
||||||
|
"vbt.settings.plotting.auto_rangebreaks = True\n",
|
||||||
|
"# Set the option to display with pagination\n",
|
||||||
|
"pd.set_option('display.notebook_repr_html', True)\n",
|
||||||
|
"pd.set_option('display.max_rows', 20) # Number of rows per page\n",
|
||||||
|
"# pd.set_option('display.float_format', '{:.9f}'.format)\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"#trade filtering\n",
|
||||||
|
"exclude_conditions = cfh.config_handler.get_val('AGG_EXCLUDED_TRADES') #standard ['C','O','4','B','7','V','P','W','U','Z','F']\n",
|
||||||
|
"minsize = 100\n",
|
||||||
|
"\n",
|
||||||
|
"symbol = \"SPY\"\n",
|
||||||
|
"#datetime in zoneNY \n",
|
||||||
|
"day_start = datetime(2024, 1, 1, 9, 30, 0)\n",
|
||||||
|
"day_stop = datetime(2024, 1, 14, 16, 00, 0)\n",
|
||||||
|
"day_start = zoneNY.localize(day_start)\n",
|
||||||
|
"day_stop = zoneNY.localize(day_stop)\n",
|
||||||
|
"#filename of trades_df parquet, date are in isoformat but without time zone part\n",
|
||||||
|
"dir = DATA_DIR + \"/notebooks/\"\n",
|
||||||
|
"#parquet interval cache contains exclude conditions and minsize filtering\n",
|
||||||
|
"file_trades = dir + f\"trades_df-{symbol}-{day_start.strftime('%Y-%m-%dT%H:%M:%S')}-{day_stop.strftime('%Y-%m-%dT%H:%M:%S')}-{exclude_conditions}-{minsize}.parquet\"\n",
|
||||||
|
"#file_trades = dir + f\"trades_df-{symbol}-{day_start.strftime('%Y-%m-%dT%H:%M:%S')}-{day_stop.strftime('%Y-%m-%dT%H:%M:%S')}.parquet\"\n",
|
||||||
|
"file_ohlcv = dir + f\"ohlcv_df-{symbol}-{day_start.strftime('%Y-%m-%dT%H:%M:%S')}-{day_stop.strftime('%Y-%m-%dT%H:%M:%S')}-{exclude_conditions}-{minsize}.parquet\"\n",
|
||||||
|
"\n",
|
||||||
|
"#PRINT all parquet in directory\n",
|
||||||
|
"import os\n",
|
||||||
|
"files = [f for f in os.listdir(dir) if f.endswith(\".parquet\")]\n",
|
||||||
|
"for f in files:\n",
|
||||||
|
" print(f)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 2,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"NOT FOUND. Fetching from remote\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"ename": "KeyboardInterrupt",
|
||||||
|
"evalue": "",
|
||||||
|
"output_type": "error",
|
||||||
|
"traceback": [
|
||||||
|
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
|
||||||
|
"\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)",
|
||||||
|
"Cell \u001b[0;32mIn[2], line 1\u001b[0m\n\u001b[0;32m----> 1\u001b[0m trades_df \u001b[38;5;241m=\u001b[39m \u001b[43mfetch_daily_stock_trades\u001b[49m\u001b[43m(\u001b[49m\u001b[43msymbol\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mday_start\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mday_stop\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mexclude_conditions\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mexclude_conditions\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mminsize\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mminsize\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mforce_remote\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mFalse\u001b[39;49;00m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mmax_retries\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;241;43m5\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mbackoff_factor\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;241;43m1\u001b[39;49m\u001b[43m)\u001b[49m\n\u001b[1;32m 2\u001b[0m trades_df\n",
|
||||||
|
"File \u001b[0;32m~/Documents/Development/python/v2trading/v2realbot/loader/aggregator_vectorized.py:200\u001b[0m, in \u001b[0;36mfetch_daily_stock_trades\u001b[0;34m(symbol, start, end, exclude_conditions, minsize, force_remote, max_retries, backoff_factor)\u001b[0m\n\u001b[1;32m 198\u001b[0m \u001b[38;5;28;01mfor\u001b[39;00m attempt \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28mrange\u001b[39m(max_retries):\n\u001b[1;32m 199\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 200\u001b[0m tradesResponse \u001b[38;5;241m=\u001b[39m \u001b[43mclient\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mget_stock_trades\u001b[49m\u001b[43m(\u001b[49m\u001b[43mstockTradeRequest\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 201\u001b[0m is_empty \u001b[38;5;241m=\u001b[39m \u001b[38;5;129;01mnot\u001b[39;00m tradesResponse[symbol]\n\u001b[1;32m 202\u001b[0m \u001b[38;5;28mprint\u001b[39m(\u001b[38;5;124mf\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mRemote fetched: \u001b[39m\u001b[38;5;132;01m{\u001b[39;00mis_empty\u001b[38;5;132;01m=}\u001b[39;00m\u001b[38;5;124m\"\u001b[39m, start, end)\n",
|
||||||
|
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/alpaca/data/historical/stock.py:144\u001b[0m, in \u001b[0;36mStockHistoricalDataClient.get_stock_trades\u001b[0;34m(self, request_params)\u001b[0m\n\u001b[1;32m 141\u001b[0m params \u001b[38;5;241m=\u001b[39m request_params\u001b[38;5;241m.\u001b[39mto_request_fields()\n\u001b[1;32m 143\u001b[0m \u001b[38;5;66;03m# paginated get request for market data api\u001b[39;00m\n\u001b[0;32m--> 144\u001b[0m raw_trades \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_data_get\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 145\u001b[0m \u001b[43m \u001b[49m\u001b[43mendpoint_data_type\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mtrades\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m,\u001b[49m\n\u001b[1;32m 146\u001b[0m \u001b[43m \u001b[49m\u001b[43mendpoint_asset_class\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mstocks\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m,\u001b[49m\n\u001b[1;32m 147\u001b[0m \u001b[43m \u001b[49m\u001b[43mapi_version\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mv2\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m,\u001b[49m\n\u001b[1;32m 148\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mparams\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 149\u001b[0m \u001b[43m\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 151\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_use_raw_data:\n\u001b[1;32m 152\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m raw_trades\n",
|
||||||
|
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/alpaca/data/historical/stock.py:338\u001b[0m, in \u001b[0;36mStockHistoricalDataClient._data_get\u001b[0;34m(self, endpoint_asset_class, endpoint_data_type, api_version, symbol_or_symbols, limit, page_limit, extension, **kwargs)\u001b[0m\n\u001b[1;32m 335\u001b[0m params[\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mlimit\u001b[39m\u001b[38;5;124m\"\u001b[39m] \u001b[38;5;241m=\u001b[39m actual_limit\n\u001b[1;32m 336\u001b[0m params[\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mpage_token\u001b[39m\u001b[38;5;124m\"\u001b[39m] \u001b[38;5;241m=\u001b[39m page_token\n\u001b[0;32m--> 338\u001b[0m response \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mget\u001b[49m\u001b[43m(\u001b[49m\u001b[43mpath\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mpath\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mdata\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mparams\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mapi_version\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mapi_version\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 340\u001b[0m \u001b[38;5;66;03m# TODO: Merge parsing if possible\u001b[39;00m\n\u001b[1;32m 341\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m extension \u001b[38;5;241m==\u001b[39m DataExtensionType\u001b[38;5;241m.\u001b[39mSNAPSHOT:\n",
|
||||||
|
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/alpaca/common/rest.py:221\u001b[0m, in \u001b[0;36mRESTClient.get\u001b[0;34m(self, path, data, **kwargs)\u001b[0m\n\u001b[1;32m 210\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mget\u001b[39m(\u001b[38;5;28mself\u001b[39m, path: \u001b[38;5;28mstr\u001b[39m, data: Union[\u001b[38;5;28mdict\u001b[39m, \u001b[38;5;28mstr\u001b[39m] \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mNone\u001b[39;00m, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m HTTPResult:\n\u001b[1;32m 211\u001b[0m \u001b[38;5;250m \u001b[39m\u001b[38;5;124;03m\"\"\"Performs a single GET request\u001b[39;00m\n\u001b[1;32m 212\u001b[0m \n\u001b[1;32m 213\u001b[0m \u001b[38;5;124;03m Args:\u001b[39;00m\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 219\u001b[0m \u001b[38;5;124;03m dict: The response\u001b[39;00m\n\u001b[1;32m 220\u001b[0m \u001b[38;5;124;03m \"\"\"\u001b[39;00m\n\u001b[0;32m--> 221\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_request\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mGET\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mpath\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mdata\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n",
|
||||||
|
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/alpaca/common/rest.py:129\u001b[0m, in \u001b[0;36mRESTClient._request\u001b[0;34m(self, method, path, data, base_url, api_version)\u001b[0m\n\u001b[1;32m 127\u001b[0m \u001b[38;5;28;01mwhile\u001b[39;00m retry \u001b[38;5;241m>\u001b[39m\u001b[38;5;241m=\u001b[39m \u001b[38;5;241m0\u001b[39m:\n\u001b[1;32m 128\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 129\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_one_request\u001b[49m\u001b[43m(\u001b[49m\u001b[43mmethod\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43murl\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mopts\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mretry\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 130\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m RetryException:\n\u001b[1;32m 131\u001b[0m time\u001b[38;5;241m.\u001b[39msleep(\u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_retry_wait)\n",
|
||||||
|
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/alpaca/common/rest.py:193\u001b[0m, in \u001b[0;36mRESTClient._one_request\u001b[0;34m(self, method, url, opts, retry)\u001b[0m\n\u001b[1;32m 174\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21m_one_request\u001b[39m(\u001b[38;5;28mself\u001b[39m, method: \u001b[38;5;28mstr\u001b[39m, url: \u001b[38;5;28mstr\u001b[39m, opts: \u001b[38;5;28mdict\u001b[39m, retry: \u001b[38;5;28mint\u001b[39m) \u001b[38;5;241m-\u001b[39m\u001b[38;5;241m>\u001b[39m \u001b[38;5;28mdict\u001b[39m:\n\u001b[1;32m 175\u001b[0m \u001b[38;5;250m \u001b[39m\u001b[38;5;124;03m\"\"\"Perform one request, possibly raising RetryException in the case\u001b[39;00m\n\u001b[1;32m 176\u001b[0m \u001b[38;5;124;03m the response is 429. Otherwise, if error text contain \"code\" string,\u001b[39;00m\n\u001b[1;32m 177\u001b[0m \u001b[38;5;124;03m then it decodes to json object and returns APIError.\u001b[39;00m\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 191\u001b[0m \u001b[38;5;124;03m dict: The response data\u001b[39;00m\n\u001b[1;32m 192\u001b[0m \u001b[38;5;124;03m \"\"\"\u001b[39;00m\n\u001b[0;32m--> 193\u001b[0m response \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_session\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mrequest\u001b[49m\u001b[43m(\u001b[49m\u001b[43mmethod\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43murl\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mopts\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 195\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m 196\u001b[0m response\u001b[38;5;241m.\u001b[39mraise_for_status()\n",
|
||||||
|
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/requests/sessions.py:589\u001b[0m, in \u001b[0;36mSession.request\u001b[0;34m(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json)\u001b[0m\n\u001b[1;32m 584\u001b[0m send_kwargs \u001b[38;5;241m=\u001b[39m {\n\u001b[1;32m 585\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mtimeout\u001b[39m\u001b[38;5;124m\"\u001b[39m: timeout,\n\u001b[1;32m 586\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mallow_redirects\u001b[39m\u001b[38;5;124m\"\u001b[39m: allow_redirects,\n\u001b[1;32m 587\u001b[0m }\n\u001b[1;32m 588\u001b[0m send_kwargs\u001b[38;5;241m.\u001b[39mupdate(settings)\n\u001b[0;32m--> 589\u001b[0m resp \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43msend\u001b[49m\u001b[43m(\u001b[49m\u001b[43mprep\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43msend_kwargs\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 591\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m resp\n",
|
||||||
|
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/requests/sessions.py:703\u001b[0m, in \u001b[0;36mSession.send\u001b[0;34m(self, request, **kwargs)\u001b[0m\n\u001b[1;32m 700\u001b[0m start \u001b[38;5;241m=\u001b[39m preferred_clock()\n\u001b[1;32m 702\u001b[0m \u001b[38;5;66;03m# Send the request\u001b[39;00m\n\u001b[0;32m--> 703\u001b[0m r \u001b[38;5;241m=\u001b[39m \u001b[43madapter\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43msend\u001b[49m\u001b[43m(\u001b[49m\u001b[43mrequest\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 705\u001b[0m \u001b[38;5;66;03m# Total elapsed time of the request (approximately)\u001b[39;00m\n\u001b[1;32m 706\u001b[0m elapsed \u001b[38;5;241m=\u001b[39m preferred_clock() \u001b[38;5;241m-\u001b[39m start\n",
|
||||||
|
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/requests/adapters.py:486\u001b[0m, in \u001b[0;36mHTTPAdapter.send\u001b[0;34m(self, request, stream, timeout, verify, cert, proxies)\u001b[0m\n\u001b[1;32m 483\u001b[0m timeout \u001b[38;5;241m=\u001b[39m TimeoutSauce(connect\u001b[38;5;241m=\u001b[39mtimeout, read\u001b[38;5;241m=\u001b[39mtimeout)\n\u001b[1;32m 485\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 486\u001b[0m resp \u001b[38;5;241m=\u001b[39m \u001b[43mconn\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43murlopen\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 487\u001b[0m \u001b[43m \u001b[49m\u001b[43mmethod\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mrequest\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mmethod\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 488\u001b[0m \u001b[43m \u001b[49m\u001b[43murl\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43murl\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 489\u001b[0m \u001b[43m \u001b[49m\u001b[43mbody\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mrequest\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mbody\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 490\u001b[0m \u001b[43m \u001b[49m\u001b[43mheaders\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mrequest\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mheaders\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 491\u001b[0m \u001b[43m \u001b[49m\u001b[43mredirect\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mFalse\u001b[39;49;00m\u001b[43m,\u001b[49m\n\u001b[1;32m 492\u001b[0m \u001b[43m \u001b[49m\u001b[43massert_same_host\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mFalse\u001b[39;49;00m\u001b[43m,\u001b[49m\n\u001b[1;32m 493\u001b[0m \u001b[43m \u001b[49m\u001b[43mpreload_content\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mFalse\u001b[39;49;00m\u001b[43m,\u001b[49m\n\u001b[1;32m 494\u001b[0m \u001b[43m \u001b[49m\u001b[43mdecode_content\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mFalse\u001b[39;49;00m\u001b[43m,\u001b[49m\n\u001b[1;32m 495\u001b[0m \u001b[43m \u001b[49m\u001b[43mretries\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mmax_retries\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 496\u001b[0m \u001b[43m \u001b[49m\u001b[43mtimeout\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mtimeout\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 497\u001b[0m \u001b[43m \u001b[49m\u001b[43mchunked\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mchunked\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 498\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 500\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m (ProtocolError, \u001b[38;5;167;01mOSError\u001b[39;00m) \u001b[38;5;28;01mas\u001b[39;00m err:\n\u001b[1;32m 501\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mConnectionError\u001b[39;00m(err, request\u001b[38;5;241m=\u001b[39mrequest)\n",
|
||||||
|
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/urllib3/connectionpool.py:703\u001b[0m, in \u001b[0;36mHTTPConnectionPool.urlopen\u001b[0;34m(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, **response_kw)\u001b[0m\n\u001b[1;32m 700\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_prepare_proxy(conn)\n\u001b[1;32m 702\u001b[0m \u001b[38;5;66;03m# Make the request on the httplib connection object.\u001b[39;00m\n\u001b[0;32m--> 703\u001b[0m httplib_response \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_make_request\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 704\u001b[0m \u001b[43m \u001b[49m\u001b[43mconn\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 705\u001b[0m \u001b[43m \u001b[49m\u001b[43mmethod\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 706\u001b[0m \u001b[43m \u001b[49m\u001b[43murl\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 707\u001b[0m \u001b[43m \u001b[49m\u001b[43mtimeout\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mtimeout_obj\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 708\u001b[0m \u001b[43m \u001b[49m\u001b[43mbody\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mbody\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 709\u001b[0m \u001b[43m \u001b[49m\u001b[43mheaders\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mheaders\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 710\u001b[0m \u001b[43m \u001b[49m\u001b[43mchunked\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mchunked\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 711\u001b[0m \u001b[43m\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 713\u001b[0m \u001b[38;5;66;03m# If we're going to release the connection in ``finally:``, then\u001b[39;00m\n\u001b[1;32m 714\u001b[0m \u001b[38;5;66;03m# the response doesn't need to know about the connection. Otherwise\u001b[39;00m\n\u001b[1;32m 715\u001b[0m \u001b[38;5;66;03m# it will also try to release it and we'll have a double-release\u001b[39;00m\n\u001b[1;32m 716\u001b[0m \u001b[38;5;66;03m# mess.\u001b[39;00m\n\u001b[1;32m 717\u001b[0m response_conn \u001b[38;5;241m=\u001b[39m conn \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m release_conn \u001b[38;5;28;01melse\u001b[39;00m \u001b[38;5;28;01mNone\u001b[39;00m\n",
|
||||||
|
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/urllib3/connectionpool.py:449\u001b[0m, in \u001b[0;36mHTTPConnectionPool._make_request\u001b[0;34m(self, conn, method, url, timeout, chunked, **httplib_request_kw)\u001b[0m\n\u001b[1;32m 444\u001b[0m httplib_response \u001b[38;5;241m=\u001b[39m conn\u001b[38;5;241m.\u001b[39mgetresponse()\n\u001b[1;32m 445\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mBaseException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[1;32m 446\u001b[0m \u001b[38;5;66;03m# Remove the TypeError from the exception chain in\u001b[39;00m\n\u001b[1;32m 447\u001b[0m \u001b[38;5;66;03m# Python 3 (including for exceptions like SystemExit).\u001b[39;00m\n\u001b[1;32m 448\u001b[0m \u001b[38;5;66;03m# Otherwise it looks like a bug in the code.\u001b[39;00m\n\u001b[0;32m--> 449\u001b[0m \u001b[43msix\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mraise_from\u001b[49m\u001b[43m(\u001b[49m\u001b[43me\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43;01mNone\u001b[39;49;00m\u001b[43m)\u001b[49m\n\u001b[1;32m 450\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m (SocketTimeout, BaseSSLError, SocketError) \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[1;32m 451\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_raise_timeout(err\u001b[38;5;241m=\u001b[39me, url\u001b[38;5;241m=\u001b[39murl, timeout_value\u001b[38;5;241m=\u001b[39mread_timeout)\n",
|
||||||
|
"File \u001b[0;32m<string>:3\u001b[0m, in \u001b[0;36mraise_from\u001b[0;34m(value, from_value)\u001b[0m\n",
|
||||||
|
"File \u001b[0;32m~/Documents/Development/python/v2trading/.venv/lib/python3.10/site-packages/urllib3/connectionpool.py:444\u001b[0m, in \u001b[0;36mHTTPConnectionPool._make_request\u001b[0;34m(self, conn, method, url, timeout, chunked, **httplib_request_kw)\u001b[0m\n\u001b[1;32m 441\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mTypeError\u001b[39;00m:\n\u001b[1;32m 442\u001b[0m \u001b[38;5;66;03m# Python 3\u001b[39;00m\n\u001b[1;32m 443\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 444\u001b[0m httplib_response \u001b[38;5;241m=\u001b[39m \u001b[43mconn\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mgetresponse\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 445\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mBaseException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[1;32m 446\u001b[0m \u001b[38;5;66;03m# Remove the TypeError from the exception chain in\u001b[39;00m\n\u001b[1;32m 447\u001b[0m \u001b[38;5;66;03m# Python 3 (including for exceptions like SystemExit).\u001b[39;00m\n\u001b[1;32m 448\u001b[0m \u001b[38;5;66;03m# Otherwise it looks like a bug in the code.\u001b[39;00m\n\u001b[1;32m 449\u001b[0m six\u001b[38;5;241m.\u001b[39mraise_from(e, \u001b[38;5;28;01mNone\u001b[39;00m)\n",
|
||||||
|
"File \u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py:1375\u001b[0m, in \u001b[0;36mHTTPConnection.getresponse\u001b[0;34m(self)\u001b[0m\n\u001b[1;32m 1373\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m 1374\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m-> 1375\u001b[0m \u001b[43mresponse\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mbegin\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 1376\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mConnectionError\u001b[39;00m:\n\u001b[1;32m 1377\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mclose()\n",
|
||||||
|
"File \u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py:318\u001b[0m, in \u001b[0;36mHTTPResponse.begin\u001b[0;34m(self)\u001b[0m\n\u001b[1;32m 316\u001b[0m \u001b[38;5;66;03m# read until we get a non-100 response\u001b[39;00m\n\u001b[1;32m 317\u001b[0m \u001b[38;5;28;01mwhile\u001b[39;00m \u001b[38;5;28;01mTrue\u001b[39;00m:\n\u001b[0;32m--> 318\u001b[0m version, status, reason \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_read_status\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 319\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m status \u001b[38;5;241m!=\u001b[39m CONTINUE:\n\u001b[1;32m 320\u001b[0m \u001b[38;5;28;01mbreak\u001b[39;00m\n",
|
||||||
|
"File \u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py:279\u001b[0m, in \u001b[0;36mHTTPResponse._read_status\u001b[0;34m(self)\u001b[0m\n\u001b[1;32m 278\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21m_read_status\u001b[39m(\u001b[38;5;28mself\u001b[39m):\n\u001b[0;32m--> 279\u001b[0m line \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mstr\u001b[39m(\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mfp\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mreadline\u001b[49m\u001b[43m(\u001b[49m\u001b[43m_MAXLINE\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m+\u001b[39;49m\u001b[43m \u001b[49m\u001b[38;5;241;43m1\u001b[39;49m\u001b[43m)\u001b[49m, \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124miso-8859-1\u001b[39m\u001b[38;5;124m\"\u001b[39m)\n\u001b[1;32m 280\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28mlen\u001b[39m(line) \u001b[38;5;241m>\u001b[39m _MAXLINE:\n\u001b[1;32m 281\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m LineTooLong(\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mstatus line\u001b[39m\u001b[38;5;124m\"\u001b[39m)\n",
|
||||||
|
"File \u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/socket.py:705\u001b[0m, in \u001b[0;36mSocketIO.readinto\u001b[0;34m(self, b)\u001b[0m\n\u001b[1;32m 703\u001b[0m \u001b[38;5;28;01mwhile\u001b[39;00m \u001b[38;5;28;01mTrue\u001b[39;00m:\n\u001b[1;32m 704\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 705\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_sock\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mrecv_into\u001b[49m\u001b[43m(\u001b[49m\u001b[43mb\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 706\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m timeout:\n\u001b[1;32m 707\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_timeout_occurred \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mTrue\u001b[39;00m\n",
|
||||||
|
"File \u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/ssl.py:1274\u001b[0m, in \u001b[0;36mSSLSocket.recv_into\u001b[0;34m(self, buffer, nbytes, flags)\u001b[0m\n\u001b[1;32m 1270\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m flags \u001b[38;5;241m!=\u001b[39m \u001b[38;5;241m0\u001b[39m:\n\u001b[1;32m 1271\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mValueError\u001b[39;00m(\n\u001b[1;32m 1272\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mnon-zero flags not allowed in calls to recv_into() on \u001b[39m\u001b[38;5;132;01m%s\u001b[39;00m\u001b[38;5;124m\"\u001b[39m \u001b[38;5;241m%\u001b[39m\n\u001b[1;32m 1273\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m\u001b[38;5;18m__class__\u001b[39m)\n\u001b[0;32m-> 1274\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mread\u001b[49m\u001b[43m(\u001b[49m\u001b[43mnbytes\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mbuffer\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 1275\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[1;32m 1276\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28msuper\u001b[39m()\u001b[38;5;241m.\u001b[39mrecv_into(buffer, nbytes, flags)\n",
|
||||||
|
"File \u001b[0;32m/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/ssl.py:1130\u001b[0m, in \u001b[0;36mSSLSocket.read\u001b[0;34m(self, len, buffer)\u001b[0m\n\u001b[1;32m 1128\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m 1129\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m buffer \u001b[38;5;129;01mis\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m \u001b[38;5;28;01mNone\u001b[39;00m:\n\u001b[0;32m-> 1130\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_sslobj\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mread\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43mlen\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mbuffer\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 1131\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[1;32m 1132\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_sslobj\u001b[38;5;241m.\u001b[39mread(\u001b[38;5;28mlen\u001b[39m)\n",
|
||||||
|
"\u001b[0;31mKeyboardInterrupt\u001b[0m: "
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"trades_df = fetch_daily_stock_trades(symbol, day_start, day_stop, exclude_conditions=exclude_conditions, minsize=minsize, force_remote=False, max_retries=5, backoff_factor=1)\n",
|
||||||
|
"trades_df"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 2,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"#Either load trades or ohlcv from parquet if exists\n",
|
||||||
|
"\n",
|
||||||
|
"#trades_df = fetch_trades_parallel(symbol, day_start, day_stop, exclude_conditions=exclude_conditions, minsize=50, max_workers=20) #exclude_conditions=['C','O','4','B','7','V','P','W','U','Z','F'])\n",
|
||||||
|
"# trades_df.to_parquet(file_trades, engine='pyarrow', compression='gzip')\n",
|
||||||
|
"\n",
|
||||||
|
"trades_df = pd.read_parquet(file_trades,engine='pyarrow')\n",
|
||||||
|
"ohlcv_df = aggregate_trades(symbol=symbol, trades_df=trades_df, resolution=1, type=BarType.TIME)\n",
|
||||||
|
"ohlcv_df.to_parquet(file_ohlcv, engine='pyarrow', compression='gzip')\n",
|
||||||
|
"\n",
|
||||||
|
"# ohlcv_df = pd.read_parquet(file_ohlcv,engine='pyarrow')\n",
|
||||||
|
"# trades_df = pd.read_parquet(file_trades,engine='pyarrow')\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"#list all files is dir directory with parquet extension\n",
|
||||||
|
"dir = DATA_DIR + \"/notebooks/\"\n",
|
||||||
|
"import os\n",
|
||||||
|
"files = [f for f in os.listdir(dir) if f.endswith(\".parquet\")]\n",
|
||||||
|
"file_name = \"\"\n",
|
||||||
|
"ohlcv_df = pd.read_parquet(file_ohlcv,engine='pyarrow')"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"ohlcv_df"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"import matplotlib.pyplot as plt\n",
|
||||||
|
"import seaborn as sns\n",
|
||||||
|
"# Calculate daily returns\n",
|
||||||
|
"ohlcv_df['returns'] = ohlcv_df['close'].pct_change().dropna()\n",
|
||||||
|
"#same as above but pct_change is from 3 datapoints back, but only if it is the same date, else na\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"# Plot the probability distribution curve\n",
|
||||||
|
"plt.figure(figsize=(10, 6))\n",
|
||||||
|
"sns.histplot(df['returns'].dropna(), kde=True, stat='probability', bins=30)\n",
|
||||||
|
"plt.title('Probability Distribution of Daily Returns')\n",
|
||||||
|
"plt.xlabel('Daily Returns')\n",
|
||||||
|
"plt.ylabel('Probability')\n",
|
||||||
|
"plt.show()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"import pandas as pd\n",
|
||||||
|
"import numpy as np\n",
|
||||||
|
"from sklearn.model_selection import train_test_split\n",
|
||||||
|
"from sklearn.preprocessing import StandardScaler\n",
|
||||||
|
"from sklearn.linear_model import LogisticRegression\n",
|
||||||
|
"from sklearn.metrics import accuracy_score\n",
|
||||||
|
"\n",
|
||||||
|
"# Define the intervals from 5 to 20 s, returns for each interval\n",
|
||||||
|
"#maybe use rolling window?\n",
|
||||||
|
"intervals = range(5, 21, 5)\n",
|
||||||
|
"\n",
|
||||||
|
"# Create columns for percentage returns\n",
|
||||||
|
"rolling_window = 50\n",
|
||||||
|
"\n",
|
||||||
|
"# Normalize the returns using rolling mean and std\n",
|
||||||
|
"for N in intervals:\n",
|
||||||
|
" column_name = f'returns_{N}'\n",
|
||||||
|
" rolling_mean = ohlcv_df[column_name].rolling(window=rolling_window).mean()\n",
|
||||||
|
" rolling_std = ohlcv_df[column_name].rolling(window=rolling_window).std()\n",
|
||||||
|
" ohlcv_df[f'norm_{column_name}'] = (ohlcv_df[column_name] - rolling_mean) / rolling_std\n",
|
||||||
|
"\n",
|
||||||
|
"# Display the dataframe with normalized return columns\n",
|
||||||
|
"ohlcv_df\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# Calculate the sum of the normalized return columns for each row\n",
|
||||||
|
"ohlcv_df['sum_norm_returns'] = ohlcv_df[[f'norm_returns_{N}' for N in intervals]].sum(axis=1)\n",
|
||||||
|
"\n",
|
||||||
|
"# Sort the DataFrame based on the sum of normalized returns in descending order\n",
|
||||||
|
"df_sorted = ohlcv_df.sort_values(by='sum_norm_returns', ascending=False)\n",
|
||||||
|
"\n",
|
||||||
|
"# Display the top rows with the highest sum of normalized returns\n",
|
||||||
|
"df_sorted\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# Drop initial rows with NaN values due to pct_change\n",
|
||||||
|
"ohlcv_df.dropna(inplace=True)\n",
|
||||||
|
"\n",
|
||||||
|
"# Plotting the probability distribution curves\n",
|
||||||
|
"plt.figure(figsize=(14, 8))\n",
|
||||||
|
"for N in intervals:\n",
|
||||||
|
" sns.kdeplot(ohlcv_df[f'returns_{N}'].dropna(), label=f'Returns {N}', fill=True)\n",
|
||||||
|
"\n",
|
||||||
|
"plt.title('Probability Distribution of Percentage Returns')\n",
|
||||||
|
"plt.xlabel('Percentage Return')\n",
|
||||||
|
"plt.ylabel('Density')\n",
|
||||||
|
"plt.legend()\n",
|
||||||
|
"plt.show()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"import matplotlib.pyplot as plt\n",
|
||||||
|
"import seaborn as sns\n",
|
||||||
|
"# Plot the probability distribution curve\n",
|
||||||
|
"plt.figure(figsize=(10, 6))\n",
|
||||||
|
"sns.histplot(ohlcv_df['returns'].dropna(), kde=True, stat='probability', bins=30)\n",
|
||||||
|
"plt.title('Probability Distribution of Daily Returns')\n",
|
||||||
|
"plt.xlabel('Daily Returns')\n",
|
||||||
|
"plt.ylabel('Probability')\n",
|
||||||
|
"plt.show()\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"#show only rows from ohlcv_df where returns > 0.005\n",
|
||||||
|
"ohlcv_df[ohlcv_df['returns'] > 0.0005]\n",
|
||||||
|
"\n",
|
||||||
|
"#ohlcv_df[ohlcv_df['returns'] < -0.005]"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"#ohlcv where index = date 2024-03-13 and between hour 12\n",
|
||||||
|
"\n",
|
||||||
|
"a = ohlcv_df.loc['2024-03-13 12:00:00':'2024-03-13 13:00:00']\n",
|
||||||
|
"a"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"ohlcv_df"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"trades_df"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"ohlcv_df.info()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"trades_df.to_parquet(\"trades_df-spy-0111-0111.parquett\", engine='pyarrow', compression='gzip')\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"trades_df.to_parquet(\"trades_df-spy-111-0516.parquett\", engine='pyarrow', compression='gzip', allow_truncated_timestamps=True)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"ohlcv_df.to_parquet(\"ohlcv_df-spy-111-0516.parquett\", engine='pyarrow', compression='gzip')"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"basic_data = vbt.Data.from_data(vbt.symbol_dict({symbol: ohlcv_df}), tz_convert=zoneNY)\n",
|
||||||
|
"vbt.settings['plotting']['auto_rangebreaks'] = True\n",
|
||||||
|
"basic_data.ohlcv.plot()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"#access just BCA\n",
|
||||||
|
"#df_filtered = df.loc[\"BAC\"]"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": ".venv",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.10.11"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 2
|
||||||
|
}
|
||||||
1526
research/indcross_parametrized.ipynb
Normal file
1526
research/indcross_parametrized.ipynb
Normal file
File diff suppressed because one or more lines are too long
557
research/ohlc_persistance_test.ipynb
Normal file
557
research/ohlc_persistance_test.ipynb
Normal file
File diff suppressed because one or more lines are too long
1602
research/prepare_aggregatied_data.ipynb
Normal file
1602
research/prepare_aggregatied_data.ipynb
Normal file
File diff suppressed because it is too large
Load Diff
26673
research/rsi_alpaca.ipynb
Normal file
26673
research/rsi_alpaca.ipynb
Normal file
File diff suppressed because one or more lines are too long
1553
research/strat1/strat1_v1_MULTI.ipynb
Normal file
1553
research/strat1/strat1_v1_MULTI.ipynb
Normal file
File diff suppressed because one or more lines are too long
1570
research/strat1/strat1_v1_SINGLE.ipynb
Normal file
1570
research/strat1/strat1_v1_SINGLE.ipynb
Normal file
File diff suppressed because one or more lines are too long
1553
research/strat_LINREG_MULTI/v1_MULTI.ipynb
Normal file
1553
research/strat_LINREG_MULTI/v1_MULTI.ipynb
Normal file
File diff suppressed because one or more lines are too long
44670
research/strat_LINREG_MULTI/v1_SINGLE.ipynb
Normal file
44670
research/strat_LINREG_MULTI/v1_SINGLE.ipynb
Normal file
File diff suppressed because one or more lines are too long
23637
research/test.ipynb
Normal file
23637
research/test.ipynb
Normal file
File diff suppressed because it is too large
Load Diff
105
research/test1.ipynb
Normal file
105
research/test1.ipynb
Normal file
File diff suppressed because one or more lines are too long
421
research/test1sbars.ipynb
Normal file
421
research/test1sbars.ipynb
Normal file
@ -0,0 +1,421 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from v2realbot.tools.loadbatch import load_batch\n",
|
||||||
|
"from v2realbot.utils.utils import zoneNY\n",
|
||||||
|
"import pandas as pd\n",
|
||||||
|
"import numpy as np\n",
|
||||||
|
"import vectorbtpro as vbt\n",
|
||||||
|
"from itables import init_notebook_mode, show\n",
|
||||||
|
"\n",
|
||||||
|
"init_notebook_mode(all_interactive=True)\n",
|
||||||
|
"\n",
|
||||||
|
"vbt.settings.set_theme(\"dark\")\n",
|
||||||
|
"vbt.settings['plotting']['layout']['width'] = 1280\n",
|
||||||
|
"vbt.settings.plotting.auto_rangebreaks = True\n",
|
||||||
|
"# Set the option to display with pagination\n",
|
||||||
|
"pd.set_option('display.notebook_repr_html', True)\n",
|
||||||
|
"pd.set_option('display.max_rows', 10) # Number of rows per page\n",
|
||||||
|
"\n",
|
||||||
|
"res, df = load_batch(batch_id=\"0fb5043a\", #46 days 1.3 - 6.5.\n",
|
||||||
|
" space_resolution_evenly=False,\n",
|
||||||
|
" indicators_columns=[\"Rsi14\"],\n",
|
||||||
|
" main_session_only=True,\n",
|
||||||
|
" verbose = False)\n",
|
||||||
|
"if res < 0:\n",
|
||||||
|
" print(\"Error\" + str(res) + str(df))\n",
|
||||||
|
"df = df[\"bars\"]\n",
|
||||||
|
"\n",
|
||||||
|
"df"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# filter dates"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"#filter na dny\n",
|
||||||
|
"# dates_of_interest = pd.to_datetime(['2024-04-22', '2024-04-23']).tz_localize('US/Eastern')\n",
|
||||||
|
"# filtered_df = df.loc[df.index.normalize().isin(dates_of_interest)]\n",
|
||||||
|
"\n",
|
||||||
|
"# df = filtered_df\n",
|
||||||
|
"# df.info()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"import plotly.io as pio\n",
|
||||||
|
"pio.renderers.default = 'notebook'\n",
|
||||||
|
"\n",
|
||||||
|
"#naloadujeme do vbt symbol as column\n",
|
||||||
|
"basic_data = vbt.Data.from_data({\"BAC\": df}, tz_convert=zoneNY)\n",
|
||||||
|
"start_date = pd.Timestamp('2024-03-12 09:30', tz=zoneNY)\n",
|
||||||
|
"end_date = pd.Timestamp('2024-03-13 16:00', tz=zoneNY)\n",
|
||||||
|
"\n",
|
||||||
|
"#basic_data = basic_data.transform(lambda df: df[df.index.date == start_date.date()])\n",
|
||||||
|
"#basic_data = basic_data.transform(lambda df: df[(df.index >= start_date) & (df.index <= end_date)])\n",
|
||||||
|
"#basic_data.data[\"BAC\"].info()\n",
|
||||||
|
"\n",
|
||||||
|
"# fig = basic_data.plot(plot_volume=False)\n",
|
||||||
|
"# pivot_info = basic_data.run(\"pivotinfo\", up_th=0.003, down_th=0.002)\n",
|
||||||
|
"# #pivot_info.plot()\n",
|
||||||
|
"# pivot_info.plot(fig=fig, conf_value_trace_kwargs=dict(visible=True))\n",
|
||||||
|
"# fig.show()\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"# rsi14 = basic_data.data[\"BAC\"][\"Rsi14\"].rename(\"Rsi14\")\n",
|
||||||
|
"\n",
|
||||||
|
"# rsi14.vbt.plot().show()\n",
|
||||||
|
"#basic_data.xloc[\"09:30\":\"10:00\"].data[\"BAC\"].vbt.ohlcv.plot().show()\n",
|
||||||
|
"\n",
|
||||||
|
"vbt.settings.plotting.auto_rangebreaks = True\n",
|
||||||
|
"#basic_data.data[\"BAC\"].vbt.ohlcv.plot()\n",
|
||||||
|
"\n",
|
||||||
|
"#basic_data.data[\"BAC\"]"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"m1_data = basic_data[['Open', 'High', 'Low', 'Close', 'Volume']]\n",
|
||||||
|
"\n",
|
||||||
|
"m1_data.data[\"BAC\"]\n",
|
||||||
|
"#m5_data = m1_data.resample(\"5T\")\n",
|
||||||
|
"\n",
|
||||||
|
"#m5_data.data[\"BAC\"].head(10)\n",
|
||||||
|
"\n",
|
||||||
|
"# m15_data = m1_data.resample(\"15T\")\n",
|
||||||
|
"\n",
|
||||||
|
"# m15 = m15_data.data[\"BAC\"]\n",
|
||||||
|
"\n",
|
||||||
|
"# m15.vbt.ohlcv.plot()\n",
|
||||||
|
"\n",
|
||||||
|
"# m1_data.wrapper.index\n",
|
||||||
|
"\n",
|
||||||
|
"# m1_resampler = m1_data.wrapper.get_resampler(\"1T\")\n",
|
||||||
|
"# m1_resampler.index_difference(reverse=True)\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"# m5_resampler.prettify()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# defining ENTRY WINDOW and forced EXIT window"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"#m1_data.data[\"BAC\"].info()\n",
|
||||||
|
"import datetime\n",
|
||||||
|
"# Define the market open and close times\n",
|
||||||
|
"market_open = datetime.time(9, 30)\n",
|
||||||
|
"market_close = datetime.time(16, 0)\n",
|
||||||
|
"entry_window_opens = 1\n",
|
||||||
|
"entry_window_closes = 350\n",
|
||||||
|
"\n",
|
||||||
|
"forced_exit_start = 380\n",
|
||||||
|
"forced_exit_end = 390\n",
|
||||||
|
"\n",
|
||||||
|
"forced_exit = m1_data.symbol_wrapper.fill(False)\n",
|
||||||
|
"entry_window_open= m1_data.symbol_wrapper.fill(False)\n",
|
||||||
|
"\n",
|
||||||
|
"# Calculate the time difference in minutes from market open for each timestamp\n",
|
||||||
|
"elapsed_min_from_open = (forced_exit.index.hour - market_open.hour) * 60 + (forced_exit.index.minute - market_open.minute)\n",
|
||||||
|
"\n",
|
||||||
|
"entry_window_open[(elapsed_min_from_open >= entry_window_opens) & (elapsed_min_from_open < entry_window_closes)] = True\n",
|
||||||
|
"forced_exit[(elapsed_min_from_open >= forced_exit_start) & (elapsed_min_from_open < forced_exit_end)] = True\n",
|
||||||
|
"\n",
|
||||||
|
"#entry_window_open.info()\n",
|
||||||
|
"# forced_exit.tail(100)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"close = m1_data.close\n",
|
||||||
|
"\n",
|
||||||
|
"rsi = vbt.RSI.run(close, window=14)\n",
|
||||||
|
"\n",
|
||||||
|
"long_entries = (rsi.rsi.vbt.crossed_below(20) & entry_window_open)\n",
|
||||||
|
"long_exits = (rsi.rsi.vbt.crossed_above(70) | forced_exit)\n",
|
||||||
|
"#long_entries.info()\n",
|
||||||
|
"#number of trues and falses in long_entries\n",
|
||||||
|
"long_entries.value_counts()\n",
|
||||||
|
"#long_exits.value_counts()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"def plot_rsi(rsi, close, entries, exits):\n",
|
||||||
|
" fig = vbt.make_subplots(rows=1, cols=1, shared_xaxes=True, specs=[[{\"secondary_y\": True}]], vertical_spacing=0.02, subplot_titles=(\"RSI\", \"Price\" ))\n",
|
||||||
|
" close.vbt.plot(fig=fig, add_trace_kwargs=dict(secondary_y=True))\n",
|
||||||
|
" rsi.plot(fig=fig, add_trace_kwargs=dict(secondary_y=False))\n",
|
||||||
|
" entries.vbt.signals.plot_as_entries(rsi.rsi, fig=fig, add_trace_kwargs=dict(secondary_y=False)) \n",
|
||||||
|
" exits.vbt.signals.plot_as_exits(rsi.rsi, fig=fig, add_trace_kwargs=dict(secondary_y=False)) \n",
|
||||||
|
" return fig\n",
|
||||||
|
"\n",
|
||||||
|
"plot_rsi(rsi, close, long_entries, long_exits)\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"vbt.phelp(vbt.Portfolio.from_signals)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"sl_stop = np.arange(0.03/100, 0.2/100, 0.02/100).tolist()\n",
|
||||||
|
"# Using the round function\n",
|
||||||
|
"sl_stop = [round(val, 4) for val in sl_stop]\n",
|
||||||
|
"print(sl_stop)\n",
|
||||||
|
"sl_stop = vbt.Param(sl_stop) #np.nan mean s no stoploss\n",
|
||||||
|
"\n",
|
||||||
|
"pf = vbt.Portfolio.from_signals(close=close, entries=long_entries, sl_stop=sl_stop, tp_stop = sl_stop, exits=long_exits,fees=0.0167/100, freq=\"1s\") #sl_stop=sl_stop, tp_stop = sl_stop, \n",
|
||||||
|
"\n",
|
||||||
|
"#pf.stats()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"pf.plot()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"pf[(0.0015,0.0013)].plot()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"pf[0.03].plot_trade_signals()\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# pristup k pf jako multi index"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"#pf[0.03].plot()\n",
|
||||||
|
"#pf.order_records\n",
|
||||||
|
"pf[(0.03)].stats()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"#zgrupovane statistiky\n",
|
||||||
|
"stats_df = pf.stats([\n",
|
||||||
|
" 'total_return',\n",
|
||||||
|
" 'total_trades',\n",
|
||||||
|
" 'win_rate',\n",
|
||||||
|
" 'expectancy'\n",
|
||||||
|
"], agg_func=None)\n",
|
||||||
|
"stats_df\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"stats_df.nlargest(50, 'Total Return [%]')\n",
|
||||||
|
"#stats_df.info()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"pf[(0.0011,0.0013)].plot()\n",
|
||||||
|
"\n",
|
||||||
|
"#pf[(0.0011,0.0013000000000000002)].plot()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from pandas.tseries.offsets import DateOffset\n",
|
||||||
|
"\n",
|
||||||
|
"temp_data = basic_data['2024-4-22']\n",
|
||||||
|
"temp_data\n",
|
||||||
|
"res1m = temp_data[[\"Open\", \"High\", \"Low\", \"Close\", \"Volume\"]]\n",
|
||||||
|
"\n",
|
||||||
|
"# Define a custom date offset that starts at 9:30 AM and spans 4 hours\n",
|
||||||
|
"custom_offset = DateOffset(hours=4, minutes=30)\n",
|
||||||
|
"\n",
|
||||||
|
"# res1m = res1m.get().resample(\"4H\").agg({ \n",
|
||||||
|
"# \"Open\": \"first\",\n",
|
||||||
|
"# \"High\": \"max\",\n",
|
||||||
|
"# \"Low\": \"min\",\n",
|
||||||
|
"# \"Close\": \"last\",\n",
|
||||||
|
"# \"Volume\": \"sum\"\n",
|
||||||
|
"# })\n",
|
||||||
|
"\n",
|
||||||
|
"res4h = res1m.resample(\"1h\", resample_kwargs=dict(origin=\"start\"))\n",
|
||||||
|
"\n",
|
||||||
|
"res4h.data\n",
|
||||||
|
"\n",
|
||||||
|
"res15m = res1m.resample(\"15T\", resample_kwargs=dict(origin=\"start\"))\n",
|
||||||
|
"\n",
|
||||||
|
"res15m.data[\"BAC\"]"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"@vbt.njit\n",
|
||||||
|
"def long_entry_place_func_nb(c, low, close, time_in_ns, rsi14, window_open, window_close):\n",
|
||||||
|
" market_open_minutes = 570 # 9 hours * 60 minutes + 30 minutes\n",
|
||||||
|
"\n",
|
||||||
|
" for out_i in range(len(c.out)):\n",
|
||||||
|
" i = c.from_i + out_i\n",
|
||||||
|
"\n",
|
||||||
|
" current_minutes = vbt.dt_nb.hour_nb(time_in_ns[i]) * 60 + vbt.dt_nb.minute_nb(time_in_ns[i])\n",
|
||||||
|
" #print(\"current_minutes\", current_minutes)\n",
|
||||||
|
" # Calculate elapsed minutes since market open at 9:30 AM\n",
|
||||||
|
" elapsed_from_open = current_minutes - market_open_minutes\n",
|
||||||
|
" elapsed_from_open = elapsed_from_open if elapsed_from_open >= 0 else 0\n",
|
||||||
|
" #print( \"elapsed_from_open\", elapsed_from_open)\n",
|
||||||
|
"\n",
|
||||||
|
" #elapsed_from_open = elapsed_minutes_from_open_nb(time_in_ns) \n",
|
||||||
|
" in_window = elapsed_from_open > window_open and elapsed_from_open < window_close\n",
|
||||||
|
" #print(\"in_window\", in_window)\n",
|
||||||
|
" # if in_window:\n",
|
||||||
|
" # print(\"in window\")\n",
|
||||||
|
"\n",
|
||||||
|
" if in_window and rsi14[i] > 60: # and low[i, c.col] <= hit_price: # and hour == 9: # (4)!\n",
|
||||||
|
" return out_i\n",
|
||||||
|
" return -1\n",
|
||||||
|
"\n",
|
||||||
|
"@vbt.njit\n",
|
||||||
|
"def long_exit_place_func_nb(c, high, close, time_index, tp, sl): # (5)!\n",
|
||||||
|
" entry_i = c.from_i - c.wait\n",
|
||||||
|
" entry_price = close[entry_i, c.col]\n",
|
||||||
|
" hit_price = entry_price * (1 + tp)\n",
|
||||||
|
" stop_price = entry_price * (1 - sl)\n",
|
||||||
|
" for out_i in range(len(c.out)):\n",
|
||||||
|
" i = c.from_i + out_i\n",
|
||||||
|
" last_bar_of_day = vbt.dt_nb.day_changed_nb(time_index[i], time_index[i + 1])\n",
|
||||||
|
"\n",
|
||||||
|
" #print(next_day)\n",
|
||||||
|
" if last_bar_of_day: #pokud je dalsi next day, tak zavirame posledni\n",
|
||||||
|
" print(\"ted\",out_i)\n",
|
||||||
|
" return out_i\n",
|
||||||
|
" if close[i, c.col] >= hit_price or close[i, c.col] <= stop_price :\n",
|
||||||
|
" return out_i\n",
|
||||||
|
" return -1\n",
|
||||||
|
"\n",
|
||||||
|
"\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"df = pd.DataFrame(np.random.random(size=(5, 10)), columns=list('abcdefghij'))\n",
|
||||||
|
"\n",
|
||||||
|
"df"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"df.sum()"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": ".venv",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.10.11"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 2
|
||||||
|
}
|
||||||
1639
research/test1sbars_roc.ipynb
Normal file
1639
research/test1sbars_roc.ipynb
Normal file
File diff suppressed because one or more lines are too long
45
restart.sh
Executable file
45
restart.sh
Executable file
@ -0,0 +1,45 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# file: restart.sh
|
||||||
|
|
||||||
|
# Usage: ./restart.sh [test|prod|all]
|
||||||
|
|
||||||
|
# Define server addresses
|
||||||
|
TEST_SERVER="david@142.132.188.109"
|
||||||
|
PROD_SERVER="david@5.161.179.223"
|
||||||
|
|
||||||
|
# Define the remote directory where the script is located
|
||||||
|
REMOTE_DIR="v2trading"
|
||||||
|
|
||||||
|
# Check for argument
|
||||||
|
if [ "$#" -ne 1 ]; then
|
||||||
|
echo "Usage: $0 [test|prod|all]"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Function to restart a server
|
||||||
|
restart_server() {
|
||||||
|
local server=$1
|
||||||
|
echo "Connecting to $server to restart the Python app..."
|
||||||
|
ssh -t $server "cd $REMOTE_DIR && . ~/.bashrc && ./run.sh restart" # Sourcing .bashrc here
|
||||||
|
echo "Operation completed on $server."
|
||||||
|
}
|
||||||
|
|
||||||
|
# Select the server based on the input argument
|
||||||
|
case $1 in
|
||||||
|
test)
|
||||||
|
restart_server $TEST_SERVER
|
||||||
|
;;
|
||||||
|
prod)
|
||||||
|
restart_server $PROD_SERVER
|
||||||
|
;;
|
||||||
|
all)
|
||||||
|
restart_server $TEST_SERVER
|
||||||
|
restart_server $PROD_SERVER
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
echo "Invalid argument: $1. Use 'test', 'prod', or 'all'."
|
||||||
|
exit 1
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
|
||||||
@ -23,12 +23,12 @@ clientTrading = TradingClient(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY,
|
|||||||
|
|
||||||
#get previous days bar
|
#get previous days bar
|
||||||
|
|
||||||
datetime_object_from = datetime.datetime(2023, 10, 11, 4, 0, 00, tzinfo=datetime.timezone.utc)
|
datetime_object_from = datetime.datetime(2024, 3, 9, 13, 29, 00, tzinfo=datetime.timezone.utc)
|
||||||
datetime_object_to = datetime.datetime(2023, 10, 16, 16, 1, 00, tzinfo=datetime.timezone.utc)
|
datetime_object_to = datetime.datetime(2024, 3, 11, 20, 1, 00, tzinfo=datetime.timezone.utc)
|
||||||
calendar_request = GetCalendarRequest(start=datetime_object_from,end=datetime_object_to)
|
# calendar_request = GetCalendarRequest(start=datetime_object_from,end=datetime_object_to)
|
||||||
cal_dates = clientTrading.get_calendar(calendar_request)
|
# cal_dates = clientTrading.get_calendar(calendar_request)
|
||||||
print(cal_dates)
|
# print(cal_dates)
|
||||||
bar_request = StockBarsRequest(symbol_or_symbols="BAC",timeframe=TimeFrame.Day, start=datetime_object_from, end=datetime_object_to, feed=DataFeed.SIP)
|
bar_request = StockBarsRequest(symbol_or_symbols="BAC",timeframe=TimeFrame.Minute, start=datetime_object_from, end=datetime_object_to, feed=DataFeed.SIP)
|
||||||
|
|
||||||
# bars = client.get_stock_bars(bar_request).df
|
# bars = client.get_stock_bars(bar_request).df
|
||||||
|
|
||||||
|
|||||||
18
testy/createbatchimage.py
Normal file
18
testy/createbatchimage.py
Normal file
@ -0,0 +1,18 @@
|
|||||||
|
import argparse
|
||||||
|
import v2realbot.reporting.metricstoolsimage as mt
|
||||||
|
|
||||||
|
# Parse the command-line arguments
|
||||||
|
# parser = argparse.ArgumentParser(description="Generate trading report image with batch ID")
|
||||||
|
# parser.add_argument("batch_id", type=str, help="The batch ID for the report")
|
||||||
|
# args = parser.parse_args()
|
||||||
|
|
||||||
|
# batch_id = args.batch_id
|
||||||
|
|
||||||
|
# Generate the report image
|
||||||
|
res, val = mt.generate_trading_report_image(batch_id="4d7dc163")
|
||||||
|
|
||||||
|
# Print the result
|
||||||
|
if res == 0:
|
||||||
|
print("BATCH REPORT CREATED")
|
||||||
|
else:
|
||||||
|
print(f"BATCH REPORT ERROR - {val}")
|
||||||
@ -1,7 +1,9 @@
|
|||||||
import os,sys
|
import os,sys
|
||||||
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
print(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
print(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
from alpaca.data.historical import CryptoHistoricalDataClient, StockHistoricalDataClient
|
import pandas as pd
|
||||||
|
import numpy as np
|
||||||
|
from alpaca.data.historical import StockHistoricalDataClient
|
||||||
from alpaca.data.requests import CryptoLatestTradeRequest, StockLatestTradeRequest, StockLatestBarRequest, StockTradesRequest
|
from alpaca.data.requests import CryptoLatestTradeRequest, StockLatestTradeRequest, StockLatestBarRequest, StockTradesRequest
|
||||||
from alpaca.data.enums import DataFeed
|
from alpaca.data.enums import DataFeed
|
||||||
from v2realbot.config import ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY
|
from v2realbot.config import ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY
|
||||||
|
|||||||
89
testy/getrunnerdetail.py
Normal file
89
testy/getrunnerdetail.py
Normal file
@ -0,0 +1,89 @@
|
|||||||
|
|
||||||
|
from v2realbot.common.model import RunDay, StrategyInstance, Runner, RunRequest, RunArchive, RunArchiveView, RunArchiveViewPagination, RunArchiveDetail, RunArchiveChange, Bar, TradeEvent, TestList, Intervals, ConfigItem, InstantIndicator, DataTablesRequest
|
||||||
|
import v2realbot.controller.services as cs
|
||||||
|
from v2realbot.utils.utils import slice_dict_lists,zoneUTC,safe_get, AttributeDict
|
||||||
|
id = "b11c66d9-a9b6-475a-9ac1-28b11e1b4edf"
|
||||||
|
state = AttributeDict(vars={})
|
||||||
|
|
||||||
|
##základ pro init_attached_data in strategy.init
|
||||||
|
|
||||||
|
# def get_previous_runner(state):
|
||||||
|
# runner : Runner
|
||||||
|
# res, runner = cs.get_runner(state.runner_id)
|
||||||
|
# if res < 0:
|
||||||
|
# print(f"Not running {id}")
|
||||||
|
# return 0, None
|
||||||
|
|
||||||
|
# return 0, runner.batch_id
|
||||||
|
|
||||||
|
def attach_previous_data(state):
|
||||||
|
runner : Runner
|
||||||
|
#get batch_id of current runer
|
||||||
|
res, runner = cs.get_runner(state.runner_id)
|
||||||
|
if res < 0 or runner.batch_id is None:
|
||||||
|
print(f"Couldnt get previous runner {val}")
|
||||||
|
return None
|
||||||
|
|
||||||
|
batch_id = runner.batch_id
|
||||||
|
#batch_id = "6a6b0bcf"
|
||||||
|
|
||||||
|
res, runner_ids =cs.get_archived_runnerslist_byBatchID(batch_id, "desc")
|
||||||
|
if res < 0:
|
||||||
|
msg = f"error whne fetching runners of batch {batch_id} {runner_ids}"
|
||||||
|
print(msg)
|
||||||
|
return None
|
||||||
|
|
||||||
|
if runner_ids is None or len(runner_ids) == 0:
|
||||||
|
print(f"no runners found for batch {batch_id} {runner_ids}")
|
||||||
|
return None
|
||||||
|
|
||||||
|
last_runner = runner_ids[0]
|
||||||
|
print("Previous runner identified:", last_runner)
|
||||||
|
|
||||||
|
#get details from the runner
|
||||||
|
res, val = cs.get_archived_runner_details_byID(last_runner)
|
||||||
|
if res < 0:
|
||||||
|
print(f"no archived runner {last_runner}")
|
||||||
|
|
||||||
|
detail = RunArchiveDetail(**val)
|
||||||
|
#print("toto jsme si dotahnuli", detail.bars)
|
||||||
|
|
||||||
|
# from stratvars directives
|
||||||
|
attach_previous_bars_indicators = safe_get(state.vars, "attach_previous_bars_indicators", 50)
|
||||||
|
attach_previous_cbar_indicators = safe_get(state.vars, "attach_previous_cbar_indicators", 50)
|
||||||
|
# [stratvars]
|
||||||
|
# attach_previous_bars_indicators = 50
|
||||||
|
# attach_previous_cbar_indicators = 50
|
||||||
|
|
||||||
|
#indicators datetime utc
|
||||||
|
indicators = slice_dict_lists(d=detail.indicators[0],last_item=attach_previous_bars_indicators, time_to_datetime=True)
|
||||||
|
|
||||||
|
#time -datetime utc, updated - timestamp float
|
||||||
|
bars = slice_dict_lists(d=detail.bars, last_item=attach_previous_bars_indicators, time_to_datetime=True)
|
||||||
|
|
||||||
|
#cbar_indicatzors #float
|
||||||
|
cbar_inds = slice_dict_lists(d=detail.indicators[1],last_item=attach_previous_cbar_indicators)
|
||||||
|
|
||||||
|
#USE these as INITs - TADY SI TO JESTE ZASTAVIT a POROVNAT
|
||||||
|
print(f"{state.indicators=} NEW:{indicators=}")
|
||||||
|
state.indicators = indicators
|
||||||
|
print(f"{state.bars=} NEW:{bars=}")
|
||||||
|
state.bars = bars
|
||||||
|
print(f"{state.cbar_indicators=} NEW:{cbar_inds=}")
|
||||||
|
state.cbar_indicators = cbar_inds
|
||||||
|
|
||||||
|
print("BARS and INDS INITIALIZED")
|
||||||
|
#bars
|
||||||
|
|
||||||
|
|
||||||
|
#tady budou pripadne dalsi inicializace, z ext_data
|
||||||
|
print("EXT_DATA", detail.ext_data)
|
||||||
|
#podle urciteho nastaveni napr.v konfiguraci se pouziji urcite promenne
|
||||||
|
|
||||||
|
#pridavame dailyBars z extData
|
||||||
|
# if hasattr(detail, "ext_data") and "dailyBars" in detail.ext_data:
|
||||||
|
# state.dailyBars = detail.ext_data["dailyBars"]
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
attach_previous_data(state)
|
||||||
74
testy/tablesizes.py
Normal file
74
testy/tablesizes.py
Normal file
@ -0,0 +1,74 @@
|
|||||||
|
import queue
|
||||||
|
import sqlite3
|
||||||
|
import threading
|
||||||
|
from appdirs import user_data_dir
|
||||||
|
|
||||||
|
DATA_DIR = user_data_dir("v2realbot")
|
||||||
|
sqlite_db_file = DATA_DIR + "/v2trading.db"
|
||||||
|
|
||||||
|
class ConnectionPool:
|
||||||
|
def __init__(self, max_connections):
|
||||||
|
self.max_connections = max_connections
|
||||||
|
self.connections = queue.Queue(max_connections)
|
||||||
|
self.lock = threading.Lock()
|
||||||
|
|
||||||
|
def get_connection(self):
|
||||||
|
with self.lock:
|
||||||
|
if self.connections.empty():
|
||||||
|
return self.create_connection()
|
||||||
|
else:
|
||||||
|
return self.connections.get()
|
||||||
|
|
||||||
|
def release_connection(self, connection):
|
||||||
|
with self.lock:
|
||||||
|
self.connections.put(connection)
|
||||||
|
|
||||||
|
def create_connection(self):
|
||||||
|
connection = sqlite3.connect(sqlite_db_file, check_same_thread=False)
|
||||||
|
return connection
|
||||||
|
|
||||||
|
|
||||||
|
pool = ConnectionPool(10)
|
||||||
|
|
||||||
|
def get_table_sizes_in_mb():
|
||||||
|
# Connect to the SQLite database
|
||||||
|
conn = pool.get_connection()
|
||||||
|
cursor = conn.cursor()
|
||||||
|
|
||||||
|
# Get the list of tables
|
||||||
|
cursor.execute("SELECT name FROM sqlite_master WHERE type='table';")
|
||||||
|
tables = cursor.fetchall()
|
||||||
|
|
||||||
|
# Dictionary to store table sizes
|
||||||
|
table_sizes = {}
|
||||||
|
|
||||||
|
for table in tables:
|
||||||
|
table_name = table[0]
|
||||||
|
|
||||||
|
# Get total number of rows in the table
|
||||||
|
cursor.execute(f"SELECT COUNT(*) FROM {table_name};")
|
||||||
|
row_count = cursor.fetchone()[0]
|
||||||
|
|
||||||
|
if row_count > 0:
|
||||||
|
# Sample a few rows (e.g., 10 rows) and calculate average row size
|
||||||
|
cursor.execute(f"SELECT * FROM {table_name} LIMIT 10;")
|
||||||
|
sample_rows = cursor.fetchall()
|
||||||
|
total_sample_size = sum(sum(len(str(cell)) for cell in row) for row in sample_rows)
|
||||||
|
avg_row_size = total_sample_size / len(sample_rows)
|
||||||
|
|
||||||
|
# Estimate table size in megabytes
|
||||||
|
size_in_mb = (avg_row_size * row_count) / (1024 * 1024)
|
||||||
|
else:
|
||||||
|
size_in_mb = 0
|
||||||
|
|
||||||
|
table_sizes[table_name] = {'size_mb': size_in_mb, 'rows': row_count}
|
||||||
|
|
||||||
|
conn.close()
|
||||||
|
return table_sizes
|
||||||
|
|
||||||
|
# Usage example
|
||||||
|
db_path = 'path_to_your_database.db'
|
||||||
|
table_sizes = get_table_sizes_in_mb()
|
||||||
|
for table, info in table_sizes.items():
|
||||||
|
print(f"Table: {table}, Size: {info['size_mb']} MB, Rows: {info['rows']}")
|
||||||
|
|
||||||
66
testy/vectorbt/testHtml2MD.py
Normal file
66
testy/vectorbt/testHtml2MD.py
Normal file
@ -0,0 +1,66 @@
|
|||||||
|
import os
|
||||||
|
from bs4 import BeautifulSoup
|
||||||
|
import html2text
|
||||||
|
|
||||||
|
def convert_html_to_markdown(html_content, link_mapping):
|
||||||
|
h = html2text.HTML2Text()
|
||||||
|
h.ignore_links = False
|
||||||
|
|
||||||
|
# Update internal links to point to the relevant sections in the Markdown
|
||||||
|
soup = BeautifulSoup(html_content, 'html.parser')
|
||||||
|
for a in soup.find_all('a', href=True):
|
||||||
|
href = a['href']
|
||||||
|
if href in link_mapping:
|
||||||
|
a['href'] = f"#{link_mapping[href]}"
|
||||||
|
|
||||||
|
return h.handle(str(soup))
|
||||||
|
|
||||||
|
def create_link_mapping(root_dir):
|
||||||
|
link_mapping = {}
|
||||||
|
for subdir, _, files in os.walk(root_dir):
|
||||||
|
for file in files:
|
||||||
|
if file == "index.html":
|
||||||
|
relative_path = os.path.relpath(os.path.join(subdir, file), root_dir)
|
||||||
|
chapter_id = relative_path.replace(os.sep, '-').replace('index.html', '')
|
||||||
|
link_mapping[relative_path] = chapter_id
|
||||||
|
link_mapping[relative_path.replace(os.sep, '/')] = chapter_id # for URLs with slashes
|
||||||
|
return link_mapping
|
||||||
|
|
||||||
|
def read_html_files(root_dir, link_mapping):
|
||||||
|
markdown_content = []
|
||||||
|
|
||||||
|
for subdir, _, files in os.walk(root_dir):
|
||||||
|
relative_path = os.path.relpath(subdir, root_dir)
|
||||||
|
if files and any(file == "index.html" for file in files):
|
||||||
|
# Add directory as a heading based on its depth
|
||||||
|
heading_level = relative_path.count(os.sep) + 1
|
||||||
|
markdown_content.append(f"{'#' * heading_level} {relative_path}\n")
|
||||||
|
|
||||||
|
for file in files:
|
||||||
|
if file == "index.html":
|
||||||
|
file_path = os.path.join(subdir, file)
|
||||||
|
with open(file_path, 'r', encoding='utf-8') as f:
|
||||||
|
html_content = f.read()
|
||||||
|
soup = BeautifulSoup(html_content, 'html.parser')
|
||||||
|
title = soup.title.string if soup.title else "No Title"
|
||||||
|
chapter_id = os.path.relpath(file_path, root_dir).replace(os.sep, '-').replace('index.html', '')
|
||||||
|
markdown_content.append(f"<a id='{chapter_id}'></a>\n")
|
||||||
|
markdown_content.append(f"{'#' * (heading_level + 1)} {title}\n")
|
||||||
|
markdown_content.append(convert_html_to_markdown(html_content, link_mapping))
|
||||||
|
|
||||||
|
return "\n".join(markdown_content)
|
||||||
|
|
||||||
|
def save_to_markdown_file(content, output_file):
|
||||||
|
with open(output_file, 'w', encoding='utf-8') as f:
|
||||||
|
f.write(content)
|
||||||
|
|
||||||
|
def main():
|
||||||
|
root_dir = "./v2realbot/static/js/vbt/"
|
||||||
|
output_file = "output.md"
|
||||||
|
link_mapping = create_link_mapping(root_dir)
|
||||||
|
markdown_content = read_html_files(root_dir, link_mapping)
|
||||||
|
save_to_markdown_file(markdown_content, output_file)
|
||||||
|
print(f"Markdown document created at {output_file}")
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
@ -3,7 +3,7 @@ sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
|||||||
from v2realbot.strategy.base import StrategyState
|
from v2realbot.strategy.base import StrategyState
|
||||||
from v2realbot.strategy.StrategyOrderLimitVykladaciNormalizedMYSELL import StrategyOrderLimitVykladaciNormalizedMYSELL
|
from v2realbot.strategy.StrategyOrderLimitVykladaciNormalizedMYSELL import StrategyOrderLimitVykladaciNormalizedMYSELL
|
||||||
from v2realbot.enums.enums import RecordType, StartBarAlign, Mode, Account
|
from v2realbot.enums.enums import RecordType, StartBarAlign, Mode, Account
|
||||||
from v2realbot.utils.utils import zoneNY, print
|
from v2realbot.utils.utils import zoneNY, print, fetch_calendar_data, send_to_telegram
|
||||||
from v2realbot.utils.historicals import get_historical_bars
|
from v2realbot.utils.historicals import get_historical_bars
|
||||||
from datetime import datetime, timedelta
|
from datetime import datetime, timedelta
|
||||||
from rich import print as printanyway
|
from rich import print as printanyway
|
||||||
@ -16,9 +16,9 @@ from v2realbot.strategyblocks.newtrade.signals import signal_search
|
|||||||
from v2realbot.strategyblocks.activetrade.activetrade_hub import manage_active_trade
|
from v2realbot.strategyblocks.activetrade.activetrade_hub import manage_active_trade
|
||||||
from v2realbot.strategyblocks.inits.init_indicators import initialize_dynamic_indicators
|
from v2realbot.strategyblocks.inits.init_indicators import initialize_dynamic_indicators
|
||||||
from v2realbot.strategyblocks.inits.init_directives import intialize_directive_conditions
|
from v2realbot.strategyblocks.inits.init_directives import intialize_directive_conditions
|
||||||
from alpaca.trading.requests import GetCalendarRequest
|
from v2realbot.strategyblocks.inits.init_attached_data import attach_previous_data
|
||||||
from alpaca.trading.client import TradingClient
|
from alpaca.trading.client import TradingClient
|
||||||
from v2realbot.config import ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, DATA_DIR, OFFLINE_MODE
|
from v2realbot.config import ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, DATA_DIR
|
||||||
from alpaca.trading.models import Calendar
|
from alpaca.trading.models import Calendar
|
||||||
from v2realbot.indicators.oscillators import rsi
|
from v2realbot.indicators.oscillators import rsi
|
||||||
from v2realbot.indicators.moving_averages import sma
|
from v2realbot.indicators.moving_averages import sma
|
||||||
@ -116,6 +116,10 @@ def init(state: StrategyState):
|
|||||||
#models
|
#models
|
||||||
state.vars.loaded_models = {}
|
state.vars.loaded_models = {}
|
||||||
|
|
||||||
|
#state attributes for martingale sizing mngmt
|
||||||
|
state.vars["transferables"] = {}
|
||||||
|
state.vars["transferables"]["martingale"] = dict(cont_loss_series_cnt=0)
|
||||||
|
|
||||||
#INITIALIZE CBAR INDICATORS - do vlastni funkce
|
#INITIALIZE CBAR INDICATORS - do vlastni funkce
|
||||||
#state.cbar_indicators['ivwap'] = []
|
#state.cbar_indicators['ivwap'] = []
|
||||||
state.vars.last_tick_price = 0
|
state.vars.last_tick_price = 0
|
||||||
@ -129,6 +133,9 @@ def init(state: StrategyState):
|
|||||||
initialize_dynamic_indicators(state)
|
initialize_dynamic_indicators(state)
|
||||||
intialize_directive_conditions(state)
|
intialize_directive_conditions(state)
|
||||||
|
|
||||||
|
#attach part of yesterdays data, bars, indicators, cbar_indicators
|
||||||
|
attach_previous_data(state)
|
||||||
|
|
||||||
#intitialize indicator mapping (for use in operation) - mozna presunout do samostatne funkce prip dat do base kdyz se osvedci
|
#intitialize indicator mapping (for use in operation) - mozna presunout do samostatne funkce prip dat do base kdyz se osvedci
|
||||||
local_dict_cbar_inds = {key: state.cbar_indicators[key] for key in state.cbar_indicators.keys() if key != "time"}
|
local_dict_cbar_inds = {key: state.cbar_indicators[key] for key in state.cbar_indicators.keys() if key != "time"}
|
||||||
local_dict_inds = {key: state.indicators[key] for key in state.indicators.keys() if key != "time"}
|
local_dict_inds = {key: state.indicators[key] for key in state.indicators.keys() if key != "time"}
|
||||||
@ -167,10 +174,13 @@ def init(state: StrategyState):
|
|||||||
today = time_to.date()
|
today = time_to.date()
|
||||||
several_days_ago = today - timedelta(days=60)
|
several_days_ago = today - timedelta(days=60)
|
||||||
#printanyway(f"{today=}",f"{several_days_ago=}")
|
#printanyway(f"{today=}",f"{several_days_ago=}")
|
||||||
clientTrading = TradingClient(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, raw_data=False)
|
#clientTrading = TradingClient(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, raw_data=False)
|
||||||
#get all market days from here to 40days ago
|
#get all market days from here to 40days ago
|
||||||
calendar_request = GetCalendarRequest(start=several_days_ago,end=today)
|
|
||||||
cal_dates = clientTrading.get_calendar(calendar_request)
|
#calendar_request = GetCalendarRequest(start=several_days_ago,end=today)
|
||||||
|
|
||||||
|
cal_dates = fetch_calendar_data(several_days_ago, today)
|
||||||
|
#cal_dates = clientTrading.get_calendar(calendar_request)
|
||||||
|
|
||||||
#find the first market day - 40days ago
|
#find the first market day - 40days ago
|
||||||
#history_datetime_from = zoneNY.localize(cal_dates[0].open)
|
#history_datetime_from = zoneNY.localize(cal_dates[0].open)
|
||||||
@ -197,7 +207,12 @@ def init(state: StrategyState):
|
|||||||
|
|
||||||
#NOTE zatim pridano takto do baru dalsi indikatory
|
#NOTE zatim pridano takto do baru dalsi indikatory
|
||||||
#BUDE PREDELANO - v rámci custom rozliseni a static indikátoru
|
#BUDE PREDELANO - v rámci custom rozliseni a static indikátoru
|
||||||
|
if state.dailyBars is None:
|
||||||
|
print("Nepodařilo se načíst denní bary")
|
||||||
|
err_msg = f"Nepodařilo se načíst denní bary (get_historical_bars) pro {state.symbol} od {history_datetime_from} do {history_datetime_to} ve strat.init. Probably wrong symbol?"
|
||||||
|
send_to_telegram(err_msg)
|
||||||
|
raise Exception(err_msg)
|
||||||
|
|
||||||
#RSI vraci pouze pro vsechny + prepend with zeros nepocita prvnich N (dle rsi length)
|
#RSI vraci pouze pro vsechny + prepend with zeros nepocita prvnich N (dle rsi length)
|
||||||
rsi_calculated = rsi(state.dailyBars["vwap"], 14).tolist()
|
rsi_calculated = rsi(state.dailyBars["vwap"], 14).tolist()
|
||||||
num_zeros_to_prepend = len(state.dailyBars["vwap"]) - len(rsi_calculated)
|
num_zeros_to_prepend = len(state.dailyBars["vwap"]) - len(rsi_calculated)
|
||||||
|
|||||||
@ -40,10 +40,10 @@
|
|||||||
from uuid import UUID, uuid4
|
from uuid import UUID, uuid4
|
||||||
from alpaca.trading.enums import OrderSide, OrderStatus, TradeEvent, OrderType
|
from alpaca.trading.enums import OrderSide, OrderStatus, TradeEvent, OrderType
|
||||||
from v2realbot.common.model import TradeUpdate, Order
|
from v2realbot.common.model import TradeUpdate, Order
|
||||||
#from rich import print
|
from rich import print as printanyway
|
||||||
import threading
|
import threading
|
||||||
import asyncio
|
import asyncio
|
||||||
from v2realbot.config import BT_DELAYS, DATA_DIR, BT_FILL_CONDITION_BUY_LIMIT, BT_FILL_CONDITION_SELL_LIMIT, BT_FILL_LOG_SURROUNDING_TRADES, BT_FILL_CONS_TRADES_REQUIRED,BT_FILL_PRICE_MARKET_ORDER_PREMIUM
|
from v2realbot.config import DATA_DIR
|
||||||
from v2realbot.utils.utils import AttributeDict, ltp, zoneNY, trunc, count_decimals, print
|
from v2realbot.utils.utils import AttributeDict, ltp, zoneNY, trunc, count_decimals, print
|
||||||
from v2realbot.utils.tlog import tlog
|
from v2realbot.utils.tlog import tlog
|
||||||
from v2realbot.enums.enums import FillCondition
|
from v2realbot.enums.enums import FillCondition
|
||||||
@ -60,6 +60,7 @@ from v2realbot.utils.dash_save_html import make_static
|
|||||||
import dash_bootstrap_components as dbc
|
import dash_bootstrap_components as dbc
|
||||||
from dash.dependencies import Input, Output
|
from dash.dependencies import Input, Output
|
||||||
from dash import dcc, html, dash_table, Dash
|
from dash import dcc, html, dash_table, Dash
|
||||||
|
import v2realbot.utils.config_handler as cfh
|
||||||
""""
|
""""
|
||||||
LATENCY DELAYS
|
LATENCY DELAYS
|
||||||
.000 trigger - last_trade_time (.4246266)
|
.000 trigger - last_trade_time (.4246266)
|
||||||
@ -171,7 +172,7 @@ class Backtester:
|
|||||||
todel.append(order)
|
todel.append(order)
|
||||||
elif not self.symbol or order.symbol == self.symbol:
|
elif not self.symbol or order.symbol == self.symbol:
|
||||||
#pricteme mininimalni latency od submittu k fillu
|
#pricteme mininimalni latency od submittu k fillu
|
||||||
if order.submitted_at.timestamp() + BT_DELAYS.sub_to_fill > float(intime):
|
if order.submitted_at.timestamp() + cfh.config_handler.get_val('BT_DELAYS','sub_to_fill') > float(intime):
|
||||||
print(f"too soon for {order.id}")
|
print(f"too soon for {order.id}")
|
||||||
#try to execute
|
#try to execute
|
||||||
else:
|
else:
|
||||||
@ -197,7 +198,7 @@ class Backtester:
|
|||||||
#Mazeme, jinak je to hruza
|
#Mazeme, jinak je to hruza
|
||||||
#nechavame na konci trady, které muzeme potrebovat pro consekutivni pravidlo
|
#nechavame na konci trady, které muzeme potrebovat pro consekutivni pravidlo
|
||||||
#osetrujeme, kdy je malo tradu a oriznuti by slo do zaporu
|
#osetrujeme, kdy je malo tradu a oriznuti by slo do zaporu
|
||||||
del_to_index = index_end-2-BT_FILL_CONS_TRADES_REQUIRED
|
del_to_index = index_end-2-cfh.config_handler.get_val('BT_FILL_CONS_TRADES_REQUIRED')
|
||||||
del_to_index = del_to_index if del_to_index > 0 else 0
|
del_to_index = del_to_index if del_to_index > 0 else 0
|
||||||
del self.btdata[0:del_to_index]
|
del self.btdata[0:del_to_index]
|
||||||
##ic("after delete",len(self.btdata[0:index_end]))
|
##ic("after delete",len(self.btdata[0:index_end]))
|
||||||
@ -218,7 +219,7 @@ class Backtester:
|
|||||||
|
|
||||||
fill_time = None
|
fill_time = None
|
||||||
fill_price = None
|
fill_price = None
|
||||||
order_min_fill_time = o.submitted_at.timestamp() + BT_DELAYS.sub_to_fill
|
order_min_fill_time = o.submitted_at.timestamp() + cfh.config_handler.get_val('BT_DELAYS','sub_to_fill')
|
||||||
#ic(order_min_fill_time)
|
#ic(order_min_fill_time)
|
||||||
#ic(len(work_range))
|
#ic(len(work_range))
|
||||||
|
|
||||||
@ -240,17 +241,18 @@ class Backtester:
|
|||||||
#NASTVENI PODMINEK PLNENI
|
#NASTVENI PODMINEK PLNENI
|
||||||
fast_fill_condition = i[1] <= o.limit_price
|
fast_fill_condition = i[1] <= o.limit_price
|
||||||
slow_fill_condition = i[1] < o.limit_price
|
slow_fill_condition = i[1] < o.limit_price
|
||||||
if BT_FILL_CONDITION_BUY_LIMIT == FillCondition.FAST:
|
fill_cond_buy_limit = cfh.config_handler.get_val('BT_FILL_CONDITION_BUY_LIMIT')
|
||||||
|
if fill_cond_buy_limit == FillCondition.FAST:
|
||||||
fill_condition = fast_fill_condition
|
fill_condition = fast_fill_condition
|
||||||
elif BT_FILL_CONDITION_BUY_LIMIT == FillCondition.SLOW:
|
elif fill_cond_buy_limit == FillCondition.SLOW:
|
||||||
fill_condition = slow_fill_condition
|
fill_condition = slow_fill_condition
|
||||||
else:
|
else:
|
||||||
print("unknow fill condition")
|
print("unknow fill condition")
|
||||||
return -1
|
return -1
|
||||||
|
|
||||||
if float(i[0]) > float(order_min_fill_time+BT_DELAYS.limit_order_offset) and fill_condition:
|
if float(i[0]) > float(order_min_fill_time+cfh.config_handler.get_val('BT_DELAYS','limit_order_offset')) and fill_condition:
|
||||||
consec_cnt += 1
|
consec_cnt += 1
|
||||||
if consec_cnt == BT_FILL_CONS_TRADES_REQUIRED:
|
if consec_cnt == cfh.config_handler.get_val('BT_FILL_CONS_TRADES_REQUIRED'):
|
||||||
|
|
||||||
#(1679081919.381649, 27.88)
|
#(1679081919.381649, 27.88)
|
||||||
#ic(i)
|
#ic(i)
|
||||||
@ -261,10 +263,10 @@ class Backtester:
|
|||||||
#fill_price = i[1]
|
#fill_price = i[1]
|
||||||
|
|
||||||
print("FILL LIMIT BUY at", fill_time, datetime.fromtimestamp(fill_time).astimezone(zoneNY), "at",i[1])
|
print("FILL LIMIT BUY at", fill_time, datetime.fromtimestamp(fill_time).astimezone(zoneNY), "at",i[1])
|
||||||
if BT_FILL_LOG_SURROUNDING_TRADES != 0:
|
if cfh.config_handler.get_val('BT_FILL_LOG_SURROUNDING_TRADES') != 0:
|
||||||
#TODO loguru
|
#TODO loguru
|
||||||
print("FILL SURR TRADES: before",work_range[index-BT_FILL_LOG_SURROUNDING_TRADES:index])
|
print("FILL SURR TRADES: before",work_range[index-cfh.config_handler.get_val('BT_FILL_LOG_SURROUNDING_TRADES'):index])
|
||||||
print("FILL SURR TRADES: fill and after",work_range[index:index+BT_FILL_LOG_SURROUNDING_TRADES])
|
print("FILL SURR TRADES: fill and after",work_range[index:index+cfh.config_handler.get_val('BT_FILL_LOG_SURROUNDING_TRADES')])
|
||||||
break
|
break
|
||||||
else:
|
else:
|
||||||
consec_cnt = 0
|
consec_cnt = 0
|
||||||
@ -275,17 +277,18 @@ class Backtester:
|
|||||||
#NASTVENI PODMINEK PLNENI
|
#NASTVENI PODMINEK PLNENI
|
||||||
fast_fill_condition = i[1] >= o.limit_price
|
fast_fill_condition = i[1] >= o.limit_price
|
||||||
slow_fill_condition = i[1] > o.limit_price
|
slow_fill_condition = i[1] > o.limit_price
|
||||||
if BT_FILL_CONDITION_SELL_LIMIT == FillCondition.FAST:
|
fill_conf_sell_cfg = cfh.config_handler.get_val('BT_FILL_CONDITION_SELL_LIMIT')
|
||||||
|
if fill_conf_sell_cfg == FillCondition.FAST:
|
||||||
fill_condition = fast_fill_condition
|
fill_condition = fast_fill_condition
|
||||||
elif BT_FILL_CONDITION_SELL_LIMIT == FillCondition.SLOW:
|
elif fill_conf_sell_cfg == FillCondition.SLOW:
|
||||||
fill_condition = slow_fill_condition
|
fill_condition = slow_fill_condition
|
||||||
else:
|
else:
|
||||||
print("unknown fill condition")
|
print("unknown fill condition")
|
||||||
return -1
|
return -1
|
||||||
|
|
||||||
if float(i[0]) > float(order_min_fill_time+BT_DELAYS.limit_order_offset) and fill_condition:
|
if float(i[0]) > float(order_min_fill_time+cfh.config_handler.get_val('BT_DELAYS','limit_order_offset')) and fill_condition:
|
||||||
consec_cnt += 1
|
consec_cnt += 1
|
||||||
if consec_cnt == BT_FILL_CONS_TRADES_REQUIRED:
|
if consec_cnt == cfh.config_handler.get_val('BT_FILL_CONS_TRADES_REQUIRED'):
|
||||||
#(1679081919.381649, 27.88)
|
#(1679081919.381649, 27.88)
|
||||||
#ic(i)
|
#ic(i)
|
||||||
fill_time = i[0]
|
fill_time = i[0]
|
||||||
@ -297,10 +300,11 @@ class Backtester:
|
|||||||
|
|
||||||
#fill_price = i[1]
|
#fill_price = i[1]
|
||||||
print("FILL LIMIT SELL at", fill_time, datetime.fromtimestamp(fill_time).astimezone(zoneNY), "at",i[1])
|
print("FILL LIMIT SELL at", fill_time, datetime.fromtimestamp(fill_time).astimezone(zoneNY), "at",i[1])
|
||||||
if BT_FILL_LOG_SURROUNDING_TRADES != 0:
|
surr_trades_cfg = cfh.config_handler.get_val('BT_FILL_LOG_SURROUNDING_TRADES')
|
||||||
|
if surr_trades_cfg != 0:
|
||||||
#TODO loguru
|
#TODO loguru
|
||||||
print("FILL SELL SURR TRADES: before",work_range[index-BT_FILL_LOG_SURROUNDING_TRADES:index])
|
print("FILL SELL SURR TRADES: before",work_range[index-surr_trades_cfg:index])
|
||||||
print("FILL SELL SURR TRADES: fill and after",work_range[index:index+BT_FILL_LOG_SURROUNDING_TRADES])
|
print("FILL SELL SURR TRADES: fill and after",work_range[index:index+surr_trades_cfg])
|
||||||
break
|
break
|
||||||
else:
|
else:
|
||||||
consec_cnt = 0
|
consec_cnt = 0
|
||||||
@ -314,11 +318,16 @@ class Backtester:
|
|||||||
#ic(i)
|
#ic(i)
|
||||||
fill_time = i[0]
|
fill_time = i[0]
|
||||||
fill_price = i[1]
|
fill_price = i[1]
|
||||||
#přičteme MARKET PREMIUM z konfigurace (do budoucna mozna rozdilne pro BUY/SELL a nebo mozna z konfigurace pro dany itutl)
|
#přičteme MARKET PREMIUM z konfigurace (je v pct nebo abs) (do budoucna mozna rozdilne pro BUY/SELL a nebo mozna z konfigurace pro dany titul)
|
||||||
|
cfg_premium = cfh.config_handler.get_val('BT_FILL_PRICE_MARKET_ORDER_PREMIUM')
|
||||||
|
if cfg_premium < 0: #configured as percentage
|
||||||
|
premium = abs(cfg_premium) * fill_price / 100.0
|
||||||
|
else: #configured as absolute value
|
||||||
|
premium = cfg_premium
|
||||||
if o.side == OrderSide.BUY:
|
if o.side == OrderSide.BUY:
|
||||||
fill_price = fill_price + BT_FILL_PRICE_MARKET_ORDER_PREMIUM
|
fill_price = fill_price + premium
|
||||||
elif o.side == OrderSide.SELL:
|
elif o.side == OrderSide.SELL:
|
||||||
fill_price = fill_price - BT_FILL_PRICE_MARKET_ORDER_PREMIUM
|
fill_price = fill_price - premium
|
||||||
|
|
||||||
print("FILL ",o.side,"MARKET at", fill_time, datetime.fromtimestamp(fill_time).astimezone(zoneNY), "cena", i[1])
|
print("FILL ",o.side,"MARKET at", fill_time, datetime.fromtimestamp(fill_time).astimezone(zoneNY), "cena", i[1])
|
||||||
break
|
break
|
||||||
@ -367,7 +376,7 @@ class Backtester:
|
|||||||
def _do_notification_with_callbacks(self, tradeupdate: TradeUpdate, time: float):
|
def _do_notification_with_callbacks(self, tradeupdate: TradeUpdate, time: float):
|
||||||
|
|
||||||
#do callbacku je třeba zpropagovat filltime čas (včetně latency pro notifikaci), aby se pripadne akce v callbacku udály s tímto časem
|
#do callbacku je třeba zpropagovat filltime čas (včetně latency pro notifikaci), aby se pripadne akce v callbacku udály s tímto časem
|
||||||
self.time = time + float(BT_DELAYS.fill_to_not)
|
self.time = time + float(cfh.config_handler.get_val('BT_DELAYS','fill_to_not'))
|
||||||
print("current bt.time",self.time)
|
print("current bt.time",self.time)
|
||||||
#print("FILL NOTIFICATION: ", tradeupdate)
|
#print("FILL NOTIFICATION: ", tradeupdate)
|
||||||
res = asyncio.run(self.order_fill_callback(tradeupdate))
|
res = asyncio.run(self.order_fill_callback(tradeupdate))
|
||||||
@ -470,11 +479,11 @@ class Backtester:
|
|||||||
print("BT: submit order entry")
|
print("BT: submit order entry")
|
||||||
|
|
||||||
if not time or time < 0:
|
if not time or time < 0:
|
||||||
print("time musi byt vyplneny")
|
printanyway("time musi byt vyplneny")
|
||||||
return -1
|
return -1
|
||||||
|
|
||||||
if not size or int(size) < 0:
|
if not size or int(size) < 0:
|
||||||
print("size musi byt vetsi nez 0")
|
printanyway("size musi byt vetsi nez 0")
|
||||||
return -1
|
return -1
|
||||||
|
|
||||||
if (order_type != OrderType.MARKET) and (order_type != OrderType.LIMIT):
|
if (order_type != OrderType.MARKET) and (order_type != OrderType.LIMIT):
|
||||||
@ -482,11 +491,11 @@ class Backtester:
|
|||||||
return -1
|
return -1
|
||||||
|
|
||||||
if not side == OrderSide.BUY and not side == OrderSide.SELL:
|
if not side == OrderSide.BUY and not side == OrderSide.SELL:
|
||||||
print("side buy/sell required")
|
printanyway("side buy/sell required")
|
||||||
return -1
|
return -1
|
||||||
|
|
||||||
if order_type == OrderType.LIMIT and count_decimals(price) > 2:
|
if order_type == OrderType.LIMIT and count_decimals(price) > 2:
|
||||||
print("only 2 decimals supported", price)
|
printanyway("only 2 decimals supported", price)
|
||||||
return -1
|
return -1
|
||||||
|
|
||||||
#pokud neexistuje klic v accountu vytvorime si ho
|
#pokud neexistuje klic v accountu vytvorime si ho
|
||||||
@ -508,14 +517,14 @@ class Backtester:
|
|||||||
|
|
||||||
actual_minus_reserved = int(self.account[symbol][0]) - reserved
|
actual_minus_reserved = int(self.account[symbol][0]) - reserved
|
||||||
if actual_minus_reserved > 0 and actual_minus_reserved - int(size) < 0:
|
if actual_minus_reserved > 0 and actual_minus_reserved - int(size) < 0:
|
||||||
print("not enough shares available to sell or shorting while long position",self.account[symbol][0],"reserved",reserved,"available",int(self.account[symbol][0]) - reserved,"selling",size)
|
printanyway("not enough shares available to sell or shorting while long position",self.account[symbol][0],"reserved",reserved,"available",int(self.account[symbol][0]) - reserved,"selling",size)
|
||||||
return -1
|
return -1
|
||||||
|
|
||||||
#if is shorting - check available cash to short
|
#if is shorting - check available cash to short
|
||||||
if actual_minus_reserved <= 0:
|
if actual_minus_reserved <= 0:
|
||||||
cena = price if price else self.get_last_price(time, self.symbol)
|
cena = price if price else self.get_last_price(time, self.symbol)
|
||||||
if (self.cash - reserved_price - float(int(size)*float(cena))) < 0:
|
if (self.cash - reserved_price - float(int(size)*float(cena))) < 0:
|
||||||
print("not enough cash for shorting. cash",self.cash,"reserved",reserved,"available",self.cash-reserved,"needed",float(int(size)*float(cena)))
|
printanyway("ERROR: not enough cash for shorting. cash",self.cash,"reserved",reserved,"available",self.cash-reserved,"needed",float(int(size)*float(cena)))
|
||||||
return -1
|
return -1
|
||||||
|
|
||||||
#check for available cash
|
#check for available cash
|
||||||
@ -534,14 +543,14 @@ class Backtester:
|
|||||||
|
|
||||||
#jde o uzavreni shortu
|
#jde o uzavreni shortu
|
||||||
if actual_plus_reserved_qty < 0 and (actual_plus_reserved_qty + int(size)) > 0:
|
if actual_plus_reserved_qty < 0 and (actual_plus_reserved_qty + int(size)) > 0:
|
||||||
print("nejprve je treba uzavrit short pozici pro buy res_qty, size", actual_plus_reserved_qty, size)
|
printanyway("nejprve je treba uzavrit short pozici pro buy res_qty, size", actual_plus_reserved_qty, size)
|
||||||
return -1
|
return -1
|
||||||
|
|
||||||
#jde o standardni long, kontroluju cash
|
#jde o standardni long, kontroluju cash
|
||||||
if actual_plus_reserved_qty >= 0:
|
if actual_plus_reserved_qty >= 0:
|
||||||
cena = price if price else self.get_last_price(time, self.symbol)
|
cena = price if price else self.get_last_price(time, self.symbol)
|
||||||
if (self.cash - reserved_price - float(int(size)*float(cena))) < 0:
|
if (self.cash - reserved_price - float(int(size)*float(cena))) < 0:
|
||||||
print("not enough cash to buy long. cash",self.cash,"reserved_qty",reserved_qty,"reserved_price",reserved_price, "available",self.cash-reserved_price,"needed",float(int(size)*float(cena)))
|
printanyway("ERROR: not enough cash to buy long. cash",self.cash,"reserved_qty",reserved_qty,"reserved_price",reserved_price, "available",self.cash-reserved_price,"needed",float(int(size)*float(cena)))
|
||||||
return -1
|
return -1
|
||||||
|
|
||||||
id = str(uuid4())
|
id = str(uuid4())
|
||||||
@ -568,11 +577,11 @@ class Backtester:
|
|||||||
print("BT: replace order entry",id,size,price)
|
print("BT: replace order entry",id,size,price)
|
||||||
|
|
||||||
if not price and not size:
|
if not price and not size:
|
||||||
print("size or price required")
|
printanyway("size or price required")
|
||||||
return -1
|
return -1
|
||||||
|
|
||||||
if len(self.open_orders) == 0:
|
if len(self.open_orders) == 0:
|
||||||
print("BT: order doesnt exist")
|
printanyway("BT: order doesnt exist")
|
||||||
return 0
|
return 0
|
||||||
#with lock:
|
#with lock:
|
||||||
for o in self.open_orders:
|
for o in self.open_orders:
|
||||||
@ -600,7 +609,7 @@ class Backtester:
|
|||||||
"""
|
"""
|
||||||
print("BT: cancel order entry",id)
|
print("BT: cancel order entry",id)
|
||||||
if len(self.open_orders) == 0:
|
if len(self.open_orders) == 0:
|
||||||
print("BTC: order doesnt exist")
|
printanyway("BTC: order doesnt exist")
|
||||||
return 0
|
return 0
|
||||||
#with lock:
|
#with lock:
|
||||||
for o in self.open_orders:
|
for o in self.open_orders:
|
||||||
@ -820,10 +829,10 @@ class Backtester:
|
|||||||
Trades:''' + str(len(self.trades)))
|
Trades:''' + str(len(self.trades)))
|
||||||
textik8 = html.Div('''
|
textik8 = html.Div('''
|
||||||
Profit:''' + str(state.profit))
|
Profit:''' + str(state.profit))
|
||||||
textik9 = html.Div(f"{BT_FILL_CONS_TRADES_REQUIRED=}")
|
textik9 = html.Div(f"{cfh.config_handler.get_val('BT_FILL_CONS_TRADES_REQUIRED')=}")
|
||||||
textik10 = html.Div(f"{BT_FILL_LOG_SURROUNDING_TRADES=}")
|
textik10 = html.Div(f"{cfh.config_handler.get_val('BT_FILL_LOG_SURROUNDING_TRADES')=}")
|
||||||
textik11 = html.Div(f"{BT_FILL_CONDITION_BUY_LIMIT=}")
|
textik11 = html.Div(f"{cfh.config_handler.get_val('BT_FILL_CONDITION_BUY_LIMIT')=}")
|
||||||
textik12 = html.Div(f"{BT_FILL_CONDITION_SELL_LIMIT=}")
|
textik12 = html.Div(f"{cfh.config_handler.get_val('BT_FILL_CONDITION_SELL_LIMIT')=}")
|
||||||
|
|
||||||
orders_title = dcc.Markdown('## Open orders')
|
orders_title = dcc.Markdown('## Open orders')
|
||||||
trades_title = dcc.Markdown('## Trades')
|
trades_title = dcc.Markdown('## Trades')
|
||||||
|
|||||||
@ -1,11 +1,8 @@
|
|||||||
from v2realbot.config import DATA_DIR
|
|
||||||
import sqlite3
|
import sqlite3
|
||||||
import queue
|
import queue
|
||||||
import threading
|
import threading
|
||||||
import time
|
import time
|
||||||
from v2realbot.common.model import RunArchive, RunArchiveView
|
from v2realbot.config import DATA_DIR
|
||||||
from datetime import datetime
|
|
||||||
import orjson
|
|
||||||
|
|
||||||
sqlite_db_file = DATA_DIR + "/v2trading.db"
|
sqlite_db_file = DATA_DIR + "/v2trading.db"
|
||||||
# Define the connection pool
|
# Define the connection pool
|
||||||
@ -31,7 +28,7 @@ class ConnectionPool:
|
|||||||
return connection
|
return connection
|
||||||
|
|
||||||
|
|
||||||
def execute_with_retry(cursor: sqlite3.Cursor, statement: str, params = None, retry_interval: int = 1) -> sqlite3.Cursor:
|
def execute_with_retry(cursor: sqlite3.Cursor, statement: str, params = None, retry_interval: int = 2) -> sqlite3.Cursor:
|
||||||
"""get connection from pool and execute SQL statement with retry logic if required.
|
"""get connection from pool and execute SQL statement with retry logic if required.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
@ -60,53 +57,4 @@ def execute_with_retry(cursor: sqlite3.Cursor, statement: str, params = None, re
|
|||||||
pool = ConnectionPool(10)
|
pool = ConnectionPool(10)
|
||||||
#for one shared connection (used for writes only in WAL mode)
|
#for one shared connection (used for writes only in WAL mode)
|
||||||
insert_conn = sqlite3.connect(sqlite_db_file, check_same_thread=False)
|
insert_conn = sqlite3.connect(sqlite_db_file, check_same_thread=False)
|
||||||
insert_queue = queue.Queue()
|
insert_queue = queue.Queue()
|
||||||
|
|
||||||
#prevede dict radku zpatky na objekt vcetme retypizace
|
|
||||||
def row_to_runarchiveview(row: dict) -> RunArchiveView:
|
|
||||||
return RunArchive(
|
|
||||||
id=row['runner_id'],
|
|
||||||
strat_id=row['strat_id'],
|
|
||||||
batch_id=row['batch_id'],
|
|
||||||
symbol=row['symbol'],
|
|
||||||
name=row['name'],
|
|
||||||
note=row['note'],
|
|
||||||
started=datetime.fromisoformat(row['started']) if row['started'] else None,
|
|
||||||
stopped=datetime.fromisoformat(row['stopped']) if row['stopped'] else None,
|
|
||||||
mode=row['mode'],
|
|
||||||
account=row['account'],
|
|
||||||
bt_from=datetime.fromisoformat(row['bt_from']) if row['bt_from'] else None,
|
|
||||||
bt_to=datetime.fromisoformat(row['bt_to']) if row['bt_to'] else None,
|
|
||||||
ilog_save=bool(row['ilog_save']),
|
|
||||||
profit=float(row['profit']),
|
|
||||||
trade_count=int(row['trade_count']),
|
|
||||||
end_positions=int(row['end_positions']),
|
|
||||||
end_positions_avgp=float(row['end_positions_avgp']),
|
|
||||||
metrics=orjson.loads(row['metrics']) if row['metrics'] else None
|
|
||||||
)
|
|
||||||
|
|
||||||
#prevede dict radku zpatky na objekt vcetme retypizace
|
|
||||||
def row_to_runarchive(row: dict) -> RunArchive:
|
|
||||||
return RunArchive(
|
|
||||||
id=row['runner_id'],
|
|
||||||
strat_id=row['strat_id'],
|
|
||||||
batch_id=row['batch_id'],
|
|
||||||
symbol=row['symbol'],
|
|
||||||
name=row['name'],
|
|
||||||
note=row['note'],
|
|
||||||
started=datetime.fromisoformat(row['started']) if row['started'] else None,
|
|
||||||
stopped=datetime.fromisoformat(row['stopped']) if row['stopped'] else None,
|
|
||||||
mode=row['mode'],
|
|
||||||
account=row['account'],
|
|
||||||
bt_from=datetime.fromisoformat(row['bt_from']) if row['bt_from'] else None,
|
|
||||||
bt_to=datetime.fromisoformat(row['bt_to']) if row['bt_to'] else None,
|
|
||||||
strat_json=orjson.loads(row['strat_json']),
|
|
||||||
settings=orjson.loads(row['settings']),
|
|
||||||
ilog_save=bool(row['ilog_save']),
|
|
||||||
profit=float(row['profit']),
|
|
||||||
trade_count=int(row['trade_count']),
|
|
||||||
end_positions=int(row['end_positions']),
|
|
||||||
end_positions_avgp=float(row['end_positions_avgp']),
|
|
||||||
metrics=orjson.loads(row['metrics']),
|
|
||||||
stratvars_toml=row['stratvars_toml']
|
|
||||||
)
|
|
||||||
@ -1,13 +1,16 @@
|
|||||||
from uuid import UUID
|
from uuid import UUID, uuid4
|
||||||
from alpaca.trading.enums import OrderSide, OrderStatus, TradeEvent,OrderType
|
from alpaca.trading.enums import OrderSide, OrderStatus, TradeEvent,OrderType
|
||||||
#from utils import AttributeDict
|
#from utils import AttributeDict
|
||||||
from rich import print
|
from rich import print
|
||||||
from typing import Any, Optional, List, Union
|
from typing import Any, Optional, List, Union
|
||||||
from datetime import datetime, date
|
from datetime import datetime, date
|
||||||
from pydantic import BaseModel
|
from pydantic import BaseModel, Field
|
||||||
from v2realbot.enums.enums import Mode, Account
|
from v2realbot.enums.enums import Mode, Account, SchedulerStatus, Moddus, Market
|
||||||
from alpaca.data.enums import Exchange
|
from alpaca.data.enums import Exchange
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
#models for server side datatables
|
#models for server side datatables
|
||||||
# Model for individual column data
|
# Model for individual column data
|
||||||
class ColumnData(BaseModel):
|
class ColumnData(BaseModel):
|
||||||
@ -91,12 +94,12 @@ class TestList(BaseModel):
|
|||||||
class Trade(BaseModel):
|
class Trade(BaseModel):
|
||||||
symbol: str
|
symbol: str
|
||||||
timestamp: datetime
|
timestamp: datetime
|
||||||
exchange: Optional[Union[Exchange, str]]
|
exchange: Optional[Union[Exchange, str]] = None
|
||||||
price: float
|
price: float
|
||||||
size: float
|
size: float
|
||||||
id: int
|
id: int
|
||||||
conditions: Optional[List[str]]
|
conditions: Optional[List[str]] = None
|
||||||
tape: Optional[str]
|
tape: Optional[str] = None
|
||||||
|
|
||||||
|
|
||||||
#persisted object in pickle
|
#persisted object in pickle
|
||||||
@ -111,8 +114,20 @@ class StrategyInstance(BaseModel):
|
|||||||
close_rush: int = 0
|
close_rush: int = 0
|
||||||
stratvars_conf: str
|
stratvars_conf: str
|
||||||
add_data_conf: str
|
add_data_conf: str
|
||||||
note: Optional[str]
|
note: Optional[str] = None
|
||||||
history: Optional[str]
|
history: Optional[str] = None
|
||||||
|
|
||||||
|
def __setstate__(self, state: dict[Any, Any]) -> None:
|
||||||
|
"""
|
||||||
|
Hack to allow unpickling models stored from pydantic V1
|
||||||
|
"""
|
||||||
|
state.setdefault("__pydantic_extra__", {})
|
||||||
|
state.setdefault("__pydantic_private__", {})
|
||||||
|
|
||||||
|
if "__pydantic_fields_set__" not in state:
|
||||||
|
state["__pydantic_fields_set__"] = state.get("__fields_set__")
|
||||||
|
|
||||||
|
super().__setstate__(state)
|
||||||
|
|
||||||
class RunRequest(BaseModel):
|
class RunRequest(BaseModel):
|
||||||
id: UUID
|
id: UUID
|
||||||
@ -122,8 +137,8 @@ class RunRequest(BaseModel):
|
|||||||
debug: bool = False
|
debug: bool = False
|
||||||
strat_json: Optional[str] = None
|
strat_json: Optional[str] = None
|
||||||
ilog_save: bool = False
|
ilog_save: bool = False
|
||||||
bt_from: datetime = None
|
bt_from: Optional[datetime] = None
|
||||||
bt_to: datetime = None
|
bt_to: Optional[datetime] = None
|
||||||
#weekdays filter
|
#weekdays filter
|
||||||
#pokud je uvedeny filtrujeme tyto dny
|
#pokud je uvedeny filtrujeme tyto dny
|
||||||
weekdays_filter: Optional[list] = None
|
weekdays_filter: Optional[list] = None
|
||||||
@ -134,7 +149,34 @@ class RunRequest(BaseModel):
|
|||||||
cash: int = 100000
|
cash: int = 100000
|
||||||
skip_cache: Optional[bool] = False
|
skip_cache: Optional[bool] = False
|
||||||
|
|
||||||
|
#Trida, která je nadstavbou runrequestu a pouzivame ji v scheduleru, je zde navic jen par polí
|
||||||
|
class RunManagerRecord(BaseModel):
|
||||||
|
moddus: Moddus
|
||||||
|
id: UUID = Field(default_factory=uuid4)
|
||||||
|
strat_id: UUID
|
||||||
|
symbol: Optional[str] = None
|
||||||
|
account: Account
|
||||||
|
mode: Mode
|
||||||
|
note: Optional[str] = None
|
||||||
|
ilog_save: bool = False
|
||||||
|
market: Optional[Market] = Market.US
|
||||||
|
bt_from: Optional[datetime] = None
|
||||||
|
bt_to: Optional[datetime] = None
|
||||||
|
#weekdays filter
|
||||||
|
#pokud je uvedeny filtrujeme tyto dny
|
||||||
|
weekdays_filter: Optional[list] = None #list of strings 0-6 representing days to run
|
||||||
|
#GENERATED ID v ramci runu, vaze vsechny runnery v batchovem behu
|
||||||
|
batch_id: Optional[str] = None
|
||||||
|
testlist_id: Optional[str] = None
|
||||||
|
start_time: str #time (HH:MM) that start function is called
|
||||||
|
stop_time: Optional[str] = None #time (HH:MM) that stop function is called
|
||||||
|
status: SchedulerStatus
|
||||||
|
last_processed: Optional[datetime] = None
|
||||||
|
history: Optional[str] = None
|
||||||
|
valid_from: Optional[datetime] = None # US East time zone daetime
|
||||||
|
valid_to: Optional[datetime] = None # US East time zone daetime
|
||||||
|
runner_id: Optional[UUID] = None #last runner_id from scheduler after stratefy is started
|
||||||
|
strat_running: Optional[bool] = None #automatically updated field based on status of runner_id above, it is added by row_to_RunManagerRecord
|
||||||
class RunnerView(BaseModel):
|
class RunnerView(BaseModel):
|
||||||
id: UUID
|
id: UUID
|
||||||
strat_id: UUID
|
strat_id: UUID
|
||||||
@ -164,10 +206,10 @@ class Runner(BaseModel):
|
|||||||
run_name: Optional[str] = None
|
run_name: Optional[str] = None
|
||||||
run_note: Optional[str] = None
|
run_note: Optional[str] = None
|
||||||
run_ilog_save: Optional[bool] = False
|
run_ilog_save: Optional[bool] = False
|
||||||
run_trade_count: Optional[int]
|
run_trade_count: Optional[int] = None
|
||||||
run_profit: Optional[float]
|
run_profit: Optional[float] = None
|
||||||
run_positions: Optional[int]
|
run_positions: Optional[int] = None
|
||||||
run_avgp: Optional[float]
|
run_avgp: Optional[float] = None
|
||||||
run_strat_json: Optional[str] = None
|
run_strat_json: Optional[str] = None
|
||||||
run_stopped: Optional[datetime] = None
|
run_stopped: Optional[datetime] = None
|
||||||
run_paused: Optional[datetime] = None
|
run_paused: Optional[datetime] = None
|
||||||
@ -201,41 +243,41 @@ class Bar(BaseModel):
|
|||||||
low: float
|
low: float
|
||||||
close: float
|
close: float
|
||||||
volume: float
|
volume: float
|
||||||
trade_count: Optional[float]
|
trade_count: Optional[float] = 0
|
||||||
vwap: Optional[float]
|
vwap: Optional[float] = 0
|
||||||
|
|
||||||
class Order(BaseModel):
|
class Order(BaseModel):
|
||||||
id: UUID
|
id: UUID
|
||||||
submitted_at: datetime
|
submitted_at: datetime
|
||||||
filled_at: Optional[datetime]
|
filled_at: Optional[datetime] = None
|
||||||
canceled_at: Optional[datetime]
|
canceled_at: Optional[datetime] = None
|
||||||
symbol: str
|
symbol: str
|
||||||
qty: int
|
qty: int
|
||||||
status: OrderStatus
|
status: OrderStatus
|
||||||
order_type: OrderType
|
order_type: OrderType
|
||||||
filled_qty: Optional[int]
|
filled_qty: Optional[int] = None
|
||||||
filled_avg_price: Optional[float]
|
filled_avg_price: Optional[float] = None
|
||||||
side: OrderSide
|
side: OrderSide
|
||||||
limit_price: Optional[float]
|
limit_price: Optional[float] = None
|
||||||
|
|
||||||
#entita pro kazdy kompletni FILL, je navazana na prescribed_trade
|
#entita pro kazdy kompletni FILL, je navazana na prescribed_trade
|
||||||
class TradeUpdate(BaseModel):
|
class TradeUpdate(BaseModel):
|
||||||
event: Union[TradeEvent, str]
|
event: Union[TradeEvent, str]
|
||||||
execution_id: Optional[UUID]
|
execution_id: Optional[UUID] = None
|
||||||
order: Order
|
order: Order
|
||||||
timestamp: datetime
|
timestamp: datetime
|
||||||
position_qty: Optional[float]
|
position_qty: Optional[float] = None
|
||||||
price: Optional[float]
|
price: Optional[float] = None
|
||||||
qty: Optional[float]
|
qty: Optional[float] = None
|
||||||
value: Optional[float]
|
value: Optional[float] = None
|
||||||
cash: Optional[float]
|
cash: Optional[float] = None
|
||||||
pos_avg_price: Optional[float]
|
pos_avg_price: Optional[float] = None
|
||||||
profit: Optional[float]
|
profit: Optional[float] = None
|
||||||
profit_sum: Optional[float]
|
profit_sum: Optional[float] = None
|
||||||
rel_profit: Optional[float]
|
rel_profit: Optional[float] = None
|
||||||
rel_profit_cum: Optional[float]
|
rel_profit_cum: Optional[float] = None
|
||||||
signal_name: Optional[str]
|
signal_name: Optional[str] = None
|
||||||
prescribed_trade_id: Optional[str]
|
prescribed_trade_id: Optional[str] = None
|
||||||
|
|
||||||
|
|
||||||
class RunArchiveChange(BaseModel):
|
class RunArchiveChange(BaseModel):
|
||||||
@ -260,8 +302,7 @@ class RunArchive(BaseModel):
|
|||||||
bt_from: Optional[datetime] = None
|
bt_from: Optional[datetime] = None
|
||||||
bt_to: Optional[datetime] = None
|
bt_to: Optional[datetime] = None
|
||||||
strat_json: Optional[str] = None
|
strat_json: Optional[str] = None
|
||||||
##bude decomiss, misto toho stratvars_toml
|
transferables: Optional[dict] = None #varaibles that are transferrable to next run
|
||||||
stratvars: Optional[dict] = None
|
|
||||||
settings: Optional[dict] = None
|
settings: Optional[dict] = None
|
||||||
ilog_save: Optional[bool] = False
|
ilog_save: Optional[bool] = False
|
||||||
profit: float = 0
|
profit: float = 0
|
||||||
@ -291,6 +332,8 @@ class RunArchiveView(BaseModel):
|
|||||||
end_positions: int = 0
|
end_positions: int = 0
|
||||||
end_positions_avgp: float = 0
|
end_positions_avgp: float = 0
|
||||||
metrics: Union[dict, str] = None
|
metrics: Union[dict, str] = None
|
||||||
|
batch_profit: float = 0 # Total profit for the batch - now calculated during query
|
||||||
|
batch_count: int = 0 # Count of runs in the batch - now calculated during query
|
||||||
|
|
||||||
#same but with pagination
|
#same but with pagination
|
||||||
class RunArchiveViewPagination(BaseModel):
|
class RunArchiveViewPagination(BaseModel):
|
||||||
@ -301,7 +344,7 @@ class RunArchiveViewPagination(BaseModel):
|
|||||||
|
|
||||||
#trida pro ukladani historie stoplossy do ext_data
|
#trida pro ukladani historie stoplossy do ext_data
|
||||||
class SLHistory(BaseModel):
|
class SLHistory(BaseModel):
|
||||||
id: Optional[UUID]
|
id: Optional[UUID] = None
|
||||||
time: datetime
|
time: datetime
|
||||||
sl_val: float
|
sl_val: float
|
||||||
|
|
||||||
@ -314,7 +357,7 @@ class RunArchiveDetail(BaseModel):
|
|||||||
indicators: List[dict]
|
indicators: List[dict]
|
||||||
statinds: dict
|
statinds: dict
|
||||||
trades: List[TradeUpdate]
|
trades: List[TradeUpdate]
|
||||||
ext_data: Optional[dict]
|
ext_data: Optional[dict] = None
|
||||||
|
|
||||||
|
|
||||||
class InstantIndicator(BaseModel):
|
class InstantIndicator(BaseModel):
|
||||||
|
|||||||
87
v2realbot/common/transform.py
Normal file
87
v2realbot/common/transform.py
Normal file
@ -0,0 +1,87 @@
|
|||||||
|
from v2realbot.common.model import RunArchive, RunArchiveView, RunManagerRecord
|
||||||
|
from datetime import datetime
|
||||||
|
import orjson
|
||||||
|
import v2realbot.controller.services as cs
|
||||||
|
|
||||||
|
#prevede dict radku zpatky na objekt vcetme retypizace
|
||||||
|
def row_to_runmanager(row: dict) -> RunManagerRecord:
|
||||||
|
is_running = cs.is_runner_running(row['runner_id']) if row['runner_id'] else False
|
||||||
|
res = RunManagerRecord(
|
||||||
|
moddus=row['moddus'],
|
||||||
|
id=row['id'],
|
||||||
|
strat_id=row['strat_id'],
|
||||||
|
symbol=row['symbol'],
|
||||||
|
mode=row['mode'],
|
||||||
|
account=row['account'],
|
||||||
|
note=row['note'],
|
||||||
|
ilog_save=bool(row['ilog_save']),
|
||||||
|
market=row['market'] if row['market'] is not None else None,
|
||||||
|
bt_from=datetime.fromisoformat(row['bt_from']) if row['bt_from'] else None,
|
||||||
|
bt_to=datetime.fromisoformat(row['bt_to']) if row['bt_to'] else None,
|
||||||
|
weekdays_filter=[int(x) for x in row['weekdays_filter'].split(',')] if row['weekdays_filter'] else [],
|
||||||
|
batch_id=row['batch_id'],
|
||||||
|
testlist_id=row['testlist_id'],
|
||||||
|
start_time=row['start_time'],
|
||||||
|
stop_time=row['stop_time'],
|
||||||
|
status=row['status'],
|
||||||
|
#last_started=zoneNY.localize(datetime.fromisoformat(row['last_started'])) if row['last_started'] else None,
|
||||||
|
last_processed=datetime.fromisoformat(row['last_processed']) if row['last_processed'] else None,
|
||||||
|
history=row['history'],
|
||||||
|
valid_from=datetime.fromisoformat(row['valid_from']) if row['valid_from'] else None,
|
||||||
|
valid_to=datetime.fromisoformat(row['valid_to']) if row['valid_to'] else None,
|
||||||
|
runner_id = row['runner_id'] if row['runner_id'] and is_running else None, #runner_id is only present if it is running
|
||||||
|
strat_running = is_running) #cant believe this when called from separate process as not current
|
||||||
|
return res
|
||||||
|
|
||||||
|
#prevede dict radku zpatky na objekt vcetme retypizace
|
||||||
|
def row_to_runarchiveview(row: dict) -> RunArchiveView:
|
||||||
|
a = RunArchiveView(
|
||||||
|
id=row['runner_id'],
|
||||||
|
strat_id=row['strat_id'],
|
||||||
|
batch_id=row['batch_id'],
|
||||||
|
symbol=row['symbol'],
|
||||||
|
name=row['name'],
|
||||||
|
note=row['note'],
|
||||||
|
started=datetime.fromisoformat(row['started']) if row['started'] else None,
|
||||||
|
stopped=datetime.fromisoformat(row['stopped']) if row['stopped'] else None,
|
||||||
|
mode=row['mode'],
|
||||||
|
account=row['account'],
|
||||||
|
bt_from=datetime.fromisoformat(row['bt_from']) if row['bt_from'] else None,
|
||||||
|
bt_to=datetime.fromisoformat(row['bt_to']) if row['bt_to'] else None,
|
||||||
|
ilog_save=bool(row['ilog_save']),
|
||||||
|
profit=float(row['profit']),
|
||||||
|
trade_count=int(row['trade_count']),
|
||||||
|
end_positions=int(row['end_positions']),
|
||||||
|
end_positions_avgp=float(row['end_positions_avgp']),
|
||||||
|
metrics=orjson.loads(row['metrics']) if row['metrics'] else None,
|
||||||
|
batch_profit=int(row['batch_profit']) if row['batch_profit'] and row['batch_id'] else 0,
|
||||||
|
batch_count=int(row['batch_count']) if row['batch_count'] and row['batch_id'] else 0,
|
||||||
|
)
|
||||||
|
return a
|
||||||
|
|
||||||
|
#prevede dict radku zpatky na objekt vcetme retypizace
|
||||||
|
def row_to_runarchive(row: dict) -> RunArchive:
|
||||||
|
return RunArchive(
|
||||||
|
id=row['runner_id'],
|
||||||
|
strat_id=row['strat_id'],
|
||||||
|
batch_id=row['batch_id'],
|
||||||
|
symbol=row['symbol'],
|
||||||
|
name=row['name'],
|
||||||
|
note=row['note'],
|
||||||
|
started=datetime.fromisoformat(row['started']) if row['started'] else None,
|
||||||
|
stopped=datetime.fromisoformat(row['stopped']) if row['stopped'] else None,
|
||||||
|
mode=row['mode'],
|
||||||
|
account=row['account'],
|
||||||
|
bt_from=datetime.fromisoformat(row['bt_from']) if row['bt_from'] else None,
|
||||||
|
bt_to=datetime.fromisoformat(row['bt_to']) if row['bt_to'] else None,
|
||||||
|
strat_json=orjson.loads(row['strat_json']),
|
||||||
|
settings=orjson.loads(row['settings']),
|
||||||
|
ilog_save=bool(row['ilog_save']),
|
||||||
|
profit=float(row['profit']),
|
||||||
|
trade_count=int(row['trade_count']),
|
||||||
|
end_positions=int(row['end_positions']),
|
||||||
|
end_positions_avgp=float(row['end_positions_avgp']),
|
||||||
|
metrics=orjson.loads(row['metrics']),
|
||||||
|
stratvars_toml=row['stratvars_toml'],
|
||||||
|
transferables=orjson.loads(row['transferables']) if row['transferables'] else None
|
||||||
|
)
|
||||||
@ -2,66 +2,41 @@ from alpaca.data.enums import DataFeed
|
|||||||
from v2realbot.enums.enums import Mode, Account, FillCondition
|
from v2realbot.enums.enums import Mode, Account, FillCondition
|
||||||
from appdirs import user_data_dir
|
from appdirs import user_data_dir
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
import os
|
||||||
|
from collections import defaultdict
|
||||||
|
from dotenv import load_dotenv
|
||||||
|
# Global flag to track if the ml module has been imported (solution for long import times of tensorflow)
|
||||||
|
#the first occurence of using it will load it globally
|
||||||
|
_ml_module_loaded = False
|
||||||
|
|
||||||
#directory for generated images and basic reports
|
#directory for generated images and basic reports
|
||||||
MEDIA_DIRECTORY = Path(__file__).parent.parent.parent / "media"
|
MEDIA_DIRECTORY = Path(__file__).parent.parent.parent / "media"
|
||||||
RUNNER_DETAIL_DIRECTORY = Path(__file__).parent.parent.parent / "runner_detail"
|
RUNNER_DETAIL_DIRECTORY = Path(__file__).parent.parent.parent / "runner_detail"
|
||||||
|
|
||||||
#location of strat.log - it is used to fetch by gui
|
#location of strat.log - it is used to fetch by gui
|
||||||
|
LOG_PATH = Path(__file__).parent.parent
|
||||||
LOG_FILE = Path(__file__).parent.parent / "strat.log"
|
LOG_FILE = Path(__file__).parent.parent / "strat.log"
|
||||||
|
JOB_LOG_FILE = Path(__file__).parent.parent / "job.log"
|
||||||
|
DOTENV_DIRECTORY = Path(__file__).parent.parent.parent
|
||||||
|
ENV_FILE = DOTENV_DIRECTORY / '.env'
|
||||||
|
|
||||||
#'0.0.0.0',
|
|
||||||
#currently only prod server has acces to LIVE
|
|
||||||
PROD_SERVER_HOSTNAMES = ['tradingeastcoast','David-MacBook-Pro.local'] #,'David-MacBook-Pro.local'
|
|
||||||
TEST_SERVER_HOSTNAMES = ['tradingtest']
|
|
||||||
|
|
||||||
#TODO vybrane dat do config db a managovat pres GUI
|
|
||||||
|
|
||||||
#DEFAULT AGGREGATOR filter trades
|
|
||||||
#NOTE pridana F - Inter Market Sweep Order - obcas vytvarela spajky
|
|
||||||
AGG_EXCLUDED_TRADES = ['C','O','4','B','7','V','P','W','U','Z','F']
|
|
||||||
|
|
||||||
OFFLINE_MODE = False
|
|
||||||
|
|
||||||
# ilog lvls = 0,1 - 0 debug, 1 info
|
|
||||||
ILOG_SAVE_LEVEL_FROM = 1
|
|
||||||
|
|
||||||
#minimalni vzdalenost mezi trady, kterou agregator pousti pro CBAR(0.001 - blokuje mensi nez 1ms)
|
|
||||||
GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN = 0.003
|
|
||||||
#normalized price for tick 0.01
|
|
||||||
NORMALIZED_TICK_BASE_PRICE = 30.00
|
|
||||||
LOG_RUNNER_EVENTS = False
|
|
||||||
#no print in console
|
|
||||||
QUIET_MODE = True
|
|
||||||
#how many consecutive trades with the fill price are necessary for LIMIT fill to happen in backtesting
|
|
||||||
#0 - optimistic, every knot high will fill the order
|
|
||||||
#N - N consecutive trades required
|
|
||||||
#not impl.yet
|
|
||||||
#minimum is 1, na alpace live to vetsinou vychazi 7-8 u BAC, je to hodne podobne tomu, nez je cena překonaná pul centu. tzn. 7-8 a nebo FillCondition.SLOW
|
|
||||||
BT_FILL_CONS_TRADES_REQUIRED = 2
|
|
||||||
#during bt trade execution logs X-surrounding trades of the one that triggers the fill
|
|
||||||
BT_FILL_LOG_SURROUNDING_TRADES = 10
|
|
||||||
#fill condition for limit order in bt
|
|
||||||
# fast - price has to be equal or bigger <=
|
|
||||||
# slow - price has to be bigger <
|
|
||||||
BT_FILL_CONDITION_BUY_LIMIT = FillCondition.SLOW
|
|
||||||
BT_FILL_CONDITION_SELL_LIMIT = FillCondition.SLOW
|
|
||||||
#TBD TODO not implemented yet
|
|
||||||
BT_FILL_PRICE_MARKET_ORDER_PREMIUM = 0.005
|
|
||||||
#backend counter of api requests
|
|
||||||
COUNT_API_REQUESTS = False
|
|
||||||
#stratvars that cannot be changed in gui
|
#stratvars that cannot be changed in gui
|
||||||
STRATVARS_UNCHANGEABLES = ['pendingbuys', 'blockbuy', 'jevylozeno', 'limitka']
|
STRATVARS_UNCHANGEABLES = ['pendingbuys', 'blockbuy', 'jevylozeno', 'limitka']
|
||||||
DATA_DIR = user_data_dir("v2realbot")
|
DATA_DIR = user_data_dir("v2realbot", False)
|
||||||
MODEL_DIR = Path(DATA_DIR)/"models"
|
MODEL_DIR = Path(DATA_DIR)/"models"
|
||||||
#BT DELAYS
|
#BT DELAYS
|
||||||
#profiling
|
#profiling
|
||||||
PROFILING_NEXT_ENABLED = False
|
PROFILING_NEXT_ENABLED = False
|
||||||
PROFILING_OUTPUT_DIR = DATA_DIR
|
PROFILING_OUTPUT_DIR = DATA_DIR
|
||||||
|
|
||||||
#FILL CONFIGURATION CLASS FOR BACKTESTING
|
#NALOADUJEME DOTENV ENV VARIABLES
|
||||||
|
if load_dotenv(ENV_FILE, verbose=True) is False:
|
||||||
|
raise Exception(f"Error loading.env file {ENV_FILE}")
|
||||||
|
else:
|
||||||
|
print(f"Loaded env variables from file {ENV_FILE}")
|
||||||
|
|
||||||
#WIP
|
#WIP - FILL CONFIGURATION CLASS FOR BACKTESTING
|
||||||
class BT_FILL_CONF:
|
class BT_FILL_CONF:
|
||||||
""""
|
""""
|
||||||
Trida pro konfiguraci backtesting fillu pro dany symbol, pokud neexistuje tak fallback na obecny viz vyse-
|
Trida pro konfiguraci backtesting fillu pro dany symbol, pokud neexistuje tak fallback na obecny viz vyse-
|
||||||
@ -75,24 +50,6 @@ class BT_FILL_CONF:
|
|||||||
self.BT_FILL_CONDITION_SELL_LIMIT=BT_FILL_CONDITION_SELL_LIMIT
|
self.BT_FILL_CONDITION_SELL_LIMIT=BT_FILL_CONDITION_SELL_LIMIT
|
||||||
self.BT_FILL_PRICE_MARKET_ORDER_PREMIUM=BT_FILL_PRICE_MARKET_ORDER_PREMIUM
|
self.BT_FILL_PRICE_MARKET_ORDER_PREMIUM=BT_FILL_PRICE_MARKET_ORDER_PREMIUM
|
||||||
|
|
||||||
|
|
||||||
""""
|
|
||||||
LATENCY DELAYS for LIVE eastcoast
|
|
||||||
.000 trigger - last_trade_time (.4246266)
|
|
||||||
+.020 vstup do strategie a BUY (.444606)
|
|
||||||
+.023 submitted (.469198)
|
|
||||||
+.008 filled (.476695552)
|
|
||||||
+.023 fill not(.499888)
|
|
||||||
"""
|
|
||||||
#TODO změnit názvy delay promennych vystizneji a obecneji
|
|
||||||
class BT_DELAYS:
|
|
||||||
trigger_to_strat: float = 0.020
|
|
||||||
strat_to_sub: float = 0.023
|
|
||||||
sub_to_fill: float = 0.008
|
|
||||||
fill_to_not: float = 0.023
|
|
||||||
#doplnit dle live
|
|
||||||
limit_order_offset: float = 0
|
|
||||||
|
|
||||||
class Keys:
|
class Keys:
|
||||||
def __init__(self, api_key, secret_key, paper, feed) -> None:
|
def __init__(self, api_key, secret_key, paper, feed) -> None:
|
||||||
self.API_KEY = api_key
|
self.API_KEY = api_key
|
||||||
@ -101,7 +58,8 @@ class Keys:
|
|||||||
self.FEED = feed
|
self.FEED = feed
|
||||||
|
|
||||||
# podle modu (PAPER, LIVE) vrati objekt
|
# podle modu (PAPER, LIVE) vrati objekt
|
||||||
# obsahujici klice pro pripojeni k alpace
|
# obsahujici klice pro pripojeni k alpace - používá se pro Trading API a order updates websockets (pristupy relevantni per strategie)
|
||||||
|
#pro real time data se bere LIVE_DATA_API_KEY, LIVE_DATA_SECRET_KEY, LIVE_DATA_FEED nize - jelikoz jde o server wide nastaveni
|
||||||
def get_key(mode: Mode, account: Account):
|
def get_key(mode: Mode, account: Account):
|
||||||
if mode not in [Mode.PAPER, Mode.LIVE]:
|
if mode not in [Mode.PAPER, Mode.LIVE]:
|
||||||
print("has to be LIVE or PAPER only")
|
print("has to be LIVE or PAPER only")
|
||||||
@ -120,36 +78,85 @@ def get_key(mode: Mode, account: Account):
|
|||||||
#strategy instance main loop heartbeat
|
#strategy instance main loop heartbeat
|
||||||
HEARTBEAT_TIMEOUT=5
|
HEARTBEAT_TIMEOUT=5
|
||||||
|
|
||||||
WEB_API_KEY="david"
|
WEB_API_KEY=os.environ.get('WEB_API_KEY')
|
||||||
|
|
||||||
#PRIMARY PAPER
|
#PRIMARY PAPER
|
||||||
ACCOUNT1_PAPER_API_KEY = 'PKGGEWIEYZOVQFDRY70L'
|
ACCOUNT1_PAPER_API_KEY = os.environ.get('ACCOUNT1_PAPER_API_KEY')
|
||||||
ACCOUNT1_PAPER_SECRET_KEY = 'O5Kt8X4RLceIOvM98i5LdbalItsX7hVZlbPYHy8Y'
|
ACCOUNT1_PAPER_SECRET_KEY = os.environ.get('ACCOUNT1_PAPER_SECRET_KEY')
|
||||||
ACCOUNT1_PAPER_MAX_BATCH_SIZE = 1
|
ACCOUNT1_PAPER_MAX_BATCH_SIZE = 1
|
||||||
ACCOUNT1_PAPER_PAPER = True
|
ACCOUNT1_PAPER_PAPER = True
|
||||||
ACCOUNT1_PAPER_FEED = DataFeed.SIP
|
#ACCOUNT1_PAPER_FEED = DataFeed.SIP
|
||||||
|
|
||||||
|
# Load the data feed type from environment variable
|
||||||
|
data_feed_type_str = os.environ.get('ACCOUNT1_PAPER_FEED', 'iex') # Default to 'sip' if not set
|
||||||
|
|
||||||
|
# Convert the string to DataFeed enum
|
||||||
|
try:
|
||||||
|
ACCOUNT1_PAPER_FEED = DataFeed(data_feed_type_str)
|
||||||
|
except ValueError:
|
||||||
|
# Handle the case where the environment variable does not match any enum member
|
||||||
|
print(f"Invalid data feed type: {data_feed_type_str} in ACCOUNT1_PAPER_FEED defaulting to 'iex'")
|
||||||
|
ACCOUNT1_PAPER_FEED = DataFeed.SIP
|
||||||
|
|
||||||
#PRIMARY LIVE
|
#PRIMARY LIVE
|
||||||
ACCOUNT1_LIVE_API_KEY = 'AKB5HD32LPDZC9TPUWJT'
|
ACCOUNT1_LIVE_API_KEY = os.environ.get('ACCOUNT1_LIVE_API_KEY')
|
||||||
ACCOUNT1_LIVE_SECRET_KEY = 'Xq1wPSNOtwmlMTAd4cEmdKvNDgfcUYfrOaCccaAs'
|
ACCOUNT1_LIVE_SECRET_KEY = os.environ.get('ACCOUNT1_LIVE_SECRET_KEY')
|
||||||
ACCOUNT1_LIVE_MAX_BATCH_SIZE = 1
|
ACCOUNT1_LIVE_MAX_BATCH_SIZE = 1
|
||||||
ACCOUNT1_LIVE_PAPER = False
|
ACCOUNT1_LIVE_PAPER = False
|
||||||
ACCOUNT1_LIVE_FEED = DataFeed.SIP
|
#ACCOUNT1_LIVE_FEED = DataFeed.SIP
|
||||||
|
|
||||||
|
# Load the data feed type from environment variable
|
||||||
|
data_feed_type_str = os.environ.get('ACCOUNT1_LIVE_FEED', 'iex') # Default to 'sip' if not set
|
||||||
|
|
||||||
|
# Convert the string to DataFeed enum
|
||||||
|
try:
|
||||||
|
ACCOUNT1_LIVE_FEED = DataFeed(data_feed_type_str)
|
||||||
|
except ValueError:
|
||||||
|
# Handle the case where the environment variable does not match any enum member
|
||||||
|
print(f"Invalid data feed type: {data_feed_type_str} in ACCOUNT1_LIVE_FEED defaulting to 'iex'")
|
||||||
|
ACCOUNT1_LIVE_FEED = DataFeed.IEX
|
||||||
|
|
||||||
#SECONDARY PAPER - Martin
|
#SECONDARY PAPER - Martin
|
||||||
ACCOUNT2_PAPER_API_KEY = 'PKPDTCQLNHCBC2D9GQFB'
|
ACCOUNT2_PAPER_API_KEY = os.environ.get('ACCOUNT2_PAPER_API_KEY')
|
||||||
ACCOUNT2_PAPER_SECRET_KEY = 'c1Z2V0gBleQmwHYCreqqTs45Jy33RqPGrofuSayz'
|
ACCOUNT2_PAPER_SECRET_KEY = os.environ.get('ACCOUNT2_PAPER_SECRET_KEY')
|
||||||
ACCOUNT2_PAPER_MAX_BATCH_SIZE = 1
|
ACCOUNT2_PAPER_MAX_BATCH_SIZE = 1
|
||||||
ACCOUNT2_PAPER_PAPER = True
|
ACCOUNT2_PAPER_PAPER = True
|
||||||
ACCOUNT2_PAPER_FEED = DataFeed.IEX
|
#ACCOUNT2_PAPER_FEED = DataFeed.IEX
|
||||||
|
|
||||||
# #SECONDARY PAPER
|
# Load the data feed type from environment variable
|
||||||
# ACCOUNT2_PAPER_API_KEY = 'PK0OQHZG03PUZ1SC560V'
|
data_feed_type_str = os.environ.get('ACCOUNT2_PAPER_FEED', 'iex') # Default to 'sip' if not set
|
||||||
# ACCOUNT2_PAPER_SECRET_KEY = 'cTglhm7kwRcZfFT27fQWz31sXaxadzQApFDW6Lat'
|
|
||||||
# ACCOUNT2_PAPER_MAX_BATCH_SIZE = 1
|
# Convert the string to DataFeed enum
|
||||||
# ACCOUNT2_PAPER_PAPER = True
|
try:
|
||||||
# ACCOUNT2_PAPER_FEED = DataFeed.IEX
|
ACCOUNT2_PAPER_FEED = DataFeed(data_feed_type_str)
|
||||||
|
except ValueError:
|
||||||
|
# Handle the case where the environment variable does not match any enum member
|
||||||
|
print(f"Invalid data feed type: {data_feed_type_str} in ACCOUNT2_PAPER_FEED defaulting to 'iex'")
|
||||||
|
ACCOUNT2_PAPER_FEED = DataFeed.IEX
|
||||||
|
|
||||||
|
|
||||||
|
#SECONDARY LIVE - Martin
|
||||||
|
# ACCOUNT2_LIVE_API_KEY = os.environ.get('ACCOUNT2_LIVE_API_KEY')
|
||||||
|
# ACCOUNT2_LIVE_SECRET_KEY = os.environ.get('ACCOUNT2_LIVE_SECRET_KEY')
|
||||||
|
# ACCOUNT2_LIVE_MAX_BATCH_SIZE = 1
|
||||||
|
# ACCOUNT2_LIVE_PAPER = True
|
||||||
|
# #ACCOUNT2_LIVE_FEED = DataFeed.IEX
|
||||||
|
|
||||||
|
# # Load the data feed type from environment variable
|
||||||
|
# data_feed_type_str = os.environ.get('ACCOUNT2_LIVE_FEED', 'iex') # Default to 'sip' if not set
|
||||||
|
|
||||||
|
# # Convert the string to DataFeed enum
|
||||||
|
# try:
|
||||||
|
# ACCOUNT2_LIVE_FEED = DataFeed(data_feed_type_str)
|
||||||
|
# except ValueError:
|
||||||
|
# # Handle the case where the environment variable does not match any enum member
|
||||||
|
# print(f"Invalid data feed type: {data_feed_type_str} in ACCOUNT2_LIVE_FEED defaulting to 'iex'")
|
||||||
|
# ACCOUNT2_LIVE_FEED = DataFeed.IEX
|
||||||
|
|
||||||
|
#zatim jsou LIVE_DATA nastaveny jako z account1_paper
|
||||||
|
LIVE_DATA_API_KEY = ACCOUNT1_PAPER_API_KEY
|
||||||
|
LIVE_DATA_SECRET_KEY = ACCOUNT1_PAPER_SECRET_KEY
|
||||||
|
#LIVE_DATA_FEED je nastaveny v config_handleru
|
||||||
|
|
||||||
class KW:
|
class KW:
|
||||||
activate: str = "activate"
|
activate: str = "activate"
|
||||||
|
|||||||
112
v2realbot/controller/configs.py
Normal file
112
v2realbot/controller/configs.py
Normal file
@ -0,0 +1,112 @@
|
|||||||
|
|
||||||
|
import v2realbot.common.db as db
|
||||||
|
from v2realbot.common.model import ConfigItem
|
||||||
|
import v2realbot.utils.config_handler as ch
|
||||||
|
|
||||||
|
# region CONFIG db services
|
||||||
|
#TODO vytvorit modul pro dotahovani z pythonu (get_from_config(var_name, def_value) {)- stejne jako v js
|
||||||
|
#TODO zvazit presunuti do TOML z JSONu
|
||||||
|
def get_all_config_items():
|
||||||
|
conn = db.pool.get_connection()
|
||||||
|
try:
|
||||||
|
cursor = conn.cursor()
|
||||||
|
cursor.execute('SELECT id, item_name, json_data FROM config_table')
|
||||||
|
config_items = [{"id": row[0], "item_name": row[1], "json_data": row[2]} for row in cursor.fetchall()]
|
||||||
|
finally:
|
||||||
|
db.pool.release_connection(conn)
|
||||||
|
return 0, config_items
|
||||||
|
|
||||||
|
# Function to get a config item by ID
|
||||||
|
def get_config_item_by_id(item_id):
|
||||||
|
conn = db.pool.get_connection()
|
||||||
|
try:
|
||||||
|
cursor = conn.cursor()
|
||||||
|
cursor.execute('SELECT item_name, json_data FROM config_table WHERE id = ?', (item_id,))
|
||||||
|
row = cursor.fetchone()
|
||||||
|
finally:
|
||||||
|
db.pool.release_connection(conn)
|
||||||
|
if row is None:
|
||||||
|
return -2, "not found"
|
||||||
|
else:
|
||||||
|
return 0, {"item_name": row[0], "json_data": row[1]}
|
||||||
|
|
||||||
|
# Function to get a config item by ID
|
||||||
|
def get_config_item_by_name(item_name):
|
||||||
|
#print(item_name)
|
||||||
|
conn = db.pool.get_connection()
|
||||||
|
try:
|
||||||
|
cursor = conn.cursor()
|
||||||
|
query = f"SELECT item_name, json_data FROM config_table WHERE item_name = '{item_name}'"
|
||||||
|
#print(query)
|
||||||
|
cursor.execute(query)
|
||||||
|
row = cursor.fetchone()
|
||||||
|
#print(row)
|
||||||
|
finally:
|
||||||
|
db.pool.release_connection(conn)
|
||||||
|
if row is None:
|
||||||
|
return -2, "not found"
|
||||||
|
else:
|
||||||
|
return 0, {"item_name": row[0], "json_data": row[1]}
|
||||||
|
|
||||||
|
# Function to create a new config item
|
||||||
|
def create_config_item(config_item: ConfigItem):
|
||||||
|
conn = db.pool.get_connection()
|
||||||
|
try:
|
||||||
|
try:
|
||||||
|
cursor = conn.cursor()
|
||||||
|
cursor.execute('INSERT INTO config_table (item_name, json_data) VALUES (?, ?)', (config_item.item_name, config_item.json_data))
|
||||||
|
item_id = cursor.lastrowid
|
||||||
|
conn.commit()
|
||||||
|
print(item_id)
|
||||||
|
finally:
|
||||||
|
db.pool.release_connection(conn)
|
||||||
|
|
||||||
|
return 0, {"id": item_id, "item_name":config_item.item_name, "json_data":config_item.json_data}
|
||||||
|
except Exception as e:
|
||||||
|
return -2, str(e)
|
||||||
|
|
||||||
|
# Function to update a config item by ID
|
||||||
|
def update_config_item(item_id, config_item: ConfigItem):
|
||||||
|
conn = db.pool.get_connection()
|
||||||
|
try:
|
||||||
|
try:
|
||||||
|
cursor = conn.cursor()
|
||||||
|
cursor.execute('UPDATE config_table SET item_name = ?, json_data = ? WHERE id = ?', (config_item.item_name, config_item.json_data, item_id))
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
#refresh active item je zatím řešena takto natvrdo při updatu položky "active_profile" a při startu aplikace
|
||||||
|
if config_item.item_name == "active_profile":
|
||||||
|
ch.config_handler.activate_profile()
|
||||||
|
finally:
|
||||||
|
db.pool.release_connection(conn)
|
||||||
|
return 0, {"id": item_id, **config_item.dict()}
|
||||||
|
except Exception as e:
|
||||||
|
return -2, str(e)
|
||||||
|
|
||||||
|
# Function to delete a config item by ID
|
||||||
|
def delete_config_item(item_id):
|
||||||
|
conn = db.pool.get_connection()
|
||||||
|
try:
|
||||||
|
cursor = conn.cursor()
|
||||||
|
cursor.execute('DELETE FROM config_table WHERE id = ?', (item_id,))
|
||||||
|
conn.commit()
|
||||||
|
finally:
|
||||||
|
db.pool.release_connection(conn)
|
||||||
|
return 0, {"id": item_id}
|
||||||
|
|
||||||
|
# endregion
|
||||||
|
|
||||||
|
#Example of using config directive
|
||||||
|
# config_directive = "overrides"
|
||||||
|
# ret, res = get_config_item_by_name(config_directive)
|
||||||
|
# if ret < 0:
|
||||||
|
# print(f"CONFIG OVERRIDE {config_directive} Error {res}")
|
||||||
|
# else:
|
||||||
|
# config = orjson.loads(res["json_data"])
|
||||||
|
|
||||||
|
# print("OVERRIDN CFG:", config)
|
||||||
|
# for key, value in config.items():
|
||||||
|
# if hasattr(cfg, key):
|
||||||
|
# print(f"Overriding {key} with {value}")
|
||||||
|
# setattr(cfg, key, value)
|
||||||
|
|
||||||
463
v2realbot/controller/run_manager.py
Normal file
463
v2realbot/controller/run_manager.py
Normal file
@ -0,0 +1,463 @@
|
|||||||
|
from typing import Any, List, Tuple
|
||||||
|
from uuid import UUID, uuid4
|
||||||
|
from v2realbot.common.model import RunManagerRecord, StrategyInstance, RunDay, StrategyInstance, Runner, RunRequest, RunArchive, RunArchiveView, RunArchiveViewPagination, RunArchiveDetail, RunArchiveChange, Bar, TradeEvent, TestList, Intervals, ConfigItem, InstantIndicator, DataTablesRequest
|
||||||
|
from v2realbot.utils.utils import validate_and_format_time, AttributeDict, zoneNY, zonePRG, safe_get, dict_replace_value, Store, parse_toml_string, json_serial, is_open_hours, send_to_telegram, concatenate_weekdays, transform_data
|
||||||
|
from v2realbot.utils.ilog import delete_logs
|
||||||
|
from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeStatus, TradeStoplossType
|
||||||
|
from datetime import datetime
|
||||||
|
from v2realbot.loader.trade_offline_streamer import Trade_Offline_Streamer
|
||||||
|
from threading import Thread, current_thread, Event, enumerate
|
||||||
|
from v2realbot.config import STRATVARS_UNCHANGEABLES, ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, ACCOUNT1_LIVE_API_KEY, ACCOUNT1_LIVE_SECRET_KEY, DATA_DIR,MEDIA_DIRECTORY, RUNNER_DETAIL_DIRECTORY
|
||||||
|
import importlib
|
||||||
|
from alpaca.trading.requests import GetCalendarRequest
|
||||||
|
from alpaca.trading.client import TradingClient
|
||||||
|
#from alpaca.trading.models import Calendar
|
||||||
|
from queue import Queue
|
||||||
|
from tinydb import TinyDB, Query, where
|
||||||
|
from tinydb.operations import set
|
||||||
|
import orjson
|
||||||
|
import numpy as np
|
||||||
|
from rich import print
|
||||||
|
import pandas as pd
|
||||||
|
from traceback import format_exc
|
||||||
|
from datetime import timedelta, time
|
||||||
|
from threading import Lock
|
||||||
|
import v2realbot.common.db as db
|
||||||
|
import v2realbot.common.transform as tr
|
||||||
|
from sqlite3 import OperationalError, Row
|
||||||
|
import v2realbot.strategyblocks.indicators.custom as ci
|
||||||
|
from v2realbot.strategyblocks.inits.init_indicators import initialize_dynamic_indicators
|
||||||
|
from v2realbot.strategyblocks.indicators.indicators_hub import populate_dynamic_indicators
|
||||||
|
from v2realbot.interfaces.backtest_interface import BacktestInterface
|
||||||
|
import os
|
||||||
|
import v2realbot.reporting.metricstoolsimage as mt
|
||||||
|
import gzip
|
||||||
|
import os
|
||||||
|
import msgpack
|
||||||
|
import v2realbot.controller.services as cs
|
||||||
|
import v2realbot.scheduler.ap_scheduler as aps
|
||||||
|
|
||||||
|
# Functions for your 'run_manager' table
|
||||||
|
|
||||||
|
# CREATE TABLE "run_manager" (
|
||||||
|
# "moddus" TEXT NOT NULL,
|
||||||
|
# "id" varchar(32),
|
||||||
|
# "strat_id" varchar(32) NOT NULL,
|
||||||
|
# "symbol" TEXT,
|
||||||
|
# "account" TEXT NOT NULL,
|
||||||
|
# "mode" TEXT NOT NULL,
|
||||||
|
# "note" TEXT,
|
||||||
|
# "ilog_save" BOOLEAN,
|
||||||
|
# "bt_from" TEXT,
|
||||||
|
# "bt_to" TEXT,
|
||||||
|
# "weekdays_filter" TEXT,
|
||||||
|
# "batch_id" TEXT,
|
||||||
|
# "start_time" TEXT NOT NULL,
|
||||||
|
# "stop_time" TEXT NOT NULL,
|
||||||
|
# "status" TEXT NOT NULL,
|
||||||
|
# "last_processed" TEXT,
|
||||||
|
# "history" TEXT,
|
||||||
|
# "valid_from" TEXT,
|
||||||
|
# "valid_to" TEXT,
|
||||||
|
# "testlist_id" TEXT,
|
||||||
|
# "runner_id" varchar2(32),
|
||||||
|
# PRIMARY KEY("id")
|
||||||
|
# )
|
||||||
|
|
||||||
|
# CREATE INDEX idx_moddus ON run_manager (moddus);
|
||||||
|
# CREATE INDEX idx_status ON run_manager (status);
|
||||||
|
# CREATE INDEX idx_status_moddus ON run_manager (status, moddus);
|
||||||
|
# CREATE INDEX idx_valid_from_to ON run_manager (valid_from, valid_to);
|
||||||
|
# CREATE INDEX idx_stopped_batch_id ON runner_header (stopped, batch_id);
|
||||||
|
# CREATE INDEX idx_search_value ON runner_header (strat_id, batch_id);
|
||||||
|
|
||||||
|
|
||||||
|
##weekdays are stored as comma separated values
|
||||||
|
# Fetching (assume 'weekdays' field is a comma-separated string)
|
||||||
|
# weekday_str = record['weekdays']
|
||||||
|
# weekdays = [int(x) for x in weekday_str.split(',')]
|
||||||
|
|
||||||
|
# # ... logic to check whether today's weekday is in 'weekdays'
|
||||||
|
|
||||||
|
# # Storing
|
||||||
|
# weekdays = [1, 2, 5] # Example
|
||||||
|
# weekday_str = ",".join(str(x) for x in weekdays)
|
||||||
|
# update_data = {'weekdays': weekday_str}
|
||||||
|
# # ... use in an SQL UPDATE statement
|
||||||
|
|
||||||
|
# for row in records:
|
||||||
|
# row['weekdays_filter'] = [int(x) for x in row['weekdays_filter'].split(',')] if row['weekdays_filter'] else []
|
||||||
|
|
||||||
|
|
||||||
|
#get stratin info return
|
||||||
|
# strat : StrategyInstance = None
|
||||||
|
# result, strat = cs.get_stratin("625760ac-6376-47fa-8989-1e6a3f6ab66a")
|
||||||
|
# if result == 0:
|
||||||
|
# print(strat)
|
||||||
|
# else:
|
||||||
|
# print("Error:", strat)
|
||||||
|
|
||||||
|
|
||||||
|
# Fetch all
|
||||||
|
#result, records = fetch_all_run_manager_records()
|
||||||
|
|
||||||
|
#TODO zvazit rozsireni vystupu o strat_status (running/stopped)
|
||||||
|
|
||||||
|
|
||||||
|
def fetch_all_run_manager_records() -> list[RunManagerRecord]:
|
||||||
|
conn = db.pool.get_connection()
|
||||||
|
try:
|
||||||
|
conn.row_factory = Row
|
||||||
|
cursor = conn.cursor()
|
||||||
|
cursor.execute('SELECT * FROM run_manager')
|
||||||
|
rows = cursor.fetchall()
|
||||||
|
results = []
|
||||||
|
#Transform row to object
|
||||||
|
for row in rows:
|
||||||
|
#add transformed object into result list
|
||||||
|
results.append(tr.row_to_runmanager(row))
|
||||||
|
|
||||||
|
return 0, results
|
||||||
|
finally:
|
||||||
|
conn.row_factory = None
|
||||||
|
db.pool.release_connection(conn)
|
||||||
|
|
||||||
|
# Fetch by strategy_id
|
||||||
|
# result, record = fetch_run_manager_record_by_id('625760ac-6376-47fa-8989-1e6a3f6ab66a')
|
||||||
|
def fetch_run_manager_record_by_id(strategy_id) -> RunManagerRecord:
|
||||||
|
conn = db.pool.get_connection()
|
||||||
|
try:
|
||||||
|
conn.row_factory = Row
|
||||||
|
cursor = conn.cursor()
|
||||||
|
cursor.execute('SELECT * FROM run_manager WHERE id = ?', (str(strategy_id),))
|
||||||
|
row = cursor.fetchone()
|
||||||
|
if row is None:
|
||||||
|
return -2, "not found"
|
||||||
|
else:
|
||||||
|
return 0, tr.row_to_runmanager(row)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print("ERROR while fetching all records:", str(e) + format_exc())
|
||||||
|
return -2, str(e) + format_exc()
|
||||||
|
finally:
|
||||||
|
conn.row_factory = None
|
||||||
|
db.pool.release_connection(conn)
|
||||||
|
|
||||||
|
def add_run_manager_record(new_record: RunManagerRecord):
|
||||||
|
#validation/standardization of time
|
||||||
|
new_record.start_time = validate_and_format_time(new_record.start_time)
|
||||||
|
if new_record.start_time is None:
|
||||||
|
return -2, f"Invalid start_time format {new_record.start_time}"
|
||||||
|
|
||||||
|
if new_record.stop_time is not None:
|
||||||
|
new_record.stop_time = validate_and_format_time(new_record.stop_time)
|
||||||
|
if new_record.stop_time is None:
|
||||||
|
return -2, f"Invalid stop_time format {new_record.stop_time}"
|
||||||
|
|
||||||
|
if new_record.batch_id is None:
|
||||||
|
new_record.batch_id = str(uuid4())[:8]
|
||||||
|
|
||||||
|
conn = db.pool.get_connection()
|
||||||
|
try:
|
||||||
|
|
||||||
|
strat : StrategyInstance = None
|
||||||
|
result, strat = cs.get_stratin(id=str(new_record.strat_id))
|
||||||
|
if result == 0:
|
||||||
|
new_record.symbol = strat.symbol
|
||||||
|
else:
|
||||||
|
return -1, f"Strategy {new_record.strat_id} not found"
|
||||||
|
|
||||||
|
cursor = conn.cursor()
|
||||||
|
|
||||||
|
# Construct a suitable INSERT query based on your RunManagerRecord fields
|
||||||
|
insert_query = """
|
||||||
|
INSERT INTO run_manager (moddus, id, strat_id, symbol,account, mode, note,ilog_save,
|
||||||
|
market, bt_from, bt_to, weekdays_filter, batch_id,
|
||||||
|
start_time, stop_time, status, last_processed,
|
||||||
|
history, valid_from, valid_to, testlist_id)
|
||||||
|
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?,?)
|
||||||
|
"""
|
||||||
|
values = [
|
||||||
|
new_record.moddus, str(new_record.id), str(new_record.strat_id), new_record.symbol, new_record.account, new_record.mode, new_record.note,
|
||||||
|
int(new_record.ilog_save), new_record.market,
|
||||||
|
new_record.bt_from.isoformat() if new_record.bt_from is not None else None,
|
||||||
|
new_record.bt_to.isoformat() if new_record.bt_to is not None else None,
|
||||||
|
",".join(str(x) for x in new_record.weekdays_filter) if new_record.weekdays_filter else None,
|
||||||
|
new_record.batch_id, new_record.start_time,
|
||||||
|
new_record.stop_time, new_record.status,
|
||||||
|
new_record.last_processed.isoformat() if new_record.last_processed is not None else None,
|
||||||
|
new_record.history,
|
||||||
|
new_record.valid_from.isoformat() if new_record.valid_from is not None else None,
|
||||||
|
new_record.valid_to.isoformat() if new_record.valid_to is not None else None,
|
||||||
|
new_record.testlist_id
|
||||||
|
]
|
||||||
|
db.execute_with_retry(cursor, insert_query, values)
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
#Add APS scheduler job refresh
|
||||||
|
res, result = aps.initialize_jobs()
|
||||||
|
if res < 0:
|
||||||
|
return -2, f"Error initializing jobs: {res} {result}"
|
||||||
|
|
||||||
|
return 0, new_record.id # Assuming success, you might return something more descriptive
|
||||||
|
except Exception as e:
|
||||||
|
print("ERROR while adding record:", str(e) + format_exc())
|
||||||
|
return -2, str(e) + format_exc()
|
||||||
|
finally:
|
||||||
|
db.pool.release_connection(conn)
|
||||||
|
|
||||||
|
# Update (example)
|
||||||
|
# update_data = {'last_started': '2024-02-13 10:35:00'}
|
||||||
|
# result, message = update_run_manager_record('625760ac-6376-47fa-8989-1e6a3f6ab66a', update_data)
|
||||||
|
def update_run_manager_record(record_id, updated_record: RunManagerRecord):
|
||||||
|
#validation/standardization of time
|
||||||
|
updated_record.start_time = validate_and_format_time(updated_record.start_time)
|
||||||
|
if updated_record.start_time is None:
|
||||||
|
return -2, f"Invalid start_time format {updated_record.start_time}"
|
||||||
|
|
||||||
|
if updated_record.stop_time is not None:
|
||||||
|
updated_record.stop_time = validate_and_format_time(updated_record.stop_time)
|
||||||
|
if updated_record.stop_time is None:
|
||||||
|
return -2, f"Invalid stop_time format {updated_record.stop_time}"
|
||||||
|
|
||||||
|
conn = db.pool.get_connection()
|
||||||
|
try:
|
||||||
|
cursor = conn.cursor()
|
||||||
|
|
||||||
|
#strategy lookup check, if strategy still exists
|
||||||
|
strat : StrategyInstance = None
|
||||||
|
result, strat = cs.get_stratin(id=str(updated_record.strat_id))
|
||||||
|
if result == 0:
|
||||||
|
updated_record.symbol = strat.symbol
|
||||||
|
else:
|
||||||
|
return -1, f"Strategy {updated_record.strat_id} not found"
|
||||||
|
|
||||||
|
#remove values with None, so they are not updated
|
||||||
|
#updated_record_dict = updated_record.dict(exclude_none=True)
|
||||||
|
|
||||||
|
# Construct update query and handle weekdays conversion
|
||||||
|
update_query = 'UPDATE run_manager SET '
|
||||||
|
update_params = []
|
||||||
|
for key, value in updated_record.dict().items(): # Iterate over model attributes
|
||||||
|
if key in ['id', 'strat_running']: # Skip updating the primary key
|
||||||
|
continue
|
||||||
|
update_query += f"{key} = ?, "
|
||||||
|
if key == "ilog_save":
|
||||||
|
value = int(value)
|
||||||
|
elif key in ["strat_id", "runner_id"]:
|
||||||
|
value = str(value) if value else None
|
||||||
|
elif key == "weekdays_filter":
|
||||||
|
value = ",".join(str(x) for x in value) if value else None
|
||||||
|
elif key in ['valid_from', 'valid_to', 'bt_from', 'bt_to', 'last_processed']:
|
||||||
|
value = value.isoformat() if value else None
|
||||||
|
update_params.append(value)
|
||||||
|
# if 'weekdays_filter' in updated_record.dict():
|
||||||
|
# updated_record.weekdays_filter = ",".join(str(x) for x in updated_record.weekdays_filter)
|
||||||
|
update_query = update_query[:-2] # Remove trailing comma and space
|
||||||
|
update_query += ' WHERE id = ?'
|
||||||
|
update_params.append(str(record_id))
|
||||||
|
|
||||||
|
db.execute_with_retry(cursor, update_query, update_params)
|
||||||
|
#cursor.execute(update_query, update_params)
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
#Add APS scheduler job refresh
|
||||||
|
res, result = aps.initialize_jobs()
|
||||||
|
if res < 0:
|
||||||
|
return -2, f"Error initializing jobs: {res} {result}"
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print("ERROR while updating record:", str(e) + format_exc())
|
||||||
|
return -2, str(e) + format_exc()
|
||||||
|
finally:
|
||||||
|
db.pool.release_connection(conn)
|
||||||
|
return 0, record_id
|
||||||
|
|
||||||
|
# result, message = delete_run_manager_record('625760ac-6376-47fa-8989-1e6a3f6ab66a')
|
||||||
|
def delete_run_manager_record(record_id):
|
||||||
|
conn = db.pool.get_connection()
|
||||||
|
try:
|
||||||
|
cursor = conn.cursor()
|
||||||
|
db.execute_with_retry(cursor, 'DELETE FROM run_manager WHERE id = ?', (str(record_id),))
|
||||||
|
#cursor.execute('DELETE FROM run_manager WHERE id = ?', (str(strategy_id),))
|
||||||
|
conn.commit()
|
||||||
|
except Exception as e:
|
||||||
|
print("ERROR while deleting record:", str(e) + format_exc())
|
||||||
|
return -2, str(e) + format_exc()
|
||||||
|
finally:
|
||||||
|
db.pool.release_connection(conn)
|
||||||
|
return 0, record_id
|
||||||
|
|
||||||
|
def fetch_scheduled_candidates_for_start_and_stop(market_datetime_now, market) -> tuple[int, dict]:
|
||||||
|
"""
|
||||||
|
Fetches all active records from the 'run_manager' table where the mode is 'schedule'. It checks if the current
|
||||||
|
time in the America/New_York timezone is within the operational intervals specified by 'start_time' and 'stop_time'
|
||||||
|
for each record. This function is designed to correctly handle scenarios where the operational interval crosses
|
||||||
|
midnight, as well as intervals contained within a single day.
|
||||||
|
|
||||||
|
The function localizes 'valid_from', 'valid_to', 'start_time', and 'stop_time' using the 'zoneNY' timezone object
|
||||||
|
for accurate comparison with the current time.
|
||||||
|
|
||||||
|
Parameters:
|
||||||
|
market_datetime_now (datetime): The current date and time in the America/New_York timezone.
|
||||||
|
market (str): The market identifier.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Tuple[int, dict]: A tuple where the first element is a status code (0 for success, -2 for error), and the
|
||||||
|
second element is a dictionary. This dictionary has keys 'start' and 'stop', each containing a list of
|
||||||
|
RunManagerRecord objects meeting the respective criteria. If an error occurs, the second element is a
|
||||||
|
descriptive error message.
|
||||||
|
|
||||||
|
Note:
|
||||||
|
- This function assumes that the 'zoneNY' pytz timezone object is properly defined and configured to represent
|
||||||
|
the America/New York timezone.
|
||||||
|
- It also assumes that the 'run_manager' table exists in the database with the required columns.
|
||||||
|
- 'start_time' and 'stop_time' are expected to be strings representing times in 24-hour format.
|
||||||
|
- If 'valid_from', 'valid_to', 'start_time', or 'stop_time' are NULL in the database, they are considered as
|
||||||
|
having unlimited boundaries.
|
||||||
|
|
||||||
|
Pozor: je jeste jeden okrajovy pripad, kdy by to nemuselo zafungovat: kdyby casy byly nastaveny pro
|
||||||
|
beh strategie pres pulnoc, ale zapla by se pozdeji az po pulnoci
|
||||||
|
(https://chat.openai.com/c/3c77674a-8a2c-45aa-afbd-ab140f473e07)
|
||||||
|
|
||||||
|
"""
|
||||||
|
conn = db.pool.get_connection()
|
||||||
|
try:
|
||||||
|
conn.row_factory = Row
|
||||||
|
cursor = conn.cursor()
|
||||||
|
|
||||||
|
# Get current datetime in America/New York timezone
|
||||||
|
market_datetime_now_str = market_datetime_now.strftime('%Y-%m-%d %H:%M:%S')
|
||||||
|
current_time_str = market_datetime_now.strftime('%H:%M')
|
||||||
|
print("current_market_datetime_str:", market_datetime_now_str)
|
||||||
|
print("current_time_str:", current_time_str)
|
||||||
|
|
||||||
|
# Select also supports scenarios where strategy runs overnight
|
||||||
|
# SQL query to fetch records with active status and date constraints for both start and stop times
|
||||||
|
query = """
|
||||||
|
SELECT *,
|
||||||
|
CASE
|
||||||
|
WHEN start_time <= stop_time AND (? >= start_time AND ? < stop_time) OR
|
||||||
|
start_time > stop_time AND (? >= start_time OR ? < stop_time) THEN 1
|
||||||
|
ELSE 0
|
||||||
|
END as is_start_time,
|
||||||
|
CASE
|
||||||
|
WHEN start_time <= stop_time AND (? >= stop_time OR ? < start_time) OR
|
||||||
|
start_time > stop_time AND (? >= stop_time AND ? < start_time) THEN 1
|
||||||
|
ELSE 0
|
||||||
|
END as is_stop_time
|
||||||
|
FROM run_manager
|
||||||
|
WHERE status = 'active' AND moddus = 'schedule' AND
|
||||||
|
((valid_from IS NULL OR strftime('%Y-%m-%d %H:%M:%S', valid_from) <= ?) AND
|
||||||
|
(valid_to IS NULL OR strftime('%Y-%m-%d %H:%M:%S', valid_to) >= ?))
|
||||||
|
"""
|
||||||
|
cursor.execute(query, (current_time_str, current_time_str, current_time_str, current_time_str,
|
||||||
|
current_time_str, current_time_str, current_time_str, current_time_str,
|
||||||
|
market_datetime_now_str, market_datetime_now_str))
|
||||||
|
rows = cursor.fetchall()
|
||||||
|
|
||||||
|
start_candidates = []
|
||||||
|
stop_candidates = []
|
||||||
|
for row in rows:
|
||||||
|
run_manager_record = tr.row_to_runmanager(row)
|
||||||
|
if row['is_start_time']:
|
||||||
|
start_candidates.append(run_manager_record)
|
||||||
|
if row['is_stop_time']:
|
||||||
|
stop_candidates.append(run_manager_record)
|
||||||
|
|
||||||
|
results = {'start': start_candidates, 'stop': stop_candidates}
|
||||||
|
|
||||||
|
return 0, results
|
||||||
|
except Exception as e:
|
||||||
|
msg_err = f"ERROR while fetching records for start and stop times with datetime {market_datetime_now_str}: {str(e)} {format_exc()}"
|
||||||
|
print(msg_err)
|
||||||
|
return -2, msg_err
|
||||||
|
finally:
|
||||||
|
conn.row_factory = None
|
||||||
|
db.pool.release_connection(conn)
|
||||||
|
|
||||||
|
|
||||||
|
def fetch_startstop_scheduled_candidates(market_datetime_now, time_check, market = "US") -> tuple[int, list[RunManagerRecord]]:
|
||||||
|
"""
|
||||||
|
Fetches all active records from the 'run_manager' table where moddus is schedule, the current date and time
|
||||||
|
in the America/New_York timezone falls between the 'valid_from' and 'valid_to' datetime
|
||||||
|
fields, and either 'start_time' or 'stop_time' matches the specified condition with the current time.
|
||||||
|
If 'valid_from', 'valid_to', or the time column ('start_time'/'stop_time') are NULL, they are considered
|
||||||
|
as having unlimited boundaries.
|
||||||
|
|
||||||
|
The function localizes the 'valid_from', 'valid_to', and the time column times using the 'zoneNY'
|
||||||
|
timezone object for accurate comparison with the current time.
|
||||||
|
|
||||||
|
Parameters:
|
||||||
|
market_datetime_now (datetime): Current datetime in the market timezone.
|
||||||
|
market (str): The market for which to fetch candidates.
|
||||||
|
time_check (str): Either 'start' or 'stop', indicating which time condition to check.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Tuple[int, list[RunManagerRecord]]: A tuple where the first element is a status code
|
||||||
|
(0 for success, -2 for error), and the second element is a list of RunManagerRecord
|
||||||
|
objects meeting the criteria. If an error occurs, the second element is a descriptive
|
||||||
|
error message.
|
||||||
|
|
||||||
|
Note:
|
||||||
|
This function assumes that the 'zoneNY' pytz timezone object is properly defined and
|
||||||
|
configured to represent the America/New York timezone. It also assumes that the
|
||||||
|
'run_manager' table exists in the database with the columns as described in the
|
||||||
|
provided schema.
|
||||||
|
"""
|
||||||
|
if time_check not in ['start', 'stop']:
|
||||||
|
return -2, "Invalid time_check parameter. Must be 'start' or 'stop'."
|
||||||
|
|
||||||
|
conn = db.pool.get_connection()
|
||||||
|
try:
|
||||||
|
conn.row_factory = Row
|
||||||
|
cursor = conn.cursor()
|
||||||
|
|
||||||
|
# Get current datetime in America/New York timezone
|
||||||
|
market_datetime_now_str = market_datetime_now.strftime('%Y-%m-%d %H:%M:%S')
|
||||||
|
current_time_str = market_datetime_now.strftime('%H:%M')
|
||||||
|
print("current_market_datetime_str:", market_datetime_now_str)
|
||||||
|
print("current_time_str:", current_time_str)
|
||||||
|
|
||||||
|
# SQL query to fetch records with active status, date constraints, and time condition
|
||||||
|
time_column = 'start_time' if time_check == 'start' else 'stop_time'
|
||||||
|
query = f"""
|
||||||
|
SELECT * FROM run_manager
|
||||||
|
WHERE status = 'active' AND moddus = 'schedule' AND
|
||||||
|
((valid_from IS NULL OR strftime('%Y-%m-%d %H:%M:%S', valid_from) <= ?) AND
|
||||||
|
(valid_to IS NULL OR strftime('%Y-%m-%d %H:%M:%S', valid_to) >= ?)) AND
|
||||||
|
({time_column} IS NULL OR {time_column} <= ?)
|
||||||
|
"""
|
||||||
|
cursor.execute(query, (market_datetime_now_str, market_datetime_now_str, current_time_str))
|
||||||
|
rows = cursor.fetchall()
|
||||||
|
results = [tr.row_to_runmanager(row) for row in rows]
|
||||||
|
|
||||||
|
return 0, results
|
||||||
|
except Exception as e:
|
||||||
|
msg_err = f"ERROR while fetching records based on {time_check} time with datetime {market_datetime_now_str}: {str(e)} {format_exc()}"
|
||||||
|
print(msg_err)
|
||||||
|
return -2, msg_err
|
||||||
|
finally:
|
||||||
|
conn.row_factory = None
|
||||||
|
db.pool.release_connection(conn)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
res, sada = fetch_startstop_scheduled_candidates(datetime.now().astimezone(zoneNY), "start")
|
||||||
|
if res == 0:
|
||||||
|
print(sada)
|
||||||
|
else:
|
||||||
|
print("Error:", sada)
|
||||||
|
|
||||||
|
# from apscheduler.schedulers.background import BackgroundScheduler
|
||||||
|
# import time
|
||||||
|
|
||||||
|
# def print_hello():
|
||||||
|
# print("Hello")
|
||||||
|
|
||||||
|
# def schedule_job():
|
||||||
|
# scheduler = BackgroundScheduler()
|
||||||
|
# scheduler.add_job(print_hello, 'interval', seconds=10)
|
||||||
|
# scheduler.start()
|
||||||
|
|
||||||
|
# schedule_job()
|
||||||
@ -3,7 +3,7 @@ from uuid import UUID, uuid4
|
|||||||
import pickle
|
import pickle
|
||||||
from alpaca.data.historical import StockHistoricalDataClient
|
from alpaca.data.historical import StockHistoricalDataClient
|
||||||
from alpaca.data.requests import StockTradesRequest, StockBarsRequest
|
from alpaca.data.requests import StockTradesRequest, StockBarsRequest
|
||||||
from alpaca.data.enums import DataFeed
|
from alpaca.data.enums import DataFeed
|
||||||
from alpaca.data.timeframe import TimeFrame
|
from alpaca.data.timeframe import TimeFrame
|
||||||
from v2realbot.strategy.base import StrategyState
|
from v2realbot.strategy.base import StrategyState
|
||||||
from v2realbot.enums.enums import RecordType, StartBarAlign, Mode, Account, OrderSide
|
from v2realbot.enums.enums import RecordType, StartBarAlign, Mode, Account, OrderSide
|
||||||
@ -14,7 +14,7 @@ from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeSt
|
|||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from v2realbot.loader.trade_offline_streamer import Trade_Offline_Streamer
|
from v2realbot.loader.trade_offline_streamer import Trade_Offline_Streamer
|
||||||
from threading import Thread, current_thread, Event, enumerate
|
from threading import Thread, current_thread, Event, enumerate
|
||||||
from v2realbot.config import STRATVARS_UNCHANGEABLES, ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, ACCOUNT1_LIVE_API_KEY, ACCOUNT1_LIVE_SECRET_KEY, DATA_DIR,BT_FILL_CONS_TRADES_REQUIRED,BT_FILL_LOG_SURROUNDING_TRADES,BT_FILL_CONDITION_BUY_LIMIT,BT_FILL_CONDITION_SELL_LIMIT, GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN, MEDIA_DIRECTORY, RUNNER_DETAIL_DIRECTORY, OFFLINE_MODE
|
from v2realbot.config import STRATVARS_UNCHANGEABLES, ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, ACCOUNT1_PAPER_FEED, ACCOUNT1_LIVE_API_KEY, ACCOUNT1_LIVE_SECRET_KEY, ACCOUNT1_LIVE_FEED, DATA_DIR, MEDIA_DIRECTORY, RUNNER_DETAIL_DIRECTORY
|
||||||
import importlib
|
import importlib
|
||||||
from alpaca.trading.requests import GetCalendarRequest
|
from alpaca.trading.requests import GetCalendarRequest
|
||||||
from alpaca.trading.client import TradingClient
|
from alpaca.trading.client import TradingClient
|
||||||
@ -24,23 +24,25 @@ from tinydb import TinyDB, Query, where
|
|||||||
from tinydb.operations import set
|
from tinydb.operations import set
|
||||||
import orjson
|
import orjson
|
||||||
import numpy as np
|
import numpy as np
|
||||||
from numpy import ndarray
|
|
||||||
from rich import print
|
from rich import print
|
||||||
import pandas as pd
|
import pandas as pd
|
||||||
from traceback import format_exc
|
from traceback import format_exc
|
||||||
from datetime import timedelta, time
|
from datetime import timedelta, time
|
||||||
from threading import Lock
|
from threading import Lock
|
||||||
from v2realbot.common.db import pool, execute_with_retry, row_to_runarchive, row_to_runarchiveview
|
from v2realbot.common.db import pool, execute_with_retry
|
||||||
|
import v2realbot.common.transform as tr
|
||||||
from sqlite3 import OperationalError, Row
|
from sqlite3 import OperationalError, Row
|
||||||
import v2realbot.strategyblocks.indicators.custom as ci
|
import v2realbot.strategyblocks.indicators.custom as ci
|
||||||
from v2realbot.strategyblocks.inits.init_indicators import initialize_dynamic_indicators
|
from v2realbot.strategyblocks.inits.init_indicators import initialize_dynamic_indicators
|
||||||
from v2realbot.strategyblocks.indicators.indicators_hub import populate_dynamic_indicators
|
from v2realbot.strategyblocks.indicators.indicators_hub import populate_dynamic_indicators
|
||||||
|
from v2realbot.strategyblocks.inits.init_attached_data import attach_previous_data
|
||||||
from v2realbot.interfaces.backtest_interface import BacktestInterface
|
from v2realbot.interfaces.backtest_interface import BacktestInterface
|
||||||
import os
|
import os
|
||||||
from v2realbot.reporting.metricstoolsimage import generate_trading_report_image
|
import v2realbot.reporting.metricstoolsimage as mt
|
||||||
import msgpack
|
|
||||||
import gzip
|
import gzip
|
||||||
import os
|
import os
|
||||||
|
import msgpack
|
||||||
|
import v2realbot.utils.config_handler as cfh
|
||||||
#import gc
|
#import gc
|
||||||
#from pyinstrument import Profiler
|
#from pyinstrument import Profiler
|
||||||
#adding lock to ensure thread safety of TinyDB (in future will be migrated to proper db)
|
#adding lock to ensure thread safety of TinyDB (in future will be migrated to proper db)
|
||||||
@ -81,7 +83,7 @@ def get_all_stratins():
|
|||||||
else:
|
else:
|
||||||
return (0, [])
|
return (0, [])
|
||||||
|
|
||||||
def get_stratin(id: UUID):
|
def get_stratin(id: UUID) -> List[StrategyInstance]:
|
||||||
for i in db.stratins:
|
for i in db.stratins:
|
||||||
if str(i.id) == str(id):
|
if str(i.id) == str(id):
|
||||||
return (0, i)
|
return (0, i)
|
||||||
@ -101,12 +103,12 @@ def create_stratin(si: StrategyInstance):
|
|||||||
#validate toml
|
#validate toml
|
||||||
res, stp = parse_toml_string(si.stratvars_conf)
|
res, stp = parse_toml_string(si.stratvars_conf)
|
||||||
if res < 0:
|
if res < 0:
|
||||||
return (-1,"stratvars invalid")
|
return (-1,f"stratvars invalid: {stp}")
|
||||||
res, adp = parse_toml_string(si.add_data_conf)
|
res, adp = parse_toml_string(si.add_data_conf)
|
||||||
if res < 0:
|
if res < 0:
|
||||||
return (-1, "None")
|
return (-1, f"add data conf invalid {adp}")
|
||||||
si.id = uuid4()
|
si.id = uuid4()
|
||||||
print(si)
|
#print(si)
|
||||||
db.stratins.append(si)
|
db.stratins.append(si)
|
||||||
db.save()
|
db.save()
|
||||||
#print(db.stratins)
|
#print(db.stratins)
|
||||||
@ -118,10 +120,10 @@ def modify_stratin(si: StrategyInstance, id: UUID):
|
|||||||
return (-1, "strat is running, use modify_stratin_running")
|
return (-1, "strat is running, use modify_stratin_running")
|
||||||
res, stp = parse_toml_string(si.stratvars_conf)
|
res, stp = parse_toml_string(si.stratvars_conf)
|
||||||
if res < 0:
|
if res < 0:
|
||||||
return (-1, "stratvars invalid")
|
return (-1, f"stratvars invalid {stp}")
|
||||||
res, adp = parse_toml_string(si.add_data_conf)
|
res, adp = parse_toml_string(si.add_data_conf)
|
||||||
if res < 0:
|
if res < 0:
|
||||||
return (-1, "add data conf invalid")
|
return (-1, f"add data conf invalid {adp}")
|
||||||
for i in db.stratins:
|
for i in db.stratins:
|
||||||
if str(i.id) == str(id):
|
if str(i.id) == str(id):
|
||||||
#print("removing",i)
|
#print("removing",i)
|
||||||
@ -179,14 +181,14 @@ def modify_stratin_running(si: StrategyInstance, id: UUID):
|
|||||||
#validate toml
|
#validate toml
|
||||||
res,stp = parse_toml_string(si.stratvars_conf)
|
res,stp = parse_toml_string(si.stratvars_conf)
|
||||||
if res < 0:
|
if res < 0:
|
||||||
return (-1, "new stratvars format invalid")
|
return (-1, f"new stratvars format invalid {stp}")
|
||||||
for i in db.stratins:
|
for i in db.stratins:
|
||||||
if str(i.id) == str(id):
|
if str(i.id) == str(id):
|
||||||
if not is_stratin_running(id=str(id)):
|
if not is_stratin_running(id=str(id)):
|
||||||
return (-1, "not running")
|
return (-1, "not running")
|
||||||
res,stp_old = parse_toml_string(i.stratvars_conf)
|
res,stp_old = parse_toml_string(i.stratvars_conf)
|
||||||
if res < 0:
|
if res < 0:
|
||||||
return (-1, "current stratin stratvars invalid")
|
return (-1, f"current stratin stratvars invalid {stp_old}")
|
||||||
#TODO reload running strat
|
#TODO reload running strat
|
||||||
#print(stp)
|
#print(stp)
|
||||||
#print("starting injection", stp)
|
#print("starting injection", stp)
|
||||||
@ -243,13 +245,14 @@ def pause_runner(id: UUID):
|
|||||||
return (0, "paused runner " + str(i.id))
|
return (0, "paused runner " + str(i.id))
|
||||||
print("no ID found")
|
print("no ID found")
|
||||||
return (-1, "not running instance found")
|
return (-1, "not running instance found")
|
||||||
|
#allows to delete runner based on runner_id, strat_id or all (both none)
|
||||||
def stop_runner(id: UUID = None):
|
#podpruje i hodnotu strat_id v id
|
||||||
|
def stop_runner(id: UUID = None, strat_id: UUID = None):
|
||||||
chng = []
|
chng = []
|
||||||
try:
|
try:
|
||||||
for i in db.runners:
|
for i in db.runners:
|
||||||
#print(i['id'])
|
#print(i['id'])
|
||||||
if id is None or str(i.id) == id:
|
if (id is None and strat_id is None) or str(i.id) == str(id) or str(i.strat_id) == str(strat_id) or str(i.strat_id) == str(id):
|
||||||
chng.append(i.id)
|
chng.append(i.id)
|
||||||
print("Sending STOP signal to Runner", i.id)
|
print("Sending STOP signal to Runner", i.id)
|
||||||
#just sending the signal, update is done in stop after plugin
|
#just sending the signal, update is done in stop after plugin
|
||||||
@ -349,13 +352,28 @@ def capsule(target: object, db: object, inter_batch_params: dict = None):
|
|||||||
db.runners.remove(i)
|
db.runners.remove(i)
|
||||||
#vytvoreni report image pro RUNNER
|
#vytvoreni report image pro RUNNER
|
||||||
try:
|
try:
|
||||||
res, val = generate_trading_report_image(runner_ids=[str(i.id)])
|
res, val = mt.generate_trading_report_image(runner_ids=[str(i.id)])
|
||||||
if res == 0:
|
if res == 0:
|
||||||
print("DAILY REPORT IMAGE CREATED")
|
print("DAILY REPORT IMAGE CREATED")
|
||||||
else:
|
else:
|
||||||
print(f"Daily report ERROR - {val}")
|
print(f"Daily report ERROR - {val}")
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print("Nepodarilo se vytvorit report image", str(e)+format_exc())
|
err_msg = "Nepodarilo se vytvorit daily report image" + str(e)+format_exc()
|
||||||
|
send_to_telegram(err_msg)
|
||||||
|
print(err_msg)
|
||||||
|
#PRO LIVE a PAPER pri vyplnenem batchi vytvarime batchovy soubor zde (pro BT ridi batch_manager)
|
||||||
|
if i.run_mode in [Mode.LIVE, Mode.PAPER] and i.batch_id is not None:
|
||||||
|
try:
|
||||||
|
res, val = mt.generate_trading_report_image(batch_id=i.batch_id)
|
||||||
|
if res == 0:
|
||||||
|
print("BATCH REPORT CREATED")
|
||||||
|
else:
|
||||||
|
print(f"BATCH REPORT ERROR - {val}")
|
||||||
|
except Exception as e:
|
||||||
|
err_msg = f"Nepodarilo se vytvorit batchj report image pro {i.strat_id} a batch{i.batch_id}" + str(e)+format_exc()
|
||||||
|
send_to_telegram(err_msg)
|
||||||
|
print(err_msg)
|
||||||
|
|
||||||
target.release()
|
target.release()
|
||||||
print("Runner STOPPED")
|
print("Runner STOPPED")
|
||||||
|
|
||||||
@ -395,7 +413,7 @@ def run_batch_stratin(id: UUID, runReq: RunRequest):
|
|||||||
def get_market_days_in_interval(datefrom, dateto, note = None, id = None):
|
def get_market_days_in_interval(datefrom, dateto, note = None, id = None):
|
||||||
#getting dates from calendat
|
#getting dates from calendat
|
||||||
clientTrading = TradingClient(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, raw_data=False, paper=True)
|
clientTrading = TradingClient(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, raw_data=False, paper=True)
|
||||||
calendar_request = GetCalendarRequest(start=datefrom,end=dateto)
|
calendar_request = GetCalendarRequest(start=datefrom.date(),end=dateto.date())
|
||||||
cal_dates = clientTrading.get_calendar(calendar_request)
|
cal_dates = clientTrading.get_calendar(calendar_request)
|
||||||
#list(Calendar)
|
#list(Calendar)
|
||||||
# Calendar
|
# Calendar
|
||||||
@ -429,7 +447,7 @@ def run_batch_stratin(id: UUID, runReq: RunRequest):
|
|||||||
cal_list.append(RunDay(start = start_time, end = end_time, note = note, id = id))
|
cal_list.append(RunDay(start = start_time, end = end_time, note = note, id = id))
|
||||||
|
|
||||||
print(f"Getting interval dates from - to - RESULT ({len(cal_list)}):")
|
print(f"Getting interval dates from - to - RESULT ({len(cal_list)}):")
|
||||||
print(cal_list)
|
#print(cal_list)
|
||||||
return cal_list
|
return cal_list
|
||||||
|
|
||||||
#getting days to run into RunDays format
|
#getting days to run into RunDays format
|
||||||
@ -476,6 +494,9 @@ def run_batch_stratin(id: UUID, runReq: RunRequest):
|
|||||||
# bud ceka na dokonceni v runners nebo to bude ridit jinak a bude mit jednoho runnera?
|
# bud ceka na dokonceni v runners nebo to bude ridit jinak a bude mit jednoho runnera?
|
||||||
# nejak vymyslet.
|
# nejak vymyslet.
|
||||||
# logovani zatim jen do print
|
# logovani zatim jen do print
|
||||||
|
|
||||||
|
##OFFLINE BATCH RUN MANAGER (generuje batch_id, ridi datove provazani runnerů(inter_batch_data) a generuje batch report
|
||||||
|
## a samozrejme spousti jednotlivé dny
|
||||||
def batch_run_manager(id: UUID, runReq: RunRequest, rundays: list[RunDay]):
|
def batch_run_manager(id: UUID, runReq: RunRequest, rundays: list[RunDay]):
|
||||||
#zde muzu iterovat nad intervaly
|
#zde muzu iterovat nad intervaly
|
||||||
#cekat az dobehne jeden interval a pak spustit druhy
|
#cekat az dobehne jeden interval a pak spustit druhy
|
||||||
@ -567,14 +588,16 @@ def batch_run_manager(id: UUID, runReq: RunRequest, rundays: list[RunDay]):
|
|||||||
runReq = None
|
runReq = None
|
||||||
#vytvoreni report image pro batch
|
#vytvoreni report image pro batch
|
||||||
try:
|
try:
|
||||||
res, val = generate_trading_report_image(batch_id=batch_id)
|
res, val = mt.generate_trading_report_image(batch_id=batch_id)
|
||||||
if res == 0:
|
if res == 0:
|
||||||
print("BATCH REPORT CREATED")
|
print("BATCH REPORT CREATED")
|
||||||
else:
|
else:
|
||||||
print(f"BATCH REPORT ERROR - {val}")
|
print(f"BATCH REPORT ERROR - {val}")
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print("Nepodarilo se vytvorit report image", str(e)+format_exc())
|
err_msg = "Nepodarilo se vytvorit batch report image" + str(e)+format_exc()
|
||||||
|
send_to_telegram(err_msg)
|
||||||
|
print(err_msg)
|
||||||
|
|
||||||
#gc.collect()
|
#gc.collect()
|
||||||
|
|
||||||
@ -596,10 +619,10 @@ def run_stratin(id: UUID, runReq: RunRequest, synchronous: bool = False, inter_b
|
|||||||
#validate toml
|
#validate toml
|
||||||
res, stp = parse_toml_string(i.stratvars_conf)
|
res, stp = parse_toml_string(i.stratvars_conf)
|
||||||
if res < 0:
|
if res < 0:
|
||||||
return (-1, "stratvars invalid")
|
return (-1, f"stratvars invalid {stp}")
|
||||||
res, adp = parse_toml_string(i.add_data_conf)
|
res, adp = parse_toml_string(i.add_data_conf)
|
||||||
if res < 0:
|
if res < 0:
|
||||||
return (-1, "add data conf invalid")
|
return (-1, f"add data conf invalid {adp}")
|
||||||
id = uuid4()
|
id = uuid4()
|
||||||
print(f"RUN {id} INITIATED")
|
print(f"RUN {id} INITIATED")
|
||||||
name = i.name
|
name = i.name
|
||||||
@ -697,7 +720,7 @@ def get_trade_history(symbol: str, timestamp_from: float, timestamp_to:float):
|
|||||||
#datetime_object_from = datetime(2023, 4, 14, 15, 51, 38, tzinfo=zoneNY)
|
#datetime_object_from = datetime(2023, 4, 14, 15, 51, 38, tzinfo=zoneNY)
|
||||||
#datetime_object_to = datetime(2023, 4, 14, 15, 51, 39, tzinfo=zoneNY)
|
#datetime_object_to = datetime(2023, 4, 14, 15, 51, 39, tzinfo=zoneNY)
|
||||||
client = StockHistoricalDataClient(ACCOUNT1_LIVE_API_KEY, ACCOUNT1_LIVE_SECRET_KEY, raw_data=False)
|
client = StockHistoricalDataClient(ACCOUNT1_LIVE_API_KEY, ACCOUNT1_LIVE_SECRET_KEY, raw_data=False)
|
||||||
trades_request = StockTradesRequest(symbol_or_symbols=symbol, feed = DataFeed.SIP, start=datetime_object_from, end=datetime_object_to)
|
trades_request = StockTradesRequest(symbol_or_symbols=symbol, feed = ACCOUNT1_LIVE_FEED, start=datetime_object_from, end=datetime_object_to)
|
||||||
all_trades = client.get_stock_trades(trades_request)
|
all_trades = client.get_stock_trades(trades_request)
|
||||||
#print(all_trades[symbol])
|
#print(all_trades[symbol])
|
||||||
return 0, all_trades[symbol]
|
return 0, all_trades[symbol]
|
||||||
@ -866,13 +889,9 @@ def archive_runner(runner: Runner, strat: StrategyInstance, inter_batch_params:
|
|||||||
rectype=strat.state.rectype,
|
rectype=strat.state.rectype,
|
||||||
cache_used=strat.dataloader.cache_used if isinstance(strat.dataloader, Trade_Offline_Streamer) else None,
|
cache_used=strat.dataloader.cache_used if isinstance(strat.dataloader, Trade_Offline_Streamer) else None,
|
||||||
configs=dict(
|
configs=dict(
|
||||||
GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN=GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN,
|
CONFIG_HANDLER=dict(profile=cfh.config_handler.active_profile, values=cfh.config_handler.active_config)))
|
||||||
BT_FILL_CONS_TRADES_REQUIRED=BT_FILL_CONS_TRADES_REQUIRED,
|
|
||||||
BT_FILL_LOG_SURROUNDING_TRADES=BT_FILL_LOG_SURROUNDING_TRADES,
|
|
||||||
BT_FILL_CONDITION_BUY_LIMIT=BT_FILL_CONDITION_BUY_LIMIT,
|
|
||||||
BT_FILL_CONDITION_SELL_LIMIT=BT_FILL_CONDITION_SELL_LIMIT))
|
|
||||||
|
|
||||||
|
|
||||||
#add profit of this batch iteration to batch_sum_profit
|
#add profit of this batch iteration to batch_sum_profit
|
||||||
if inter_batch_params is not None:
|
if inter_batch_params is not None:
|
||||||
inter_batch_params["batch_profit"] += round(float(strat.state.profit),2)
|
inter_batch_params["batch_profit"] += round(float(strat.state.profit),2)
|
||||||
@ -907,7 +926,8 @@ def archive_runner(runner: Runner, strat: StrategyInstance, inter_batch_params:
|
|||||||
end_positions=strat.state.positions,
|
end_positions=strat.state.positions,
|
||||||
end_positions_avgp=round(float(strat.state.avgp),3),
|
end_positions_avgp=round(float(strat.state.avgp),3),
|
||||||
metrics=results_metrics,
|
metrics=results_metrics,
|
||||||
stratvars_toml=runner.run_stratvars_toml
|
stratvars_toml=runner.run_stratvars_toml,
|
||||||
|
transferables=strat.state.vars["transferables"]
|
||||||
)
|
)
|
||||||
|
|
||||||
#flatten indicators from numpy array
|
#flatten indicators from numpy array
|
||||||
@ -915,7 +935,7 @@ def archive_runner(runner: Runner, strat: StrategyInstance, inter_batch_params:
|
|||||||
#pole indicatoru, kazdy ma svoji casovou osu time
|
#pole indicatoru, kazdy ma svoji casovou osu time
|
||||||
flattened_indicators_list = []
|
flattened_indicators_list = []
|
||||||
for key, value in strat.state.indicators.items():
|
for key, value in strat.state.indicators.items():
|
||||||
if isinstance(value, ndarray):
|
if isinstance(value, np.ndarray):
|
||||||
#print("is numpy", key,value)
|
#print("is numpy", key,value)
|
||||||
flattened_indicators[key]= value.tolist()
|
flattened_indicators[key]= value.tolist()
|
||||||
#print("changed numpy:",value.tolist())
|
#print("changed numpy:",value.tolist())
|
||||||
@ -925,7 +945,7 @@ def archive_runner(runner: Runner, strat: StrategyInstance, inter_batch_params:
|
|||||||
flattened_indicators_list.append(flattened_indicators)
|
flattened_indicators_list.append(flattened_indicators)
|
||||||
flattened_indicators = {}
|
flattened_indicators = {}
|
||||||
for key, value in strat.state.cbar_indicators.items():
|
for key, value in strat.state.cbar_indicators.items():
|
||||||
if isinstance(value, ndarray):
|
if isinstance(value, np.ndarray):
|
||||||
#print("is numpy", key,value)
|
#print("is numpy", key,value)
|
||||||
flattened_indicators[key]= value.tolist()
|
flattened_indicators[key]= value.tolist()
|
||||||
#print("changed numpy:",value.tolist())
|
#print("changed numpy:",value.tolist())
|
||||||
@ -988,7 +1008,7 @@ def get_all_archived_runners() -> list[RunArchiveView]:
|
|||||||
rows = c.fetchall()
|
rows = c.fetchall()
|
||||||
results = []
|
results = []
|
||||||
for row in rows:
|
for row in rows:
|
||||||
results.append(row_to_runarchiveview(row))
|
results.append(tr.row_to_runarchiveview(row))
|
||||||
finally:
|
finally:
|
||||||
conn.row_factory = None
|
conn.row_factory = None
|
||||||
pool.release_connection(conn)
|
pool.release_connection(conn)
|
||||||
@ -1018,7 +1038,7 @@ def get_all_archived_runners() -> list[RunArchiveView]:
|
|||||||
# c.execute(paginated_query)
|
# c.execute(paginated_query)
|
||||||
# rows = c.fetchall()
|
# rows = c.fetchall()
|
||||||
|
|
||||||
# results = [row_to_runarchiveview(row) for row in rows]
|
# results = [tr.row_to_runarchiveview(row) for row in rows]
|
||||||
|
|
||||||
# finally:
|
# finally:
|
||||||
# conn.row_factory = None
|
# conn.row_factory = None
|
||||||
@ -1032,7 +1052,7 @@ def get_all_archived_runners() -> list[RunArchiveView]:
|
|||||||
|
|
||||||
#new version to support search and ordering
|
#new version to support search and ordering
|
||||||
#TODO index nad strat_id a batch_id mam?
|
#TODO index nad strat_id a batch_id mam?
|
||||||
def get_all_archived_runners_p(request: DataTablesRequest) -> Tuple[int, RunArchiveViewPagination]:
|
def get_all_archived_runners_p_original(request: DataTablesRequest) -> Tuple[int, RunArchiveViewPagination]:
|
||||||
conn = pool.get_connection()
|
conn = pool.get_connection()
|
||||||
search_value = request.search.value # Extract the search value from the request
|
search_value = request.search.value # Extract the search value from the request
|
||||||
try:
|
try:
|
||||||
@ -1068,7 +1088,80 @@ def get_all_archived_runners_p(request: DataTablesRequest) -> Tuple[int, RunArch
|
|||||||
c.execute(filtered_count_query, {'search_value': f'%{search_value}%'})
|
c.execute(filtered_count_query, {'search_value': f'%{search_value}%'})
|
||||||
filtered_count = c.fetchone()[0]
|
filtered_count = c.fetchone()[0]
|
||||||
|
|
||||||
results = [row_to_runarchiveview(row) for row in rows]
|
results = [tr.row_to_runarchiveview(row) for row in rows]
|
||||||
|
|
||||||
|
finally:
|
||||||
|
conn.row_factory = None
|
||||||
|
pool.release_connection(conn)
|
||||||
|
|
||||||
|
try:
|
||||||
|
obj = RunArchiveViewPagination(draw=request.draw, recordsTotal=total_count, recordsFiltered=filtered_count, data=results)
|
||||||
|
return 0, obj
|
||||||
|
except Exception as e:
|
||||||
|
return -2, str(e) + format_exc()
|
||||||
|
|
||||||
|
|
||||||
|
#new version with batch_id asc sortin https://chat.openai.com/c/64511445-5181-411b-b9d0-51d16930bf71
|
||||||
|
#Tato verze správně groupuje záznamy se stejnym batch_id (podle maximalniho batche) a non batch zaznamy prolne mezi ne podle jeho stopped date - vlozi zaznam po nebo pred jednotlivou skupinu (dle jejiho max.date)
|
||||||
|
#diky tomu se mi radi batche a nonbatche spravne a pokud do batche pridame zaznam zobrazi se nam batch nahore
|
||||||
|
def get_all_archived_runners_p(request: DataTablesRequest) -> Tuple[int, RunArchiveViewPagination]:
|
||||||
|
conn = pool.get_connection()
|
||||||
|
search_value = request.search.value # Extract the search value from the request
|
||||||
|
try:
|
||||||
|
conn.row_factory = Row
|
||||||
|
c = conn.cursor()
|
||||||
|
|
||||||
|
# Total count query
|
||||||
|
total_count_query = """
|
||||||
|
SELECT COUNT(*) FROM runner_header
|
||||||
|
WHERE (:search_value = '' OR strat_id LIKE :search_value OR batch_id LIKE :search_value OR symbol like :search_value OR name like :search_value)
|
||||||
|
"""
|
||||||
|
c.execute(total_count_query, {'search_value': f'%{search_value}%'})
|
||||||
|
total_count = c.fetchone()[0]
|
||||||
|
|
||||||
|
# Paginated query with advanced sorting logic
|
||||||
|
paginated_query = f"""
|
||||||
|
WITH GroupedData AS (
|
||||||
|
SELECT runner_id, strat_id, batch_id, symbol, name, note, started,
|
||||||
|
stopped, mode, account, bt_from, bt_to, ilog_save, profit,
|
||||||
|
trade_count, end_positions, end_positions_avgp, metrics,
|
||||||
|
MAX(stopped) OVER (PARTITION BY batch_id) AS max_stopped,
|
||||||
|
SUM(profit) OVER (PARTITION BY batch_id) AS batch_profit,
|
||||||
|
COUNT(*) OVER (PARTITION BY batch_id) AS batch_count
|
||||||
|
FROM runner_header
|
||||||
|
WHERE (:search_value = '' OR strat_id LIKE :search_value OR batch_id LIKE :search_value OR symbol like :search_value OR name like :search_value)
|
||||||
|
),
|
||||||
|
InterleavedGroups AS (
|
||||||
|
SELECT *,
|
||||||
|
CASE
|
||||||
|
WHEN batch_id IS NOT NULL THEN max_stopped
|
||||||
|
ELSE stopped
|
||||||
|
END AS sort_key
|
||||||
|
FROM GroupedData
|
||||||
|
)
|
||||||
|
SELECT runner_id, strat_id, batch_id, symbol, name, note, started,
|
||||||
|
stopped, mode, account, bt_from, bt_to, ilog_save, profit,
|
||||||
|
trade_count, end_positions, end_positions_avgp, metrics,
|
||||||
|
batch_profit, batch_count
|
||||||
|
FROM InterleavedGroups
|
||||||
|
ORDER BY
|
||||||
|
sort_key DESC,
|
||||||
|
CASE WHEN batch_id IS NOT NULL THEN 0 ELSE 1 END,
|
||||||
|
stopped DESC
|
||||||
|
LIMIT {request.length} OFFSET {request.start}
|
||||||
|
"""
|
||||||
|
c.execute(paginated_query, {'search_value': f'%{search_value}%'})
|
||||||
|
rows = c.fetchall()
|
||||||
|
|
||||||
|
# Filtered count query
|
||||||
|
filtered_count_query = """
|
||||||
|
SELECT COUNT(*) FROM runner_header
|
||||||
|
WHERE (:search_value = '' OR strat_id LIKE :search_value OR batch_id LIKE :search_value OR symbol like :search_value OR name like :search_value)
|
||||||
|
"""
|
||||||
|
c.execute(filtered_count_query, {'search_value': f'%{search_value}%'})
|
||||||
|
filtered_count = c.fetchone()[0]
|
||||||
|
|
||||||
|
results = [tr.row_to_runarchiveview(row) for row in rows]
|
||||||
|
|
||||||
finally:
|
finally:
|
||||||
conn.row_factory = None
|
conn.row_factory = None
|
||||||
@ -1103,7 +1196,7 @@ def get_archived_runner_header_byID(id: UUID) -> RunArchive:
|
|||||||
row = c.fetchone()
|
row = c.fetchone()
|
||||||
|
|
||||||
if row:
|
if row:
|
||||||
return 0, row_to_runarchive(row)
|
return 0, tr.row_to_runarchive(row)
|
||||||
else:
|
else:
|
||||||
return -2, "not found"
|
return -2, "not found"
|
||||||
|
|
||||||
@ -1129,17 +1222,43 @@ def get_archived_runner_header_byID(id: UUID) -> RunArchive:
|
|||||||
# else:
|
# else:
|
||||||
# return 0, res
|
# return 0, res
|
||||||
|
|
||||||
#vrátí seznam runneru s danym batch_id
|
# #vrátí seznam runneru s danym batch_id
|
||||||
def get_archived_runnerslist_byBatchID(batch_id: str):
|
# def get_archived_runnerslist_byBatchID(batch_id: str):
|
||||||
|
# conn = pool.get_connection()
|
||||||
|
# try:
|
||||||
|
# cursor = conn.cursor()
|
||||||
|
# cursor.execute(f"SELECT runner_id FROM runner_header WHERE batch_id='{str(batch_id)}'")
|
||||||
|
# runner_list = [row[0] for row in cursor.fetchall()]
|
||||||
|
# finally:
|
||||||
|
# pool.release_connection(conn)
|
||||||
|
# return 0, runner_list
|
||||||
|
|
||||||
|
#update that allows to sort
|
||||||
|
def get_archived_runnerslist_byBatchID(batch_id: str, sort_order: str = "asc"):
|
||||||
|
"""
|
||||||
|
Fetches all runner records by batch_id, sorted by the 'started' column.
|
||||||
|
|
||||||
|
:param batch_id: The batch ID to filter runners by.
|
||||||
|
:param sort_order: The sort order of the 'started' column. Defaults to 'asc'.
|
||||||
|
Accepts 'asc' for ascending or 'desc' for descending order.
|
||||||
|
:return: A tuple with the first element being a status code and the second being the list of runner_ids.
|
||||||
|
"""
|
||||||
|
# Validate sort_order
|
||||||
|
if sort_order.lower() not in ['asc', 'desc']:
|
||||||
|
return -1, [] # Returning an error code and an empty list in case of invalid sort_order
|
||||||
|
|
||||||
conn = pool.get_connection()
|
conn = pool.get_connection()
|
||||||
try:
|
try:
|
||||||
cursor = conn.cursor()
|
cursor = conn.cursor()
|
||||||
cursor.execute(f"SELECT runner_id FROM runner_header WHERE batch_id='{str(batch_id)}'")
|
query = f"""SELECT runner_id FROM runner_header
|
||||||
|
WHERE batch_id=?
|
||||||
|
ORDER BY datetime(started) {sort_order.upper()}"""
|
||||||
|
cursor.execute(query, (batch_id,))
|
||||||
runner_list = [row[0] for row in cursor.fetchall()]
|
runner_list = [row[0] for row in cursor.fetchall()]
|
||||||
finally:
|
finally:
|
||||||
pool.release_connection(conn)
|
pool.release_connection(conn)
|
||||||
return 0, runner_list
|
return 0, runner_list
|
||||||
|
|
||||||
def insert_archive_header(archeader: RunArchive):
|
def insert_archive_header(archeader: RunArchive):
|
||||||
conn = pool.get_connection()
|
conn = pool.get_connection()
|
||||||
try:
|
try:
|
||||||
@ -1148,11 +1267,11 @@ def insert_archive_header(archeader: RunArchive):
|
|||||||
|
|
||||||
res = c.execute("""
|
res = c.execute("""
|
||||||
INSERT INTO runner_header
|
INSERT INTO runner_header
|
||||||
(runner_id, strat_id, batch_id, symbol, name, note, started, stopped, mode, account, bt_from, bt_to, strat_json, settings, ilog_save, profit, trade_count, end_positions, end_positions_avgp, metrics, stratvars_toml)
|
(runner_id, strat_id, batch_id, symbol, name, note, started, stopped, mode, account, bt_from, bt_to, strat_json, settings, ilog_save, profit, trade_count, end_positions, end_positions_avgp, metrics, stratvars_toml, transferables)
|
||||||
VALUES
|
VALUES
|
||||||
(?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
(?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||||
""",
|
""",
|
||||||
(str(archeader.id), str(archeader.strat_id), archeader.batch_id, archeader.symbol, archeader.name, archeader.note, archeader.started, archeader.stopped, archeader.mode, archeader.account, archeader.bt_from, archeader.bt_to, orjson.dumps(archeader.strat_json).decode('utf-8'), orjson.dumps(archeader.settings).decode('utf-8'), archeader.ilog_save, archeader.profit, archeader.trade_count, archeader.end_positions, archeader.end_positions_avgp, orjson.dumps(archeader.metrics, default=json_serial, option=orjson.OPT_PASSTHROUGH_DATETIME).decode('utf-8'), archeader.stratvars_toml))
|
(str(archeader.id), str(archeader.strat_id), archeader.batch_id, archeader.symbol, archeader.name, archeader.note, archeader.started, archeader.stopped, archeader.mode, archeader.account, archeader.bt_from, archeader.bt_to, orjson.dumps(archeader.strat_json).decode('utf-8'), orjson.dumps(archeader.settings).decode('utf-8'), archeader.ilog_save, archeader.profit, archeader.trade_count, archeader.end_positions, archeader.end_positions_avgp, orjson.dumps(archeader.metrics, default=json_serial, option=orjson.OPT_PASSTHROUGH_DATETIME).decode('utf-8'), archeader.stratvars_toml, orjson.dumps(archeader.transferables).decode('utf-8')))
|
||||||
|
|
||||||
#retry not yet supported for statement format above
|
#retry not yet supported for statement format above
|
||||||
#res = execute_with_retry(c,statement)
|
#res = execute_with_retry(c,statement)
|
||||||
@ -1476,7 +1595,7 @@ def preview_indicator_byTOML(id: UUID, indicator: InstantIndicator, save: bool =
|
|||||||
# print(row)
|
# print(row)
|
||||||
res, toml_parsed = parse_toml_string(tomlino)
|
res, toml_parsed = parse_toml_string(tomlino)
|
||||||
if res < 0:
|
if res < 0:
|
||||||
return (-2, "toml invalid")
|
return (-2, f"toml invalid: {toml_parsed}")
|
||||||
|
|
||||||
#print("parsed toml", toml_parsed)
|
#print("parsed toml", toml_parsed)
|
||||||
|
|
||||||
@ -1571,9 +1690,17 @@ def preview_indicator_byTOML(id: UUID, indicator: InstantIndicator, save: bool =
|
|||||||
state.ind_mapping = {**local_dict_inds, **local_dict_bars, **local_dict_cbar_inds}
|
state.ind_mapping = {**local_dict_inds, **local_dict_bars, **local_dict_cbar_inds}
|
||||||
#print("IND MAPPING DONE:", state.ind_mapping)
|
#print("IND MAPPING DONE:", state.ind_mapping)
|
||||||
|
|
||||||
|
##intialize required vars from strat init
|
||||||
|
state.vars["loaded_models"] = {}
|
||||||
|
#state attributes for martingale sizing mngmt
|
||||||
|
state.vars["transferables"] = {}
|
||||||
|
state.vars["transferables"]["martingale"] = dict(cont_loss_series_cnt=0)
|
||||||
|
|
||||||
##intialize dynamic indicators
|
##intialize dynamic indicators
|
||||||
initialize_dynamic_indicators(state)
|
initialize_dynamic_indicators(state)
|
||||||
|
#TODO vazit attached data (z toho potrebuji jen transferables, tzn. najit nejak predchozi runner a prelipnout transferables od zacatku)
|
||||||
|
#nejspis upravit attach_previous_data a nebo udelat specialni verzi
|
||||||
|
#attach_previous_data(state)
|
||||||
|
|
||||||
# print("subtype")
|
# print("subtype")
|
||||||
# function = "ci."+subtype+"."+subtype
|
# function = "ci."+subtype+"."+subtype
|
||||||
@ -1714,10 +1841,10 @@ def preview_indicator_byTOML(id: UUID, indicator: InstantIndicator, save: bool =
|
|||||||
|
|
||||||
#vracime list, kde pozice 0 je bar indicators, pozice 1 je ticks indicators
|
#vracime list, kde pozice 0 je bar indicators, pozice 1 je ticks indicators
|
||||||
if output == "bar":
|
if output == "bar":
|
||||||
return 0, [output_dict, []]
|
return 0, [output_dict, {}]
|
||||||
#return 0, [new_inds[indicator.name], []]
|
#return 0, [new_inds[indicator.name], []]
|
||||||
else:
|
else:
|
||||||
return 0, [[], output_dict]
|
return 0, [{}, output_dict]
|
||||||
#return 0, [[], new_tick_inds[indicator.name]]
|
#return 0, [[], new_tick_inds[indicator.name]]
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
@ -1757,107 +1884,18 @@ def delete_indicator_byName(id: UUID, indicator: InstantIndicator):
|
|||||||
print(str(e) + format_exc())
|
print(str(e) + format_exc())
|
||||||
return -2, str(e)
|
return -2, str(e)
|
||||||
|
|
||||||
# region CONFIG db services
|
|
||||||
#TODO vytvorit modul pro dotahovani z pythonu (get_from_config(var_name, def_value) {)- stejne jako v js
|
|
||||||
#TODO zvazit presunuti do TOML z JSONu
|
|
||||||
def get_all_config_items():
|
|
||||||
conn = pool.get_connection()
|
|
||||||
try:
|
|
||||||
cursor = conn.cursor()
|
|
||||||
cursor.execute('SELECT id, item_name, json_data FROM config_table')
|
|
||||||
config_items = [{"id": row[0], "item_name": row[1], "json_data": row[2]} for row in cursor.fetchall()]
|
|
||||||
finally:
|
|
||||||
pool.release_connection(conn)
|
|
||||||
return 0, config_items
|
|
||||||
|
|
||||||
# Function to get a config item by ID
|
|
||||||
def get_config_item_by_id(item_id):
|
|
||||||
conn = pool.get_connection()
|
|
||||||
try:
|
|
||||||
cursor = conn.cursor()
|
|
||||||
cursor.execute('SELECT item_name, json_data FROM config_table WHERE id = ?', (item_id,))
|
|
||||||
row = cursor.fetchone()
|
|
||||||
finally:
|
|
||||||
pool.release_connection(conn)
|
|
||||||
if row is None:
|
|
||||||
return -2, "not found"
|
|
||||||
else:
|
|
||||||
return 0, {"item_name": row[0], "json_data": row[1]}
|
|
||||||
|
|
||||||
# Function to get a config item by ID
|
|
||||||
def get_config_item_by_name(item_name):
|
|
||||||
#print(item_name)
|
|
||||||
conn = pool.get_connection()
|
|
||||||
try:
|
|
||||||
cursor = conn.cursor()
|
|
||||||
query = f"SELECT item_name, json_data FROM config_table WHERE item_name = '{item_name}'"
|
|
||||||
#print(query)
|
|
||||||
cursor.execute(query)
|
|
||||||
row = cursor.fetchone()
|
|
||||||
#print(row)
|
|
||||||
finally:
|
|
||||||
pool.release_connection(conn)
|
|
||||||
if row is None:
|
|
||||||
return -2, "not found"
|
|
||||||
else:
|
|
||||||
return 0, {"item_name": row[0], "json_data": row[1]}
|
|
||||||
|
|
||||||
# Function to create a new config item
|
|
||||||
def create_config_item(config_item: ConfigItem):
|
|
||||||
conn = pool.get_connection()
|
|
||||||
try:
|
|
||||||
try:
|
|
||||||
cursor = conn.cursor()
|
|
||||||
cursor.execute('INSERT INTO config_table (item_name, json_data) VALUES (?, ?)', (config_item.item_name, config_item.json_data))
|
|
||||||
item_id = cursor.lastrowid
|
|
||||||
conn.commit()
|
|
||||||
print(item_id)
|
|
||||||
finally:
|
|
||||||
pool.release_connection(conn)
|
|
||||||
|
|
||||||
return 0, {"id": item_id, "item_name":config_item.item_name, "json_data":config_item.json_data}
|
|
||||||
except Exception as e:
|
|
||||||
return -2, str(e)
|
|
||||||
|
|
||||||
# Function to update a config item by ID
|
|
||||||
def update_config_item(item_id, config_item: ConfigItem):
|
|
||||||
conn = pool.get_connection()
|
|
||||||
try:
|
|
||||||
try:
|
|
||||||
cursor = conn.cursor()
|
|
||||||
cursor.execute('UPDATE config_table SET item_name = ?, json_data = ? WHERE id = ?', (config_item.item_name, config_item.json_data, item_id))
|
|
||||||
conn.commit()
|
|
||||||
finally:
|
|
||||||
pool.release_connection(conn)
|
|
||||||
return 0, {"id": item_id, **config_item.dict()}
|
|
||||||
except Exception as e:
|
|
||||||
return -2, str(e)
|
|
||||||
|
|
||||||
# Function to delete a config item by ID
|
|
||||||
def delete_config_item(item_id):
|
|
||||||
conn = pool.get_connection()
|
|
||||||
try:
|
|
||||||
cursor = conn.cursor()
|
|
||||||
cursor.execute('DELETE FROM config_table WHERE id = ?', (item_id,))
|
|
||||||
conn.commit()
|
|
||||||
finally:
|
|
||||||
pool.release_connection(conn)
|
|
||||||
return 0, {"id": item_id}
|
|
||||||
|
|
||||||
# endregion
|
|
||||||
|
|
||||||
#returns b
|
#returns b
|
||||||
def get_alpaca_history_bars(symbol: str, datetime_object_from: datetime, datetime_object_to: datetime, timeframe: TimeFrame):
|
def get_alpaca_history_bars(symbol: str, datetime_object_from: datetime, datetime_object_to: datetime, timeframe: TimeFrame):
|
||||||
"""Returns Bar object
|
"""Returns Bar object
|
||||||
"""
|
"""
|
||||||
try:
|
try:
|
||||||
|
result = []
|
||||||
client = StockHistoricalDataClient(ACCOUNT1_LIVE_API_KEY, ACCOUNT1_LIVE_SECRET_KEY, raw_data=False)
|
client = StockHistoricalDataClient(ACCOUNT1_LIVE_API_KEY, ACCOUNT1_LIVE_SECRET_KEY, raw_data=False)
|
||||||
#datetime_object_from = datetime(2023, 2, 27, 18, 51, 38, tzinfo=datetime.timezone.utc)
|
#datetime_object_from = datetime(2023, 2, 27, 18, 51, 38, tzinfo=datetime.timezone.utc)
|
||||||
#datetime_object_to = datetime(2023, 2, 27, 21, 51, 39, tzinfo=datetime.timezone.utc)
|
#datetime_object_to = datetime(2023, 2, 27, 21, 51, 39, tzinfo=datetime.timezone.utc)
|
||||||
bar_request = StockBarsRequest(symbol_or_symbols=symbol,timeframe=timeframe, start=datetime_object_from, end=datetime_object_to, feed=DataFeed.SIP)
|
bar_request = StockBarsRequest(symbol_or_symbols=symbol,timeframe=timeframe, start=datetime_object_from, end=datetime_object_to, feed=ACCOUNT1_LIVE_FEED)
|
||||||
#print("before df")
|
#print("before df")
|
||||||
bars = client.get_stock_bars(bar_request)
|
bars = client.get_stock_bars(bar_request)
|
||||||
result = []
|
|
||||||
##pridavame pro jistotu minutu z obou stran kvuli frontendu
|
##pridavame pro jistotu minutu z obou stran kvuli frontendu
|
||||||
business_hours = {
|
business_hours = {
|
||||||
# monday = 0, tuesday = 1, ... same pattern as date.weekday()
|
# monday = 0, tuesday = 1, ... same pattern as date.weekday()
|
||||||
@ -1888,12 +1926,25 @@ def get_alpaca_history_bars(symbol: str, datetime_object_from: datetime, datetim
|
|||||||
#bars.data[symbol]
|
#bars.data[symbol]
|
||||||
return 0, result
|
return 0, result
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print(str(e) + format_exc())
|
# Workaround of error when no data foun d AttributeError and has the specific message
|
||||||
if OFFLINE_MODE:
|
if isinstance(e, AttributeError) and str(e) == "'NoneType' object has no attribute 'items'":
|
||||||
print("OFFLINE MODE ENABLED")
|
print("Caught the specific AttributeError: 'NoneType' object has no attribute 'items' means NO DATA FOUND")
|
||||||
return 0, []
|
print(str(e) + format_exc())
|
||||||
return -2, str(e)
|
return 0, result
|
||||||
|
else:
|
||||||
|
print(str(e) + format_exc())
|
||||||
|
if cfh.config_handler.get_val('OFFLINE_MODE'):
|
||||||
|
print("OFFLINE MODE ENABLED")
|
||||||
|
return 0, []
|
||||||
|
return -2, str(e)
|
||||||
# change_archived_runner
|
# change_archived_runner
|
||||||
# delete_archived_runner_details
|
# delete_archived_runner_details
|
||||||
|
|
||||||
|
#Example of using config directive
|
||||||
|
# config_directive = "python"
|
||||||
|
# ret, res = get_config_item_by_name(config_directive)
|
||||||
|
# if ret < 0:
|
||||||
|
# print(f"Error {res}")
|
||||||
|
# else:
|
||||||
|
# config = orjson.loads(res["json_data"])
|
||||||
|
# print(config)
|
||||||
@ -1,6 +1,11 @@
|
|||||||
from enum import Enum
|
from enum import Enum
|
||||||
from alpaca.trading.enums import OrderSide, OrderStatus, OrderType
|
from alpaca.trading.enums import OrderSide, OrderStatus, OrderType
|
||||||
|
|
||||||
|
class BarType(str, Enum):
|
||||||
|
TIME = "time"
|
||||||
|
VOLUME = "volume"
|
||||||
|
DOLLAR = "dollar"
|
||||||
|
|
||||||
class Env(str, Enum):
|
class Env(str, Enum):
|
||||||
PROD = "prod"
|
PROD = "prod"
|
||||||
TEST = "test"
|
TEST = "test"
|
||||||
@ -52,6 +57,16 @@ class Account(str, Enum):
|
|||||||
"""
|
"""
|
||||||
ACCOUNT1 = "ACCOUNT1"
|
ACCOUNT1 = "ACCOUNT1"
|
||||||
ACCOUNT2 = "ACCOUNT2"
|
ACCOUNT2 = "ACCOUNT2"
|
||||||
|
|
||||||
|
class Moddus(str, Enum):
|
||||||
|
"""
|
||||||
|
Moddus for RunManager record
|
||||||
|
|
||||||
|
schedule - scheduled record
|
||||||
|
queue - queued record
|
||||||
|
"""
|
||||||
|
SCHEDULE = "schedule"
|
||||||
|
QUEUE = "queue"
|
||||||
class RecordType(str, Enum):
|
class RecordType(str, Enum):
|
||||||
"""
|
"""
|
||||||
Represents output of aggregator
|
Represents output of aggregator
|
||||||
@ -64,6 +79,15 @@ class RecordType(str, Enum):
|
|||||||
CBARRENKO = "cbarrenko"
|
CBARRENKO = "cbarrenko"
|
||||||
TRADE = "trade"
|
TRADE = "trade"
|
||||||
|
|
||||||
|
class SchedulerStatus(str, Enum):
|
||||||
|
"""
|
||||||
|
ACTIVE - active scheduling
|
||||||
|
SUSPENDED - suspended for scheduling
|
||||||
|
"""
|
||||||
|
|
||||||
|
ACTIVE = "active"
|
||||||
|
SUSPENDED = "suspended"
|
||||||
|
|
||||||
class Mode(str, Enum):
|
class Mode(str, Enum):
|
||||||
"""
|
"""
|
||||||
LIVE - live on production
|
LIVE - live on production
|
||||||
@ -77,7 +101,6 @@ class Mode(str, Enum):
|
|||||||
BT = "backtest"
|
BT = "backtest"
|
||||||
PREP = "prep"
|
PREP = "prep"
|
||||||
|
|
||||||
|
|
||||||
class StartBarAlign(str, Enum):
|
class StartBarAlign(str, Enum):
|
||||||
"""
|
"""
|
||||||
Represents first bar start time alignement according to timeframe
|
Represents first bar start time alignement according to timeframe
|
||||||
@ -85,4 +108,10 @@ class StartBarAlign(str, Enum):
|
|||||||
RANDOM = first bar starts when first trade occurs
|
RANDOM = first bar starts when first trade occurs
|
||||||
"""
|
"""
|
||||||
ROUND = "round"
|
ROUND = "round"
|
||||||
RANDOM = "random"
|
RANDOM = "random"
|
||||||
|
|
||||||
|
class Market(str, Enum):
|
||||||
|
US = "US"
|
||||||
|
CRYPTO = "CRYPTO"
|
||||||
|
|
||||||
|
|
||||||
@ -2,9 +2,9 @@ from alpaca.trading.enums import OrderSide, OrderType
|
|||||||
from threading import Lock
|
from threading import Lock
|
||||||
from v2realbot.interfaces.general_interface import GeneralInterface
|
from v2realbot.interfaces.general_interface import GeneralInterface
|
||||||
from v2realbot.backtesting.backtester import Backtester
|
from v2realbot.backtesting.backtester import Backtester
|
||||||
from v2realbot.config import BT_DELAYS, COUNT_API_REQUESTS
|
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from v2realbot.utils.utils import zoneNY
|
from v2realbot.utils.utils import zoneNY
|
||||||
|
import v2realbot.utils.config_handler as cfh
|
||||||
|
|
||||||
""""
|
""""
|
||||||
backtester methods can be called
|
backtester methods can be called
|
||||||
@ -19,7 +19,7 @@ class BacktestInterface(GeneralInterface):
|
|||||||
def __init__(self, symbol, bt: Backtester) -> None:
|
def __init__(self, symbol, bt: Backtester) -> None:
|
||||||
self.symbol = symbol
|
self.symbol = symbol
|
||||||
self.bt = bt
|
self.bt = bt
|
||||||
self.count_api_requests = COUNT_API_REQUESTS
|
self.count_api_requests = cfh.config_handler.get_val('COUNT_API_REQUESTS')
|
||||||
self.mincnt = list([dict(minute=0,count=0)])
|
self.mincnt = list([dict(minute=0,count=0)])
|
||||||
#TODO time v API nejspis muzeme dat pryc a BT bude si to brat primo ze self.time (nezapomenout na + BT_DELAYS)
|
#TODO time v API nejspis muzeme dat pryc a BT bude si to brat primo ze self.time (nezapomenout na + BT_DELAYS)
|
||||||
# self.time = self.bt.time
|
# self.time = self.bt.time
|
||||||
@ -43,33 +43,33 @@ class BacktestInterface(GeneralInterface):
|
|||||||
def buy(self, size = 1, repeat: bool = False):
|
def buy(self, size = 1, repeat: bool = False):
|
||||||
self.count()
|
self.count()
|
||||||
#add REST API latency
|
#add REST API latency
|
||||||
return self.bt.submit_order(time=self.bt.time + BT_DELAYS.strat_to_sub,symbol=self.symbol,side=OrderSide.BUY,size=size,order_type = OrderType.MARKET)
|
return self.bt.submit_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'),symbol=self.symbol,side=OrderSide.BUY,size=size,order_type = OrderType.MARKET)
|
||||||
|
|
||||||
"""buy limit"""
|
"""buy limit"""
|
||||||
def buy_l(self, price: float, size: int = 1, repeat: bool = False, force: int = 0):
|
def buy_l(self, price: float, size: int = 1, repeat: bool = False, force: int = 0):
|
||||||
self.count()
|
self.count()
|
||||||
return self.bt.submit_order(time=self.bt.time + BT_DELAYS.strat_to_sub,symbol=self.symbol,side=OrderSide.BUY,size=size,price=price,order_type = OrderType.LIMIT)
|
return self.bt.submit_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'),symbol=self.symbol,side=OrderSide.BUY,size=size,price=price,order_type = OrderType.LIMIT)
|
||||||
|
|
||||||
"""sell market"""
|
"""sell market"""
|
||||||
def sell(self, size = 1, repeat: bool = False):
|
def sell(self, size = 1, repeat: bool = False):
|
||||||
self.count()
|
self.count()
|
||||||
return self.bt.submit_order(time=self.bt.time + BT_DELAYS.strat_to_sub,symbol=self.symbol,side=OrderSide.SELL,size=size,order_type = OrderType.MARKET)
|
return self.bt.submit_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'),symbol=self.symbol,side=OrderSide.SELL,size=size,order_type = OrderType.MARKET)
|
||||||
|
|
||||||
"""sell limit"""
|
"""sell limit"""
|
||||||
async def sell_l(self, price: float, size = 1, repeat: bool = False):
|
async def sell_l(self, price: float, size = 1, repeat: bool = False):
|
||||||
self.count()
|
self.count()
|
||||||
return self.bt.submit_order(time=self.bt.time + BT_DELAYS.strat_to_sub,symbol=self.symbol,side=OrderSide.SELL,size=size,price=price,order_type = OrderType.LIMIT)
|
return self.bt.submit_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'),symbol=self.symbol,side=OrderSide.SELL,size=size,price=price,order_type = OrderType.LIMIT)
|
||||||
|
|
||||||
"""replace order"""
|
"""replace order"""
|
||||||
async def repl(self, orderid: str, price: float = None, size: int = None, repeat: bool = False):
|
async def repl(self, orderid: str, price: float = None, size: int = None, repeat: bool = False):
|
||||||
self.count()
|
self.count()
|
||||||
return self.bt.replace_order(time=self.bt.time + BT_DELAYS.strat_to_sub,id=orderid,size=size,price=price)
|
return self.bt.replace_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'),id=orderid,size=size,price=price)
|
||||||
|
|
||||||
"""cancel order"""
|
"""cancel order"""
|
||||||
#TBD exec predtim?
|
#TBD exec predtim?
|
||||||
def cancel(self, orderid: str):
|
def cancel(self, orderid: str):
|
||||||
self.count()
|
self.count()
|
||||||
return self.bt.cancel_order(time=self.bt.time + BT_DELAYS.strat_to_sub, id=orderid)
|
return self.bt.cancel_order(time=self.bt.time + cfh.config_handler.get_val('BT_DELAYS','strat_to_sub'), id=orderid)
|
||||||
|
|
||||||
"""get positions ->(size,avgp)"""
|
"""get positions ->(size,avgp)"""
|
||||||
#TBD exec predtim?
|
#TBD exec predtim?
|
||||||
|
|||||||
@ -40,7 +40,9 @@ class LiveInterface(GeneralInterface):
|
|||||||
|
|
||||||
return market_order.id
|
return market_order.id
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print("Nepodarilo se odeslat buy", str(e))
|
reason = "Nepodarilo se market buy:" + str(e) + format_exc()
|
||||||
|
print(reason)
|
||||||
|
send_to_telegram(reason)
|
||||||
return -1
|
return -1
|
||||||
|
|
||||||
"""buy limit"""
|
"""buy limit"""
|
||||||
@ -65,7 +67,9 @@ class LiveInterface(GeneralInterface):
|
|||||||
|
|
||||||
return limit_order.id
|
return limit_order.id
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print("Nepodarilo se odeslat limitku", str(e))
|
reason = "Nepodarilo se odeslat buy limitku:" + str(e) + format_exc()
|
||||||
|
print(reason)
|
||||||
|
send_to_telegram(reason)
|
||||||
return -1
|
return -1
|
||||||
|
|
||||||
"""sell market"""
|
"""sell market"""
|
||||||
@ -87,7 +91,9 @@ class LiveInterface(GeneralInterface):
|
|||||||
|
|
||||||
return market_order.id
|
return market_order.id
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print("Nepodarilo se odeslat sell", str(e))
|
reason = "Nepodarilo se odeslat sell:" + str(e) + format_exc()
|
||||||
|
print(reason)
|
||||||
|
send_to_telegram(reason)
|
||||||
return -1
|
return -1
|
||||||
|
|
||||||
"""sell limit"""
|
"""sell limit"""
|
||||||
@ -112,8 +118,9 @@ class LiveInterface(GeneralInterface):
|
|||||||
return limit_order.id
|
return limit_order.id
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print("Nepodarilo se odeslat sell_l", str(e))
|
reason = "Nepodarilo se odeslat sell limitku:" + str(e) + format_exc()
|
||||||
#raise Exception(e)
|
print(reason)
|
||||||
|
send_to_telegram(reason)
|
||||||
return -1
|
return -1
|
||||||
|
|
||||||
"""order replace"""
|
"""order replace"""
|
||||||
@ -136,7 +143,9 @@ class LiveInterface(GeneralInterface):
|
|||||||
if e.code == 42210000: return orderid
|
if e.code == 42210000: return orderid
|
||||||
else:
|
else:
|
||||||
##mozna tady proste vracet vzdy ok
|
##mozna tady proste vracet vzdy ok
|
||||||
print("Neslo nahradit profitku. Problem",str(e))
|
reason = "Neslo nahradit profitku. Problem:" + str(e) + format_exc()
|
||||||
|
print(reason)
|
||||||
|
send_to_telegram(reason)
|
||||||
return -1
|
return -1
|
||||||
#raise Exception(e)
|
#raise Exception(e)
|
||||||
|
|
||||||
@ -150,7 +159,9 @@ class LiveInterface(GeneralInterface):
|
|||||||
#order doesnt exist
|
#order doesnt exist
|
||||||
if e.code == 40410000: return 0
|
if e.code == 40410000: return 0
|
||||||
else:
|
else:
|
||||||
print("nepovedlo se zrusit objednavku", str(e))
|
reason = "Nepovedlo se zrusit objednavku:" + str(e) + format_exc()
|
||||||
|
print(reason)
|
||||||
|
send_to_telegram(reason)
|
||||||
#raise Exception(e)
|
#raise Exception(e)
|
||||||
return -1
|
return -1
|
||||||
|
|
||||||
@ -178,7 +189,9 @@ class LiveInterface(GeneralInterface):
|
|||||||
#list of Orders (orderlist[0].id)
|
#list of Orders (orderlist[0].id)
|
||||||
return orderlist
|
return orderlist
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print("Chyba pri dotazeni objednávek.", str(e))
|
reason = "Chyba pri dotazeni objednávek:" + str(e) + format_exc()
|
||||||
|
print(reason)
|
||||||
|
send_to_telegram(reason)
|
||||||
#raise Exception (e)
|
#raise Exception (e)
|
||||||
return -1
|
return -1
|
||||||
|
|
||||||
|
|||||||
1411
v2realbot/loader/agg_vect.ipynb
Normal file
1411
v2realbot/loader/agg_vect.ipynb
Normal file
File diff suppressed because it is too large
Load Diff
@ -11,10 +11,10 @@ import threading
|
|||||||
from copy import deepcopy
|
from copy import deepcopy
|
||||||
from msgpack import unpackb
|
from msgpack import unpackb
|
||||||
import os
|
import os
|
||||||
from v2realbot.config import DATA_DIR, GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN, AGG_EXCLUDED_TRADES
|
from v2realbot.config import DATA_DIR
|
||||||
import pickle
|
|
||||||
import dill
|
import dill
|
||||||
import gzip
|
import gzip
|
||||||
|
import v2realbot.utils.config_handler as cfh
|
||||||
|
|
||||||
class TradeAggregator:
|
class TradeAggregator:
|
||||||
def __init__(self,
|
def __init__(self,
|
||||||
@ -25,7 +25,7 @@ class TradeAggregator:
|
|||||||
align: StartBarAlign = StartBarAlign.ROUND,
|
align: StartBarAlign = StartBarAlign.ROUND,
|
||||||
mintick: int = 0,
|
mintick: int = 0,
|
||||||
exthours: bool = False,
|
exthours: bool = False,
|
||||||
excludes: list = AGG_EXCLUDED_TRADES,
|
excludes: list = cfh.config_handler.get_val('AGG_EXCLUDED_TRADES'),
|
||||||
skip_cache: bool = False):
|
skip_cache: bool = False):
|
||||||
"""
|
"""
|
||||||
UPDATED VERSION - vrací více záznamů
|
UPDATED VERSION - vrací více záznamů
|
||||||
@ -48,7 +48,7 @@ class TradeAggregator:
|
|||||||
self.excludes = excludes
|
self.excludes = excludes
|
||||||
self.skip_cache = skip_cache
|
self.skip_cache = skip_cache
|
||||||
|
|
||||||
if mintick >= resolution:
|
if resolution > 0 and mintick >= resolution:
|
||||||
print("Mintick musi byt mensi nez resolution")
|
print("Mintick musi byt mensi nez resolution")
|
||||||
raise Exception
|
raise Exception
|
||||||
|
|
||||||
@ -293,7 +293,7 @@ class TradeAggregator:
|
|||||||
self.diff_price = True
|
self.diff_price = True
|
||||||
self.last_price = data['p']
|
self.last_price = data['p']
|
||||||
|
|
||||||
if float(data['t']) - float(self.lasttimestamp) < GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN:
|
if float(data['t']) - float(self.lasttimestamp) < cfh.config_handler.get_val('GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN'):
|
||||||
self.trades_too_close = True
|
self.trades_too_close = True
|
||||||
else:
|
else:
|
||||||
self.trades_too_close = False
|
self.trades_too_close = False
|
||||||
@ -320,13 +320,13 @@ class TradeAggregator:
|
|||||||
#TODO: do budoucna vymyslet, kdyz bude mene tradu, tak to radit vzdy do spravneho intervalu
|
#TODO: do budoucna vymyslet, kdyz bude mene tradu, tak to radit vzdy do spravneho intervalu
|
||||||
#zarovname time prvniho baru podle timeframu kam patří (např. 5, 10, 15 ...) (ROUND)
|
#zarovname time prvniho baru podle timeframu kam patří (např. 5, 10, 15 ...) (ROUND)
|
||||||
if self.align == StartBarAlign.ROUND and self.bar_start == 0:
|
if self.align == StartBarAlign.ROUND and self.bar_start == 0:
|
||||||
t = datetime.fromtimestamp(data['t'])
|
t = datetime.fromtimestamp(data['t'], tz=zoneUTC)
|
||||||
t = t - timedelta(seconds=t.second % self.resolution,microseconds=t.microsecond)
|
t = t - timedelta(seconds=t.second % self.resolution,microseconds=t.microsecond)
|
||||||
self.bar_start = datetime.timestamp(t)
|
self.bar_start = datetime.timestamp(t)
|
||||||
#nebo pouzijeme datum tradu zaokrouhlene na vteriny (RANDOM)
|
#nebo pouzijeme datum tradu zaokrouhlene na vteriny (RANDOM)
|
||||||
else:
|
else:
|
||||||
#ulozime si jeho timestamp (odtum pocitame resolution)
|
#ulozime si jeho timestamp (odtum pocitame resolution)
|
||||||
t = datetime.fromtimestamp(int(data['t']))
|
t = datetime.fromtimestamp(int(data['t']), tz=zoneUTC)
|
||||||
#timestamp
|
#timestamp
|
||||||
self.bar_start = int(data['t'])
|
self.bar_start = int(data['t'])
|
||||||
|
|
||||||
@ -376,7 +376,7 @@ class TradeAggregator:
|
|||||||
if self.mintick != 0 and self.lastBarConfirmed:
|
if self.mintick != 0 and self.lastBarConfirmed:
|
||||||
#d zacatku noveho baru musi ubehnout x sekund nez posilame updazte
|
#d zacatku noveho baru musi ubehnout x sekund nez posilame updazte
|
||||||
#pocatek noveho baru + Xs musi byt vetsi nez aktualni trade
|
#pocatek noveho baru + Xs musi byt vetsi nez aktualni trade
|
||||||
if (self.newBar['time'] + timedelta(seconds=self.mintick)) > datetime.fromtimestamp(data['t']):
|
if (self.newBar['time'] + timedelta(seconds=self.mintick)) > datetime.fromtimestamp(data['t'], tz=zoneUTC):
|
||||||
#print("waiting for mintick")
|
#print("waiting for mintick")
|
||||||
return []
|
return []
|
||||||
else:
|
else:
|
||||||
@ -540,7 +540,7 @@ class TradeAggregator:
|
|||||||
self.diff_price = True
|
self.diff_price = True
|
||||||
self.last_price = data['p']
|
self.last_price = data['p']
|
||||||
|
|
||||||
if float(data['t']) - float(self.lasttimestamp) < GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN:
|
if float(data['t']) - float(self.lasttimestamp) < cfh.config_handler.get_val('GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN'):
|
||||||
self.trades_too_close = True
|
self.trades_too_close = True
|
||||||
else:
|
else:
|
||||||
self.trades_too_close = False
|
self.trades_too_close = False
|
||||||
@ -712,7 +712,7 @@ class TradeAggregator:
|
|||||||
self.diff_price = True
|
self.diff_price = True
|
||||||
self.last_price = data['p']
|
self.last_price = data['p']
|
||||||
|
|
||||||
if float(data['t']) - float(self.lasttimestamp) < GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN:
|
if float(data['t']) - float(self.lasttimestamp) < cfh.config_handler.get_val('GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN'):
|
||||||
self.trades_too_close = True
|
self.trades_too_close = True
|
||||||
else:
|
else:
|
||||||
self.trades_too_close = False
|
self.trades_too_close = False
|
||||||
@ -756,8 +756,14 @@ class TradeAggregator:
|
|||||||
Ve strategii je třeba počítat s tím, že open v nepotvrzeném baru není finální.
|
Ve strategii je třeba počítat s tím, že open v nepotvrzeném baru není finální.
|
||||||
"""""
|
"""""
|
||||||
|
|
||||||
|
if self.resolution < 0: # Treat as percentage
|
||||||
|
reference_price = self.lastConfirmedBar['close'] if self.lastConfirmedBar is not None else float(data['p'])
|
||||||
|
brick_size = abs(self.resolution) * reference_price / 100.0
|
||||||
|
else: # Treat as absolute value pocet ticku
|
||||||
|
brick_size = self.resolution
|
||||||
|
|
||||||
#pocet ticku např. 10ticků, případně pak na procenta
|
#pocet ticku např. 10ticků, případně pak na procenta
|
||||||
brick_size = self.resolution
|
#brick_size = self.resolution
|
||||||
#potvrzene pripravene k vraceni
|
#potvrzene pripravene k vraceni
|
||||||
confirmedBars = []
|
confirmedBars = []
|
||||||
#potvrdi existujici a nastavi k vraceni
|
#potvrdi existujici a nastavi k vraceni
|
||||||
@ -866,7 +872,7 @@ class TradeAggregator:
|
|||||||
self.diff_price = True
|
self.diff_price = True
|
||||||
self.last_price = data['p']
|
self.last_price = data['p']
|
||||||
|
|
||||||
if float(data['t']) - float(self.lasttimestamp) < GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN:
|
if float(data['t']) - float(self.lasttimestamp) < cfh.config_handler.get_val('GROUP_TRADES_WITH_TIMESTAMP_LESS_THAN'):
|
||||||
self.trades_too_close = True
|
self.trades_too_close = True
|
||||||
else:
|
else:
|
||||||
self.trades_too_close = False
|
self.trades_too_close = False
|
||||||
@ -962,7 +968,7 @@ class TradeAggregator2Queue(TradeAggregator):
|
|||||||
Child of TradeAggregator - sends items to given queue
|
Child of TradeAggregator - sends items to given queue
|
||||||
In the future others will be added - TradeAggToTxT etc.
|
In the future others will be added - TradeAggToTxT etc.
|
||||||
"""
|
"""
|
||||||
def __init__(self, symbol: str, queue: Queue, rectype: RecordType = RecordType.BAR, resolution: int = 5, minsize: int = 100, update_ltp: bool = False, align: StartBarAlign = StartBarAlign.ROUND, mintick: int = 0, exthours: bool = False, excludes: list = AGG_EXCLUDED_TRADES, skip_cache: bool = False):
|
def __init__(self, symbol: str, queue: Queue, rectype: RecordType = RecordType.BAR, resolution: int = 5, minsize: int = 100, update_ltp: bool = False, align: StartBarAlign = StartBarAlign.ROUND, mintick: int = 0, exthours: bool = False, excludes: list = cfh.config_handler.get_val('AGG_EXCLUDED_TRADES'), skip_cache: bool = False):
|
||||||
super().__init__(rectype=rectype, resolution=resolution, minsize=minsize, update_ltp=update_ltp, align=align, mintick=mintick, exthours=exthours, excludes=excludes, skip_cache=skip_cache)
|
super().__init__(rectype=rectype, resolution=resolution, minsize=minsize, update_ltp=update_ltp, align=align, mintick=mintick, exthours=exthours, excludes=excludes, skip_cache=skip_cache)
|
||||||
self.queue = queue
|
self.queue = queue
|
||||||
self.symbol = symbol
|
self.symbol = symbol
|
||||||
@ -1007,7 +1013,7 @@ class TradeAggregator2List(TradeAggregator):
|
|||||||
""""
|
""""
|
||||||
stores records to the list
|
stores records to the list
|
||||||
"""
|
"""
|
||||||
def __init__(self, symbol: str, btdata: list, rectype: RecordType = RecordType.BAR, resolution: int = 5, minsize: int = 100, update_ltp: bool = False, align: StartBarAlign = StartBarAlign.ROUND, mintick: int = 0, exthours: bool = False, excludes: list = AGG_EXCLUDED_TRADES, skip_cache: bool = False):
|
def __init__(self, symbol: str, btdata: list, rectype: RecordType = RecordType.BAR, resolution: int = 5, minsize: int = 100, update_ltp: bool = False, align: StartBarAlign = StartBarAlign.ROUND, mintick: int = 0, exthours: bool = False, excludes: list = cfh.config_handler.get_val('AGG_EXCLUDED_TRADES'), skip_cache: bool = False):
|
||||||
super().__init__(rectype=rectype, resolution=resolution, minsize=minsize, update_ltp=update_ltp, align=align, mintick=mintick, exthours=exthours, excludes=excludes, skip_cache=skip_cache)
|
super().__init__(rectype=rectype, resolution=resolution, minsize=minsize, update_ltp=update_ltp, align=align, mintick=mintick, exthours=exthours, excludes=excludes, skip_cache=skip_cache)
|
||||||
self.btdata = btdata
|
self.btdata = btdata
|
||||||
self.symbol = symbol
|
self.symbol = symbol
|
||||||
|
|||||||
570
v2realbot/loader/aggregator_vectorized.py
Normal file
570
v2realbot/loader/aggregator_vectorized.py
Normal file
@ -0,0 +1,570 @@
|
|||||||
|
import pandas as pd
|
||||||
|
import numpy as np
|
||||||
|
from numba import jit
|
||||||
|
from alpaca.data.historical import StockHistoricalDataClient
|
||||||
|
from sqlalchemy import column
|
||||||
|
from v2realbot.config import ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, DATA_DIR
|
||||||
|
from alpaca.data.requests import StockTradesRequest
|
||||||
|
import time as time_module
|
||||||
|
from v2realbot.utils.utils import parse_alpaca_timestamp, ltp, zoneNY, send_to_telegram, fetch_calendar_data
|
||||||
|
import pyarrow
|
||||||
|
from traceback import format_exc
|
||||||
|
from datetime import timedelta, datetime, time
|
||||||
|
from concurrent.futures import ThreadPoolExecutor
|
||||||
|
import os
|
||||||
|
import gzip
|
||||||
|
import pickle
|
||||||
|
import random
|
||||||
|
from alpaca.data.models import BarSet, QuoteSet, TradeSet
|
||||||
|
import v2realbot.utils.config_handler as cfh
|
||||||
|
from v2realbot.enums.enums import BarType
|
||||||
|
from tqdm import tqdm
|
||||||
|
""""
|
||||||
|
Module used for vectorized aggregation of trades.
|
||||||
|
|
||||||
|
Includes fetch (remote/cached) methods and numba aggregator function for TIME BASED, VOLUME BASED and DOLLAR BARS
|
||||||
|
|
||||||
|
"""""
|
||||||
|
|
||||||
|
def aggregate_trades(symbol: str, trades_df: pd.DataFrame, resolution: int, type: BarType = BarType.TIME):
|
||||||
|
""""
|
||||||
|
Accepts dataframe with trades keyed by symbol. Preparess dataframe to
|
||||||
|
numpy and calls Numba optimized aggregator for given bar type. (time/volume/dollar)
|
||||||
|
"""""
|
||||||
|
trades_df = trades_df.loc[symbol]
|
||||||
|
trades_df= trades_df.reset_index()
|
||||||
|
ticks = trades_df[['timestamp', 'price', 'size']].to_numpy()
|
||||||
|
# Extract the timestamps column (assuming it's the first column)
|
||||||
|
timestamps = ticks[:, 0]
|
||||||
|
# Convert the timestamps to Unix timestamps in seconds with microsecond precision
|
||||||
|
unix_timestamps_s = np.array([ts.timestamp() for ts in timestamps], dtype='float64')
|
||||||
|
# Replace the original timestamps in the NumPy array with the converted Unix timestamps
|
||||||
|
ticks[:, 0] = unix_timestamps_s
|
||||||
|
ticks = ticks.astype(np.float64)
|
||||||
|
#based on type, specific aggregator function is called
|
||||||
|
match type:
|
||||||
|
case BarType.TIME:
|
||||||
|
ohlcv_bars = generate_time_bars_nb(ticks, resolution)
|
||||||
|
case BarType.VOLUME:
|
||||||
|
ohlcv_bars = generate_volume_bars_nb(ticks, resolution)
|
||||||
|
case BarType.DOLLAR:
|
||||||
|
ohlcv_bars = generate_dollar_bars_nb(ticks, resolution)
|
||||||
|
case _:
|
||||||
|
raise ValueError("Invalid bar type. Supported types are 'time', 'volume' and 'dollar'.")
|
||||||
|
# Convert the resulting array back to a DataFrame
|
||||||
|
columns = ['time', 'open', 'high', 'low', 'close', 'volume', 'trades']
|
||||||
|
if type == BarType.DOLLAR:
|
||||||
|
columns.append('amount')
|
||||||
|
columns.append('updated')
|
||||||
|
if type == BarType.TIME:
|
||||||
|
columns.append('vwap')
|
||||||
|
columns.append('buyvolume')
|
||||||
|
columns.append('sellvolume')
|
||||||
|
if type == BarType.VOLUME:
|
||||||
|
columns.append('buyvolume')
|
||||||
|
columns.append('sellvolume')
|
||||||
|
ohlcv_df = pd.DataFrame(ohlcv_bars, columns=columns)
|
||||||
|
ohlcv_df['time'] = pd.to_datetime(ohlcv_df['time'], unit='s').dt.tz_localize('UTC').dt.tz_convert(zoneNY)
|
||||||
|
#print(ohlcv_df['updated'])
|
||||||
|
ohlcv_df['updated'] = pd.to_datetime(ohlcv_df['updated'], unit="s").dt.tz_localize('UTC').dt.tz_convert(zoneNY)
|
||||||
|
# Round to microseconds to maintain six decimal places
|
||||||
|
ohlcv_df['updated'] = ohlcv_df['updated'].dt.round('us')
|
||||||
|
|
||||||
|
ohlcv_df.set_index('time', inplace=True)
|
||||||
|
#ohlcv_df.index = ohlcv_df.index.tz_localize('UTC').tz_convert(zoneNY)
|
||||||
|
return ohlcv_df
|
||||||
|
|
||||||
|
# Function to ensure fractional seconds are present
|
||||||
|
def ensure_fractional_seconds(timestamp):
|
||||||
|
if '.' not in timestamp:
|
||||||
|
# Inserting .000000 before the timezone indicator 'Z'
|
||||||
|
return timestamp.replace('Z', '.000000Z')
|
||||||
|
else:
|
||||||
|
return timestamp
|
||||||
|
|
||||||
|
def convert_dict_to_multiindex_df(tradesResponse):
|
||||||
|
""""
|
||||||
|
Converts dictionary from cache or from remote (raw input) to multiindex dataframe.
|
||||||
|
with microsecond precision (from nanoseconds in the raw data)
|
||||||
|
"""""
|
||||||
|
# Create a DataFrame for each key and add the key as part of the MultiIndex
|
||||||
|
dfs = []
|
||||||
|
for key, values in tradesResponse.items():
|
||||||
|
df = pd.DataFrame(values)
|
||||||
|
# Rename columns
|
||||||
|
# Select and order columns explicitly
|
||||||
|
#print(df)
|
||||||
|
df = df[['t', 'x', 'p', 's', 'i', 'c','z']]
|
||||||
|
df.rename(columns={'t': 'timestamp', 'c': 'conditions', 'p': 'price', 's': 'size', 'x': 'exchange', 'z':'tape', 'i':'id'}, inplace=True)
|
||||||
|
df['symbol'] = key # Add ticker as a column
|
||||||
|
|
||||||
|
# Apply the function to ensure all timestamps have fractional seconds
|
||||||
|
#zvazit zda toto ponechat a nebo dat jen pri urcitem erroru pri to_datetime
|
||||||
|
#pripadne pak pridelat efektivnejsi pristup, aneb nahrazeni NaT - https://chatgpt.com/c/d2be6f87-b38f-4050-a1c6-541d100b1474
|
||||||
|
df['timestamp'] = df['timestamp'].apply(ensure_fractional_seconds)
|
||||||
|
|
||||||
|
df['timestamp'] = pd.to_datetime(df['timestamp'], errors='coerce') # Convert 't' from string to datetime before setting it as an index
|
||||||
|
|
||||||
|
#Adjust to microsecond precision
|
||||||
|
df.loc[df['timestamp'].notna(), 'timestamp'] = df['timestamp'].dt.floor('us')
|
||||||
|
|
||||||
|
df.set_index(['symbol', 'timestamp'], inplace=True) # Set the multi-level index using both 'ticker' and 't'
|
||||||
|
df = df.tz_convert(zoneNY, level='timestamp')
|
||||||
|
dfs.append(df)
|
||||||
|
|
||||||
|
# Concatenate all DataFrames into a single DataFrame with MultiIndex
|
||||||
|
final_df = pd.concat(dfs)
|
||||||
|
|
||||||
|
return final_df
|
||||||
|
|
||||||
|
def dict_to_df(tradesResponse, start, end, exclude_conditions = None, minsize = None):
|
||||||
|
""""
|
||||||
|
Transforms dict to Tradeset, then df and to zone aware
|
||||||
|
Also filters to start and end if necessary (ex. 9:30 to 15:40 is required only)
|
||||||
|
|
||||||
|
NOTE: prepodkladame, ze tradesResponse je dict from Raw data (cached/remote)
|
||||||
|
"""""
|
||||||
|
|
||||||
|
df = convert_dict_to_multiindex_df(tradesResponse)
|
||||||
|
|
||||||
|
#REQUIRED FILTERING
|
||||||
|
#pokud je zacatek pozdeji nebo konec driv tak orizneme
|
||||||
|
if (start.time() > time(9, 30) or end.time() < time(16, 0)):
|
||||||
|
print(f"filtrujeme {start.time()} {end.time()}")
|
||||||
|
# Define the time range
|
||||||
|
# start_time = pd.Timestamp(start.time(), tz=zoneNY).time()
|
||||||
|
# end_time = pd.Timestamp(end.time(), tz=zoneNY).time()
|
||||||
|
|
||||||
|
# Create a mask to filter rows within the specified time range
|
||||||
|
mask = (df.index.get_level_values('timestamp') >= start) & \
|
||||||
|
(df.index.get_level_values('timestamp') <= end)
|
||||||
|
|
||||||
|
# Apply the mask to the DataFrame
|
||||||
|
df = df[mask]
|
||||||
|
|
||||||
|
if exclude_conditions is not None:
|
||||||
|
print(f"excluding conditions {exclude_conditions}")
|
||||||
|
# Create a mask to exclude rows with any of the specified conditions
|
||||||
|
mask = df['conditions'].apply(lambda x: any(cond in exclude_conditions for cond in x))
|
||||||
|
|
||||||
|
# Filter out the rows with specified conditions
|
||||||
|
df = df[~mask]
|
||||||
|
|
||||||
|
if minsize is not None:
|
||||||
|
print(f"minsize {minsize}")
|
||||||
|
#exclude conditions
|
||||||
|
df = df[df['size'] >= minsize]
|
||||||
|
return df
|
||||||
|
|
||||||
|
def fetch_daily_stock_trades(symbol, start, end, exclude_conditions=None, minsize=None, force_remote=False, max_retries=5, backoff_factor=1):
|
||||||
|
#doc for this function
|
||||||
|
"""
|
||||||
|
Attempts to fetch stock trades either from cache or remote. When remote, it uses retry mechanism with exponential backoff.
|
||||||
|
Also it stores the data to cache if it is not already there.
|
||||||
|
by using force_remote - forcess using remote data always and thus refreshing cache for these dates
|
||||||
|
Attributes:
|
||||||
|
:param symbol: The stock symbol to fetch trades for.
|
||||||
|
:param start: The start time for the trade data.
|
||||||
|
:param end: The end time for the trade data.
|
||||||
|
:exclude_conditions: list of string conditions to exclude from the data
|
||||||
|
:minsize minimum size of trade to be included in the data
|
||||||
|
:force_remote will always use remote data and refresh cache
|
||||||
|
:param max_retries: Maximum number of retries.
|
||||||
|
:param backoff_factor: Factor to determine the next sleep time.
|
||||||
|
:return: TradesResponse object.
|
||||||
|
:raises: ConnectionError if all retries fail.
|
||||||
|
|
||||||
|
We use tradecache only for main sessison requests = 9:30 to 16:00
|
||||||
|
Do budoucna ukládat celý den BAC-20240203.cache.gz a z toho si pak filtrovat bud main sesssionu a extended
|
||||||
|
Ale zatim je uloženo jen main session v BAC-timestampopenu-timestampclose.cache.gz
|
||||||
|
"""
|
||||||
|
is_same_day = start.date() == end.date()
|
||||||
|
# Determine if the requested times fall within the main session
|
||||||
|
in_main_session = (time(9, 30) <= start.time() < time(16, 0)) and (time(9, 30) <= end.time() <= time(16, 0))
|
||||||
|
file_path = ''
|
||||||
|
|
||||||
|
if in_main_session:
|
||||||
|
filename_start = zoneNY.localize(datetime.combine(start.date(), time(9, 30)))
|
||||||
|
filename_end = zoneNY.localize(datetime.combine(end.date(), time(16, 0)))
|
||||||
|
daily_file = f"{symbol}-{int(filename_start.timestamp())}-{int(filename_end.timestamp())}.cache.gz"
|
||||||
|
file_path = f"{DATA_DIR}/tradecache/{daily_file}"
|
||||||
|
if not force_remote and os.path.exists(file_path):
|
||||||
|
print(f"Searching {str(start.date())} cache: " + daily_file)
|
||||||
|
with gzip.open(file_path, 'rb') as fp:
|
||||||
|
tradesResponse = pickle.load(fp)
|
||||||
|
print("FOUND in CACHE", daily_file)
|
||||||
|
return dict_to_df(tradesResponse, start, end, exclude_conditions, minsize)
|
||||||
|
|
||||||
|
print("NOT FOUND. Fetching from remote")
|
||||||
|
client = StockHistoricalDataClient(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, raw_data=True)
|
||||||
|
stockTradeRequest = StockTradesRequest(symbol_or_symbols=symbol, start=start, end=end)
|
||||||
|
last_exception = None
|
||||||
|
|
||||||
|
for attempt in range(max_retries):
|
||||||
|
try:
|
||||||
|
tradesResponse = client.get_stock_trades(stockTradeRequest)
|
||||||
|
is_empty = not tradesResponse[symbol]
|
||||||
|
print(f"Remote fetched: {is_empty=}", start, end)
|
||||||
|
if in_main_session and not is_empty:
|
||||||
|
current_time = datetime.now().astimezone(zoneNY)
|
||||||
|
if not (start < current_time < end):
|
||||||
|
with gzip.open(file_path, 'wb') as fp:
|
||||||
|
pickle.dump(tradesResponse, fp)
|
||||||
|
print("Saving to Trade CACHE", file_path)
|
||||||
|
|
||||||
|
else: # Don't save the cache if the market is still open
|
||||||
|
print("Not saving trade cache, market still open today")
|
||||||
|
return pd.DataFrame() if is_empty else dict_to_df(tradesResponse, start, end, exclude_conditions, minsize)
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Attempt {attempt + 1} failed: {e}")
|
||||||
|
last_exception = e
|
||||||
|
time_module.sleep(backoff_factor * (2 ** attempt) + random.uniform(0, 1)) # Adding random jitter
|
||||||
|
|
||||||
|
print("All attempts to fetch data failed.")
|
||||||
|
raise ConnectionError(f"Failed to fetch stock trades after {max_retries} retries. Last exception: {str(last_exception)} and {format_exc()}")
|
||||||
|
|
||||||
|
|
||||||
|
def fetch_trades_parallel(symbol, start_date, end_date, exclude_conditions = cfh.config_handler.get_val('AGG_EXCLUDED_TRADES'), minsize = 100, force_remote = False, max_workers=None):
|
||||||
|
"""
|
||||||
|
Fetches trades for each day between start_date and end_date during market hours (9:30-16:00) in parallel and concatenates them into a single DataFrame.
|
||||||
|
|
||||||
|
:param symbol: Stock symbol.
|
||||||
|
:param start_date: Start date as datetime.
|
||||||
|
:param end_date: End date as datetime.
|
||||||
|
:return: DataFrame containing all trades from start_date to end_date.
|
||||||
|
"""
|
||||||
|
futures = []
|
||||||
|
results = []
|
||||||
|
|
||||||
|
market_open_days = fetch_calendar_data(start_date, end_date)
|
||||||
|
day_count = len(market_open_days)
|
||||||
|
print("Contains", day_count, " market days")
|
||||||
|
max_workers = min(10, max(2, day_count // 2)) if max_workers is None else max_workers # Heuristic: half the days to process, but at least 1 and no more than 10
|
||||||
|
|
||||||
|
with ThreadPoolExecutor(max_workers=max_workers) as executor:
|
||||||
|
#for single_date in (start_date + timedelta(days=i) for i in range((end_date - start_date).days + 1)):
|
||||||
|
for market_day in tqdm(market_open_days, desc="Processing market days"):
|
||||||
|
#start = datetime.combine(single_date, time(9, 30)) # Market opens at 9:30 AM
|
||||||
|
#end = datetime.combine(single_date, time(16, 0)) # Market closes at 4:00 PM
|
||||||
|
|
||||||
|
interval_from = zoneNY.localize(market_day.open)
|
||||||
|
interval_to = zoneNY.localize(market_day.close)
|
||||||
|
|
||||||
|
#pripadne orizneme pokud je pozadovane pozdejsi zacatek a drivejsi konek
|
||||||
|
start = start_date if interval_from < start_date else interval_from
|
||||||
|
#start = max(start_date, interval_from)
|
||||||
|
end = end_date if interval_to > end_date else interval_to
|
||||||
|
#end = min(end_date, interval_to)
|
||||||
|
|
||||||
|
future = executor.submit(fetch_daily_stock_trades, symbol, start, end, exclude_conditions, minsize, force_remote)
|
||||||
|
futures.append(future)
|
||||||
|
|
||||||
|
for future in tqdm(futures, desc="Fetching data"):
|
||||||
|
try:
|
||||||
|
result = future.result()
|
||||||
|
results.append(result)
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error fetching data for a day: {e}")
|
||||||
|
|
||||||
|
# Batch concatenation to improve speed
|
||||||
|
batch_size = 10
|
||||||
|
batches = [results[i:i + batch_size] for i in range(0, len(results), batch_size)]
|
||||||
|
final_df = pd.concat([pd.concat(batch, ignore_index=False) for batch in batches], ignore_index=False)
|
||||||
|
|
||||||
|
return final_df
|
||||||
|
|
||||||
|
#original version
|
||||||
|
#return pd.concat(results, ignore_index=False)
|
||||||
|
|
||||||
|
@jit(nopython=True)
|
||||||
|
def generate_dollar_bars_nb(ticks, amount_per_bar):
|
||||||
|
""""
|
||||||
|
Generates Dollar based bars from ticks.
|
||||||
|
|
||||||
|
There is also simple prevention of aggregation from different days
|
||||||
|
as described here https://chatgpt.com/c/17804fc1-a7bc-495d-8686-b8392f3640a2
|
||||||
|
Downside: split days by UTC (which is ok for main session, but when extended hours it should be reworked by preprocessing new column identifying session)
|
||||||
|
|
||||||
|
|
||||||
|
When trade is split into multiple bars it is counted as trade in each of the bars.
|
||||||
|
Other option: trade count can be proportionally distributed by weight (0.2 to 1st bar, 0.8 to 2nd bar) - but this is not implemented yet
|
||||||
|
https://chatgpt.com/c/ff4802d9-22a2-4b72-8ab7-97a91e7a515f
|
||||||
|
"""""
|
||||||
|
ohlcv_bars = []
|
||||||
|
remaining_amount = amount_per_bar
|
||||||
|
|
||||||
|
# Initialize bar values based on the first tick to avoid uninitialized values
|
||||||
|
open_price = ticks[0, 1]
|
||||||
|
high_price = ticks[0, 1]
|
||||||
|
low_price = ticks[0, 1]
|
||||||
|
close_price = ticks[0, 1]
|
||||||
|
volume = 0
|
||||||
|
trades_count = 0
|
||||||
|
current_day = np.floor(ticks[0, 0] / 86400) # Calculate the initial day from the first tick timestamp
|
||||||
|
bar_time = ticks[0, 0] # Initialize bar time with the time of the first tick
|
||||||
|
|
||||||
|
for tick in ticks:
|
||||||
|
tick_time = tick[0]
|
||||||
|
price = tick[1]
|
||||||
|
tick_volume = tick[2]
|
||||||
|
tick_amount = price * tick_volume
|
||||||
|
tick_day = np.floor(tick_time / 86400) # Calculate the day of the current tick
|
||||||
|
|
||||||
|
# Check if the new tick is from a different day, then close the current bar
|
||||||
|
if tick_day != current_day:
|
||||||
|
if trades_count > 0:
|
||||||
|
ohlcv_bars.append([bar_time, open_price, high_price, low_price, close_price, volume, trades_count, amount_per_bar, tick_time])
|
||||||
|
# Reset for the new day using the current tick data
|
||||||
|
open_price = price
|
||||||
|
high_price = price
|
||||||
|
low_price = price
|
||||||
|
close_price = price
|
||||||
|
volume = 0
|
||||||
|
trades_count = 0
|
||||||
|
remaining_amount = amount_per_bar
|
||||||
|
current_day = tick_day
|
||||||
|
bar_time = tick_time
|
||||||
|
|
||||||
|
# Start new bar if needed because of the dollar value
|
||||||
|
while tick_amount > 0:
|
||||||
|
if tick_amount < remaining_amount:
|
||||||
|
# Add the entire tick to the current bar
|
||||||
|
high_price = max(high_price, price)
|
||||||
|
low_price = min(low_price, price)
|
||||||
|
close_price = price
|
||||||
|
volume += tick_volume
|
||||||
|
remaining_amount -= tick_amount
|
||||||
|
trades_count += 1
|
||||||
|
tick_amount = 0
|
||||||
|
else:
|
||||||
|
# Calculate the amount of volume that fits within the remaining dollar amount
|
||||||
|
volume_to_add = remaining_amount / price
|
||||||
|
volume += volume_to_add # Update the volume here before appending and resetting
|
||||||
|
|
||||||
|
# Append the partially filled bar to the list
|
||||||
|
ohlcv_bars.append([bar_time, open_price, high_price, low_price, close_price, volume, trades_count + 1, amount_per_bar, tick_time])
|
||||||
|
|
||||||
|
# Fill the current bar and continue with a new bar
|
||||||
|
tick_volume -= volume_to_add
|
||||||
|
tick_amount -= remaining_amount
|
||||||
|
|
||||||
|
# Reset bar values for the new bar using the current tick data
|
||||||
|
open_price = price
|
||||||
|
high_price = price
|
||||||
|
low_price = price
|
||||||
|
close_price = price
|
||||||
|
volume = 0 # Reset volume for the new bar
|
||||||
|
trades_count = 0
|
||||||
|
remaining_amount = amount_per_bar
|
||||||
|
|
||||||
|
# Increment bar time if splitting a trade
|
||||||
|
if tick_volume > 0: #pokud v tradu je jeste zbytek nastavujeme cas o nanosekundu vetsi
|
||||||
|
bar_time = tick_time + 1e-6
|
||||||
|
else:
|
||||||
|
bar_time = tick_time #jinak nastavujeme cas ticku
|
||||||
|
#bar_time = tick_time
|
||||||
|
|
||||||
|
# Add the last bar if it contains any trades
|
||||||
|
if trades_count > 0:
|
||||||
|
ohlcv_bars.append([bar_time, open_price, high_price, low_price, close_price, volume, trades_count, amount_per_bar, tick_time])
|
||||||
|
|
||||||
|
return np.array(ohlcv_bars)
|
||||||
|
|
||||||
|
|
||||||
|
@jit(nopython=True)
|
||||||
|
def generate_volume_bars_nb(ticks, volume_per_bar):
|
||||||
|
""""
|
||||||
|
Generates Volume based bars from ticks.
|
||||||
|
|
||||||
|
NOTE: UTC day split here (doesnt aggregate trades from different days)
|
||||||
|
but realized from UTC (ok for main session) - but needs rework for extension by preprocessing ticks_df and introduction sesssion column
|
||||||
|
|
||||||
|
When trade is split into multiple bars it is counted as trade in each of the bars.
|
||||||
|
Other option: trade count can be proportionally distributed by weight (0.2 to 1st bar, 0.8 to 2nd bar) - but this is not implemented yet
|
||||||
|
https://chatgpt.com/c/ff4802d9-22a2-4b72-8ab7-97a91e7a515f
|
||||||
|
"""""
|
||||||
|
ohlcv_bars = []
|
||||||
|
remaining_volume = volume_per_bar
|
||||||
|
|
||||||
|
# Initialize bar values based on the first tick to avoid uninitialized values
|
||||||
|
open_price = ticks[0, 1]
|
||||||
|
high_price = ticks[0, 1]
|
||||||
|
low_price = ticks[0, 1]
|
||||||
|
close_price = ticks[0, 1]
|
||||||
|
volume = 0
|
||||||
|
trades_count = 0
|
||||||
|
current_day = np.floor(ticks[0, 0] / 86400) # Calculate the initial day from the first tick timestamp
|
||||||
|
bar_time = ticks[0, 0] # Initialize bar time with the time of the first tick
|
||||||
|
buy_volume = 0 # Volume of buy trades
|
||||||
|
sell_volume = 0 # Volume of sell trades
|
||||||
|
prev_price = ticks[0, 1] # Initialize previous price for the first tick
|
||||||
|
|
||||||
|
for tick in ticks:
|
||||||
|
tick_time = tick[0]
|
||||||
|
price = tick[1]
|
||||||
|
tick_volume = tick[2]
|
||||||
|
tick_day = np.floor(tick_time / 86400) # Calculate the day of the current tick
|
||||||
|
|
||||||
|
# Check if the new tick is from a different day, then close the current bar
|
||||||
|
if tick_day != current_day:
|
||||||
|
if trades_count > 0:
|
||||||
|
ohlcv_bars.append([bar_time, open_price, high_price, low_price, close_price, volume, trades_count, tick_time, buy_volume, sell_volume])
|
||||||
|
# Reset for the new day using the current tick data
|
||||||
|
open_price = price
|
||||||
|
high_price = price
|
||||||
|
low_price = price
|
||||||
|
close_price = price
|
||||||
|
volume = 0
|
||||||
|
trades_count = 0
|
||||||
|
remaining_volume = volume_per_bar
|
||||||
|
current_day = tick_day
|
||||||
|
bar_time = tick_time # Update bar time to the current tick time
|
||||||
|
buy_volume = 0
|
||||||
|
sell_volume = 0
|
||||||
|
# Reset previous tick price (calulating imbalance for each day from the start)
|
||||||
|
prev_price = price
|
||||||
|
|
||||||
|
# Start new bar if needed because of the volume
|
||||||
|
while tick_volume > 0:
|
||||||
|
if tick_volume < remaining_volume:
|
||||||
|
# Add the entire tick to the current bar
|
||||||
|
high_price = max(high_price, price)
|
||||||
|
low_price = min(low_price, price)
|
||||||
|
close_price = price
|
||||||
|
volume += tick_volume
|
||||||
|
remaining_volume -= tick_volume
|
||||||
|
trades_count += 1
|
||||||
|
|
||||||
|
# Update buy and sell volumes
|
||||||
|
if price > prev_price:
|
||||||
|
buy_volume += tick_volume
|
||||||
|
elif price < prev_price:
|
||||||
|
sell_volume += tick_volume
|
||||||
|
|
||||||
|
tick_volume = 0
|
||||||
|
else:
|
||||||
|
# Fill the current bar and continue with a new bar
|
||||||
|
volume_to_add = remaining_volume
|
||||||
|
volume += volume_to_add
|
||||||
|
tick_volume -= volume_to_add
|
||||||
|
trades_count += 1
|
||||||
|
|
||||||
|
# Update buy and sell volumes
|
||||||
|
if price > prev_price:
|
||||||
|
buy_volume += volume_to_add
|
||||||
|
elif price < prev_price:
|
||||||
|
sell_volume += volume_to_add
|
||||||
|
|
||||||
|
# Append the completed bar to the list
|
||||||
|
ohlcv_bars.append([bar_time, open_price, high_price, low_price, close_price, volume, trades_count, tick_time, buy_volume, sell_volume])
|
||||||
|
|
||||||
|
# Reset bar values for the new bar using the current tick data
|
||||||
|
open_price = price
|
||||||
|
high_price = price
|
||||||
|
low_price = price
|
||||||
|
close_price = price
|
||||||
|
volume = 0
|
||||||
|
trades_count = 0
|
||||||
|
remaining_volume = volume_per_bar
|
||||||
|
buy_volume = 0
|
||||||
|
sell_volume = 0
|
||||||
|
|
||||||
|
# Increment bar time if splitting a trade
|
||||||
|
if tick_volume > 0: # If there's remaining volume in the trade, set bar time slightly later
|
||||||
|
bar_time = tick_time + 1e-6
|
||||||
|
else:
|
||||||
|
bar_time = tick_time # Otherwise, set bar time to the tick time
|
||||||
|
|
||||||
|
prev_price = price
|
||||||
|
|
||||||
|
# Add the last bar if it contains any trades
|
||||||
|
if trades_count > 0:
|
||||||
|
ohlcv_bars.append([bar_time, open_price, high_price, low_price, close_price, volume, trades_count, tick_time, buy_volume, sell_volume])
|
||||||
|
|
||||||
|
return np.array(ohlcv_bars)
|
||||||
|
|
||||||
|
@jit(nopython=True)
|
||||||
|
def generate_time_bars_nb(ticks, resolution):
|
||||||
|
# Initialize the start and end time
|
||||||
|
start_time = np.floor(ticks[0, 0] / resolution) * resolution
|
||||||
|
end_time = np.floor(ticks[-1, 0] / resolution) * resolution
|
||||||
|
|
||||||
|
# # Calculate number of bars
|
||||||
|
# num_bars = int((end_time - start_time) // resolution + 1)
|
||||||
|
|
||||||
|
# Using a list to append data only when trades exist
|
||||||
|
ohlcv_bars = []
|
||||||
|
|
||||||
|
# Variables to track the current bar
|
||||||
|
current_bar_index = -1
|
||||||
|
open_price = 0
|
||||||
|
high_price = -np.inf
|
||||||
|
low_price = np.inf
|
||||||
|
close_price = 0
|
||||||
|
volume = 0
|
||||||
|
trades_count = 0
|
||||||
|
vwap_cum_volume_price = 0 # Cumulative volume * price
|
||||||
|
cum_volume = 0 # Cumulative volume for VWAP
|
||||||
|
buy_volume = 0 # Volume of buy trades
|
||||||
|
sell_volume = 0 # Volume of sell trades
|
||||||
|
prev_price = ticks[0, 1] # Initialize previous price for the first tick
|
||||||
|
prev_day = np.floor(ticks[0, 0] / 86400) # Calculate the initial day from the first tick timestamp
|
||||||
|
|
||||||
|
for tick in ticks:
|
||||||
|
curr_time = tick[0] #updated time
|
||||||
|
tick_time = np.floor(tick[0] / resolution) * resolution
|
||||||
|
price = tick[1]
|
||||||
|
tick_volume = tick[2]
|
||||||
|
tick_day = np.floor(tick_time / 86400) # Calculate the day of the current tick
|
||||||
|
|
||||||
|
#if the new tick is from a new day, reset previous tick price (calculating imbalance starts over)
|
||||||
|
if tick_day != prev_day:
|
||||||
|
prev_price = price
|
||||||
|
prev_day = tick_day
|
||||||
|
|
||||||
|
# Check if the tick belongs to a new bar
|
||||||
|
if tick_time != start_time + current_bar_index * resolution:
|
||||||
|
if current_bar_index >= 0 and trades_count > 0: # Save the previous bar if trades happened
|
||||||
|
vwap = vwap_cum_volume_price / cum_volume if cum_volume > 0 else 0
|
||||||
|
ohlcv_bars.append([start_time + current_bar_index * resolution, open_price, high_price, low_price, close_price, volume, trades_count, curr_time, vwap, buy_volume, sell_volume])
|
||||||
|
|
||||||
|
# Reset bar values
|
||||||
|
current_bar_index = int((tick_time - start_time) / resolution)
|
||||||
|
open_price = price
|
||||||
|
high_price = price
|
||||||
|
low_price = price
|
||||||
|
volume = 0
|
||||||
|
trades_count = 0
|
||||||
|
vwap_cum_volume_price = 0
|
||||||
|
cum_volume = 0
|
||||||
|
buy_volume = 0
|
||||||
|
sell_volume = 0
|
||||||
|
|
||||||
|
# Update the OHLCV values for the current bar
|
||||||
|
high_price = max(high_price, price)
|
||||||
|
low_price = min(low_price, price)
|
||||||
|
close_price = price
|
||||||
|
volume += tick_volume
|
||||||
|
trades_count += 1
|
||||||
|
vwap_cum_volume_price += price * tick_volume
|
||||||
|
cum_volume += tick_volume
|
||||||
|
|
||||||
|
# Update buy and sell volumes
|
||||||
|
if price > prev_price:
|
||||||
|
buy_volume += tick_volume
|
||||||
|
elif price < prev_price:
|
||||||
|
sell_volume += tick_volume
|
||||||
|
|
||||||
|
prev_price = price
|
||||||
|
|
||||||
|
# Save the last processed bar
|
||||||
|
if trades_count > 0:
|
||||||
|
vwap = vwap_cum_volume_price / cum_volume if cum_volume > 0 else 0
|
||||||
|
ohlcv_bars.append([start_time + current_bar_index * resolution, open_price, high_price, low_price, close_price, volume, trades_count, curr_time, vwap, buy_volume, sell_volume])
|
||||||
|
|
||||||
|
return np.array(ohlcv_bars)
|
||||||
|
|
||||||
|
# Example usage
|
||||||
|
if __name__ == '__main__':
|
||||||
|
pass
|
||||||
|
#example in agg_vect.ipynb
|
||||||
@ -1,14 +1,13 @@
|
|||||||
from v2realbot.loader.aggregator import TradeAggregator, TradeAggregator2List, TradeAggregator2Queue
|
from v2realbot.loader.aggregator import TradeAggregator, TradeAggregator2List, TradeAggregator2Queue
|
||||||
#from v2realbot.loader.cacher import get_cached_agg_data
|
#from v2realbot.loader.cacher import get_cached_agg_data
|
||||||
from alpaca.trading.requests import GetCalendarRequest
|
from alpaca.trading.requests import GetCalendarRequest
|
||||||
from alpaca.trading.client import TradingClient
|
|
||||||
from alpaca.data.live import StockDataStream
|
from alpaca.data.live import StockDataStream
|
||||||
from v2realbot.config import ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, DATA_DIR, OFFLINE_MODE
|
from v2realbot.config import ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, DATA_DIR
|
||||||
from alpaca.data.enums import DataFeed
|
from alpaca.data.enums import DataFeed
|
||||||
from alpaca.data.historical import StockHistoricalDataClient
|
from alpaca.data.historical import StockHistoricalDataClient
|
||||||
from alpaca.data.requests import StockLatestQuoteRequest, StockBarsRequest, StockTradesRequest
|
from alpaca.data.requests import StockLatestQuoteRequest, StockBarsRequest, StockTradesRequest
|
||||||
from threading import Thread, current_thread
|
from threading import Thread, current_thread
|
||||||
from v2realbot.utils.utils import parse_alpaca_timestamp, ltp, zoneNY
|
from v2realbot.utils.utils import parse_alpaca_timestamp, ltp, zoneNY, send_to_telegram, fetch_calendar_data
|
||||||
from v2realbot.utils.tlog import tlog
|
from v2realbot.utils.tlog import tlog
|
||||||
from datetime import datetime, timedelta, date
|
from datetime import datetime, timedelta, date
|
||||||
from threading import Thread
|
from threading import Thread
|
||||||
@ -26,13 +25,15 @@ from tqdm import tqdm
|
|||||||
import time
|
import time
|
||||||
from traceback import format_exc
|
from traceback import format_exc
|
||||||
from collections import defaultdict
|
from collections import defaultdict
|
||||||
|
import requests
|
||||||
|
import v2realbot.utils.config_handler as cfh
|
||||||
"""
|
"""
|
||||||
Trade offline data streamer, based on Alpaca historical data.
|
Trade offline data streamer, based on Alpaca historical data.
|
||||||
"""
|
"""
|
||||||
class Trade_Offline_Streamer(Thread):
|
class Trade_Offline_Streamer(Thread):
|
||||||
#pro BT se pripojujeme vzdy k primarnimu uctu - pouze tahame historicka data + calendar
|
#pro BT se pripojujeme vzdy k primarnimu uctu - pouze tahame historicka data + calendar
|
||||||
client = StockHistoricalDataClient(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, raw_data=True)
|
client = StockHistoricalDataClient(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, raw_data=True)
|
||||||
clientTrading = TradingClient(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, raw_data=False)
|
#clientTrading = TradingClient(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, raw_data=False)
|
||||||
def __init__(self, time_from: datetime, time_to: datetime, btdata) -> None:
|
def __init__(self, time_from: datetime, time_to: datetime, btdata) -> None:
|
||||||
# Call the Thread class's init function
|
# Call the Thread class's init function
|
||||||
Thread.__init__(self)
|
Thread.__init__(self)
|
||||||
@ -64,6 +65,35 @@ class Trade_Offline_Streamer(Thread):
|
|||||||
def stop(self):
|
def stop(self):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
def fetch_stock_trades(self, symbol, start, end, max_retries=5, backoff_factor=1):
|
||||||
|
"""
|
||||||
|
Attempts to fetch stock trades with exponential backoff. Raises an exception if all retries fail.
|
||||||
|
|
||||||
|
:param symbol: The stock symbol to fetch trades for.
|
||||||
|
:param start: The start time for the trade data.
|
||||||
|
:param end: The end time for the trade data.
|
||||||
|
:param max_retries: Maximum number of retries.
|
||||||
|
:param backoff_factor: Factor to determine the next sleep time.
|
||||||
|
:return: TradesResponse object.
|
||||||
|
:raises: ConnectionError if all retries fail.
|
||||||
|
"""
|
||||||
|
stockTradeRequest = StockTradesRequest(symbol_or_symbols=symbol, start=start, end=end)
|
||||||
|
last_exception = None
|
||||||
|
|
||||||
|
for attempt in range(max_retries):
|
||||||
|
try:
|
||||||
|
tradesResponse = self.client.get_stock_trades(stockTradeRequest)
|
||||||
|
print("Remote Fetch DAY DATA Complete", start, end)
|
||||||
|
return tradesResponse
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Attempt {attempt + 1} failed: {e}")
|
||||||
|
last_exception = e
|
||||||
|
time.sleep(backoff_factor * (2 ** attempt))
|
||||||
|
|
||||||
|
print("All attempts to fetch data failed.")
|
||||||
|
send_to_telegram(f"Failed to fetch stock trades after {max_retries} retries. Last exception: {str(last_exception)} and {format_exc()}")
|
||||||
|
raise ConnectionError(f"Failed to fetch stock trades after {max_retries} retries. Last exception: {str(last_exception)} and {format_exc()}")
|
||||||
|
|
||||||
# Override the run() function of Thread class
|
# Override the run() function of Thread class
|
||||||
#odebrano async
|
#odebrano async
|
||||||
def main(self):
|
def main(self):
|
||||||
@ -74,6 +104,8 @@ class Trade_Offline_Streamer(Thread):
|
|||||||
print("call add streams to queue first")
|
print("call add streams to queue first")
|
||||||
return 0
|
return 0
|
||||||
|
|
||||||
|
cfh.config_handler.print_current_config()
|
||||||
|
|
||||||
#iterujeme nad streamy
|
#iterujeme nad streamy
|
||||||
for i in self.streams:
|
for i in self.streams:
|
||||||
self.uniquesymbols.add(i.symbol)
|
self.uniquesymbols.add(i.symbol)
|
||||||
@ -107,25 +139,21 @@ class Trade_Offline_Streamer(Thread):
|
|||||||
#datetime.fromtimestamp(data['updated']).astimezone(zoneNY))
|
#datetime.fromtimestamp(data['updated']).astimezone(zoneNY))
|
||||||
#REFACTOR STARTS HERE
|
#REFACTOR STARTS HERE
|
||||||
#print(f"{self.time_from=} {self.time_to=}")
|
#print(f"{self.time_from=} {self.time_to=}")
|
||||||
|
|
||||||
if OFFLINE_MODE:
|
if cfh.config_handler.get_val('OFFLINE_MODE'):
|
||||||
#just one day - same like time_from
|
#just one day - same like time_from
|
||||||
den = str(self.time_to.date())
|
den = str(self.time_to.date())
|
||||||
bt_day = Calendar(date=den,open="9:30",close="16:00")
|
bt_day = Calendar(date=den,open="9:30",close="16:00")
|
||||||
cal_dates = [bt_day]
|
cal_dates = [bt_day]
|
||||||
else:
|
else:
|
||||||
calendar_request = GetCalendarRequest(start=self.time_from,end=self.time_to)
|
start_date = self.time_from # Assuming this is your start date
|
||||||
|
end_date = self.time_to # Assuming this is your end date
|
||||||
#toto zatim workaround - dat do retry funkce a obecne vymyslet exception handling, abych byl notifikovan a bylo videt okamzite v logu a na frontendu
|
cal_dates = fetch_calendar_data(start_date, end_date)
|
||||||
try:
|
|
||||||
cal_dates = self.clientTrading.get_calendar(calendar_request)
|
|
||||||
except Exception as e:
|
|
||||||
print("CHYBA - retrying in 4s: " + str(e) + format_exc())
|
|
||||||
time.sleep(5)
|
|
||||||
cal_dates = self.clientTrading.get_calendar(calendar_request)
|
|
||||||
|
|
||||||
#zatim podpora pouze main session
|
#zatim podpora pouze main session
|
||||||
|
|
||||||
|
live_data_feed = cfh.config_handler.get_val('LIVE_DATA_FEED')
|
||||||
|
|
||||||
#zatim podpora pouze 1 symbolu, predelat na froloop vsech symbolu ze symbpole
|
#zatim podpora pouze 1 symbolu, predelat na froloop vsech symbolu ze symbpole
|
||||||
#minimalni jednotka pro CACHE je 1 den - a to jen marketopen to marketclose (extended hours not supported yet)
|
#minimalni jednotka pro CACHE je 1 den - a to jen marketopen to marketclose (extended hours not supported yet)
|
||||||
for day in cal_dates:
|
for day in cal_dates:
|
||||||
@ -168,9 +196,10 @@ class Trade_Offline_Streamer(Thread):
|
|||||||
# stream.send_cache_to_output(cache)
|
# stream.send_cache_to_output(cache)
|
||||||
# to_rem.append(stream)
|
# to_rem.append(stream)
|
||||||
|
|
||||||
#cache resime jen kdyz backtestujeme cely den
|
#cache resime jen kdyz backtestujeme cely den a mame sip datapoint (iex necachujeme)
|
||||||
#pokud ne tak ani necteme, ani nezapisujeme do cache
|
#pokud ne tak ani necteme, ani nezapisujeme do cache
|
||||||
if self.time_to >= day.close and self.time_from <= day.open:
|
|
||||||
|
if (self.time_to >= day.close and self.time_from <= day.open) and live_data_feed == DataFeed.SIP:
|
||||||
#tento odstavec obchazime pokud je nastaveno "dont_use_cache"
|
#tento odstavec obchazime pokud je nastaveno "dont_use_cache"
|
||||||
stream_btdata = self.to_run[symbpole[0]][0]
|
stream_btdata = self.to_run[symbpole[0]][0]
|
||||||
cache_btdata, file_btdata = stream_btdata.get_cache(day.open, day.close)
|
cache_btdata, file_btdata = stream_btdata.get_cache(day.open, day.close)
|
||||||
@ -213,14 +242,22 @@ class Trade_Offline_Streamer(Thread):
|
|||||||
print("Loading from Trade CACHE", file_path)
|
print("Loading from Trade CACHE", file_path)
|
||||||
#daily file doesnt exist
|
#daily file doesnt exist
|
||||||
else:
|
else:
|
||||||
# TODO refactor pro zpracovani vice symbolu najednou(multithreads), nyni predpokladame pouze 1
|
|
||||||
stockTradeRequest = StockTradesRequest(symbol_or_symbols=symbpole[0], start=day.open,end=day.close)
|
#implement retry mechanism
|
||||||
tradesResponse = self.client.get_stock_trades(stockTradeRequest)
|
symbol = symbpole[0] # Assuming symbpole[0] is your target symbol
|
||||||
|
day_open = day.open # Assuming day.open is the start time
|
||||||
|
day_close = day.close # Assuming day.close is the end time
|
||||||
|
|
||||||
|
tradesResponse = self.fetch_stock_trades(symbol, day_open, day_close)
|
||||||
|
|
||||||
|
# # TODO refactor pro zpracovani vice symbolu najednou(multithreads), nyni predpokladame pouze 1
|
||||||
|
# stockTradeRequest = StockTradesRequest(symbol_or_symbols=symbpole[0], start=day.open,end=day.close)
|
||||||
|
# tradesResponse = self.client.get_stock_trades(stockTradeRequest)
|
||||||
print("Remote Fetch DAY DATA Complete", day.open, day.close)
|
print("Remote Fetch DAY DATA Complete", day.open, day.close)
|
||||||
|
|
||||||
#pokud jde o dnešní den a nebyl konec trhu tak cache neukládáme
|
#pokud jde o dnešní den a nebyl konec trhu tak cache neukládáme, pripadne pri iex datapointu necachujeme
|
||||||
if day.open < datetime.now().astimezone(zoneNY) < day.close:
|
if (day.open < datetime.now().astimezone(zoneNY) < day.close) or live_data_feed == DataFeed.IEX:
|
||||||
print("not saving trade cache, market still open today")
|
print("not saving trade cache, market still open today or IEX datapoint")
|
||||||
#ic(datetime.now().astimezone(zoneNY))
|
#ic(datetime.now().astimezone(zoneNY))
|
||||||
#ic(day.open, day.close)
|
#ic(day.open, day.close)
|
||||||
else:
|
else:
|
||||||
@ -258,7 +295,7 @@ class Trade_Offline_Streamer(Thread):
|
|||||||
cnt = 1
|
cnt = 1
|
||||||
|
|
||||||
|
|
||||||
for t in tqdm(tradesResponse[symbol]):
|
for t in tqdm(tradesResponse[symbol], desc="Loading Trades"):
|
||||||
|
|
||||||
#protoze je zde cely den, poustime dal, jen ty relevantni
|
#protoze je zde cely den, poustime dal, jen ty relevantni
|
||||||
#pokud je start_time < trade < end_time
|
#pokud je start_time < trade < end_time
|
||||||
@ -271,6 +308,9 @@ class Trade_Offline_Streamer(Thread):
|
|||||||
#tmp = to_datetime(t['t'], utc=True).timestamp()
|
#tmp = to_datetime(t['t'], utc=True).timestamp()
|
||||||
|
|
||||||
|
|
||||||
|
#obcas se v response objevoval None radek
|
||||||
|
if t is None:
|
||||||
|
continue
|
||||||
|
|
||||||
datum = to_datetime(t['t'], utc=True)
|
datum = to_datetime(t['t'], utc=True)
|
||||||
|
|
||||||
|
|||||||
@ -4,7 +4,7 @@
|
|||||||
"""
|
"""
|
||||||
from v2realbot.loader.aggregator import TradeAggregator2Queue
|
from v2realbot.loader.aggregator import TradeAggregator2Queue
|
||||||
from alpaca.data.live import StockDataStream
|
from alpaca.data.live import StockDataStream
|
||||||
from v2realbot.config import ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, ACCOUNT1_PAPER_FEED
|
from v2realbot.config import LIVE_DATA_API_KEY, LIVE_DATA_SECRET_KEY
|
||||||
from alpaca.data.historical import StockHistoricalDataClient
|
from alpaca.data.historical import StockHistoricalDataClient
|
||||||
from alpaca.data.requests import StockLatestQuoteRequest, StockBarsRequest, StockTradesRequest
|
from alpaca.data.requests import StockLatestQuoteRequest, StockBarsRequest, StockTradesRequest
|
||||||
from threading import Thread, current_thread
|
from threading import Thread, current_thread
|
||||||
@ -12,6 +12,7 @@ from v2realbot.utils.utils import parse_alpaca_timestamp, ltp
|
|||||||
from datetime import datetime, timedelta
|
from datetime import datetime, timedelta
|
||||||
from threading import Thread, Lock
|
from threading import Thread, Lock
|
||||||
from msgpack import packb
|
from msgpack import packb
|
||||||
|
import v2realbot.utils.config_handler as cfh
|
||||||
|
|
||||||
"""
|
"""
|
||||||
Shared streamer (can be shared amongst concurrently running strategies)
|
Shared streamer (can be shared amongst concurrently running strategies)
|
||||||
@ -19,9 +20,12 @@ from msgpack import packb
|
|||||||
by strategies
|
by strategies
|
||||||
"""
|
"""
|
||||||
class Trade_WS_Streamer(Thread):
|
class Trade_WS_Streamer(Thread):
|
||||||
|
live_data_feed = cfh.config_handler.get_val('LIVE_DATA_FEED')
|
||||||
##tento ws streamer je pouze jeden pro vsechny, tzn. vyuziváme natvrdo placena data primarniho uctu (nezalezi jestli paper nebo live)
|
##tento ws streamer je pouze jeden pro vsechny, tzn. vyuziváme natvrdo placena data primarniho uctu (nezalezi jestli paper nebo live)
|
||||||
client = StockDataStream(ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, raw_data=True, websocket_params={}, feed=ACCOUNT1_PAPER_FEED)
|
msg = f"Realtime Websocket connection will use FEED: {live_data_feed} and credential of ACCOUNT1"
|
||||||
|
print(msg)
|
||||||
|
#cfh.config_handler.print_current_config()
|
||||||
|
client = StockDataStream(LIVE_DATA_API_KEY, LIVE_DATA_SECRET_KEY, raw_data=True, websocket_params={}, feed=live_data_feed)
|
||||||
#uniquesymbols = set()
|
#uniquesymbols = set()
|
||||||
_streams = []
|
_streams = []
|
||||||
#to_run = dict()
|
#to_run = dict()
|
||||||
@ -38,10 +42,23 @@ class Trade_WS_Streamer(Thread):
|
|||||||
return False
|
return False
|
||||||
|
|
||||||
def add_stream(self, obj: TradeAggregator2Queue):
|
def add_stream(self, obj: TradeAggregator2Queue):
|
||||||
|
print(Trade_WS_Streamer.msg)
|
||||||
print("stav pred pridavanim", Trade_WS_Streamer._streams)
|
print("stav pred pridavanim", Trade_WS_Streamer._streams)
|
||||||
Trade_WS_Streamer._streams.append(obj)
|
Trade_WS_Streamer._streams.append(obj)
|
||||||
if Trade_WS_Streamer.client._running is False:
|
if Trade_WS_Streamer.client._running is False:
|
||||||
print("websocket zatim nebezi, pouze pridavame do pole")
|
print("websocket zatim nebezi, pouze pridavame do pole")
|
||||||
|
|
||||||
|
#zde delame refresh clienta (pokud se zmenilo live_data_feed)
|
||||||
|
|
||||||
|
# live_data_feed = cfh.config_handler.get_val('LIVE_DATA_FEED')
|
||||||
|
# #po otestování přepnout jen pokud se live_data_feed změnil
|
||||||
|
# #if live_data_feed != Trade_WS_Streamer.live_data_feed:
|
||||||
|
# # Trade_WS_Streamer.live_data_feed = live_data_feed
|
||||||
|
# msg = f"REFRESH OF CLIENT! Realtime Websocket connection will use FEED: {live_data_feed} and credential of ACCOUNT1"
|
||||||
|
# print(msg)
|
||||||
|
# #cfh.config_handler.print_current_config()
|
||||||
|
# Trade_WS_Streamer.client = StockDataStream(LIVE_DATA_API_KEY, LIVE_DATA_SECRET_KEY, raw_data=True, websocket_params={}, feed=live_data_feed)
|
||||||
|
|
||||||
else:
|
else:
|
||||||
print("websocket client bezi")
|
print("websocket client bezi")
|
||||||
if self.symbol_exists(obj.symbol):
|
if self.symbol_exists(obj.symbol):
|
||||||
@ -59,7 +76,12 @@ class Trade_WS_Streamer(Thread):
|
|||||||
#if it is the last item at all, stop the client from running
|
#if it is the last item at all, stop the client from running
|
||||||
if len(Trade_WS_Streamer._streams) == 0:
|
if len(Trade_WS_Streamer._streams) == 0:
|
||||||
print("removed last item from WS, stopping the client")
|
print("removed last item from WS, stopping the client")
|
||||||
Trade_WS_Streamer.client.stop()
|
#Trade_WS_Streamer.client.stop_ws()
|
||||||
|
#Trade_WS_Streamer.client.stop()
|
||||||
|
#zkusíme explicitně zavolat kroky pro disconnect od ws
|
||||||
|
if Trade_WS_Streamer.client._stop_stream_queue.empty():
|
||||||
|
Trade_WS_Streamer.client._stop_stream_queue.put_nowait({"should_stop": True})
|
||||||
|
Trade_WS_Streamer.client._should_run = False
|
||||||
return
|
return
|
||||||
|
|
||||||
if not self.symbol_exists(obj.symbol):
|
if not self.symbol_exists(obj.symbol):
|
||||||
|
|||||||
@ -1,7 +1,7 @@
|
|||||||
import os,sys
|
import os,sys
|
||||||
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
os.environ["KERAS_BACKEND"] = "jax"
|
os.environ["KERAS_BACKEND"] = "jax"
|
||||||
from v2realbot.config import WEB_API_KEY, DATA_DIR, MEDIA_DIRECTORY, LOG_FILE, MODEL_DIR
|
from v2realbot.config import WEB_API_KEY, DATA_DIR, MEDIA_DIRECTORY, LOG_PATH, MODEL_DIR
|
||||||
from alpaca.data.timeframe import TimeFrame, TimeFrameUnit
|
from alpaca.data.timeframe import TimeFrame, TimeFrameUnit
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from rich import print
|
from rich import print
|
||||||
@ -9,10 +9,9 @@ from fastapi import FastAPI, Depends, HTTPException, status, File, UploadFile, R
|
|||||||
from fastapi.security import APIKeyHeader
|
from fastapi.security import APIKeyHeader
|
||||||
import uvicorn
|
import uvicorn
|
||||||
from uuid import UUID
|
from uuid import UUID
|
||||||
import v2realbot.controller.services as cs
|
|
||||||
from v2realbot.utils.ilog import get_log_window
|
from v2realbot.utils.ilog import get_log_window
|
||||||
from v2realbot.common.model import StrategyInstance, RunnerView, RunRequest, Trade, RunArchive, RunArchiveView, RunArchiveViewPagination, RunArchiveDetail, Bar, RunArchiveChange, TestList, ConfigItem, InstantIndicator, DataTablesRequest, AnalyzerInputs
|
from v2realbot.common.model import RunManagerRecord, StrategyInstance, RunnerView, RunRequest, Trade, RunArchive, RunArchiveView, RunArchiveViewPagination, RunArchiveDetail, Bar, RunArchiveChange, TestList, ConfigItem, InstantIndicator, DataTablesRequest, AnalyzerInputs
|
||||||
from fastapi import FastAPI, WebSocket, WebSocketDisconnect, Depends, HTTPException, status, WebSocketException, Cookie, Query
|
from fastapi import FastAPI, WebSocket, WebSocketDisconnect, Depends, HTTPException, status, WebSocketException, Cookie, Query, Request
|
||||||
from fastapi.responses import FileResponse, StreamingResponse, JSONResponse
|
from fastapi.responses import FileResponse, StreamingResponse, JSONResponse
|
||||||
from fastapi.staticfiles import StaticFiles
|
from fastapi.staticfiles import StaticFiles
|
||||||
from fastapi.security import HTTPBasic, HTTPBasicCredentials
|
from fastapi.security import HTTPBasic, HTTPBasicCredentials
|
||||||
@ -36,9 +35,15 @@ from traceback import format_exc
|
|||||||
#from v2realbot.reporting.optimizecutoffs import find_optimal_cutoff
|
#from v2realbot.reporting.optimizecutoffs import find_optimal_cutoff
|
||||||
import v2realbot.reporting.analyzer as ci
|
import v2realbot.reporting.analyzer as ci
|
||||||
import shutil
|
import shutil
|
||||||
from starlette.responses import JSONResponse
|
from starlette.responses import JSONResponse, HTMLResponse, FileResponse, RedirectResponse
|
||||||
import mlroom
|
import mlroom
|
||||||
import mlroom.utils.mlutils as ml
|
import mlroom.utils.mlutils as ml
|
||||||
|
from typing import List
|
||||||
|
import v2realbot.controller.run_manager as rm
|
||||||
|
import v2realbot.scheduler.ap_scheduler as aps
|
||||||
|
import re
|
||||||
|
import v2realbot.controller.configs as cf
|
||||||
|
import v2realbot.controller.services as cs
|
||||||
#from async io import Queue, QueueEmpty
|
#from async io import Queue, QueueEmpty
|
||||||
#
|
#
|
||||||
# install()
|
# install()
|
||||||
@ -69,14 +74,52 @@ def api_key_auth(api_key: str = Depends(X_API_KEY)):
|
|||||||
status_code=status.HTTP_401_UNAUTHORIZED,
|
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||||
detail="Forbidden"
|
detail="Forbidden"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
def authenticate_user(credentials: HTTPBasicCredentials = Depends(HTTPBasic())):
|
||||||
|
correct_username = "david"
|
||||||
|
correct_password = "david"
|
||||||
|
|
||||||
|
if credentials.username == correct_username and credentials.password == correct_password:
|
||||||
|
return True
|
||||||
|
else:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||||
|
detail="Incorrect username or password",
|
||||||
|
headers={"WWW-Authenticate": "Basic"},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
app = FastAPI()
|
app = FastAPI()
|
||||||
root = os.path.dirname(os.path.abspath(__file__))
|
root = os.path.dirname(os.path.abspath(__file__))
|
||||||
app.mount("/static", StaticFiles(html=True, directory=os.path.join(root, 'static')), name="static")
|
#app.mount("/static", StaticFiles(html=True, directory=os.path.join(root, 'static')), name="static")
|
||||||
app.mount("/media", StaticFiles(directory=str(MEDIA_DIRECTORY)), name="media")
|
app.mount("/media", StaticFiles(directory=str(MEDIA_DIRECTORY)), name="media")
|
||||||
#app.mount("/", StaticFiles(html=True, directory=os.path.join(root, 'static')), name="www")
|
#app.mount("/", StaticFiles(html=True, directory=os.path.join(root, 'static')), name="www")
|
||||||
|
|
||||||
security = HTTPBasic()
|
security = HTTPBasic()
|
||||||
|
@app.get("/static/{path:path}")
|
||||||
|
async def static_files(request: Request, path: str, authenticated: bool = Depends(authenticate_user)):
|
||||||
|
root = os.path.dirname(os.path.abspath(__file__))
|
||||||
|
static_dir = os.path.join(root, 'static')
|
||||||
|
|
||||||
|
if not path or path == "/":
|
||||||
|
file_path = os.path.join(static_dir, 'index.html')
|
||||||
|
else:
|
||||||
|
file_path = os.path.join(static_dir, path)
|
||||||
|
|
||||||
|
# Check if path is a directory
|
||||||
|
if os.path.isdir(file_path):
|
||||||
|
# If it's a directory, try to serve index.html within that directory
|
||||||
|
index_path = os.path.join(file_path, 'index.html')
|
||||||
|
if os.path.exists(index_path):
|
||||||
|
return FileResponse(index_path)
|
||||||
|
else:
|
||||||
|
# Optionally, you can return a directory listing or a custom 404 page here
|
||||||
|
return HTMLResponse("Directory listing not enabled.", status_code=403)
|
||||||
|
|
||||||
|
if not os.path.exists(file_path):
|
||||||
|
raise HTTPException(status_code=404, detail="File not found")
|
||||||
|
|
||||||
|
return FileResponse(file_path)
|
||||||
|
|
||||||
def get_current_username(
|
def get_current_username(
|
||||||
credentials: Annotated[HTTPBasicCredentials, Depends(security)]
|
credentials: Annotated[HTTPBasicCredentials, Depends(security)]
|
||||||
@ -98,9 +141,9 @@ async def get_api_key(
|
|||||||
return session or api_key
|
return session or api_key
|
||||||
|
|
||||||
#TODO predelat z Async?
|
#TODO predelat z Async?
|
||||||
@app.get("/static")
|
# @app.get("/static")
|
||||||
async def get(username: Annotated[str, Depends(get_current_username)]):
|
# async def get(username: Annotated[str, Depends(get_current_username)]):
|
||||||
return FileResponse("index.html")
|
# return FileResponse("index.html")
|
||||||
|
|
||||||
@app.websocket("/runners/{runner_id}/ws")
|
@app.websocket("/runners/{runner_id}/ws")
|
||||||
async def websocket_endpoint(
|
async def websocket_endpoint(
|
||||||
@ -249,11 +292,13 @@ def _run_stratin(stratin_id: UUID, runReq: RunRequest):
|
|||||||
runReq.bt_to = zoneNY.localize(runReq.bt_to)
|
runReq.bt_to = zoneNY.localize(runReq.bt_to)
|
||||||
#pokud jedeme nad test intervaly anebo je požadováno více dní - pouštíme jako batch day by day
|
#pokud jedeme nad test intervaly anebo je požadováno více dní - pouštíme jako batch day by day
|
||||||
#do budoucna dát na FE jako flag
|
#do budoucna dát na FE jako flag
|
||||||
if runReq.mode != Mode.LIVE and runReq.test_batch_id is not None or (runReq.bt_from.date() != runReq.bt_to.date()):
|
#print(runReq)
|
||||||
|
if runReq.mode not in [Mode.LIVE, Mode.PAPER] and (runReq.test_batch_id is not None or (runReq.bt_from is not None and runReq.bt_to is not None and runReq.bt_from.date() != runReq.bt_to.date())):
|
||||||
res, id = cs.run_batch_stratin(id=stratin_id, runReq=runReq)
|
res, id = cs.run_batch_stratin(id=stratin_id, runReq=runReq)
|
||||||
else:
|
else:
|
||||||
if runReq.weekdays_filter is not None:
|
#not necessary for live/paper the weekdays are simply ignored, in the future maybe add validation if weekdays are presented
|
||||||
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Weekday only for backtest mode with batch (not single day)")
|
#if runReq.weekdays_filter is not None:
|
||||||
|
# raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Weekday only for backtest mode with batch (not single day)")
|
||||||
res, id = cs.run_stratin(id=stratin_id, runReq=runReq)
|
res, id = cs.run_stratin(id=stratin_id, runReq=runReq)
|
||||||
if res == 0: return id
|
if res == 0: return id
|
||||||
elif res < 0:
|
elif res < 0:
|
||||||
@ -552,30 +597,68 @@ def _get_archived_runner_log_byID(runner_id: UUID, timestamp_from: float, timest
|
|||||||
else:
|
else:
|
||||||
raise HTTPException(status_code=404, detail=f"No logs found with id: {runner_id} and between {timestamp_from} and {timestamp_to}")
|
raise HTTPException(status_code=404, detail=f"No logs found with id: {runner_id} and between {timestamp_from} and {timestamp_to}")
|
||||||
|
|
||||||
|
def remove_ansi_codes(text):
|
||||||
|
ansi_escape = re.compile(r'\x1B[@-_][0-?]*[ -/]*[@-~]')
|
||||||
|
return ansi_escape.sub('', text)
|
||||||
|
|
||||||
# endregion
|
# endregion
|
||||||
# A simple function to read the last lines of a file
|
# A simple function to read the last lines of a file
|
||||||
def tail(file_path, n=10, buffer_size=1024):
|
# def tail(file_path, n=10, buffer_size=1024):
|
||||||
with open(file_path, 'rb') as f:
|
# try:
|
||||||
f.seek(0, 2) # Move to the end of the file
|
# with open(file_path, 'rb') as f:
|
||||||
file_size = f.tell()
|
# f.seek(0, 2) # Move to the end of the file
|
||||||
lines = []
|
# file_size = f.tell()
|
||||||
buffer = bytearray()
|
# lines = []
|
||||||
|
# buffer = bytearray()
|
||||||
|
|
||||||
for i in range(file_size // buffer_size + 1):
|
# for i in range(file_size // buffer_size + 1):
|
||||||
read_start = max(-buffer_size * (i + 1), -file_size)
|
# read_start = max(-buffer_size * (i + 1), -file_size)
|
||||||
f.seek(read_start, 2)
|
# f.seek(read_start, 2)
|
||||||
read_size = min(buffer_size, file_size - buffer_size * i)
|
# read_size = min(buffer_size, file_size - buffer_size * i)
|
||||||
buffer[0:0] = f.read(read_size) # Prepend to buffer
|
# buffer[0:0] = f.read(read_size) # Prepend to buffer
|
||||||
|
|
||||||
if buffer.count(b'\n') >= n + 1:
|
# if buffer.count(b'\n') >= n + 1:
|
||||||
break
|
# break
|
||||||
|
|
||||||
lines = buffer.decode(errors='ignore').splitlines()[-n:]
|
# lines = buffer.decode(errors='ignore').splitlines()[-n:]
|
||||||
return lines
|
# lines = [remove_ansi_codes(line) for line in lines]
|
||||||
|
# return lines
|
||||||
|
# except Exception as e:
|
||||||
|
# return [str(e) + format_exc()]
|
||||||
|
|
||||||
|
#updated version that reads lines line by line
|
||||||
|
def tail(file_path, n=10):
|
||||||
|
try:
|
||||||
|
with open(file_path, 'rb') as f:
|
||||||
|
f.seek(0, 2) # Move to the end of the file
|
||||||
|
file_size = f.tell()
|
||||||
|
lines = []
|
||||||
|
line = b''
|
||||||
|
|
||||||
|
f.seek(-1, 2) # Start at the last byte
|
||||||
|
while len(lines) < n and f.tell() != 0:
|
||||||
|
byte = f.read(1)
|
||||||
|
if byte == b'\n':
|
||||||
|
# Decode, remove ANSI codes, and append the line
|
||||||
|
lines.append(remove_ansi_codes(line.decode(errors='ignore')))
|
||||||
|
line = b''
|
||||||
|
else:
|
||||||
|
line = byte + line
|
||||||
|
f.seek(-2, 1) # Move backwards by two bytes
|
||||||
|
|
||||||
|
if line:
|
||||||
|
# Append any remaining line after removing ANSI codes
|
||||||
|
lines.append(remove_ansi_codes(line.decode(errors='ignore')))
|
||||||
|
|
||||||
|
return lines[::-1] # Reverse the list to get the lines in correct order
|
||||||
|
except Exception as e:
|
||||||
|
return [str(e)]
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
@app.get("/log", dependencies=[Depends(api_key_auth)])
|
@app.get("/log", dependencies=[Depends(api_key_auth)])
|
||||||
def read_log(lines: int = 10):
|
def read_log(lines: int = 700, logfile: str = "strat.log"):
|
||||||
log_path = LOG_FILE
|
log_path = LOG_PATH / logfile
|
||||||
return {"lines": tail(log_path, lines)}
|
return {"lines": tail(log_path, lines)}
|
||||||
|
|
||||||
#get alpaca history bars
|
#get alpaca history bars
|
||||||
@ -609,7 +692,7 @@ def _generate_report_image(runner_ids: list[UUID]):
|
|||||||
res, stream = generate_trading_report_image(runner_ids=runner_ids,stream=True)
|
res, stream = generate_trading_report_image(runner_ids=runner_ids,stream=True)
|
||||||
if res == 0: return StreamingResponse(stream, media_type="image/png",headers={"Content-Disposition": "attachment; filename=report.png"})
|
if res == 0: return StreamingResponse(stream, media_type="image/png",headers={"Content-Disposition": "attachment; filename=report.png"})
|
||||||
elif res < 0:
|
elif res < 0:
|
||||||
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error: {res}:{id}")
|
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error: {res}:{stream}")
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error: {str(e)}" + format_exc())
|
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error: {str(e)}" + format_exc())
|
||||||
|
|
||||||
@ -645,7 +728,8 @@ def _generate_analysis(analyzerInputs: AnalyzerInputs):
|
|||||||
|
|
||||||
if res == 0: return StreamingResponse(stream, media_type="image/png")
|
if res == 0: return StreamingResponse(stream, media_type="image/png")
|
||||||
elif res < 0:
|
elif res < 0:
|
||||||
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error: {res}:{id}")
|
print("Error when generating analysis: ",str(stream))
|
||||||
|
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error: {res}:{stream}")
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error: {str(e)}" + format_exc())
|
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error: {str(e)}" + format_exc())
|
||||||
|
|
||||||
@ -674,7 +758,7 @@ def get_testlists():
|
|||||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=f"No data found")
|
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=f"No data found")
|
||||||
|
|
||||||
# API endpoint to retrieve a single record by ID
|
# API endpoint to retrieve a single record by ID
|
||||||
@app.get('/testlists/{record_id}')
|
@app.get('/testlists/{record_id}', dependencies=[Depends(api_key_auth)])
|
||||||
def get_testlist(record_id: str):
|
def get_testlist(record_id: str):
|
||||||
res, testlist = cs.get_testlist_byID(record_id=record_id)
|
res, testlist = cs.get_testlist_byID(record_id=record_id)
|
||||||
|
|
||||||
@ -684,7 +768,7 @@ def get_testlist(record_id: str):
|
|||||||
raise HTTPException(status_code=404, detail='Record not found')
|
raise HTTPException(status_code=404, detail='Record not found')
|
||||||
|
|
||||||
# API endpoint to update a record
|
# API endpoint to update a record
|
||||||
@app.put('/testlists/{record_id}')
|
@app.put('/testlists/{record_id}', dependencies=[Depends(api_key_auth)])
|
||||||
def update_testlist(record_id: str, testlist: TestList):
|
def update_testlist(record_id: str, testlist: TestList):
|
||||||
# Check if the record exists
|
# Check if the record exists
|
||||||
conn = pool.get_connection()
|
conn = pool.get_connection()
|
||||||
@ -704,7 +788,7 @@ def update_testlist(record_id: str, testlist: TestList):
|
|||||||
return testlist
|
return testlist
|
||||||
|
|
||||||
# API endpoint to delete a record
|
# API endpoint to delete a record
|
||||||
@app.delete('/testlists/{record_id}')
|
@app.delete('/testlists/{record_id}', dependencies=[Depends(api_key_auth)])
|
||||||
def delete_testlist(record_id: str):
|
def delete_testlist(record_id: str):
|
||||||
# Check if the record exists
|
# Check if the record exists
|
||||||
conn = pool.get_connection()
|
conn = pool.get_connection()
|
||||||
@ -727,7 +811,7 @@ def delete_testlist(record_id: str):
|
|||||||
# Get all config items
|
# Get all config items
|
||||||
@app.get("/config-items/", dependencies=[Depends(api_key_auth)])
|
@app.get("/config-items/", dependencies=[Depends(api_key_auth)])
|
||||||
def get_all_items() -> list[ConfigItem]:
|
def get_all_items() -> list[ConfigItem]:
|
||||||
res, sada = cs.get_all_config_items()
|
res, sada = cf.get_all_config_items()
|
||||||
if res == 0:
|
if res == 0:
|
||||||
return sada
|
return sada
|
||||||
else:
|
else:
|
||||||
@ -737,7 +821,7 @@ def get_all_items() -> list[ConfigItem]:
|
|||||||
# Get a config item by ID
|
# Get a config item by ID
|
||||||
@app.get("/config-items/{item_id}", dependencies=[Depends(api_key_auth)])
|
@app.get("/config-items/{item_id}", dependencies=[Depends(api_key_auth)])
|
||||||
def get_item(item_id: int)-> ConfigItem:
|
def get_item(item_id: int)-> ConfigItem:
|
||||||
res, sada = cs.get_config_item_by_id(item_id)
|
res, sada = cf.get_config_item_by_id(item_id)
|
||||||
if res == 0:
|
if res == 0:
|
||||||
return sada
|
return sada
|
||||||
else:
|
else:
|
||||||
@ -746,7 +830,7 @@ def get_item(item_id: int)-> ConfigItem:
|
|||||||
# Get a config item by Name
|
# Get a config item by Name
|
||||||
@app.get("/config-items-by-name/", dependencies=[Depends(api_key_auth)])
|
@app.get("/config-items-by-name/", dependencies=[Depends(api_key_auth)])
|
||||||
def get_item(item_name: str)-> ConfigItem:
|
def get_item(item_name: str)-> ConfigItem:
|
||||||
res, sada = cs.get_config_item_by_name(item_name)
|
res, sada = cf.get_config_item_by_name(item_name)
|
||||||
if res == 0:
|
if res == 0:
|
||||||
return sada
|
return sada
|
||||||
else:
|
else:
|
||||||
@ -755,7 +839,7 @@ def get_item(item_name: str)-> ConfigItem:
|
|||||||
# Create a new config item
|
# Create a new config item
|
||||||
@app.post("/config-items/", dependencies=[Depends(api_key_auth)], status_code=status.HTTP_200_OK)
|
@app.post("/config-items/", dependencies=[Depends(api_key_auth)], status_code=status.HTTP_200_OK)
|
||||||
def create_item(config_item: ConfigItem) -> ConfigItem:
|
def create_item(config_item: ConfigItem) -> ConfigItem:
|
||||||
res, sada = cs.create_config_item(config_item)
|
res, sada = cf.create_config_item(config_item)
|
||||||
if res == 0: return sada
|
if res == 0: return sada
|
||||||
else:
|
else:
|
||||||
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error not created: {res}:{id} {sada}")
|
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error not created: {res}:{id} {sada}")
|
||||||
@ -764,11 +848,11 @@ def create_item(config_item: ConfigItem) -> ConfigItem:
|
|||||||
# Update a config item by ID
|
# Update a config item by ID
|
||||||
@app.put("/config-items/{item_id}", dependencies=[Depends(api_key_auth)])
|
@app.put("/config-items/{item_id}", dependencies=[Depends(api_key_auth)])
|
||||||
def update_item(item_id: int, config_item: ConfigItem) -> ConfigItem:
|
def update_item(item_id: int, config_item: ConfigItem) -> ConfigItem:
|
||||||
res, sada = cs.get_config_item_by_id(item_id)
|
res, sada = cf.get_config_item_by_id(item_id)
|
||||||
if res != 0:
|
if res != 0:
|
||||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=f"No data found")
|
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=f"No data found")
|
||||||
|
|
||||||
res, sada = cs.update_config_item(item_id, config_item)
|
res, sada = cf.update_config_item(item_id, config_item)
|
||||||
if res == 0: return sada
|
if res == 0: return sada
|
||||||
else:
|
else:
|
||||||
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error not created: {res}:{id}")
|
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error not created: {res}:{id}")
|
||||||
@ -777,17 +861,77 @@ def update_item(item_id: int, config_item: ConfigItem) -> ConfigItem:
|
|||||||
# Delete a config item by ID
|
# Delete a config item by ID
|
||||||
@app.delete("/config-items/{item_id}", dependencies=[Depends(api_key_auth)])
|
@app.delete("/config-items/{item_id}", dependencies=[Depends(api_key_auth)])
|
||||||
def delete_item(item_id: int) -> dict:
|
def delete_item(item_id: int) -> dict:
|
||||||
res, sada = cs.get_config_item_by_id(item_id)
|
res, sada = cf.get_config_item_by_id(item_id)
|
||||||
if res != 0:
|
if res != 0:
|
||||||
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=f"No data found")
|
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=f"No data found")
|
||||||
|
|
||||||
res, sada = cs.delete_config_item(item_id)
|
res, sada = cf.delete_config_item(item_id)
|
||||||
if res == 0: return sada
|
if res == 0: return sada
|
||||||
else:
|
else:
|
||||||
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error not created: {res}:{id}")
|
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error not created: {res}:{id}")
|
||||||
|
|
||||||
# endregion
|
# endregion
|
||||||
|
|
||||||
|
# region scheduler
|
||||||
|
# 1. Fetch All RunManagerRecords
|
||||||
|
@app.get("/run_manager_records/", dependencies=[Depends(api_key_auth)], response_model=List[RunManagerRecord])
|
||||||
|
#TODO zvazit rozsireni vystupu o strat_status (running/stopped)
|
||||||
|
def get_all_run_manager_records():
|
||||||
|
result, records = rm.fetch_all_run_manager_records()
|
||||||
|
if result != 0:
|
||||||
|
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail="Error fetching records")
|
||||||
|
return records
|
||||||
|
|
||||||
|
# 2. Fetch RunManagerRecord by ID
|
||||||
|
@app.get("/run_manager_records/{record_id}", dependencies=[Depends(api_key_auth)], response_model=RunManagerRecord)
|
||||||
|
#TODO zvazit rozsireni vystupu o strat_status (running/stopped)
|
||||||
|
def get_run_manager_record(record_id: UUID):
|
||||||
|
result, record = rm.fetch_run_manager_record_by_id(record_id)
|
||||||
|
if result == -2: # Record not found
|
||||||
|
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Record not found")
|
||||||
|
elif result != 0:
|
||||||
|
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail="Error fetching record")
|
||||||
|
return record
|
||||||
|
|
||||||
|
# 3. Update RunManagerRecord
|
||||||
|
@app.patch("/run_manager_records/{record_id}", dependencies=[Depends(api_key_auth)], status_code=status.HTTP_200_OK)
|
||||||
|
def update_run_manager_record(record_id: UUID, update_data: RunManagerRecord):
|
||||||
|
#make dates zone aware zoneNY
|
||||||
|
# if update_data.valid_from is not None:
|
||||||
|
# update_data.valid_from = zoneNY.localize(update_data.valid_from)
|
||||||
|
# if update_data.valid_to is not None:
|
||||||
|
# update_data.valid_to = zoneNY.localize(update_data.valid_to)
|
||||||
|
result, message = rm.update_run_manager_record(record_id, update_data)
|
||||||
|
if result == -2: # Update failed
|
||||||
|
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=message)
|
||||||
|
elif result != 0:
|
||||||
|
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail=f"Error during update {result} {message}")
|
||||||
|
return {"message": "Record updated successfully"}
|
||||||
|
|
||||||
|
# 4. Delete RunManagerRecord
|
||||||
|
@app.delete("/run_manager_records/{record_id}", dependencies=[Depends(api_key_auth)], status_code=status.HTTP_200_OK)
|
||||||
|
def delete_run_manager_record(record_id: UUID):
|
||||||
|
result, message = rm.delete_run_manager_record(record_id)
|
||||||
|
if result == -2: # Delete failed
|
||||||
|
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=message)
|
||||||
|
elif result != 0:
|
||||||
|
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error during deletion {result} {message}")
|
||||||
|
return {"message": "Record deleted successfully"}
|
||||||
|
|
||||||
|
@app.post("/run_manager_records/", status_code=status.HTTP_201_CREATED)
|
||||||
|
def create_run_manager_record(new_record: RunManagerRecord, api_key_auth: Depends = Depends(api_key_auth)):
|
||||||
|
#make date zone aware - convert to zoneNY
|
||||||
|
# if new_record.valid_from is not None:
|
||||||
|
# new_record.valid_from = zoneNY.localize(new_record.valid_from)
|
||||||
|
# if new_record.valid_to is not None:
|
||||||
|
# new_record.valid_to = zoneNY.localize(new_record.valid_to)
|
||||||
|
|
||||||
|
result, record_id = rm.add_run_manager_record(new_record)
|
||||||
|
if result != 0:
|
||||||
|
raise HTTPException(status_code=status.HTTP_406_NOT_ACCEPTABLE, detail=f"Error during record creation: {result} {record_id}")
|
||||||
|
return {"id": record_id}
|
||||||
|
# endregion
|
||||||
|
|
||||||
#model section
|
#model section
|
||||||
#UPLOAD MODEL
|
#UPLOAD MODEL
|
||||||
@app.post("/model/upload_model", dependencies=[Depends(api_key_auth)])
|
@app.post("/model/upload_model", dependencies=[Depends(api_key_auth)])
|
||||||
@ -924,7 +1068,22 @@ if __name__ == "__main__":
|
|||||||
insert_thread = Thread(target=insert_queue2db)
|
insert_thread = Thread(target=insert_queue2db)
|
||||||
insert_thread.start()
|
insert_thread.start()
|
||||||
|
|
||||||
|
#attach debugGER to be able to debug scheduler jobs (run in separate threads)
|
||||||
|
# debugpy.listen(('localhost', 5678))
|
||||||
|
# print("Waiting for debugger to attach...")
|
||||||
|
# debugpy.wait_for_client() # Script will pause here until debugger is attached
|
||||||
|
|
||||||
|
#init scheduled tasks from schedule table
|
||||||
|
#Add APS scheduler job refresh
|
||||||
|
res, result = aps.initialize_jobs()
|
||||||
|
if res < 0:
|
||||||
|
#raise exception
|
||||||
|
raise Exception(f"Error {res} initializing APS jobs, error {result}")
|
||||||
|
|
||||||
uvicorn.run("__main__:app", host="0.0.0.0", port=8000, reload=False)
|
uvicorn.run("__main__:app", host="0.0.0.0", port=8000, reload=False)
|
||||||
|
except Exception as e:
|
||||||
|
print("Error intializing app: " + str(e) + format_exc())
|
||||||
|
aps.scheduler.shutdown(wait=False)
|
||||||
finally:
|
finally:
|
||||||
print("closing insert_conn connection")
|
print("closing insert_conn connection")
|
||||||
insert_conn.close()
|
insert_conn.close()
|
||||||
|
|||||||
0
v2realbot/scheduler/__init__.py
Normal file
0
v2realbot/scheduler/__init__.py
Normal file
310
v2realbot/scheduler/ap_scheduler.py
Normal file
310
v2realbot/scheduler/ap_scheduler.py
Normal file
@ -0,0 +1,310 @@
|
|||||||
|
from uuid import UUID
|
||||||
|
from typing import Any, List, Tuple
|
||||||
|
from uuid import UUID, uuid4
|
||||||
|
from v2realbot.enums.enums import Moddus, SchedulerStatus, RecordType, StartBarAlign, Mode, Account, OrderSide
|
||||||
|
from v2realbot.common.model import RunManagerRecord, StrategyInstance, RunDay, StrategyInstance, Runner, RunRequest, RunArchive, RunArchiveView, RunArchiveViewPagination, RunArchiveDetail, RunArchiveChange, Bar, TradeEvent, TestList, Intervals, ConfigItem, InstantIndicator, DataTablesRequest, Market
|
||||||
|
from v2realbot.utils.utils import validate_and_format_time, AttributeDict, zoneNY, zonePRG, safe_get, dict_replace_value, Store, parse_toml_string, json_serial, is_open_hours, send_to_telegram, concatenate_weekdays, transform_data
|
||||||
|
from v2realbot.common.PrescribedTradeModel import Trade, TradeDirection, TradeStatus, TradeStoplossType
|
||||||
|
from datetime import datetime
|
||||||
|
from v2realbot.config import JOB_LOG_FILE, STRATVARS_UNCHANGEABLES, ACCOUNT1_PAPER_API_KEY, ACCOUNT1_PAPER_SECRET_KEY, ACCOUNT1_LIVE_API_KEY, ACCOUNT1_LIVE_SECRET_KEY, DATA_DIR, MEDIA_DIRECTORY, RUNNER_DETAIL_DIRECTORY
|
||||||
|
import numpy as np
|
||||||
|
from rich import print as richprint
|
||||||
|
import v2realbot.controller.services as cs
|
||||||
|
import v2realbot.controller.run_manager as rm
|
||||||
|
import v2realbot.scheduler.scheduler as sch
|
||||||
|
from apscheduler.schedulers.background import BackgroundScheduler
|
||||||
|
from apscheduler.triggers.cron import CronTrigger
|
||||||
|
from apscheduler.job import Job
|
||||||
|
|
||||||
|
#NOTE zatím není podporováno spouštění strategie přes půlnoc - musí se dořešit weekday_filter
|
||||||
|
#který je zatím jen jeden jak pro start_time tak stop_time - což by v případě strategií běžících
|
||||||
|
#přes půlnoc nezafungovalo (stop by byl následující den a scheduler by jej nespustil)
|
||||||
|
|
||||||
|
def format_apscheduler_jobs(jobs: list[Job]) -> list[dict]:
|
||||||
|
if not jobs:
|
||||||
|
print("No scheduled jobs.")
|
||||||
|
return
|
||||||
|
|
||||||
|
jobs_info = []
|
||||||
|
|
||||||
|
for job in jobs:
|
||||||
|
job_info = {
|
||||||
|
"Job ID": job.id,
|
||||||
|
"Next Run Time": job.next_run_time,
|
||||||
|
"Job Function": job.func.__name__,
|
||||||
|
"Trigger": str(job.trigger),
|
||||||
|
"Job Args": ', '.join(map(str, job.args)),
|
||||||
|
"Job Kwargs": ', '.join(f"{k}={v}" for k, v in job.kwargs.items())
|
||||||
|
}
|
||||||
|
jobs_info.append(job_info)
|
||||||
|
|
||||||
|
return jobs_info
|
||||||
|
|
||||||
|
def get_day_of_week(weekdays_filter):
|
||||||
|
if not weekdays_filter:
|
||||||
|
return '*' # All days of the week
|
||||||
|
return ','.join(map(str, weekdays_filter))
|
||||||
|
|
||||||
|
#initialize_jobs se spousti
|
||||||
|
#- pri spusteni
|
||||||
|
#- triggerovano z add/update a delete
|
||||||
|
|
||||||
|
#zatim cely refresh, v budoucnu upravime jen na zmene menene polozky - viz
|
||||||
|
#https://chat.openai.com/c/2a1423ee-59df-47ff-b073-0c49ade51ed7
|
||||||
|
|
||||||
|
#pomocna funkce, ktera vraci strat_id, ktera jsou v scheduleru vickrat (logika pro ne se lisi)
|
||||||
|
def stratin_occurences(all_records: list[RunManagerRecord]):
|
||||||
|
# Count occurrences
|
||||||
|
strat_id_counts = {}
|
||||||
|
for record in all_records:
|
||||||
|
if record.strat_id in strat_id_counts:
|
||||||
|
strat_id_counts[record.strat_id] += 1
|
||||||
|
else:
|
||||||
|
strat_id_counts[record.strat_id] = 1
|
||||||
|
|
||||||
|
# Find strat_id values that appear twice or more
|
||||||
|
repeated_strat_ids = [strat_id for strat_id, count in strat_id_counts.items() if count >= 2]
|
||||||
|
|
||||||
|
return 0, repeated_strat_ids
|
||||||
|
|
||||||
|
|
||||||
|
def initialize_jobs(run_manager_records: RunManagerRecord = None):
|
||||||
|
"""
|
||||||
|
Initialize all scheduled jobs from RunManagerRecords with moddus = "schedule"
|
||||||
|
Triggered on app init and update of table
|
||||||
|
It deleted all "schedule_" prefixed jobs and schedule new ones base on runmanager table
|
||||||
|
prefiX of "schedule_" in aps scheduler allows to distinguisd schedule types jobs and allows more jobs categories
|
||||||
|
|
||||||
|
Parameters
|
||||||
|
----------
|
||||||
|
run_manager_records : RunManagerRecord, optional
|
||||||
|
RunManagerRecords to initialize the jobs from, by default None
|
||||||
|
|
||||||
|
Returns
|
||||||
|
-------
|
||||||
|
Tuple[int, Union[List[dict], str]]
|
||||||
|
A tuple containing an error code and a message. If there is no error, the
|
||||||
|
message will contain a list of dictionaries with information about the
|
||||||
|
scheduled jobs, otherwise it will contain an error message.
|
||||||
|
"""
|
||||||
|
if run_manager_records is None:
|
||||||
|
res, run_manager_records = rm.fetch_all_run_manager_records()
|
||||||
|
if res < 0:
|
||||||
|
err_msg= f"Error {res} fetching all runmanager records, error {run_manager_records}"
|
||||||
|
print(err_msg)
|
||||||
|
return -2, err_msg
|
||||||
|
|
||||||
|
scheduled_jobs = scheduler.get_jobs()
|
||||||
|
|
||||||
|
#print(f"Current {len(scheduled_jobs)} scheduled jobs: {str(scheduled_jobs)}")
|
||||||
|
for job in scheduled_jobs:
|
||||||
|
if job.id.startswith("scheduler_"):
|
||||||
|
scheduler.remove_job(job.id)
|
||||||
|
record : RunManagerRecord = None
|
||||||
|
for record in run_manager_records:
|
||||||
|
if record.status == SchedulerStatus.ACTIVE and record.moddus == Moddus.SCHEDULE:
|
||||||
|
day_of_week = get_day_of_week(record.weekdays_filter)
|
||||||
|
|
||||||
|
hour, minute = map(int, record.start_time.split(':'))
|
||||||
|
start_trigger = CronTrigger(day_of_week=day_of_week, hour=hour, minute=minute,
|
||||||
|
start_date=record.valid_from, end_date=record.valid_to, timezone=zoneNY)
|
||||||
|
stop_hour, stop_minute = map(int, record.stop_time.split(':'))
|
||||||
|
stop_trigger = CronTrigger(day_of_week=day_of_week, hour=stop_hour, minute=stop_minute,
|
||||||
|
start_date=record.valid_from, end_date=record.valid_to, timezone=zoneNY)
|
||||||
|
|
||||||
|
# Schedule new jobs with the 'scheduler_' prefix
|
||||||
|
scheduler.add_job(start_runman_record, start_trigger, id=f"scheduler_start_{record.id}", args=[record.id])
|
||||||
|
scheduler.add_job(stop_runman_record, stop_trigger, id=f"scheduler_stop_{record.id}", args=[record.id])
|
||||||
|
|
||||||
|
#scheduler.add_job(print_hello, 'interval', seconds=10, id=
|
||||||
|
# f"scheduler_testinterval")
|
||||||
|
scheduled_jobs = scheduler.get_jobs()
|
||||||
|
print(f"APS jobs refreshed ({len(scheduled_jobs)})")
|
||||||
|
current_jobs_dict = format_apscheduler_jobs(scheduled_jobs)
|
||||||
|
richprint(current_jobs_dict)
|
||||||
|
return 0, current_jobs_dict
|
||||||
|
|
||||||
|
#zastresovaci funkce resici error handling a printing
|
||||||
|
def start_runman_record(id: UUID, debug_date = None):
|
||||||
|
record = None
|
||||||
|
res, record, msg = _start_runman_record(id=id, debug_date=debug_date)
|
||||||
|
|
||||||
|
if record is not None:
|
||||||
|
market_time_now = datetime.now().astimezone(zoneNY) if debug_date is None else debug_date
|
||||||
|
record.last_processed = market_time_now
|
||||||
|
formatted_date = market_time_now.strftime("%y.%m.%d %H:%M:%S")
|
||||||
|
history_string = f"{formatted_date}"
|
||||||
|
history_string += " STARTED" if res == 0 else "NOTE:" + msg if res == -1 else "ERROR:" + msg
|
||||||
|
print(history_string)
|
||||||
|
if record.history is None:
|
||||||
|
record.history = history_string
|
||||||
|
else:
|
||||||
|
record.history += "\n" + history_string
|
||||||
|
|
||||||
|
rs, msg_rs = update_runman_record(record)
|
||||||
|
if rs < 0:
|
||||||
|
msg_rs = f"Error saving result to history: {msg_rs}"
|
||||||
|
print(msg_rs)
|
||||||
|
send_to_telegram(msg_rs)
|
||||||
|
|
||||||
|
|
||||||
|
if res < -1:
|
||||||
|
msg = f"START JOB: {id} ERROR\n" + msg
|
||||||
|
send_to_telegram(msg)
|
||||||
|
print(msg)
|
||||||
|
else:
|
||||||
|
print(f"START JOB: {id} FINISHED {res}")
|
||||||
|
|
||||||
|
|
||||||
|
def update_runman_record(record: RunManagerRecord):
|
||||||
|
#update record (nejspis jeste upravit - last_run a history)
|
||||||
|
res, set = rm.update_run_manager_record(record.id, record)
|
||||||
|
if res == 0:
|
||||||
|
print(f"Record updated {set}")
|
||||||
|
return 0, "OK"
|
||||||
|
else:
|
||||||
|
err_msg= f"STOP: Error updating {record.id} errir {set} with values {record}"
|
||||||
|
return -2, err_msg#toto stopne zpracovani dalsich zaznamu pri chybe, zvazit continue
|
||||||
|
|
||||||
|
def stop_runman_record(id: UUID, debug_date = None):
|
||||||
|
res, record, msg = _stop_runman_record(id=id, debug_date=debug_date)
|
||||||
|
#results : 0 - ok, -1 not running/already running/not specific, -2 error
|
||||||
|
|
||||||
|
#report vzdy zapiseme do history, pokud je record not None, pripadna chyba se stala po dotazeni recordu
|
||||||
|
if record is not None:
|
||||||
|
market_time_now = datetime.now().astimezone(zoneNY) if debug_date is None else debug_date
|
||||||
|
record.last_processed = market_time_now
|
||||||
|
formatted_date = market_time_now.strftime("%y.%m.%d %H:%M:%S")
|
||||||
|
history_string = f"{formatted_date}"
|
||||||
|
history_string += " STOPPED" if res == 0 else "NOTE:" + msg if res == -1 else "ERROR:" + msg
|
||||||
|
print(history_string)
|
||||||
|
if record.history is None:
|
||||||
|
record.history = history_string
|
||||||
|
else:
|
||||||
|
record.history += "\n" + history_string
|
||||||
|
|
||||||
|
rs, msg_rs = update_runman_record(record)
|
||||||
|
if rs < 0:
|
||||||
|
msg_rs = f"Error saving result to history: {msg_rs}"
|
||||||
|
print(msg_rs)
|
||||||
|
send_to_telegram(msg_rs)
|
||||||
|
|
||||||
|
if res < -1:
|
||||||
|
msg = f"STOP JOB: {id} ERROR\n" + msg
|
||||||
|
send_to_telegram(msg)
|
||||||
|
print(msg)
|
||||||
|
else:
|
||||||
|
print(f"STOP JOB: {id} FINISHED")
|
||||||
|
|
||||||
|
#start function that is called from the job
|
||||||
|
def _start_runman_record(id: UUID, debug_date = None):
|
||||||
|
print(f"Start scheduled record {id}")
|
||||||
|
|
||||||
|
record : RunManagerRecord = None
|
||||||
|
res, result = rm.fetch_run_manager_record_by_id(id)
|
||||||
|
if res < 0:
|
||||||
|
result = "Error fetching run manager record by id: " + str(id) + " Error: " + str(result)
|
||||||
|
return res, record, result
|
||||||
|
|
||||||
|
record = result
|
||||||
|
|
||||||
|
if record.market == Market.US or record.market == Market.CRYPTO:
|
||||||
|
res, sada = sch.get_todays_market_times(market=record.market, debug_date=debug_date)
|
||||||
|
if res == 0:
|
||||||
|
market_time_now, market_open_datetime, market_close_datetime = sada
|
||||||
|
print(f"OPEN:{market_open_datetime} CLOSE:{market_close_datetime}")
|
||||||
|
else:
|
||||||
|
sada = f"Market {record.market} Error getting market times (CLOSED): " + str(sada)
|
||||||
|
return res, record, sada
|
||||||
|
else:
|
||||||
|
print("Market type is unknown.")
|
||||||
|
if cs.is_stratin_running(record.strat_id):
|
||||||
|
return -1, record, f"Stratin {record.strat_id} is already running"
|
||||||
|
|
||||||
|
res, result = sch.run_scheduled_strategy(record)
|
||||||
|
if res < 0:
|
||||||
|
result = "Error running strategy: " + str(result)
|
||||||
|
return res, record, result
|
||||||
|
else:
|
||||||
|
record.runner_id = UUID(result)
|
||||||
|
|
||||||
|
return 0, record, record.runner_id
|
||||||
|
|
||||||
|
#stop function that is called from the job
|
||||||
|
def _stop_runman_record(id: UUID, debug_date = None):
|
||||||
|
record = None
|
||||||
|
#get all records
|
||||||
|
print(f"Stopping record {id}")
|
||||||
|
res, all_records = rm.fetch_all_run_manager_records()
|
||||||
|
if res < 0:
|
||||||
|
err_msg= f"Error {res} fetching all runmanager records, error {all_records}"
|
||||||
|
return -2, record, err_msg
|
||||||
|
|
||||||
|
record : RunManagerRecord = None
|
||||||
|
for rec in all_records:
|
||||||
|
if rec.id == id:
|
||||||
|
record = rec
|
||||||
|
break
|
||||||
|
|
||||||
|
if record is None:
|
||||||
|
return -2, record, f"Record id {id} not found"
|
||||||
|
|
||||||
|
#strat_ids that are repeated
|
||||||
|
res, repeated_strat_ids = stratin_occurences(all_records)
|
||||||
|
if res < 0:
|
||||||
|
err_msg= f"Error {res} finding repeated strat_ids, error {repeated_strat_ids}"
|
||||||
|
return -2, record, err_msg
|
||||||
|
|
||||||
|
if record.strat_running is True:
|
||||||
|
#stopneme na zaklade record.runner_id
|
||||||
|
#this code
|
||||||
|
id_to_stop = record.runner_id
|
||||||
|
|
||||||
|
#pokud existuje manualne spustena stejna strategie a neni jich vic - je to jednoznacne - stopneme ji
|
||||||
|
elif cs.is_stratin_running(record.strat_id) and record.strat_id not in repeated_strat_ids:
|
||||||
|
#stopneme na zaklade record.strat_id
|
||||||
|
id_to_stop = record.strat_id
|
||||||
|
|
||||||
|
else:
|
||||||
|
msg = f"strategy {record.strat_id} not RUNNING or not distinctive (manually launched or two strat_ids in scheduler)"
|
||||||
|
print(msg)
|
||||||
|
return -1, record, msg
|
||||||
|
|
||||||
|
print(f"Requesting STOP {id_to_stop}")
|
||||||
|
res, msg = cs.stop_runner(id=id_to_stop)
|
||||||
|
if res < 0:
|
||||||
|
msg = f"ERROR while STOPPING runner_id/strat_id {id_to_stop} {msg}"
|
||||||
|
return -2, record, msg
|
||||||
|
else:
|
||||||
|
record.runner_id = None
|
||||||
|
|
||||||
|
return 0, record, "finished"
|
||||||
|
|
||||||
|
# Global scheduler instance
|
||||||
|
scheduler = BackgroundScheduler(timezone=zoneNY)
|
||||||
|
scheduler.start()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
#use naive datetoime
|
||||||
|
debug_date = None
|
||||||
|
debug_date = datetime(2024, 2, 16, 9, 37, 0, 0)
|
||||||
|
#debug_date = datetime(2024, 2, 16, 10, 30, 0, 0)
|
||||||
|
#debug_date = datetime(2024, 2, 16, 16, 1, 0, 0)
|
||||||
|
|
||||||
|
id = UUID("bc4ec7d2-249b-4799-a02f-f1ce66f83d4a")
|
||||||
|
|
||||||
|
if debug_date is not None:
|
||||||
|
# Localize the naive datetime object to the Eastern timezone
|
||||||
|
debug_date = zoneNY.localize(debug_date)
|
||||||
|
#debugdate formatted as string in format "23.12.2024 9:30"
|
||||||
|
formatted_date = debug_date.strftime("%d.%m.%Y %H:%M")
|
||||||
|
print("Scheduler.py NY time: ", formatted_date)
|
||||||
|
print("ISoformat", debug_date.isoformat())
|
||||||
|
|
||||||
|
# res, result = start_runman_record(id=id, market = "US", debug_date = debug_date)
|
||||||
|
# print(f"CALL FINISHED, with {debug_date} RESULT: {res}, {result}")
|
||||||
|
|
||||||
|
|
||||||
|
res, result = stop_runman_record(id=id, debug_date = debug_date)
|
||||||
|
print(f"CALL FINISHED, with {debug_date} RESULT: {res}, {result}")
|
||||||
439
v2realbot/scheduler/scheduler.py
Normal file
439
v2realbot/scheduler/scheduler.py
Normal file
@ -0,0 +1,439 @@
|
|||||||
|
import json
|
||||||
|
import datetime
|
||||||
|
import v2realbot.controller.services as cs
|
||||||
|
import v2realbot.controller.run_manager as rm
|
||||||
|
from v2realbot.common.model import RunnerView, RunManagerRecord, StrategyInstance, Runner, RunRequest, Trade, RunArchive, RunArchiveView, RunArchiveViewPagination, RunArchiveDetail, Bar, RunArchiveChange, TestList, ConfigItem, InstantIndicator, DataTablesRequest, AnalyzerInputs, Market
|
||||||
|
from uuid import uuid4, UUID
|
||||||
|
from v2realbot.utils.utils import json_serial, send_to_telegram, zoneNY, zonePRG, zoneUTC, fetch_calendar_data
|
||||||
|
from datetime import datetime, timedelta, time
|
||||||
|
from traceback import format_exc
|
||||||
|
from rich import print
|
||||||
|
import requests
|
||||||
|
from v2realbot.config import WEB_API_KEY
|
||||||
|
|
||||||
|
#Puvodni varainta schedulera, ktera mela bezet v pravidelnych intervalech
|
||||||
|
#a spoustet scheduled items v RunManagerRecord
|
||||||
|
#Nově bylo zrefaktorováno a využitý apscheduler - knihovna v pythonu
|
||||||
|
#umožňující plánování jobů, tzn. nyní je každý scheduled záznam RunManagerRecord
|
||||||
|
#naplanovany jako samostatni job a triggerován pouze jednou v daný čas pro start a stop
|
||||||
|
#novy kod v aps_scheduler.py
|
||||||
|
|
||||||
|
def is_US_market_day(date):
|
||||||
|
cal_dates = fetch_calendar_data(date, date)
|
||||||
|
if len(cal_dates) == 0:
|
||||||
|
print("Today is not a market day.")
|
||||||
|
return False, cal_dates
|
||||||
|
else:
|
||||||
|
print("Market is open")
|
||||||
|
return True, cal_dates
|
||||||
|
|
||||||
|
def get_todays_market_times(market, debug_date = None):
|
||||||
|
try:
|
||||||
|
if market == Market.US:
|
||||||
|
#zjistit vsechny podminky - mozna loopovat - podminky jsou vlevo
|
||||||
|
if debug_date is not None:
|
||||||
|
nowNY = debug_date
|
||||||
|
else:
|
||||||
|
nowNY = datetime.now().astimezone(zoneNY)
|
||||||
|
nowNY_date = nowNY.date()
|
||||||
|
#is market open - nyni pouze US
|
||||||
|
stat, calendar_dates = is_US_market_day(nowNY_date)
|
||||||
|
if stat:
|
||||||
|
#zatim podpora pouze main session
|
||||||
|
#pouze main session
|
||||||
|
market_open_datetime = zoneNY.localize(calendar_dates[0].open)
|
||||||
|
market_close_datetime = zoneNY.localize(calendar_dates[0].close)
|
||||||
|
return 0, (nowNY, market_open_datetime, market_close_datetime)
|
||||||
|
else:
|
||||||
|
return -1, "Market is closed."
|
||||||
|
elif market == Market.CRYPTO:
|
||||||
|
now_market_datetime = datetime.now().astimezone(zoneUTC)
|
||||||
|
market_open_datetime = datetime.combine(datetime.now(), time.min)
|
||||||
|
matket_close_datetime = datetime.combine(datetime.now(), time.max)
|
||||||
|
return 0, (now_market_datetime, market_open_datetime, matket_close_datetime)
|
||||||
|
else:
|
||||||
|
return -1, "Market not supported"
|
||||||
|
except Exception as e:
|
||||||
|
err_msg = f"General error in {e} {format_exc()}"
|
||||||
|
print(err_msg)
|
||||||
|
return -2, err_msg
|
||||||
|
|
||||||
|
def get_running_strategies():
|
||||||
|
# Construct the URL for the local REST API endpoint on port 8000
|
||||||
|
api_url = "http://localhost:8000/runners/"
|
||||||
|
|
||||||
|
# Headers for the request
|
||||||
|
headers = {
|
||||||
|
"X-API-Key": WEB_API_KEY
|
||||||
|
}
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Make the GET request to the API with the headers
|
||||||
|
response = requests.get(api_url, headers=headers)
|
||||||
|
|
||||||
|
# Check if the request was successful
|
||||||
|
if response.status_code == 200:
|
||||||
|
runners = response.json()
|
||||||
|
print("Successfully fetched runners.")
|
||||||
|
strat_ids = []
|
||||||
|
ids = []
|
||||||
|
|
||||||
|
for runner_view in runners:
|
||||||
|
strat_ids.append(UUID(runner_view["strat_id"]))
|
||||||
|
ids.append(UUID(runner_view["id"]))
|
||||||
|
|
||||||
|
return 0, (strat_ids, ids)
|
||||||
|
else:
|
||||||
|
err_msg = f"Failed to fetch runners. Status Code: {response.status_code}, Response: {response.text}"
|
||||||
|
print(err_msg)
|
||||||
|
return -2, err_msg
|
||||||
|
except requests.RequestException as e:
|
||||||
|
err_msg = f"Request failed: {str(e)}"
|
||||||
|
print(err_msg)
|
||||||
|
return -2, err_msg
|
||||||
|
|
||||||
|
def stop_strategy(runner_id):
|
||||||
|
# Construct the URL for the local REST API endpoint on port 8000 #option 127.0.0.1
|
||||||
|
api_url = f"http://localhost:8000/runners/{runner_id}/stop"
|
||||||
|
|
||||||
|
# Headers for the request
|
||||||
|
headers = {
|
||||||
|
"X-API-Key": WEB_API_KEY
|
||||||
|
}
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Make the PUT request to the API with the headers
|
||||||
|
response = requests.put(api_url, headers=headers)
|
||||||
|
|
||||||
|
# Check if the request was successful
|
||||||
|
if response.status_code == 200:
|
||||||
|
print(f"Runner/strat_id {runner_id} stopped successfully.")
|
||||||
|
return 0, runner_id
|
||||||
|
else:
|
||||||
|
err_msg = f"Failed to stop runner {runner_id}. Status Code: {response.status_code}, Response: {response.text}"
|
||||||
|
print(err_msg)
|
||||||
|
return -2, err_msg
|
||||||
|
except requests.RequestException as e:
|
||||||
|
err_msg = f"Request failed: {str(e)}"
|
||||||
|
print(err_msg)
|
||||||
|
return -2, err_msg
|
||||||
|
|
||||||
|
def fetch_stratin(stratin_id):
|
||||||
|
# Construct the URL for the REST API endpoint
|
||||||
|
api_url = f"http://localhost:8000/stratins/{stratin_id}"
|
||||||
|
|
||||||
|
# Headers for the request
|
||||||
|
headers = {
|
||||||
|
"X-API-Key": WEB_API_KEY
|
||||||
|
}
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Make the GET request to the API with the headers
|
||||||
|
response = requests.get(api_url, headers=headers)
|
||||||
|
|
||||||
|
# Check if the request was successful
|
||||||
|
if response.status_code == 200:
|
||||||
|
# Parse the response as a StrategyInstance object
|
||||||
|
strategy_instance = response.json()
|
||||||
|
#strategy_instance = response # Assuming the response is in JSON format
|
||||||
|
print(f"StrategyInstance fetched: {stratin_id}")
|
||||||
|
return 0, strategy_instance
|
||||||
|
else:
|
||||||
|
err_msg = f"Failed to fetch StrategyInstance {stratin_id}. " \
|
||||||
|
f"Status Code: {response.status_code}, Response: {response.text}"
|
||||||
|
print(err_msg)
|
||||||
|
return -1, err_msg
|
||||||
|
except requests.RequestException as e:
|
||||||
|
err_msg = f"Request failed: {str(e)}"
|
||||||
|
print(err_msg)
|
||||||
|
return -2, err_msg
|
||||||
|
|
||||||
|
#return list of strat_ids that are in the scheduled table more than once
|
||||||
|
#TODO toto je workaround dokud nebude canndidates logika ze selectu nyni presunuta na fetch_all_run_manager_records a logiku v pythonu
|
||||||
|
def stratin_occurences():
|
||||||
|
#get all records
|
||||||
|
res, all_records = rm.fetch_all_run_manager_records()
|
||||||
|
if res < 0:
|
||||||
|
err_msg= f"Error {res} fetching all runmanager records, error {all_records}"
|
||||||
|
print(err_msg)
|
||||||
|
return -2, err_msg
|
||||||
|
|
||||||
|
# Count occurrences
|
||||||
|
strat_id_counts = {}
|
||||||
|
for record in all_records:
|
||||||
|
if record.strat_id in strat_id_counts:
|
||||||
|
strat_id_counts[record.strat_id] += 1
|
||||||
|
else:
|
||||||
|
strat_id_counts[record.strat_id] = 1
|
||||||
|
|
||||||
|
# Find strat_id values that appear twice or more
|
||||||
|
repeated_strat_ids = [strat_id for strat_id, count in strat_id_counts.items() if count >= 2]
|
||||||
|
|
||||||
|
return 0, repeated_strat_ids
|
||||||
|
|
||||||
|
# in case debug_date is not provided, it takes current time of the given market
|
||||||
|
#V budoucnu zde bude loopa pro kazdy obsluhovany market, nyni pouze US
|
||||||
|
def startstop_scheduled(debug_date = None, market = "US") -> tuple[int, str]:
|
||||||
|
res, sada = get_todays_market_times(market=market, debug_date=debug_date)
|
||||||
|
if res == 0:
|
||||||
|
market_time_now, market_open_datetime, market_close_datetime = sada
|
||||||
|
print(f"OPEN:{market_open_datetime} CLOSE:{market_close_datetime}")
|
||||||
|
else:
|
||||||
|
return res, sada
|
||||||
|
|
||||||
|
#its market day
|
||||||
|
res, candidates = rm.fetch_scheduled_candidates_for_start_and_stop(market_time_now, market)
|
||||||
|
if res == 0:
|
||||||
|
print(f"Candidates fetched, start: {len(candidates['start'])} stop: {len(candidates['stop'])}")
|
||||||
|
else:
|
||||||
|
return res, candidates
|
||||||
|
|
||||||
|
if candidates is None or (len(candidates["start"]) == 0 and len(candidates["stop"]) == 0):
|
||||||
|
return -1, f"No candidates found for {market_time_now} and {market}"
|
||||||
|
#do budoucna, az budou runnery persistovane, bude stav kazde strategie v RunManagerRecord
|
||||||
|
#get current runners (mozna optimalizace, fetch per each section start/stop)
|
||||||
|
res, sada = get_running_strategies()
|
||||||
|
if res < 0:
|
||||||
|
err_msg= f"Error fetching running strategies, error {sada}"
|
||||||
|
print(err_msg)
|
||||||
|
send_to_telegram(err_msg)
|
||||||
|
return -2, err_msg
|
||||||
|
strat_ids_running, runnerids_running = sada
|
||||||
|
print(f"Currently running: {len(strat_ids_running)}")
|
||||||
|
|
||||||
|
#IERATE over START CAndidates
|
||||||
|
record: RunManagerRecord = None
|
||||||
|
print(f"START - Looping over {len(candidates['start'])} candidates")
|
||||||
|
for record in candidates['start']:
|
||||||
|
print("Candidate: ", record)
|
||||||
|
|
||||||
|
if record.weekdays_filter is not None and len(record.weekdays_filter) > 0:
|
||||||
|
curr_weekday = market_time_now.weekday()
|
||||||
|
if curr_weekday not in record.weekdays_filter:
|
||||||
|
print(f"Strategy {record.strat_id} not started, today{curr_weekday} not in weekdays filter {record.weekdays_filter}")
|
||||||
|
continue
|
||||||
|
#one strat_id can run only once at time
|
||||||
|
if record.strat_id in strat_ids_running:
|
||||||
|
msg = f"strategy already {record.strat_id} is running"
|
||||||
|
continue
|
||||||
|
|
||||||
|
res, result = run_scheduled_strategy(record)
|
||||||
|
if res < 0:
|
||||||
|
send_to_telegram(result)
|
||||||
|
print(result)
|
||||||
|
else:
|
||||||
|
record.runner_id = UUID(result)
|
||||||
|
strat_ids_running.append(record.strat_id)
|
||||||
|
runnerids_running.append(record.runner_id)
|
||||||
|
|
||||||
|
record.last_processed = market_time_now
|
||||||
|
history_string = f"{market_time_now.isoformat()} strategy STARTED" if res == 0 else "ERROR:" + result
|
||||||
|
|
||||||
|
if record.history is None:
|
||||||
|
record.history = history_string
|
||||||
|
else:
|
||||||
|
record.history += "\n" + history_string
|
||||||
|
|
||||||
|
#update record (nejspis jeste upravit - last_run a history)
|
||||||
|
res, set = rm.update_run_manager_record(record.id, record)
|
||||||
|
if res == 0:
|
||||||
|
print(f"Record in db updated {set}")
|
||||||
|
#return 0, set
|
||||||
|
else:
|
||||||
|
err_msg= f"Error updating {record.id} errir {set} with values {record}. Process stopped."
|
||||||
|
print(err_msg)
|
||||||
|
send_to_telegram(msg)
|
||||||
|
return -2, err_msg #toto stopne dalsi zpracovani, zvazit continue
|
||||||
|
|
||||||
|
#if stop candidates, then fetch existing runners
|
||||||
|
stop_candidates_cnt = len(candidates['stop'])
|
||||||
|
|
||||||
|
if stop_candidates_cnt > 0:
|
||||||
|
res, repeated_strat_ids = stratin_occurences()
|
||||||
|
if res < 0:
|
||||||
|
err_msg= f"Error {res} in callin stratin_occurences, error {repeated_strat_ids}"
|
||||||
|
send_to_telegram(err_msg)
|
||||||
|
return -2, err_msg
|
||||||
|
|
||||||
|
#dalsi OPEN ISSUE pri STOPu:
|
||||||
|
# má STOP_TIME strategie záviset na dni v týdnu? jinými slovy pokud je strategie
|
||||||
|
# nastavená na 9:30-10 v pondělí. Mohu si ji manuálně spustit v úterý a systém ji neshodí?
|
||||||
|
# Zatím to je postaveno, že předpis určuje okno, kde má strategie běžet a mimo tuto dobu bude
|
||||||
|
# automaticky shozena. Druhou možností je potom, že scheduler si striktně hlídá jen strategie,
|
||||||
|
# které byly jím zapnuté a ostatní jsou mu putna. V tomto případě pak např. později ručně spuštěmá
|
||||||
|
# strategie (např. kvůli opravě bugu) bude scheduler ignorovat a nevypne ji i kdyz je nastavena na vypnuti.
|
||||||
|
# Dopady: weekdays pri stopu a stratin_occurences
|
||||||
|
|
||||||
|
#IERATE over STOP Candidates
|
||||||
|
record: RunManagerRecord = None
|
||||||
|
print(f"STOP - Looping over {stop_candidates_cnt} candidates")
|
||||||
|
for record in candidates['stop']:
|
||||||
|
print("Candidate: ", record)
|
||||||
|
|
||||||
|
#Tento šelmostroj se stratin_occurences tu je jen proto, aby scheduler zafungoval i na manualne spustene strategie (ve vetsine pripadu)
|
||||||
|
# Při stopu evaluace kandidátů na vypnutí
|
||||||
|
# - pokud mám v schedules jen 1 strategii s konkretnim strat_id, můžu jet přes strat_id - bezici strategie s timto strat_id bude vypnuta (i manualne startnuta)
|
||||||
|
# - pokud jich mám více, musím jet přes runnery uložené v schedules
|
||||||
|
# (v tomto případě je omezení: ručně pouštěna strategii nebude automaticky
|
||||||
|
# stopnuta - systém neví, která to je)
|
||||||
|
|
||||||
|
#zjistime zda strategie bezi
|
||||||
|
|
||||||
|
#strategii mame v scheduleru pouze jednou, muzeme pouzit strat_id
|
||||||
|
if record.strat_id not in repeated_strat_ids:
|
||||||
|
if record.strat_id not in strat_ids_running:
|
||||||
|
msg = f"strategy {record.strat_id} NOT RUNNING"
|
||||||
|
print(msg)
|
||||||
|
continue
|
||||||
|
else:
|
||||||
|
#do stop
|
||||||
|
id_to_stop = record.strat_id
|
||||||
|
#strat_id je pouzito v scheduleru vicekrat, musime pouzit runner_id
|
||||||
|
elif record.runner_id is not None and record.runner_id in runnerids_running:
|
||||||
|
#do stop
|
||||||
|
id_to_stop = record.runner_id
|
||||||
|
#no distinctive condition
|
||||||
|
else:
|
||||||
|
#dont do anything
|
||||||
|
print(f"strategy {record.strat_id} not RUNNING or not distinctive (manually launched or two strat_ids in scheduler)")
|
||||||
|
continue
|
||||||
|
|
||||||
|
print(f"Requesting STOP {id_to_stop}")
|
||||||
|
res, msg = stop_strategy(id_to_stop)
|
||||||
|
if res < 0:
|
||||||
|
msg = f"ERROR while STOPPING runner_id/strat_id {id_to_stop} {msg}"
|
||||||
|
send_to_telegram(msg)
|
||||||
|
else:
|
||||||
|
if record.strat_id in strat_ids_running:
|
||||||
|
strat_ids_running.remove(record.strat_id)
|
||||||
|
if record.runner_id is not None and record.runner_id in runnerids_running:
|
||||||
|
runnerids_running.remove(record.runner_id)
|
||||||
|
record.runner_id = None
|
||||||
|
|
||||||
|
record.last_processed = market_time_now
|
||||||
|
history_string = f"{market_time_now.isoformat()} strategy {record.strat_id}" + "STOPPED" if res == 0 else "ERROR:" + msg
|
||||||
|
if record.history is None:
|
||||||
|
record.history = history_string
|
||||||
|
else:
|
||||||
|
record.history += "\n" + history_string
|
||||||
|
|
||||||
|
#update record (nejspis jeste upravit - last_run a history)
|
||||||
|
res, set = rm.update_run_manager_record(record.id, record)
|
||||||
|
if res == 0:
|
||||||
|
print(f"Record updated {set}")
|
||||||
|
else:
|
||||||
|
err_msg= f"Error updating {record.id} errir {set} with values {record}"
|
||||||
|
print(err_msg)
|
||||||
|
send_to_telegram(err_msg)
|
||||||
|
return -2, err_msg#toto stopne zpracovani dalsich zaznamu pri chybe, zvazit continue
|
||||||
|
|
||||||
|
return 0, "DONE"
|
||||||
|
|
||||||
|
##LIVE or PAPER
|
||||||
|
#tato verze využívate REST API, po predelani jobu na apscheduler uz muze vyuzivat prime volani cs.run_stratin
|
||||||
|
#TODO predelat
|
||||||
|
def run_scheduled_strategy(record: RunManagerRecord):
|
||||||
|
#get strat_json
|
||||||
|
sada : StrategyInstance = None
|
||||||
|
res, sada = fetch_stratin(record.strat_id)
|
||||||
|
if res == 0:
|
||||||
|
# #TODO toto overit jestli je stejny vystup jako JS
|
||||||
|
# print("Sada", sada)
|
||||||
|
# #strategy_instance = StrategyInstance(**sada)
|
||||||
|
strat_json = json.dumps(sada, default=json_serial)
|
||||||
|
# Replace escaped characters with their unescaped versions so it matches the JS output
|
||||||
|
#strat_json = strat_json.replace('\\r\\n', '\r\n')
|
||||||
|
#print(f"Strat_json fetched, {strat_json}")
|
||||||
|
else:
|
||||||
|
err_msg= f"Strategy {record.strat_id} not found. ERROR {sada}"
|
||||||
|
print(err_msg)
|
||||||
|
return -2, err_msg
|
||||||
|
|
||||||
|
#TBD mozna customizovat NOTE
|
||||||
|
|
||||||
|
#pokud neni batch_id pak vyhgeneruju a ulozim do db
|
||||||
|
# if record.batch_id is None:
|
||||||
|
# record.batch_id = str(uuid4())[:8]
|
||||||
|
|
||||||
|
api_url = f"http://localhost:8000/stratins/{record.strat_id}/run"
|
||||||
|
|
||||||
|
# Initialize RunRequest with record values
|
||||||
|
runReq = {
|
||||||
|
"id": str(record.strat_id),
|
||||||
|
"strat_json": strat_json,
|
||||||
|
"mode": record.mode,
|
||||||
|
"account": record.account,
|
||||||
|
"ilog_save": record.ilog_save,
|
||||||
|
"weekdays_filter": record.weekdays_filter,
|
||||||
|
"test_batch_id": record.testlist_id,
|
||||||
|
"batch_id": record.batch_id or str(uuid4())[:8],
|
||||||
|
"bt_from": record.bt_from.isoformat() if record.bt_from else None,
|
||||||
|
"bt_to": record.bt_to.isoformat() if record.bt_to else None,
|
||||||
|
"note": f"SCHED {record.start_time}-" + record.stop_time if record.stop_time else "" + record.note if record.note is not None else ""
|
||||||
|
}
|
||||||
|
|
||||||
|
# Headers for the request
|
||||||
|
headers = {
|
||||||
|
"X-API-Key": WEB_API_KEY
|
||||||
|
}
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Make the PUT request to the API with the headers
|
||||||
|
response = requests.put(api_url, json=runReq, headers=headers)
|
||||||
|
|
||||||
|
# Check if the request was successful
|
||||||
|
if response.status_code == 200:
|
||||||
|
print(f"Strategy {record.strat_id} started successfully.")
|
||||||
|
return 0, response.json()
|
||||||
|
else:
|
||||||
|
err_msg = f"Strategy {record.strat_id} NOT started. Status Code: {response.status_code}, Response: {response.text}"
|
||||||
|
print(err_msg)
|
||||||
|
return -2, err_msg
|
||||||
|
except requests.RequestException as e:
|
||||||
|
err_msg = f"Request failed: {str(e)}"
|
||||||
|
print(err_msg)
|
||||||
|
return -2, err_msg
|
||||||
|
|
||||||
|
# #intiializae RunRequest with record values
|
||||||
|
# runReq = RunRequest(id=record.strat_id,
|
||||||
|
# strat_json=strat_json,
|
||||||
|
# mode=record.mode,
|
||||||
|
# account=record.account,
|
||||||
|
# ilog_save=record.ilog_save,
|
||||||
|
# weekdays_filter=record.weekdays_filter,
|
||||||
|
# test_batch_id=record.testlist_id,
|
||||||
|
# batch_id=record.batch_id,
|
||||||
|
# bt_from=record.bt_from,
|
||||||
|
# bt_to=record.bt_to,
|
||||||
|
# note=record.note)
|
||||||
|
# #call rest API to start strategy
|
||||||
|
|
||||||
|
|
||||||
|
# #start strategy
|
||||||
|
# res, sada = cs.run_stratin(id=record.strat_id, runReq=runReq, inter_batch_params=None)
|
||||||
|
# if res == 0:
|
||||||
|
# print(f"Strategy {sada} started")
|
||||||
|
# return 0, sada
|
||||||
|
# else:
|
||||||
|
# err_msg= f"Strategy {record.strat_id} NOT started. ERROR {sada}"
|
||||||
|
# print(err_msg)
|
||||||
|
# return -2, err_msg
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
#use naive datetoime
|
||||||
|
debug_date = None
|
||||||
|
debug_date = datetime(2024, 2, 16, 16, 37, 0, 0)
|
||||||
|
#debug_date = datetime(2024, 2, 16, 10, 30, 0, 0)
|
||||||
|
#debug_date = datetime(2024, 2, 16, 16, 1, 0, 0)
|
||||||
|
|
||||||
|
if debug_date is not None:
|
||||||
|
# Localize the naive datetime object to the Eastern timezone
|
||||||
|
debug_date = zoneNY.localize(debug_date)
|
||||||
|
#debugdate formatted as string in format "23.12.2024 9:30"
|
||||||
|
formatted_date = debug_date.strftime("%d.%m.%Y %H:%M")
|
||||||
|
print("Scheduler.py NY time: ", formatted_date)
|
||||||
|
print("ISoformat", debug_date.isoformat())
|
||||||
|
|
||||||
|
res, msg = startstop_scheduled(debug_date=debug_date, market="US")
|
||||||
|
print(f"CALL FINISHED, with {debug_date} RESULT: {res}, {msg}")
|
||||||
@ -26,7 +26,7 @@
|
|||||||
|
|
||||||
|
|
||||||
<!-- <script src="https://code.jquery.com/jquery-3.6.4.js" integrity="sha256-a9jBBRygX1Bh5lt8GZjXDzyOB+bWve9EiO7tROUtj/E=" crossorigin="anonymous"></script> -->
|
<!-- <script src="https://code.jquery.com/jquery-3.6.4.js" integrity="sha256-a9jBBRygX1Bh5lt8GZjXDzyOB+bWve9EiO7tROUtj/E=" crossorigin="anonymous"></script> -->
|
||||||
<script src="/static/js/libs/jquery-3.6.4.js" integrity="sha256-a9jBBRygX1Bh5lt8GZjXDzyOB+bWve9EiO7tROUtj/E=" crossorigin="anonymous"></script>
|
<script src="/static/js/libs/jquery-3.6.4.js"></script>
|
||||||
|
|
||||||
<!-- <script src="https://cdn.datatables.net/1.13.4/js/jquery.dataTables.min.js"></script> -->
|
<!-- <script src="https://cdn.datatables.net/1.13.4/js/jquery.dataTables.min.js"></script> -->
|
||||||
<script src="/static/js/libs/jquery.dataTables.min.js"></script>
|
<script src="/static/js/libs/jquery.dataTables.min.js"></script>
|
||||||
@ -57,7 +57,7 @@
|
|||||||
<!-- <script src="https://code.jquery.com/jquery-3.5.1.js"></script> -->
|
<!-- <script src="https://code.jquery.com/jquery-3.5.1.js"></script> -->
|
||||||
|
|
||||||
|
|
||||||
<link rel="stylesheet" href="/static/main.css?v=1.06">
|
<link rel="stylesheet" href="/static/main.css?v=1.07">
|
||||||
<!-- <script src="https://cdnjs.cloudflare.com/ajax/libs/mousetrap/1.4.6/mousetrap.min.js"></script> -->
|
<!-- <script src="https://cdnjs.cloudflare.com/ajax/libs/mousetrap/1.4.6/mousetrap.min.js"></script> -->
|
||||||
|
|
||||||
<script src="/static/js/libs/mousetrap.min.js"></script>
|
<script src="/static/js/libs/mousetrap.min.js"></script>
|
||||||
@ -298,6 +298,252 @@
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
</div>
|
||||||
|
<!-- SCHEDULER -->
|
||||||
|
<div id="runmanager-table" class="flex-items">
|
||||||
|
<label data-bs-toggle="collapse" data-bs-target="#runmanager-table-inner">
|
||||||
|
<h4>Run Manager</h4>
|
||||||
|
</label>
|
||||||
|
<div id="runmanager-table-inner" class="collapse show collapsible-section" style="width:58%">
|
||||||
|
<div id="controls">
|
||||||
|
<button title="Create new" id="button_add_sched" class="btn btn-outline-success btn-sm">Add</button>
|
||||||
|
<button title="Edit selected" id="button_edit_sched" class="btn btn-outline-success btn-sm">Edit</button>
|
||||||
|
<button title="Delete selected" id="button_delete_sched" class="btn btn-outline-success btn-sm">Delete</button>
|
||||||
|
<button title="History" id="button_history_sched" class="btn btn-outline-success btn-sm">History</button>
|
||||||
|
<button title="Refresh" id="button_refresh_sched" class="btn btn-outline-success btn-sm">Refresh</button>
|
||||||
|
<div class="btn-group btn-group-toggle" data-toggle="buttons">
|
||||||
|
<!-- <input type="radio" class="btn-check" name="filterOptions" id="filterNone" autocomplete="off" checked>
|
||||||
|
<label class="btn btn-outline-primary" for="filterNone">All</label> -->
|
||||||
|
|
||||||
|
<input type="radio" class="btn-check" name="filterOptions" id="filterSchedule" autocomplete="off" checked>
|
||||||
|
<label class="btn btn-outline-primary" for="filterSchedule">Scheduled</label>
|
||||||
|
|
||||||
|
<input type="radio" class="btn-check" name="filterOptions" id="filterQueue" autocomplete="off">
|
||||||
|
<label class="btn btn-outline-primary" for="filterQueue">Queued</label>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<table id="runmanagerTable" class="table-striped table dataTable" style="width:100%; border-color: #dce1dc;">
|
||||||
|
<thead>
|
||||||
|
<tr>
|
||||||
|
<th>Id</th>
|
||||||
|
<th>Type</th>
|
||||||
|
<th>Strat_Id</th>
|
||||||
|
<th>Symbol</th>
|
||||||
|
<th>Account</th>
|
||||||
|
<th>Mode</th>
|
||||||
|
<th>Note</th>
|
||||||
|
<th>Log</th>
|
||||||
|
<th>BT_from</th>
|
||||||
|
<th>BT_to</th>
|
||||||
|
<th>days</th>
|
||||||
|
<th>batch_id</th>
|
||||||
|
<th>start</th>
|
||||||
|
<th>stop</th>
|
||||||
|
<th>status</th>
|
||||||
|
<th>last_processed</th>
|
||||||
|
<th>history</th>
|
||||||
|
<th>valid_from</th>
|
||||||
|
<th>valid_to</th>
|
||||||
|
<th>testlist_id</th>
|
||||||
|
<th>Running</th>
|
||||||
|
<th>RunnerId</th>
|
||||||
|
<th>Market</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody></tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
<div id="delModalRunmanager" class="modal fade">
|
||||||
|
<div class="modal-dialog">
|
||||||
|
<form method="post" id="delFormRunmanager">
|
||||||
|
<div class="modal-content">
|
||||||
|
<div class="modal-header">
|
||||||
|
<h4 class="modal-title"><i class="fa fa-plus"></i> Delete record</h4>
|
||||||
|
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
|
||||||
|
</div>
|
||||||
|
<div class="modal-body">
|
||||||
|
<div class="form-group">
|
||||||
|
<label for="delidrunmanager" class="form-label">Id</label>
|
||||||
|
<!-- <div id="listofids"></div> -->
|
||||||
|
<input type="text" class="form-control" id="delidrunmanager" name="id" placeholder="id" readonly>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="modal-footer">
|
||||||
|
<input type="submit" name="delete" id="deleterunmanager" class="btn btn-primary" value="Delete" />
|
||||||
|
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Close</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</form>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div id="addeditModalRunmanager" class="modal fade">
|
||||||
|
<div class="modal-dialog">
|
||||||
|
<form method="post" id="addeditFormRunmanager">
|
||||||
|
<div class="modal-content">
|
||||||
|
<div class="modal-header">
|
||||||
|
<h4 class="modal-title_run"><i class="fa fa-plus"></i> Add scheduler record</h4>
|
||||||
|
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
|
||||||
|
</div>
|
||||||
|
<div class="modal-body">
|
||||||
|
<div class="form-group">
|
||||||
|
<label for="runmanid" class="form-label">Record Id</label>
|
||||||
|
<input type="text" class="form-control" id="runmanid" name="id" placeholder="auto generated id" readonly>
|
||||||
|
</div>
|
||||||
|
<div class="form-group">
|
||||||
|
<label for="runmanmoddus" class="form-label">Type</label>
|
||||||
|
<input type="text" class="form-control" id="runmanmoddus" name="moddus" readonly>
|
||||||
|
</div>
|
||||||
|
<div class="form-group">
|
||||||
|
<label for="runmanstrat_id" class="form-label">StrategyId</label>
|
||||||
|
<input type="text" class="form-control" id="runmanstrat_id" name="strat_id" placeholder="strategy id">
|
||||||
|
</div>
|
||||||
|
<div class="form-group">
|
||||||
|
<label for="runmode" class="form-label">Mode</label>
|
||||||
|
<select class="form-control" id="runmanmode" name="mode"><option value="paper">paper</option><option value="live">live</option><option value="backtest">backtest</option><option value="prep">prep</option></select>
|
||||||
|
</div>
|
||||||
|
<div class="form-group">
|
||||||
|
<label for="account" class="form-label">Account</label>
|
||||||
|
<select class="form-control" id="runmanaccount" name="account"><option value="ACCOUNT1">ACCOUNT1</option><option value="ACCOUNT2">ACCOUNT2</option></select>
|
||||||
|
</div>
|
||||||
|
<div class="form-group">
|
||||||
|
<label for="status" class="form-label">Status</label>
|
||||||
|
<select class="form-control" id="runmanstatus" name="status"><option value="active">active</option><option value="suspended">suspended</option></select>
|
||||||
|
</div>
|
||||||
|
<div class="form-group" id="runmanstart_time_div">
|
||||||
|
<label for="start" class="form-label">Start Time</label>
|
||||||
|
<input type="text" class="form-control" id="runmanstart_time" name="start_time" value="9:30" step="1">
|
||||||
|
</div>
|
||||||
|
<div class="form-group" id="runmanstop_time_div">
|
||||||
|
<label for="stop" class="form-label">Stop Time</label>
|
||||||
|
<input type="text-local" class="form-control" id="runmanstop_time" name="stop_time" value="16:00" step="1">
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- pro budouci queueing backtestu -->
|
||||||
|
<div class="form-group" id="runmanbt_from_div">
|
||||||
|
<label for="bt_from" class="form-label">bt_from</label>
|
||||||
|
<input type="datetime-local" class="form-control" id="runmanbt_from" name="bt_from" placeholder="2023-04-06T09:00:00Z" step="1">
|
||||||
|
</div>
|
||||||
|
<div class="form-group" id="runmanbt_to_div">
|
||||||
|
<label for="bt_to" class="form-label">bt_to</label>
|
||||||
|
<input type="datetime-local" class="form-control" id="runmanbt_to" name="bt_to" placeholder="2023-04-06T09:00:00Z" step="1">
|
||||||
|
</div>
|
||||||
|
<div class="form-group" id="runmantestlist_id_div">
|
||||||
|
<label for="test_batch_id" class="form-label">Test List ID</label>
|
||||||
|
<input type="text" class="form-control" id="runmantestlist_id" name="testlist_id" placeholder="test intervals ID">
|
||||||
|
</div>
|
||||||
|
<!-- pro budouci queueing backtestu -->
|
||||||
|
|
||||||
|
<!-- Initial Checkbox for Enabling Weekday Selection -->
|
||||||
|
<div class="form-group">
|
||||||
|
<div style="display:inline-flex">
|
||||||
|
<label for="runman_enable_weekdays" class="form-label">Limit to Weekdays</label>
|
||||||
|
<input type="checkbox" class="form-check" id="runman_enable_weekdays" name="enable_weekdays" aria-label="Enable Weekday Selection">
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Weekday Checkboxes -->
|
||||||
|
<div class="form-group weekday-checkboxes" style="display:none;">
|
||||||
|
<!-- <label class="form-label">Select Weekdays:</label> -->
|
||||||
|
<div>
|
||||||
|
<input type="checkbox" id="monday" name="weekdays" value="monday">
|
||||||
|
<label for="monday">Monday</label>
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<input type="checkbox" id="tuesday" name="weekdays" value="tuesday">
|
||||||
|
<label for="tuesday">Tuesday</label>
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<input type="checkbox" id="wednesday" name="weekdays" value="wednesday">
|
||||||
|
<label for="wednesday">Wednesday</label>
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<input type="checkbox" id="thursday" name="weekdays" value="thursday">
|
||||||
|
<label for="thursday">Thursday</label>
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<input type="checkbox" id="friday" name="weekdays" value="friday">
|
||||||
|
<label for="friday">Friday</label>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="form-group" id="runmanvalid_from_div">
|
||||||
|
<label for="runmanvalid_from" class="form-label">Valid from</label>
|
||||||
|
<input type="datetime-local" class="form-control" id="runmanvalid_from" name="valid_from" placeholder="2023-04-06T09:00:00Z" step="1">
|
||||||
|
</div>
|
||||||
|
<div class="form-group" id="runmanvalid_to_div">
|
||||||
|
<label for="runmanvalid_to" class="form-label">Valid to</label>
|
||||||
|
<input type="datetime-local" class="form-control" id="runmanvalid_to" name="valid_to" placeholder="2023-04-06T09:00:00Z" step="1">
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="form-group">
|
||||||
|
<label for="batch_id" class="form-label">Batch ID</label>
|
||||||
|
<input type="text" class="form-control" id="runmanbatch_id" name="batch_id" placeholder="batch id">
|
||||||
|
</div>
|
||||||
|
<div class="form-group">
|
||||||
|
<div style="display:inline-flex">
|
||||||
|
<label for="ilog_save" class="form-label">Enable logs</label>
|
||||||
|
<input type="checkbox" class="form-check" id="runmanilog_save" name="ilog_save" aria-label="Enable logs">
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="form-group">
|
||||||
|
<label for="note" class="form-label">note</label>
|
||||||
|
<textarea class="form-control" rows="1" id="runmannote" name="note"></textarea>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="modal-footer">
|
||||||
|
<input type="hidden" name="runner_id" id="runmanrunner_id" />
|
||||||
|
<input type="hidden" name="history" id="runmanhistory" />
|
||||||
|
<input type="hidden" name="last_processed" id="runmanlast_processed" />
|
||||||
|
<!--<input type="hidden" name="action" id="action" value="" />-->
|
||||||
|
<input type="submit" id="runmanagersubmit" class="btn btn-primary" value="Add" />
|
||||||
|
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Close</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</form>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div id="historyModalRunmanager" class="modal fade">
|
||||||
|
<div class="modal-dialog">
|
||||||
|
<form method="post" id="historyModalRunmanagerForm">
|
||||||
|
<div class="modal-content">
|
||||||
|
<div class="modal-header">
|
||||||
|
<h4 class="modal-title"><i class="fa fa-plus"></i>View History</h4>
|
||||||
|
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
|
||||||
|
</div>
|
||||||
|
<div class="modal-body">
|
||||||
|
<div class="form-group">
|
||||||
|
<label for="RunmanId" class="form-label">Id</label>
|
||||||
|
<input type="text" class="form-control" id="RunmanId" name="id" placeholder="id" readonly>
|
||||||
|
</div>
|
||||||
|
<div class="form-group">
|
||||||
|
<label for="Runmanlast_processed" class="form-label">Last processed</label>
|
||||||
|
<input type="text" class="form-control" id="Runmanlast_processed" name="last_processed" readonly>
|
||||||
|
</div>
|
||||||
|
<div class="form-group">
|
||||||
|
<label for="Runmanhistory" class="form-label">History</label>
|
||||||
|
<textarea class="form-control" rows="8" id="Runmanhistory" name="history" readonly></textarea>
|
||||||
|
</div>
|
||||||
|
<!-- <div class="form-group">
|
||||||
|
<label for="metrics" class="form-label">Metrics</label>
|
||||||
|
<textarea class="form-control" rows="8" id="metrics" name="metrics"></textarea>
|
||||||
|
</div>
|
||||||
|
<div class="form-group">
|
||||||
|
<label for="stratvars" class="form-label">Stratvars</label>
|
||||||
|
<textarea class="form-control" rows="8" id="editstratvars" name="stratvars"></textarea>
|
||||||
|
</div>
|
||||||
|
<div class="form-group">
|
||||||
|
<label for="strat_json" class="form-label">Strat JSON</label>
|
||||||
|
<textarea class="form-control" rows="6" id="editstratjson" name="stratjson"></textarea>
|
||||||
|
</div> -->
|
||||||
|
</div>
|
||||||
|
<div class="modal-footer">
|
||||||
|
<!-- <input type="submit" name="delete" id="editarchive" class="btn btn-primary" value="Edit" /> -->
|
||||||
|
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Close</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</form>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<div id="archive-table" class="flex-items">
|
<div id="archive-table" class="flex-items">
|
||||||
<label data-bs-toggle="collapse" data-bs-target="#archive-table-inner">
|
<label data-bs-toggle="collapse" data-bs-target="#archive-table-inner">
|
||||||
@ -316,6 +562,7 @@
|
|||||||
<button id="button_refresh" class="refresh btn btn-outline-success btn-sm">Refresh</button>
|
<button id="button_refresh" class="refresh btn btn-outline-success btn-sm">Refresh</button>
|
||||||
<button title="Compare selected days" id="button_compare_arch" class="refresh btn btn-outline-success btn-sm">Compare</button>
|
<button title="Compare selected days" id="button_compare_arch" class="refresh btn btn-outline-success btn-sm">Compare</button>
|
||||||
<button title="Run selected day" id="button_runagain_arch" class="refresh btn btn-outline-success btn-sm">Run Again(r)</button>
|
<button title="Run selected day" id="button_runagain_arch" class="refresh btn btn-outline-success btn-sm">Run Again(r)</button>
|
||||||
|
<button title="Runs LIVE/PAPER in BT mode with same dates" id="button_runbt_arch" class="refresh btn btn-outline-success btn-sm">Backtest same period</button>
|
||||||
<button title="Select all days on the page" id="button_selpage" class="btn btn-outline-success btn-sm">Select all</button>
|
<button title="Select all days on the page" id="button_selpage" class="btn btn-outline-success btn-sm">Select all</button>
|
||||||
<button title="Export selected days to XML" id="button_export_xml" class="btn btn-outline-success btn-sm">Export xml</button>
|
<button title="Export selected days to XML" id="button_export_xml" class="btn btn-outline-success btn-sm">Export xml</button>
|
||||||
<button title="Export selected days to CSV" id="button_export_csv" class="btn btn-outline-success btn-sm">Export csv</button>
|
<button title="Export selected days to CSV" id="button_export_csv" class="btn btn-outline-success btn-sm">Export csv</button>
|
||||||
@ -350,7 +597,9 @@
|
|||||||
<th>pos</th>
|
<th>pos</th>
|
||||||
<th>avgp</th>
|
<th>avgp</th>
|
||||||
<th>metrics</th>
|
<th>metrics</th>
|
||||||
<th>batchid</th>
|
<th>batchid</th>
|
||||||
|
<th>batchprofit</th>
|
||||||
|
<th>batchcount</th>
|
||||||
</tr>
|
</tr>
|
||||||
</thead>
|
</thead>
|
||||||
<tbody></tbody>
|
<tbody></tbody>
|
||||||
@ -403,27 +652,34 @@
|
|||||||
</div>
|
</div>
|
||||||
<div id="logModal" class="modal fade" style="--bs-modal-width: 825px;">
|
<div id="logModal" class="modal fade" style="--bs-modal-width: 825px;">
|
||||||
<div class="modal-dialog">
|
<div class="modal-dialog">
|
||||||
<div class="modal-content">
|
<div class="modal-content">
|
||||||
<div class="modal-header">
|
<div class="modal-header">
|
||||||
<h4 class="modal-title"><i class="fa fa-plus"></i>Log</h4>
|
<h4 class="modal-title"><i class="fa fa-plus"></i>Log</h4>
|
||||||
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
|
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
|
||||||
|
</div>
|
||||||
|
<div class="modal-body">
|
||||||
|
<div class="form-group">
|
||||||
|
<label for="logFileSelect" class="form-label">Select Log File</label>
|
||||||
|
<select class="form-select" id="logFileSelect" aria-label="Log file select">
|
||||||
|
<!-- <option selected>Select a log file</option> -->
|
||||||
|
<option value="strat.log" selected>strat.log</option>
|
||||||
|
<option value="job.log">job.log</option>
|
||||||
|
</select>
|
||||||
</div>
|
</div>
|
||||||
<div class="modal-body">
|
<div class="form-group mt-3">
|
||||||
<div class="form-group">
|
<label for="logHere" class="form-label">Log</label>
|
||||||
<label for="logHere" class="form-label">Log</label>
|
<div id="log-container"style="height:700px;border:1px solid black;">
|
||||||
<div id="log-container">
|
<!-- <pre id="log-content"></pre> -->
|
||||||
<pre id="log-content"></pre>
|
</div>
|
||||||
</div>
|
|
||||||
<!-- <input type="text" class="form-control" id="delidarchive" name="delidarchive" placeholder="id"> -->
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="modal-footer">
|
|
||||||
<button type="button" class="btn btn-primary" id="logRefreshButton" value="Refresh">Refresh</button>
|
|
||||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Close</button>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
<div class="modal-footer">
|
||||||
|
<button type="button" class="btn btn-primary" id="logRefreshButton" value="Refresh">Refresh</button>
|
||||||
|
<button type="button" class="btn btn-secondary" id="closeLogModal" data-bs-dismiss="modal">Close</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<div id="editModalArchive" class="modal fade">
|
<div id="editModalArchive" class="modal fade">
|
||||||
<div class="modal-dialog">
|
<div class="modal-dialog">
|
||||||
<form method="post" id="editFormArchive">
|
<form method="post" id="editFormArchive">
|
||||||
@ -449,6 +705,10 @@
|
|||||||
<label for="stratvars" class="form-label">Stratvars</label>
|
<label for="stratvars" class="form-label">Stratvars</label>
|
||||||
<textarea class="form-control" rows="8" id="editstratvars" name="stratvars"></textarea>
|
<textarea class="form-control" rows="8" id="editstratvars" name="stratvars"></textarea>
|
||||||
</div>
|
</div>
|
||||||
|
<div class="form-group">
|
||||||
|
<label for="stratvars" class="form-label">Transferables</label>
|
||||||
|
<textarea class="form-control" rows="8" id="edittransferables" name="stratvars"></textarea>
|
||||||
|
</div>
|
||||||
<div class="form-group">
|
<div class="form-group">
|
||||||
<label for="strat_json" class="form-label">Strat JSON</label>
|
<label for="strat_json" class="form-label">Strat JSON</label>
|
||||||
<textarea class="form-control" rows="6" id="editstratjson" name="stratjson"></textarea>
|
<textarea class="form-control" rows="6" id="editstratjson" name="stratjson"></textarea>
|
||||||
@ -887,39 +1147,43 @@
|
|||||||
<BR>
|
<BR>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<script src="/static/js/config.js?v=1.02"></script>
|
<script src="/static/js/config.js?v=1.04"></script>
|
||||||
<!-- tady zacina polska docasna lokalizace -->
|
<!-- tady zacina polska docasna lokalizace -->
|
||||||
<!-- <script type="text/javascript" src="https://unpkg.com/lightweight-charts/dist/lightweight-charts.standalone.production.js"></script> -->
|
<!-- <script type="text/javascript" src="https://unpkg.com/lightweight-charts/dist/lightweight-charts.standalone.production.js"></script> -->
|
||||||
<script type="text/javascript" src="/static/js/libs/lightweightcharts/lightweight-charts.standalone.production410.js"></script>
|
<script type="text/javascript" src="/static/js/libs/lightweightcharts/lightweight-charts.standalone.production413.js"></script>
|
||||||
<script src="/static/js/dynamicbuttons.js?v=1.03"></script>
|
<script src="/static/js/dynamicbuttons.js?v=1.05"></script>
|
||||||
|
|
||||||
|
|
||||||
<!-- <script src="/static/js/utils.js?v=1.01"></script> -->
|
<!-- <script src="/static/js/utils.js?v=1.01"></script> -->
|
||||||
<!-- new util structure and exports and colors -->
|
<!-- new util structure and exports and colors -->
|
||||||
<script src="/static/js/utils/utils.js?v=1.02"></script>
|
<script src="/static/js/utils/utils.js?v=1.06"></script>
|
||||||
<script src="/static/js/utils/exports.js?v=1.02"></script>
|
<script src="/static/js/utils/exports.js?v=1.04"></script>
|
||||||
<script src="/static/js/utils/colors.js?v=1.02"></script>
|
<script src="/static/js/utils/colors.js?v=1.04"></script>
|
||||||
|
|
||||||
|
|
||||||
<script src="/static/js/instantindicators.js?v=1.01"></script>
|
<script src="/static/js/instantindicators.js?v=1.04"></script>
|
||||||
<script src="/static/js/archivechart.js?v=1.03"></script>
|
<script src="/static/js/archivechart.js?v=1.05"></script>
|
||||||
|
|
||||||
<!-- <script src="/static/js/archivetables.js?v=1.05"></script> -->
|
<!-- <script src="/static/js/archivetables.js?v=1.05"></script> -->
|
||||||
<!-- archiveTables split into separate files -->
|
<!-- archiveTables split into separate files -->
|
||||||
<script src="/static/js/tables/archivetable/init.js?v=1.07"></script>
|
<script src="/static/js/tables/archivetable/init.js?v=1.12"></script>
|
||||||
<script src="/static/js/tables/archivetable/functions.js?v=1.06"></script>
|
<script src="/static/js/tables/archivetable/functions.js?v=1.11"></script>
|
||||||
<script src="/static/js/tables/archivetable/modals.js?v=1.05"></script>
|
<script src="/static/js/tables/archivetable/modals.js?v=1.07"></script>
|
||||||
<script src="/static/js/tables/archivetable/handlers.js?v=1.05"></script>
|
<script src="/static/js/tables/archivetable/handlers.js?v=1.11"></script>
|
||||||
|
|
||||||
|
<!-- Runmanager functionality -->
|
||||||
|
<script src="/static/js/tables/runmanager/init.js?v=1.1"></script>
|
||||||
|
<script src="/static/js/tables/runmanager/functions.js?v=1.08"></script>
|
||||||
|
<script src="/static/js/tables/runmanager/modals.js?v=1.07"></script>
|
||||||
|
<script src="/static/js/tables/runmanager/handlers.js?v=1.07"></script>
|
||||||
|
|
||||||
|
<script src="/static/js/livewebsocket.js?v=1.02"></script>
|
||||||
<script src="/static/js/livewebsocket.js?v=1.01"></script>
|
<script src="/static/js/realtimechart.js?v=1.02"></script>
|
||||||
<script src="/static/js/realtimechart.js?v=1.01"></script>
|
<script src="/static/js/mytables.js?v=1.03"></script>
|
||||||
<script src="/static/js/mytables.js?v=1.01"></script>
|
|
||||||
<script src="/static/js/testlist.js?v=1.01"></script>
|
<script src="/static/js/testlist.js?v=1.01"></script>
|
||||||
<script src="/static/js/ml.js?v=1.02"></script>
|
<script src="/static/js/ml.js?v=1.02"></script>
|
||||||
<script src="/static/js/common.js?v=1.01"></script>
|
<script src="/static/js/common.js?v=1.01"></script>
|
||||||
<script src="/static/js/configform.js?v=1.01"></script>
|
<script src="/static/js/configform.js?v=1.01"></script>
|
||||||
|
<!-- <script src="/static/js/scheduler.js?v=1.01"></script> -->
|
||||||
</body>
|
</body>
|
||||||
</html>
|
</html>
|
||||||
@ -638,7 +638,7 @@ $(document).ready(function () {
|
|||||||
else{
|
else{
|
||||||
$('#editstratvars').val(JSON.stringify(row.stratvars,null,2));
|
$('#editstratvars').val(JSON.stringify(row.stratvars,null,2));
|
||||||
}
|
}
|
||||||
|
$('#edittransferables').val(JSON.stringify(row.transferables,null,2));
|
||||||
|
|
||||||
$('#editstratjson').val(row.strat_json);
|
$('#editstratjson').val(row.strat_json);
|
||||||
}
|
}
|
||||||
|
|||||||
File diff suppressed because one or more lines are too long
@ -90,9 +90,55 @@ $(document).ready(function () {
|
|||||||
|
|
||||||
monaco.languages.register({ id: 'python' });
|
monaco.languages.register({ id: 'python' });
|
||||||
monaco.languages.register({ id: 'json' });
|
monaco.languages.register({ id: 'json' });
|
||||||
|
//Register mylogs language
|
||||||
|
monaco.languages.register({ id: 'mylogs' });
|
||||||
// Register the TOML language
|
// Register the TOML language
|
||||||
|
monaco.languages.setLanguageConfiguration('mylogs', {
|
||||||
|
comments: {
|
||||||
|
lineComment: '//', // Adjust if your logs use a different comment symbol
|
||||||
|
},
|
||||||
|
brackets: [['[', ']'], ['{', '}']], // Array and object brackets
|
||||||
|
autoClosingPairs: [
|
||||||
|
{ open: '{', close: '}', notIn: ['string'] },
|
||||||
|
{ open: '"', close: '"', notIn: ['string', 'comment'] },
|
||||||
|
{ open: "'", close: "'", notIn: ['string', 'comment'] },
|
||||||
|
],
|
||||||
|
});
|
||||||
|
monaco.languages.setMonarchTokensProvider('mylogs', {
|
||||||
|
tokenizer: {
|
||||||
|
root: [
|
||||||
|
[/#.*/, 'comment'], // Comments (if applicable)
|
||||||
|
|
||||||
|
// Timestamps
|
||||||
|
[/\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}\.\d+/, 'timestamp'],
|
||||||
|
|
||||||
|
// Log Levels
|
||||||
|
[/\b(INFO|DEBUG|WARNING|ERROR|CRITICAL)\b/, 'log-level'],
|
||||||
|
|
||||||
|
// Strings
|
||||||
|
[/".*"/, 'string'],
|
||||||
|
[/'.*'/, 'string'],
|
||||||
|
|
||||||
|
// Key-Value Pairs
|
||||||
|
[/[A-Za-z_]+\s*:/, 'key'],
|
||||||
|
[/-?\d+\.\d+/, 'number.float'], // Floating-point
|
||||||
|
[/-?\d+/, 'number.integer'], // Integers
|
||||||
|
[/\btrue\b/, 'boolean.true'],
|
||||||
|
[/\bfalse\b/, 'boolean.false'],
|
||||||
|
|
||||||
|
// Other Words and Symbols
|
||||||
|
[/[A-Za-z_]+/, 'identifier'],
|
||||||
|
[/[ \t\r\n]+/, 'white'],
|
||||||
|
[/[\[\]{}(),]/, 'delimiter'], // Expand if more delimiters exist
|
||||||
|
]
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
|
||||||
monaco.languages.register({ id: 'toml' });
|
monaco.languages.register({ id: 'toml' });
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
// Define the TOML language configuration
|
// Define the TOML language configuration
|
||||||
monaco.languages.setLanguageConfiguration('toml', {
|
monaco.languages.setLanguageConfiguration('toml', {
|
||||||
comments: {
|
comments: {
|
||||||
@ -621,7 +667,6 @@ $(document).ready(function () {
|
|||||||
})
|
})
|
||||||
});
|
});
|
||||||
|
|
||||||
|
|
||||||
//button run
|
//button run
|
||||||
$('#button_run').click(function () {
|
$('#button_run').click(function () {
|
||||||
row = stratinRecords.row('.selected').data();
|
row = stratinRecords.row('.selected').data();
|
||||||
@ -953,7 +998,18 @@ var runnerRecords =
|
|||||||
render: function ( data, type, row ) {
|
render: function ( data, type, row ) {
|
||||||
return format_date(data)
|
return format_date(data)
|
||||||
},
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
targets: [4], //symbol
|
||||||
|
render: function ( data, type, row ) {
|
||||||
|
if (type === 'display') {
|
||||||
|
//console.log("arch")
|
||||||
|
var color = getColorForId(row.strat_id);
|
||||||
|
return '<span style="color:' + color + ';">'+data+'</span>';
|
||||||
|
}
|
||||||
|
return data;
|
||||||
},
|
},
|
||||||
|
},
|
||||||
],
|
],
|
||||||
// select: {
|
// select: {
|
||||||
// style: 'multi'
|
// style: 'multi'
|
||||||
|
|||||||
@ -6,6 +6,7 @@ let editor_diff_arch1
|
|||||||
let editor_diff_arch2
|
let editor_diff_arch2
|
||||||
var archData = null
|
var archData = null
|
||||||
var batchHeaders = []
|
var batchHeaders = []
|
||||||
|
var editorLog = null
|
||||||
|
|
||||||
function refresh_arch_and_callback(row, callback) {
|
function refresh_arch_and_callback(row, callback) {
|
||||||
//console.log("entering refresh")
|
//console.log("entering refresh")
|
||||||
@ -78,10 +79,11 @@ function get_detail_and_chart(row) {
|
|||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
//rerun stratin
|
//rerun stratin (use to rerun strategy and also to rerun live/paper as bt on same period)
|
||||||
function run_day_again() {
|
function run_day_again(turnintobt=false) {
|
||||||
row = archiveRecords.row('.selected').data();
|
row = archiveRecords.row('.selected').data();
|
||||||
$('#button_runagain_arch').attr('disabled',true);
|
var button_name = turnintobt ? '#button_runbt_arch' : '#button_runagain_arch'
|
||||||
|
$(button_name).attr('disabled',true)
|
||||||
|
|
||||||
var record1 = new Object()
|
var record1 = new Object()
|
||||||
//console.log(JSON.stringify(rows))
|
//console.log(JSON.stringify(rows))
|
||||||
@ -142,7 +144,7 @@ function run_day_again() {
|
|||||||
//console.log("Result from second request:", result2);
|
//console.log("Result from second request:", result2);
|
||||||
|
|
||||||
//console.log("calling compare")
|
//console.log("calling compare")
|
||||||
rerun_strategy(result1, result2)
|
rerun_strategy(result1, result2, turnintobt)
|
||||||
// Perform your action with the results from both requests
|
// Perform your action with the results from both requests
|
||||||
// Example:
|
// Example:
|
||||||
|
|
||||||
@ -154,13 +156,22 @@ function run_day_again() {
|
|||||||
});
|
});
|
||||||
|
|
||||||
|
|
||||||
function rerun_strategy(archRunner, stratData) {
|
function rerun_strategy(archRunner, stratData, turnintobt) {
|
||||||
record1 = archRunner
|
record1 = archRunner
|
||||||
//console.log(record1)
|
//console.log(record1)
|
||||||
|
|
||||||
|
var note_prefix = "RERUN "
|
||||||
|
if ((turnintobt) && ((record1.mode == 'live') || (record1.mode == 'paper'))) {
|
||||||
|
record1.mode = 'backtest'
|
||||||
|
record1.bt_from = record1.started
|
||||||
|
record1.bt_to = record1.stopped
|
||||||
|
note_prefix = "BT SAME PERIOD "
|
||||||
|
}
|
||||||
|
|
||||||
|
record1.note = note_prefix + record1.note
|
||||||
|
//nebudeme muset odstanovat pri kazdem pridani noveho atributu v budoucnu
|
||||||
//smazeneme nepotrebne a pridame potrebne
|
//smazeneme nepotrebne a pridame potrebne
|
||||||
//do budoucna predelat na vytvoreni noveho objektu
|
//do budoucna predelat na vytvoreni noveho objektu
|
||||||
//nebudeme muset odstanovat pri kazdem pridani noveho atributu v budoucnu
|
|
||||||
delete record1["end_positions"];
|
delete record1["end_positions"];
|
||||||
delete record1["end_positions_avgp"];
|
delete record1["end_positions_avgp"];
|
||||||
delete record1["profit"];
|
delete record1["profit"];
|
||||||
@ -172,8 +183,6 @@ function run_day_again() {
|
|||||||
delete record1["settings"];
|
delete record1["settings"];
|
||||||
delete record1["stratvars"];
|
delete record1["stratvars"];
|
||||||
|
|
||||||
record1.note = "RERUN " + record1.note
|
|
||||||
|
|
||||||
if (record1.bt_from == "") {delete record1["bt_from"];}
|
if (record1.bt_from == "") {delete record1["bt_from"];}
|
||||||
if (record1.bt_to == "") {delete record1["bt_to"];}
|
if (record1.bt_to == "") {delete record1["bt_to"];}
|
||||||
|
|
||||||
@ -212,7 +221,7 @@ function run_day_again() {
|
|||||||
contentType: "application/json",
|
contentType: "application/json",
|
||||||
data: jsonString,
|
data: jsonString,
|
||||||
success:function(data){
|
success:function(data){
|
||||||
$('#button_runagain_arch').attr('disabled',false);
|
$(button_name).attr('disabled',false);
|
||||||
setTimeout(function () {
|
setTimeout(function () {
|
||||||
runnerRecords.ajax.reload();
|
runnerRecords.ajax.reload();
|
||||||
stratinRecords.ajax.reload();
|
stratinRecords.ajax.reload();
|
||||||
@ -222,7 +231,7 @@ function run_day_again() {
|
|||||||
var err = eval("(" + xhr.responseText + ")");
|
var err = eval("(" + xhr.responseText + ")");
|
||||||
window.alert(JSON.stringify(xhr));
|
window.alert(JSON.stringify(xhr));
|
||||||
//console.log(JSON.stringify(xhr));
|
//console.log(JSON.stringify(xhr));
|
||||||
$('#button_runagain_arch').attr('disabled',false);
|
$(button_name).attr('disabled',false);
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
@ -453,8 +462,10 @@ function display_batch_report(batch_id) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
function refresh_logfile() {
|
function refresh_logfile() {
|
||||||
|
logfile = $("#logFileSelect").val()
|
||||||
|
lines = 1200
|
||||||
$.ajax({
|
$.ajax({
|
||||||
url:"/log?lines=30",
|
url:"/log?lines="+lines+"&logfile="+logfile,
|
||||||
beforeSend: function (xhr) {
|
beforeSend: function (xhr) {
|
||||||
xhr.setRequestHeader('X-API-Key',
|
xhr.setRequestHeader('X-API-Key',
|
||||||
API_KEY); },
|
API_KEY); },
|
||||||
@ -462,12 +473,34 @@ function refresh_logfile() {
|
|||||||
contentType: "application/json",
|
contentType: "application/json",
|
||||||
dataType: "json",
|
dataType: "json",
|
||||||
success:function(response){
|
success:function(response){
|
||||||
|
if (editorLog) {
|
||||||
|
editorLog.dispose();
|
||||||
|
}
|
||||||
if (response.lines.length == 0) {
|
if (response.lines.length == 0) {
|
||||||
$('#log-content').html("no records");
|
value = "no records";
|
||||||
|
// $('#log-content').html("no records");
|
||||||
}
|
}
|
||||||
else {
|
else {
|
||||||
$('#log-content').html(response.lines.join('\n'));
|
//console.log(response.lines)
|
||||||
}
|
//var escapedLines = response.lines.map(line => escapeHtml(line));
|
||||||
|
value = response.lines.join('\n')
|
||||||
|
// $('#log-content').html(escapedLines.join('\n'));
|
||||||
|
}
|
||||||
|
require(["vs/editor/editor.main"], () => {
|
||||||
|
editorLog = monaco.editor.create(document.getElementById('log-container'), {
|
||||||
|
value: value,
|
||||||
|
language: 'mylogs',
|
||||||
|
theme: 'tomlTheme-dark',
|
||||||
|
automaticLayout: true,
|
||||||
|
readOnly: true
|
||||||
|
});
|
||||||
|
});
|
||||||
|
// Focus at the end of the file:
|
||||||
|
const model = editorLog.getModel();
|
||||||
|
const lastLineNumber = model.getLineCount();
|
||||||
|
const lastLineColumn = model.getLineMaxColumn(lastLineNumber);
|
||||||
|
editorLog.setPosition({ lineNumber: lastLineNumber, column: lastLineColumn });
|
||||||
|
editorLog.revealPosition({ lineNumber: lastLineNumber, column: lastLineColumn });
|
||||||
},
|
},
|
||||||
error: function(xhr, status, error) {
|
error: function(xhr, status, error) {
|
||||||
var err = eval("(" + xhr.responseText + ")");
|
var err = eval("(" + xhr.responseText + ")");
|
||||||
@ -476,6 +509,14 @@ function refresh_logfile() {
|
|||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function escapeHtml(text) {
|
||||||
|
return text
|
||||||
|
.replace(/&/g, "&")
|
||||||
|
.replace(/</g, "<")
|
||||||
|
.replace(/>/g, ">")
|
||||||
|
.replace(/"/g, """)
|
||||||
|
.replace(/'/g, "'");
|
||||||
|
}
|
||||||
function delete_arch_rows(ids) {
|
function delete_arch_rows(ids) {
|
||||||
$.ajax({
|
$.ajax({
|
||||||
url:"/archived_runners/",
|
url:"/archived_runners/",
|
||||||
@ -530,6 +571,7 @@ function generateStorageKey(batchId) {
|
|||||||
function disable_arch_buttons() {
|
function disable_arch_buttons() {
|
||||||
//disable buttons (enable on row selection)
|
//disable buttons (enable on row selection)
|
||||||
$('#button_runagain_arch').attr('disabled','disabled');
|
$('#button_runagain_arch').attr('disabled','disabled');
|
||||||
|
$('#button_runbt_arch').attr('disabled','disabled');
|
||||||
$('#button_show_arch').attr('disabled','disabled');
|
$('#button_show_arch').attr('disabled','disabled');
|
||||||
$('#button_delete_arch').attr('disabled','disabled');
|
$('#button_delete_arch').attr('disabled','disabled');
|
||||||
$('#button_delete_batch').attr('disabled','disabled');
|
$('#button_delete_batch').attr('disabled','disabled');
|
||||||
@ -552,4 +594,10 @@ function enable_arch_buttons() {
|
|||||||
$('#button_report').attr('disabled',false);
|
$('#button_report').attr('disabled',false);
|
||||||
$('#button_export_xml').attr('disabled',false);
|
$('#button_export_xml').attr('disabled',false);
|
||||||
$('#button_export_csv').attr('disabled',false);
|
$('#button_export_csv').attr('disabled',false);
|
||||||
|
|
||||||
|
//Backtest same period button is displayed only when row with mode paper/live is selected
|
||||||
|
row = archiveRecords.row('.selected').data();
|
||||||
|
if ((row.mode == 'paper') || (row.mode == 'live')) {
|
||||||
|
$('#button_runbt_arch').attr('disabled',false);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
@ -265,8 +265,8 @@ $(document).ready(function () {
|
|||||||
|
|
||||||
$('#diff_first').text(record1.name);
|
$('#diff_first').text(record1.name);
|
||||||
$('#diff_second').text(record2.name);
|
$('#diff_second').text(record2.name);
|
||||||
$('#diff_first_id').text(data1.id);
|
$('#diff_first_id').text(data1.id + ' Batch: ' + data1.batch_id);
|
||||||
$('#diff_second_id').text(data2.id);
|
$('#diff_second_id').text(data2.id + ' Batch: ' + data2.batch_id);
|
||||||
|
|
||||||
//monaco
|
//monaco
|
||||||
require(["vs/editor/editor.main"], () => {
|
require(["vs/editor/editor.main"], () => {
|
||||||
@ -358,11 +358,20 @@ $(document).ready(function () {
|
|||||||
})
|
})
|
||||||
});
|
});
|
||||||
|
|
||||||
|
$('#closeLogModal').click(function () {
|
||||||
|
editorLog.dispose()
|
||||||
|
});
|
||||||
|
|
||||||
//button to query log
|
//button to query log
|
||||||
$('#logRefreshButton').click(function () {
|
$('#logRefreshButton').click(function () {
|
||||||
|
editorLog.dispose()
|
||||||
refresh_logfile()
|
refresh_logfile()
|
||||||
});
|
});
|
||||||
|
|
||||||
|
$('#logFileSelect').change(function() {
|
||||||
|
refresh_logfile();
|
||||||
|
});
|
||||||
|
|
||||||
//button to open log modal
|
//button to open log modal
|
||||||
$('#button_show_log').click(function () {
|
$('#button_show_log').click(function () {
|
||||||
window.$('#logModal').modal('show');
|
window.$('#logModal').modal('show');
|
||||||
@ -441,7 +450,7 @@ $(document).ready(function () {
|
|||||||
$('#editstratvars').val(JSON.stringify(row.stratvars,null,2));
|
$('#editstratvars').val(JSON.stringify(row.stratvars,null,2));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
$('#edittransferables').val(JSON.stringify(row.transferables,null,2));
|
||||||
$('#editstratjson').val(row.strat_json);
|
$('#editstratjson').val(row.strat_json);
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
@ -458,6 +467,11 @@ $(document).ready(function () {
|
|||||||
//run again button
|
//run again button
|
||||||
$('#button_runagain_arch').click(run_day_again)
|
$('#button_runagain_arch').click(run_day_again)
|
||||||
|
|
||||||
|
//run in bt mode
|
||||||
|
$('#button_runbt_arch').click(function() {
|
||||||
|
run_day_again(true);
|
||||||
|
});
|
||||||
|
|
||||||
//workaround pro spatne oznacovani selectu i pro group-headery
|
//workaround pro spatne oznacovani selectu i pro group-headery
|
||||||
// $('#archiveTable tbody').on('click', 'tr.group-header', function(event) {
|
// $('#archiveTable tbody').on('click', 'tr.group-header', function(event) {
|
||||||
// var $row = $(this);
|
// var $row = $(this);
|
||||||
|
|||||||
@ -42,6 +42,8 @@ function initialize_archiveRecords() {
|
|||||||
{data: 'end_positions_avgp', visible: true},
|
{data: 'end_positions_avgp', visible: true},
|
||||||
{data: 'metrics', visible: true},
|
{data: 'metrics', visible: true},
|
||||||
{data: 'batch_id', visible: true},
|
{data: 'batch_id', visible: true},
|
||||||
|
{data: 'batch_profit', visible: false},
|
||||||
|
{data: 'batch_count', visible: false},
|
||||||
],
|
],
|
||||||
paging: true,
|
paging: true,
|
||||||
processing: true,
|
processing: true,
|
||||||
@ -68,30 +70,32 @@ function initialize_archiveRecords() {
|
|||||||
{
|
{
|
||||||
targets: [5],
|
targets: [5],
|
||||||
render: function ( data, type, row ) {
|
render: function ( data, type, row ) {
|
||||||
now = new Date(data)
|
|
||||||
if (type == "sort") {
|
if (type == "sort") {
|
||||||
return new Date(data).getTime();
|
return new Date(data).getTime();
|
||||||
}
|
}
|
||||||
|
//data = "2024-02-26T19:29:13.400621-05:00"
|
||||||
|
// Create a date object from the string, represents given moment in time in UTC time
|
||||||
var date = new Date(data);
|
var date = new Date(data);
|
||||||
|
|
||||||
tit = date.toLocaleString('cs-CZ', {
|
tit = date.toLocaleString('cs-CZ', {
|
||||||
timeZone: 'America/New_York',
|
timeZone: 'America/New_York',
|
||||||
})
|
})
|
||||||
|
|
||||||
if (isToday(now)) {
|
if (isToday(date)) {
|
||||||
|
//console.log("volame isToday s", date)
|
||||||
//return local time only
|
//return local time only
|
||||||
return '<div title="'+tit+'">'+ 'dnes ' + format_date(data,false,true)+'</div>'
|
return '<div title="'+tit+'">'+ 'dnes ' + format_date(data,true,true)+'</div>'
|
||||||
}
|
}
|
||||||
else
|
else
|
||||||
{
|
{
|
||||||
//return local datetime
|
//return local datetime
|
||||||
return '<div title="'+tit+'">'+ format_date(data,false,false)+'</div>'
|
return '<div title="'+tit+'">'+ format_date(data,true,false)+'</div>'
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
targets: [6],
|
targets: [6],
|
||||||
render: function ( data, type, row ) {
|
render: function ( data, type, row ) {
|
||||||
now = new Date(data)
|
|
||||||
if (type == "sort") {
|
if (type == "sort") {
|
||||||
return new Date(data).getTime();
|
return new Date(data).getTime();
|
||||||
}
|
}
|
||||||
@ -100,14 +104,14 @@ function initialize_archiveRecords() {
|
|||||||
timeZone: 'America/New_York',
|
timeZone: 'America/New_York',
|
||||||
})
|
})
|
||||||
|
|
||||||
if (isToday(now)) {
|
if (isToday(date)) {
|
||||||
//return local time only
|
//return local time only
|
||||||
return '<div title="'+tit+'" class="token level comment">'+ 'dnes ' + format_date(data,false,true)+'</div>'
|
return '<div title="'+tit+'" class="token level comment">'+ 'dnes ' + format_date(data,true,true)+'</div>'
|
||||||
}
|
}
|
||||||
else
|
else
|
||||||
{
|
{
|
||||||
//return local datetime
|
//return local datetime
|
||||||
return '<div title="'+tit+'" class="token level number">'+ format_date(data,false,false)+'</div>'
|
return '<div title="'+tit+'" class="token level number">'+ format_date(data,true,false)+'</div>'
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
@ -237,6 +241,8 @@ function initialize_archiveRecords() {
|
|||||||
var groupId = group ? group : 'no-batch-id-' + firstRowData.id;
|
var groupId = group ? group : 'no-batch-id-' + firstRowData.id;
|
||||||
var stateKey = 'dt-group-state-' + groupId;
|
var stateKey = 'dt-group-state-' + groupId;
|
||||||
var state = localStorage.getItem(stateKey);
|
var state = localStorage.getItem(stateKey);
|
||||||
|
var profit = firstRowData.batch_profit
|
||||||
|
var itemCount = firstRowData.batch_count
|
||||||
|
|
||||||
// Iterate over each row in the group to set the data attribute
|
// Iterate over each row in the group to set the data attribute
|
||||||
// zaroven pro kazdy node nastavime viditelnost podle nastaveni
|
// zaroven pro kazdy node nastavime viditelnost podle nastaveni
|
||||||
@ -252,10 +258,10 @@ function initialize_archiveRecords() {
|
|||||||
});
|
});
|
||||||
|
|
||||||
// Initialize variables for the group
|
// Initialize variables for the group
|
||||||
var itemCount = 0;
|
//var itemCount = 0;
|
||||||
var period = '';
|
var period = '';
|
||||||
var batch_note = '';
|
var batch_note = '';
|
||||||
var profit = '';
|
//var profit = '';
|
||||||
var started = null;
|
var started = null;
|
||||||
var stratinId = null;
|
var stratinId = null;
|
||||||
var symbol = null;
|
var symbol = null;
|
||||||
@ -284,13 +290,23 @@ function initialize_archiveRecords() {
|
|||||||
|
|
||||||
//pokud mame batch_id podivame se zda jeho nastaveni uz nema a pokud ano pouzijeme to
|
//pokud mame batch_id podivame se zda jeho nastaveni uz nema a pokud ano pouzijeme to
|
||||||
//pokud nemame tak si ho loadneme
|
//pokud nemame tak si ho loadneme
|
||||||
|
//Tento kod parsuje informace do header hlavicky podle notes, je to relevantni pouze pro
|
||||||
|
//backtest batche, nikoliv pro paper a live, kde pocet dni je neznamy a poznamka se muze menit
|
||||||
|
//do budoucna tento parsing na frontendu bude nahrazen batch tabulkou v db, ktera persistuje
|
||||||
|
//tyto data
|
||||||
if (group) {
|
if (group) {
|
||||||
const existingBatch = batchHeaders.find(batch => batch.batch_id == group);
|
const existingBatch = batchHeaders.find(batch => batch.batch_id == group);
|
||||||
//jeste neni v poli batchu - udelame hlavicku
|
//jeste neni v poli batchu - udelame hlavicku
|
||||||
if (!existingBatch) {
|
if (!existingBatch) {
|
||||||
itemCount = extractNumbersFromString(firstRowData.note);
|
// itemCount = extractNumbersFromString(firstRowData.note);
|
||||||
try {profit = firstRowData.metrics.profit.batch_sum_profit;}
|
// if (!itemCount) {
|
||||||
catch (e) {profit = 'NA'}
|
// itemCount="NA"
|
||||||
|
// }
|
||||||
|
|
||||||
|
// try { profit = firstRowData.metrics.profit.batch_sum_profit;}
|
||||||
|
// catch (e) {profit = 'NA'}
|
||||||
|
|
||||||
|
// if (!profit) {profit = 'NA'}
|
||||||
period = firstRowData.note ? firstRowData.note.substring(0, 14) : '';
|
period = firstRowData.note ? firstRowData.note.substring(0, 14) : '';
|
||||||
try {
|
try {
|
||||||
batch_note = firstRowData.note ? firstRowData.note.split("N:")[1].trim() : ''
|
batch_note = firstRowData.note ? firstRowData.note.split("N:")[1].trim() : ''
|
||||||
@ -298,15 +314,22 @@ function initialize_archiveRecords() {
|
|||||||
started = firstRowData.started
|
started = firstRowData.started
|
||||||
stratinId = firstRowData.strat_id
|
stratinId = firstRowData.strat_id
|
||||||
symbol = firstRowData.symbol
|
symbol = firstRowData.symbol
|
||||||
|
if (period.startsWith("SCHED")) {
|
||||||
|
period = "SCHEDULER";
|
||||||
|
}
|
||||||
var newBatchHeader = {batch_id:group, batch_note:batch_note, profit:profit, itemCount:itemCount, period:period, started:started, stratinId:stratinId, symbol:symbol};
|
var newBatchHeader = {batch_id:group, batch_note:batch_note, profit:profit, itemCount:itemCount, period:period, started:started, stratinId:stratinId, symbol:symbol};
|
||||||
batchHeaders.push(newBatchHeader)
|
batchHeaders.push(newBatchHeader)
|
||||||
}
|
}
|
||||||
//uz je v poli, ale mame novejsi (pribyl v ramci backtestu napr.) - updatujeme
|
//uz je v poli, ale mame novejsi (pribyl v ramci backtestu napr.) - updatujeme
|
||||||
else if (new Date(existingBatch.started) < new Date(firstRowData.started)) {
|
else if (new Date(existingBatch.started) < new Date(firstRowData.started)) {
|
||||||
itemCount = extractNumbersFromString(firstRowData.note);
|
// try {itemCount = extractNumbersFromString(firstRowData.note);}
|
||||||
try {profit = firstRowData.metrics.profit.batch_sum_profit;}
|
// catch (e) {itemCount = 'NA'}
|
||||||
catch (e) {profit = 'NA'}
|
// try {profit = firstRowData.metrics.profit.batch_sum_profit;}
|
||||||
|
// catch (e) {profit = 'NA'}
|
||||||
period = firstRowData.note ? firstRowData.note.substring(0, 14) : '';
|
period = firstRowData.note ? firstRowData.note.substring(0, 14) : '';
|
||||||
|
if (period.startsWith("SCHED")) {
|
||||||
|
period = "SCHEDULER";
|
||||||
|
}
|
||||||
try {
|
try {
|
||||||
batch_note = firstRowData.note ? firstRowData.note.split("N:")[1].trim() : ''
|
batch_note = firstRowData.note ? firstRowData.note.split("N:")[1].trim() : ''
|
||||||
} catch (e) { batch_note = ''}
|
} catch (e) { batch_note = ''}
|
||||||
|
|||||||
100
v2realbot/static/js/tables/runmanager/functions.js
Normal file
100
v2realbot/static/js/tables/runmanager/functions.js
Normal file
@ -0,0 +1,100 @@
|
|||||||
|
function refresh_runmanager_and_callback(row, callback) {
|
||||||
|
//console.log("entering refresh")
|
||||||
|
var request = $.ajax({
|
||||||
|
url: "/run_manager_records/"+row.id,
|
||||||
|
beforeSend: function (xhr) {
|
||||||
|
xhr.setRequestHeader('X-API-Key',
|
||||||
|
API_KEY); },
|
||||||
|
method:"GET",
|
||||||
|
contentType: "application/json",
|
||||||
|
dataType: "json",
|
||||||
|
success:function(data){
|
||||||
|
//console.log("fetched data ok")
|
||||||
|
//console.log(JSON.stringify(data,null,2));
|
||||||
|
},
|
||||||
|
error: function(xhr, status, error) {
|
||||||
|
var err = eval("(" + xhr.responseText + ")");
|
||||||
|
window.alert(JSON.stringify(xhr));
|
||||||
|
console.log(JSON.stringify(xhr));
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Handling the responses of both requests
|
||||||
|
$.when(request).then(function(response) {
|
||||||
|
// Both requests have completed successfully
|
||||||
|
//console.log("Result from request:", response);
|
||||||
|
//console.log("Response received. calling callback")
|
||||||
|
//call callback function
|
||||||
|
callback(response)
|
||||||
|
|
||||||
|
}, function(error) {
|
||||||
|
// Handle errors from either request here
|
||||||
|
// Example:
|
||||||
|
console.error("Error from first request:", error);
|
||||||
|
console.log("requesting id error")
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function delete_runmanager_row(id) {
|
||||||
|
$.ajax({
|
||||||
|
url:"/run_manager_records/"+id,
|
||||||
|
beforeSend: function (xhr) {
|
||||||
|
xhr.setRequestHeader('X-API-Key',
|
||||||
|
API_KEY); },
|
||||||
|
method:"DELETE",
|
||||||
|
contentType: "application/json",
|
||||||
|
dataType: "json",
|
||||||
|
// data: JSON.stringify(ids),
|
||||||
|
success:function(data){
|
||||||
|
$('#delFormRunmanager')[0].reset();
|
||||||
|
window.$('#delModalRunmanager').modal('hide');
|
||||||
|
$('#deleterunmanager').attr('disabled', false);
|
||||||
|
//console.log(data)
|
||||||
|
runmanagerRecords.ajax.reload();
|
||||||
|
disable_runmanager_buttons()
|
||||||
|
},
|
||||||
|
error: function(xhr, status, error) {
|
||||||
|
var err = eval("(" + xhr.responseText + ")");
|
||||||
|
window.alert(JSON.stringify(xhr));
|
||||||
|
console.log(JSON.stringify(xhr));
|
||||||
|
$('#deleterunmanager').attr('disabled', false);
|
||||||
|
//archiveRecords.ajax.reload();
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
//enable/disable based if row(s) selected
|
||||||
|
function disable_runmanager_buttons() {
|
||||||
|
//disable buttons (enable on row selection)
|
||||||
|
//$('#button_add_sched').attr('disabled','disabled');
|
||||||
|
$('#button_edit_sched').attr('disabled','disabled');
|
||||||
|
$('#button_delete_sched').attr('disabled','disabled');
|
||||||
|
$('#button_history_sched').attr('disabled','disabled');
|
||||||
|
}
|
||||||
|
|
||||||
|
function enable_runmanager_buttons() {
|
||||||
|
//enable buttons
|
||||||
|
//$('#button_add_sched').attr('disabled',false);
|
||||||
|
$('#button_edit_sched').attr('disabled',false);
|
||||||
|
$('#button_delete_sched').attr('disabled',false);
|
||||||
|
$('#button_history_sched').attr('disabled',false);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Function to update options
|
||||||
|
function updateSelectOptions(type) {
|
||||||
|
var allOptions = {
|
||||||
|
'paper': '<option value="paper">paper</option>',
|
||||||
|
'live': '<option value="live">live</option>',
|
||||||
|
'backtest': '<option value="backtest">backtest</option>',
|
||||||
|
'prep': '<option value="prep">prep</option>'
|
||||||
|
};
|
||||||
|
|
||||||
|
var allowedOptions = (type === "schedule") ? ['paper', 'live'] : Object.keys(allOptions);
|
||||||
|
|
||||||
|
var $select = $('#runmanmode');
|
||||||
|
$select.empty(); // Clear current options
|
||||||
|
|
||||||
|
allowedOptions.forEach(function(opt) {
|
||||||
|
$select.append(allOptions[opt]); // Append allowed options
|
||||||
|
});
|
||||||
|
}
|
||||||
296
v2realbot/static/js/tables/runmanager/handlers.js
Normal file
296
v2realbot/static/js/tables/runmanager/handlers.js
Normal file
@ -0,0 +1,296 @@
|
|||||||
|
/* <button title="Create new" id="button_add_sched" class="btn btn-outline-success btn-sm">Add</button>
|
||||||
|
<button title="Edit selected" id="button_edit_sched" class="btn btn-outline-success btn-sm">Edit</button>
|
||||||
|
<button title="Delete selected" id="button_delete_sched" class="btn btn-outline-success btn-sm">Delete</button>
|
||||||
|
|
||||||
|
|
||||||
|
id="delModalRunmanager"
|
||||||
|
id="addeditModalRunmanager" id="runmanagersubmit" == "Add vs Edit"
|
||||||
|
*/
|
||||||
|
|
||||||
|
// Function to apply filter
|
||||||
|
function applyFilter(filter) {
|
||||||
|
switch (filter) {
|
||||||
|
case 'filterSchedule':
|
||||||
|
runmanagerRecords.column(1).search('schedule').draw();
|
||||||
|
break;
|
||||||
|
case 'filterQueue':
|
||||||
|
runmanagerRecords.column(1).search('queue').draw();
|
||||||
|
break;
|
||||||
|
// default:
|
||||||
|
// runmanagerRecords.search('').columns().search('').draw();
|
||||||
|
// break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Function to get the ID of current active filter
|
||||||
|
function getCurrentFilter() {
|
||||||
|
var activeFilter = $('input[name="filterOptions"]:checked').attr('id');
|
||||||
|
console.log("activeFilter", activeFilter)
|
||||||
|
return activeFilter;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Function to show/hide input fields based on the current filter
|
||||||
|
function updateInputFields() {
|
||||||
|
var activeFilter = getCurrentFilter();
|
||||||
|
|
||||||
|
switch (activeFilter) {
|
||||||
|
case 'filterSchedule':
|
||||||
|
$('#runmantestlist_id_div').hide();
|
||||||
|
$('#runmanbt_from_div').hide();
|
||||||
|
$('#runmanbt_to_div').hide();
|
||||||
|
|
||||||
|
$('#runmanvalid_from_div').show();
|
||||||
|
$('#runmanvalid_to_div').show();
|
||||||
|
$('#runmanstart_time_div').show();
|
||||||
|
$('#runmanstop_time_div').show();
|
||||||
|
break;
|
||||||
|
case 'filterQueue':
|
||||||
|
$('#runmantestlist_id_div').show();
|
||||||
|
$('#runmanbt_from_div').show();
|
||||||
|
$('#runmanbt_to_div').show();
|
||||||
|
|
||||||
|
$('#runmanvalid_from_div').hide();
|
||||||
|
$('#runmanvalid_to_div').hide();
|
||||||
|
$('#runmanstart_time_div').hide();
|
||||||
|
$('#runmanstop_time_div').hide();
|
||||||
|
break;
|
||||||
|
default:
|
||||||
|
//$('#inputForSchedule, #inputForQueue').hide();
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
//event handlers for runmanager table
|
||||||
|
$(document).ready(function () {
|
||||||
|
initialize_runmanagerRecords();
|
||||||
|
runmanagerRecords.ajax.reload();
|
||||||
|
disable_runmanager_buttons();
|
||||||
|
|
||||||
|
//on click on #button_refresh_sched call runmanagerRecords.ajax.reload()
|
||||||
|
$('#button_refresh_sched').click(function () {
|
||||||
|
runmanagerRecords.ajax.reload();
|
||||||
|
});
|
||||||
|
|
||||||
|
// Event listener for changes in the radio buttons
|
||||||
|
$('input[name="filterOptions"]').on('change', function() {
|
||||||
|
var selectedFilter = $(this).attr('id');
|
||||||
|
applyFilter(selectedFilter);
|
||||||
|
// Save the selected filter to local storage
|
||||||
|
localStorage.setItem('selectedFilter', selectedFilter);
|
||||||
|
});
|
||||||
|
|
||||||
|
|
||||||
|
// Load the last selected filter from local storage and apply it
|
||||||
|
var lastSelectedFilter = localStorage.getItem('selectedFilter');
|
||||||
|
if (lastSelectedFilter) {
|
||||||
|
$('#' + lastSelectedFilter).prop('checked', true).change();
|
||||||
|
}
|
||||||
|
|
||||||
|
//listen for changes on weekday enabling button
|
||||||
|
$('#runman_enable_weekdays').change(function() {
|
||||||
|
if ($(this).is(':checked')) {
|
||||||
|
$('.weekday-checkboxes').show();
|
||||||
|
} else {
|
||||||
|
$('.weekday-checkboxes').hide();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
//selectable rows in runmanager table
|
||||||
|
$('#runmanagerTable tbody').on('click', 'tr', function () {
|
||||||
|
if ($(this).hasClass('selected')) {
|
||||||
|
//$(this).removeClass('selected');
|
||||||
|
//aadd here condition that disable is called only when there is no other selected class on tr[data-group-name]
|
||||||
|
// Check if there are no other selected rows before disabling buttons
|
||||||
|
if ($('#runmanagerTable tr.selected').length === 1) {
|
||||||
|
disable_runmanager_buttons();
|
||||||
|
}
|
||||||
|
//disable_arch_buttons()
|
||||||
|
} else {
|
||||||
|
//archiveRecords.$('tr.selected').removeClass('selected');
|
||||||
|
$(this).addClass('selected');
|
||||||
|
enable_runmanager_buttons()
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
|
||||||
|
//delete button
|
||||||
|
$('#button_delete_sched').click(function () {
|
||||||
|
row = runmanagerRecords.row('.selected').data();
|
||||||
|
window.$('#delModalRunmanager').modal('show');
|
||||||
|
$('#delidrunmanager').val(row.id);
|
||||||
|
// $('#action').val('delRecord');
|
||||||
|
// $('#save').val('Delete');
|
||||||
|
});
|
||||||
|
|
||||||
|
//button add
|
||||||
|
$('#button_add_sched').click(function () {
|
||||||
|
window.$('#addeditModalRunmanager').modal('show');
|
||||||
|
$('#addeditFormRunmanager')[0].reset();
|
||||||
|
//$("#runmanid").prop('readonly', false);
|
||||||
|
if (getCurrentFilter() == 'filterQueue') {
|
||||||
|
mode = 'queue';
|
||||||
|
} else {
|
||||||
|
mode = 'schedule';
|
||||||
|
}
|
||||||
|
//set modus
|
||||||
|
$('#runmanmoddus').val(mode);
|
||||||
|
//updates fields according to selected type
|
||||||
|
updateInputFields();
|
||||||
|
updateSelectOptions(mode);
|
||||||
|
// Initially, check the value of "batch" and enable/disable "btfrom" and "btto" accordingly
|
||||||
|
if ($("#runmantestlist_id").val() !== "") {
|
||||||
|
$("#runmanbt_from, #runmanbt_to").prop("disabled", true);
|
||||||
|
} else {
|
||||||
|
$("#runmanbt_from, #runmanbt_to").prop("disabled", false);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Listen for changes in the "batch" input and diasble/enable "btfrom" and "btto" accordingly
|
||||||
|
$("#runmantestlist_id").on("input", function() {
|
||||||
|
if ($(this).val() !== "") {
|
||||||
|
// If "batch" is not empty, disable "from" and "to"
|
||||||
|
$("#runmanbt_from, #runmanbt_to").prop("disabled", true);
|
||||||
|
} else {
|
||||||
|
// If "batch" is empty, enable "from" and "to"
|
||||||
|
$("#runmanbt_from, #runmanbt_to").prop("disabled", false);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
$('.modal-title_run').html("<i class='fa fa-plus'></i> Add Record");
|
||||||
|
$('#runmanagersubmit').val('Add');
|
||||||
|
$('#runmanager_enable_weekdays').prop('checked', false);
|
||||||
|
$('.weekday-checkboxes').hide();
|
||||||
|
});
|
||||||
|
|
||||||
|
//edit button
|
||||||
|
$('#button_edit_sched').click(function () {
|
||||||
|
row = runmanagerRecords.row('.selected').data();
|
||||||
|
if (row == undefined) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
window.$('#addeditModalRunmanager').modal('show');
|
||||||
|
//set fields as readonly
|
||||||
|
//$("#runmanid").prop('readonly', true);
|
||||||
|
//$("#runmanmoddus").prop('readonly', true);
|
||||||
|
console.log("pred editem puvodni row", row)
|
||||||
|
refresh_runmanager_and_callback(row, show_edit_modal)
|
||||||
|
|
||||||
|
function show_edit_modal(row) {
|
||||||
|
console.log("pred editem refreshnuta row", row);
|
||||||
|
$('#addeditFormRunmanager')[0].reset();
|
||||||
|
$('.modal-title_run').html("<i class='fa fa-plus'></i> Edit Record");
|
||||||
|
$('#runmanagersubmit').val('Edit');
|
||||||
|
|
||||||
|
//updates fields according to selected type
|
||||||
|
updateInputFields();
|
||||||
|
// get shared attributess
|
||||||
|
$('#runmanid').val(row.id);
|
||||||
|
$('#runmanhistory').val(row.history);
|
||||||
|
$('#runmanlast_processed').val(row.last_processed);
|
||||||
|
$('#runmanstrat_id').val(row.strat_id);
|
||||||
|
$('#runmanmode').val(row.mode);
|
||||||
|
$('#runmanmoddus').val(row.moddus);
|
||||||
|
$('#runmanaccount').val(row.account);
|
||||||
|
$('#runmanstatus').val(row.status);
|
||||||
|
$('#runmanbatch_id').val(row.batch_id);
|
||||||
|
$('#runmanrunner_id').val(row.runner_id);
|
||||||
|
$("#runmanilog_save").prop("checked", row.ilog_save);
|
||||||
|
$('#runmannote').val(row.note);
|
||||||
|
|
||||||
|
$('#runmantestlist_id').val(row.testlist_id);
|
||||||
|
$('#runmanbt_from').val(row.bt_from);
|
||||||
|
$('#runmanbt_to').val(row.bt_to);
|
||||||
|
|
||||||
|
$('#runmanvalid_from').val(row.valid_from);
|
||||||
|
$('#runmanvalid_to').val(row.valid_to);
|
||||||
|
$('#runmanstart_time').val(row.start_time);
|
||||||
|
$('#runmanstop_time').val(row.stop_time);
|
||||||
|
|
||||||
|
// Initially, check the value of "batch" and enable/disable "from" and "to" accordingly
|
||||||
|
if ($("#runmantestlist_id").val() !== "") {
|
||||||
|
$("#runmanbt_from, #runmanbt_to").prop("disabled", true);
|
||||||
|
} else {
|
||||||
|
$("#runmanbt_from, #runmanbt_to").prop("disabled", false);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Listen for changes in the "batch" input
|
||||||
|
$("#runmantestlist_id").on("input", function() {
|
||||||
|
if ($(this).val() !== "") {
|
||||||
|
// If "batch" is not empty, disable "from" and "to"
|
||||||
|
$("#runmanbt_from, #runmanbt_to").prop("disabled", true);
|
||||||
|
} else {
|
||||||
|
// If "batch" is empty, enable "from" and "to"
|
||||||
|
$("#runmanbt_from, #runmanbt_to").prop("disabled", false);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
type = $('#runmanmoddus').val();
|
||||||
|
updateSelectOptions(type);
|
||||||
|
|
||||||
|
//add weekdays_filter transformation from string "1,2,3" to array [1,2,3]
|
||||||
|
|
||||||
|
// Assuming you have row.weekend_filter available here
|
||||||
|
var weekdayFilter = row.weekdays_filter;
|
||||||
|
|
||||||
|
//
|
||||||
|
|
||||||
|
if (weekdayFilter) {
|
||||||
|
$('#runman_enable_weekdays').prop('checked', true);
|
||||||
|
$(".weekday-checkboxes").show();
|
||||||
|
|
||||||
|
// Map numbers to weekday names
|
||||||
|
var dayOfWeekMap = {
|
||||||
|
"0": "monday",
|
||||||
|
"1": "tuesday",
|
||||||
|
"2": "wednesday",
|
||||||
|
"3": "thursday",
|
||||||
|
"4": "friday",
|
||||||
|
"5": "saturday", // Adjust if needed for your mapping
|
||||||
|
"6": "sunday" // Adjust if needed for your mapping
|
||||||
|
};
|
||||||
|
|
||||||
|
// Iterate through the selected days
|
||||||
|
$.each(weekdayFilter, function(index, dayIndex) {
|
||||||
|
var dayOfWeek = dayOfWeekMap[dayIndex];
|
||||||
|
if (dayOfWeek) { // Make sure the day exists in the map
|
||||||
|
$("#" + dayOfWeek).prop("checked", true);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
$('#runman_enable_weekdays').prop('checked', false);
|
||||||
|
$(".weekday-checkboxes").hide();
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
});
|
||||||
|
|
||||||
|
//edit button
|
||||||
|
$('#button_history_sched').click(function () {
|
||||||
|
row = runmanagerRecords.row('.selected').data();
|
||||||
|
if (row == undefined) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
window.$('#historyModalRunmanager').modal('show');
|
||||||
|
//set fields as readonly
|
||||||
|
//$("#runmanid").prop('readonly', true);
|
||||||
|
//$("#runmanmoddus").prop('readonly', true);
|
||||||
|
//console.log("pred editem puvodni row", row)
|
||||||
|
refresh_runmanager_and_callback(row, show_history_modal)
|
||||||
|
|
||||||
|
function show_history_modal(row) {
|
||||||
|
//console.log("pred editem refreshnuta row", row);
|
||||||
|
$('#historyModalRunmanagerForm')[0].reset();
|
||||||
|
// get shared attributess
|
||||||
|
$('#RunmanId').val(row.id);
|
||||||
|
var date = new Date(row.last_processed);
|
||||||
|
formatted = date.toLocaleString('cs-CZ', {
|
||||||
|
timeZone: 'America/New_York',
|
||||||
|
})
|
||||||
|
$('#Runmanlast_processed').val(formatted);
|
||||||
|
$('#Runmanhistory').val(row.history);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
});
|
||||||
322
v2realbot/static/js/tables/runmanager/init.js
Normal file
322
v2realbot/static/js/tables/runmanager/init.js
Normal file
@ -0,0 +1,322 @@
|
|||||||
|
var runmanagerRecords = null
|
||||||
|
|
||||||
|
//ekvivalent to ready
|
||||||
|
function initialize_runmanagerRecords() {
|
||||||
|
|
||||||
|
//archive table
|
||||||
|
runmanagerRecords =
|
||||||
|
$('#runmanagerTable').DataTable( {
|
||||||
|
ajax: {
|
||||||
|
url: '/run_manager_records/',
|
||||||
|
dataSrc: '',
|
||||||
|
method:"GET",
|
||||||
|
contentType: "application/json",
|
||||||
|
// dataType: "json",
|
||||||
|
beforeSend: function (xhr) {
|
||||||
|
xhr.setRequestHeader('X-API-Key',
|
||||||
|
API_KEY); },
|
||||||
|
data: function (d) {
|
||||||
|
return JSON.stringify(d);
|
||||||
|
},
|
||||||
|
error: function(xhr, status, error) {
|
||||||
|
//var err = eval("(" + xhr.responseText + ")");
|
||||||
|
//window.alert(JSON.stringify(xhr));
|
||||||
|
console.log(JSON.stringify(xhr));
|
||||||
|
}
|
||||||
|
},
|
||||||
|
columns: [ { data: 'id' },
|
||||||
|
{ data: 'moddus' },
|
||||||
|
{ data: 'strat_id' },
|
||||||
|
{data: 'symbol'},
|
||||||
|
{data: 'account'},
|
||||||
|
{data: 'mode'},
|
||||||
|
{data: 'note'},
|
||||||
|
{data: 'ilog_save'},
|
||||||
|
{data: 'bt_from'},
|
||||||
|
{data: 'bt_to'},
|
||||||
|
{data: 'weekdays_filter', visible: true},
|
||||||
|
{data: 'batch_id', visible: true},
|
||||||
|
{data: 'start_time', visible: true},
|
||||||
|
{data: 'stop_time', visible: true},
|
||||||
|
{data: 'status'},
|
||||||
|
{data: 'last_processed', visible: true},
|
||||||
|
{data: 'history', visible: false},
|
||||||
|
{data: 'valid_from', visible: true},
|
||||||
|
{data: 'valid_to', visible: true},
|
||||||
|
{data: 'testlist_id', visible: true},
|
||||||
|
{data: 'strat_running', visible: true},
|
||||||
|
{data: 'runner_id', visible: true},
|
||||||
|
{data: 'market', visible: true},
|
||||||
|
],
|
||||||
|
paging: true,
|
||||||
|
processing: true,
|
||||||
|
serverSide: false,
|
||||||
|
columnDefs: [
|
||||||
|
{ //history
|
||||||
|
targets: [6],
|
||||||
|
render: function(data, type, row, meta) {
|
||||||
|
if (!data) return data;
|
||||||
|
var stateClass = 'truncated-text';
|
||||||
|
var uniqueId = 'note-' + row.id;
|
||||||
|
|
||||||
|
if (localStorage.getItem(uniqueId) === 'expanded') {
|
||||||
|
stateClass = 'expanded-text';
|
||||||
|
}
|
||||||
|
|
||||||
|
if (type === 'display') {
|
||||||
|
return '<div class="' + stateClass + '" id="' + uniqueId + '">' + data + '</div>';
|
||||||
|
}
|
||||||
|
return data;
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{ //iloc_save
|
||||||
|
targets: [7],
|
||||||
|
render: function ( data, type, row ) {
|
||||||
|
//if ilog_save true
|
||||||
|
if (data) {
|
||||||
|
return '<span class="material-symbols-outlined">done_outline</span>'
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
return null
|
||||||
|
}
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
targets: [10], //weekdays
|
||||||
|
render: function (data, type, row) {
|
||||||
|
if (!data) return data;
|
||||||
|
// Map each number in the array to a weekday
|
||||||
|
var weekdays = ["monday", "tuesday", "wednesday", "thursday", "friday", "saturday", "sunday"];
|
||||||
|
return data.map(function(dayNumber) {
|
||||||
|
return weekdays[dayNumber];
|
||||||
|
}).join(', ');
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
targets: [0, 21], //interni id, runner_id
|
||||||
|
render: function ( data, type, row ) {
|
||||||
|
if (!data) return data;
|
||||||
|
if (type === 'display') {
|
||||||
|
return '<div class="tdnowrap" data-bs-toggle="tooltip" data-bs-placement="top" title="'+data+'">'+data+'</div>';
|
||||||
|
}
|
||||||
|
return data;
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
targets: [2], //strat_id
|
||||||
|
render: function ( data, type, row ) {
|
||||||
|
if (type === 'display') {
|
||||||
|
//console.log("arch")
|
||||||
|
var color = getColorForId(data);
|
||||||
|
return '<div class="tdnowrap" data-bs-toggle="tooltip" data-bs-placement="top" title="'+data+'"><span class="color-tag" style="background-color:' + color + ';"></span>'+data+'</div>';
|
||||||
|
}
|
||||||
|
return data;
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
targets: [3,12,13], //symbol, start_time, stop_time
|
||||||
|
render: function ( data, type, row ) {
|
||||||
|
if (type === 'display') {
|
||||||
|
//console.log("arch")
|
||||||
|
var color = getColorForId(row.strat_id);
|
||||||
|
return '<span style="color:' + color + ';">'+data+'</span>';
|
||||||
|
}
|
||||||
|
return data;
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
targets: [16], //history
|
||||||
|
render: function ( data, type, row ) {
|
||||||
|
if (type === 'display') {
|
||||||
|
if (!data) data = "";
|
||||||
|
return '<div data-bs-toggle="tooltip" data-bs-placement="top" title="'+data+'">'+data+'</div>';
|
||||||
|
}
|
||||||
|
return data;
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
targets: [14], //status
|
||||||
|
render: function ( data, type, row ) {
|
||||||
|
if (type === 'display') {
|
||||||
|
//console.log("arch")
|
||||||
|
var color = data == "active" ? "#3f953f" : "#f84c4c";
|
||||||
|
return '<span style="color:' + color + ';">'+data+'</span>';
|
||||||
|
}
|
||||||
|
return data;
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
targets: [20], //strat_running
|
||||||
|
render: function ( data, type, row ) {
|
||||||
|
if (type === 'display') {
|
||||||
|
if (!data) data = "";
|
||||||
|
console.log("running", data)
|
||||||
|
//var color = data == "active" ? "#3f953f" : "#f84c4c";
|
||||||
|
data = data ? "running" : ""
|
||||||
|
return '<div title="' + row.runner_id + '" style="color:#3f953f;">'+data+'</div>';
|
||||||
|
}
|
||||||
|
return data;
|
||||||
|
},
|
||||||
|
},
|
||||||
|
// {
|
||||||
|
// targets: [0,17],
|
||||||
|
// render: function ( data, type, row ) {
|
||||||
|
// if (!data) return data
|
||||||
|
// return '<div class="tdnowrap" title="'+data+'">'+data+'</i>'
|
||||||
|
// },
|
||||||
|
// },
|
||||||
|
{
|
||||||
|
targets: [15,17, 18, 8, 9], //start, stop, valid_from, valid_to, bt_from, bt_to, last_proccessed
|
||||||
|
render: function ( data, type, row ) {
|
||||||
|
if (!data) return data
|
||||||
|
if (type == "sort") {
|
||||||
|
return new Date(data).getTime();
|
||||||
|
}
|
||||||
|
var date = new Date(data);
|
||||||
|
tit = date.toLocaleString('cs-CZ', {
|
||||||
|
timeZone: 'America/New_York',
|
||||||
|
})
|
||||||
|
return '<div title="'+tit+'">'+ format_date(data,true,false)+'</div>'
|
||||||
|
// if (isToday(now)) {
|
||||||
|
// //return local time only
|
||||||
|
// return '<div title="'+tit+'">'+ 'dnes ' + format_date(data,true,true)+'</div>'
|
||||||
|
// }
|
||||||
|
// else
|
||||||
|
// {
|
||||||
|
// //return local datetime
|
||||||
|
// return '<div title="'+tit+'">'+ format_date(data,true,false)+'</div>'
|
||||||
|
// }
|
||||||
|
},
|
||||||
|
},
|
||||||
|
// {
|
||||||
|
// targets: [6],
|
||||||
|
// render: function ( data, type, row ) {
|
||||||
|
// now = new Date(data)
|
||||||
|
// if (type == "sort") {
|
||||||
|
// return new Date(data).getTime();
|
||||||
|
// }
|
||||||
|
// var date = new Date(data);
|
||||||
|
// tit = date.toLocaleString('cs-CZ', {
|
||||||
|
// timeZone: 'America/New_York',
|
||||||
|
// })
|
||||||
|
|
||||||
|
// if (isToday(now)) {
|
||||||
|
// //return local time only
|
||||||
|
// return '<div title="'+tit+'" class="token level comment">'+ 'dnes ' + format_date(data,false,true)+'</div>'
|
||||||
|
// }
|
||||||
|
// else
|
||||||
|
// {
|
||||||
|
// //return local datetime
|
||||||
|
// return '<div title="'+tit+'" class="token level number">'+ format_date(data,false,false)+'</div>'
|
||||||
|
// }
|
||||||
|
// },
|
||||||
|
// },
|
||||||
|
// {
|
||||||
|
// targets: [9,10],
|
||||||
|
// render: function ( data, type, row ) {
|
||||||
|
// if (type == "sort") {
|
||||||
|
// return new Date(data).getTime();
|
||||||
|
// }
|
||||||
|
// //console.log(data)
|
||||||
|
// //market datetime
|
||||||
|
// return data ? format_date(data, true) : data
|
||||||
|
// },
|
||||||
|
// },
|
||||||
|
// {
|
||||||
|
// targets: [2],
|
||||||
|
// render: function ( data, type, row ) {
|
||||||
|
// return '<div class="tdname tdnowrap" title="'+data+'">'+data+'</div>'
|
||||||
|
// },
|
||||||
|
// },
|
||||||
|
// // {
|
||||||
|
// // targets: [4],
|
||||||
|
// // render: function ( data, type, row ) {
|
||||||
|
// // return '<div class="tdname tdnowrap" title="'+data+'">'+data+'</div>'
|
||||||
|
// // },
|
||||||
|
// // },
|
||||||
|
// {
|
||||||
|
// targets: [16],
|
||||||
|
// render: function ( data, type, row ) {
|
||||||
|
// //console.log("metrics", data)
|
||||||
|
// try {
|
||||||
|
// data = JSON.parse(data)
|
||||||
|
// }
|
||||||
|
// catch (error) {
|
||||||
|
// //console.log(error)
|
||||||
|
// }
|
||||||
|
// var res = JSON.stringify(data)
|
||||||
|
// var unquoted = res.replace(/"([^"]+)":/g, '$1:')
|
||||||
|
|
||||||
|
// //zobrazujeme jen kratkou summary pokud mame, jinak davame vse, do titlu davame vzdy vse
|
||||||
|
// //console.log(data)
|
||||||
|
// short = null
|
||||||
|
// if ((data) && (data.profit) && (data.profit.sum)) {
|
||||||
|
// short = data.profit.sum
|
||||||
|
// }
|
||||||
|
// else {
|
||||||
|
// short = unquoted
|
||||||
|
// }
|
||||||
|
// return '<div class="tdmetrics" title="'+unquoted+'">'+short+'</div>'
|
||||||
|
// },
|
||||||
|
// },
|
||||||
|
// {
|
||||||
|
// targets: [4],
|
||||||
|
// render: function ( data, type, row ) {
|
||||||
|
// return '<div class="tdnote" title="'+data+'">'+data+'</div>'
|
||||||
|
// },
|
||||||
|
// },
|
||||||
|
// {
|
||||||
|
// targets: [13,14,15],
|
||||||
|
// render: function ( data, type, row ) {
|
||||||
|
// return '<div class="tdsmall">'+data+'</div>'
|
||||||
|
// },
|
||||||
|
// },
|
||||||
|
// {
|
||||||
|
// targets: [11],
|
||||||
|
// render: function ( data, type, row ) {
|
||||||
|
// //if ilog_save true
|
||||||
|
// if (data) {
|
||||||
|
// return '<span class="material-symbols-outlined">done_outline</span>'
|
||||||
|
// }
|
||||||
|
// else {
|
||||||
|
// return null
|
||||||
|
// }
|
||||||
|
// },
|
||||||
|
// },
|
||||||
|
{
|
||||||
|
targets: [4], //account
|
||||||
|
render: function ( data, type, row ) {
|
||||||
|
//if ilog_save true
|
||||||
|
if (data == "ACCOUNT1") {
|
||||||
|
res="ACC1"
|
||||||
|
}
|
||||||
|
else if (data == "ACCOUNT2") {
|
||||||
|
res="ACC2"
|
||||||
|
}
|
||||||
|
else { res=data}
|
||||||
|
return res
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
targets: [5], //mode
|
||||||
|
render: function ( data, type, row ) {
|
||||||
|
//if ilog_save true
|
||||||
|
if (data == "backtest") {
|
||||||
|
res="bt"
|
||||||
|
}
|
||||||
|
else { res=data}
|
||||||
|
return res
|
||||||
|
},
|
||||||
|
}
|
||||||
|
],
|
||||||
|
order: [[1, 'asc']],
|
||||||
|
select: {
|
||||||
|
info: true,
|
||||||
|
style: 'multi',
|
||||||
|
//selector: 'tbody > tr:not(.group-header)'
|
||||||
|
selector: 'tbody > tr:not(.group-header)'
|
||||||
|
},
|
||||||
|
paging: true
|
||||||
|
});
|
||||||
|
|
||||||
|
}
|
||||||
195
v2realbot/static/js/tables/runmanager/modals.js
Normal file
195
v2realbot/static/js/tables/runmanager/modals.js
Normal file
@ -0,0 +1,195 @@
|
|||||||
|
//delete modal
|
||||||
|
$("#delModalRunmanager").on('submit','#delFormRunmanager', function(event){
|
||||||
|
event.preventDefault();
|
||||||
|
$('#deleterunmanager').attr('disabled','disabled');
|
||||||
|
|
||||||
|
//get val from #delidrunmanager
|
||||||
|
id = $('#delidrunmanager').val();
|
||||||
|
delete_runmanager_row(id);
|
||||||
|
});
|
||||||
|
|
||||||
|
//add api
|
||||||
|
// fetch(`/run_manager_records/`, {
|
||||||
|
// method: 'POST',
|
||||||
|
// headers: {
|
||||||
|
// 'Content-Type': 'application/json',
|
||||||
|
// 'X-API-Key': API_KEY
|
||||||
|
// },
|
||||||
|
// body: JSON.stringify(newRecord)
|
||||||
|
// })
|
||||||
|
|
||||||
|
// fetch(`/run_manager_records/${recordId}`, {
|
||||||
|
// method: 'PATCH',
|
||||||
|
// headers: {
|
||||||
|
// 'Content-Type': 'application/json',
|
||||||
|
// 'X-API-Key': API_KEY
|
||||||
|
// },
|
||||||
|
// body: JSON.stringify(updatedData)
|
||||||
|
// })
|
||||||
|
|
||||||
|
function getCheckedWeekdays() {
|
||||||
|
const checkboxes = document.querySelectorAll('input[name="weekdays_filter[]"]:checked');
|
||||||
|
const selectedDays = Array.from(checkboxes).map(checkbox => checkbox.value);
|
||||||
|
return selectedDays;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
//submit form
|
||||||
|
$("#addeditModalRunmanager").on('submit','#addeditFormRunmanager', function(event){
|
||||||
|
//event.preventDefault();
|
||||||
|
//code for add
|
||||||
|
if ($('#runmanagersubmit').val() == "Add") {
|
||||||
|
|
||||||
|
event.preventDefault();
|
||||||
|
//set id as editable
|
||||||
|
$('#runmanagersubmit').attr('disabled','disabled');
|
||||||
|
//trow = runmanagerRecords.row('.selected').data();
|
||||||
|
//note = $('#editnote').val()
|
||||||
|
|
||||||
|
// Handle weekdays functionality
|
||||||
|
var weekdays = [];
|
||||||
|
if ($('#runman_enable_weekdays').is(':checked')) {
|
||||||
|
$('#addeditFormRunmanager input[name="weekdays"]:checked').each(function() {
|
||||||
|
var weekday = $(this).val();
|
||||||
|
switch(weekday) {
|
||||||
|
case 'monday': weekdays.push(0); break;
|
||||||
|
case 'tuesday': weekdays.push(1); break;
|
||||||
|
case 'wednesday': weekdays.push(2); break;
|
||||||
|
case 'thursday': weekdays.push(3); break;
|
||||||
|
case 'friday': weekdays.push(4); break;
|
||||||
|
// Add cases for Saturday and Sunday if needed
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
console.log("weekdays pole", weekdays)
|
||||||
|
|
||||||
|
var formData = $(this).serializeJSON();
|
||||||
|
console.log("formData", formData)
|
||||||
|
|
||||||
|
delete formData["enable_weekdays"]
|
||||||
|
delete formData["weekdays"]
|
||||||
|
|
||||||
|
//pokud je zatrzeno tak aplikujeme filter, jinak nevyplnujeme
|
||||||
|
if (weekdays.length > 0) {
|
||||||
|
formData.weekdays_filter = weekdays
|
||||||
|
}
|
||||||
|
console.log(formData)
|
||||||
|
if ($('#runmanilog_save').prop('checked')) {
|
||||||
|
formData.ilog_save = true;
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
formData.ilog_save = false;
|
||||||
|
}
|
||||||
|
|
||||||
|
//if (formData.batch_id == "") {delete formData["batch_id"];}
|
||||||
|
|
||||||
|
//projede vsechny atributy a kdyz jsou "" tak je smaze, default nahradi backend
|
||||||
|
for (let key in formData) {
|
||||||
|
if (formData.hasOwnProperty(key) && formData[key] === "") {
|
||||||
|
delete formData[key];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
jsonString = JSON.stringify(formData);
|
||||||
|
console.log("json string pro formData pred odeslanim", jsonString)
|
||||||
|
$.ajax({
|
||||||
|
url:"/run_manager_records/",
|
||||||
|
beforeSend: function (xhr) {
|
||||||
|
xhr.setRequestHeader('X-API-Key',
|
||||||
|
API_KEY); },
|
||||||
|
method:"POST",
|
||||||
|
contentType: "application/json",
|
||||||
|
// dataType: "json",
|
||||||
|
data: jsonString,
|
||||||
|
success:function(data){
|
||||||
|
$('#addeditFormRunmanager')[0].reset();
|
||||||
|
window.$('#addeditModalRunmanager').modal('hide');
|
||||||
|
$('#runmanagersubmit').attr('disabled', false);
|
||||||
|
runmanagerRecords.ajax.reload();
|
||||||
|
disable_runmanager_buttons();
|
||||||
|
},
|
||||||
|
error: function(xhr, status, error) {
|
||||||
|
var err = eval("(" + xhr.responseText + ")");
|
||||||
|
window.alert(JSON.stringify(xhr));
|
||||||
|
console.log(JSON.stringify(xhr));
|
||||||
|
$('#runmanagersubmit').attr('disabled', false);
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
//code for edit
|
||||||
|
else {
|
||||||
|
event.preventDefault();
|
||||||
|
$('#runmanagersubmit').attr('disabled','disabled');
|
||||||
|
//trow = runmanagerRecords.row('.selected').data();
|
||||||
|
//note = $('#editnote').val()
|
||||||
|
|
||||||
|
// Handle weekdays functionality
|
||||||
|
var weekdays = [];
|
||||||
|
if ($('#runman_enable_weekdays').is(':checked')) {
|
||||||
|
$('#addeditFormRunmanager input[name="weekdays"]:checked').each(function() {
|
||||||
|
var weekday = $(this).val();
|
||||||
|
switch(weekday) {
|
||||||
|
case 'monday': weekdays.push(0); break;
|
||||||
|
case 'tuesday': weekdays.push(1); break;
|
||||||
|
case 'wednesday': weekdays.push(2); break;
|
||||||
|
case 'thursday': weekdays.push(3); break;
|
||||||
|
case 'friday': weekdays.push(4); break;
|
||||||
|
// Add cases for Saturday and Sunday if needed
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
var formData = $(this).serializeJSON();
|
||||||
|
delete formData["enable_weekdays"]
|
||||||
|
delete formData["weekdays"]
|
||||||
|
|
||||||
|
//pokud je zatrzeno tak aplikujeme filter, jinak nevyplnujeme
|
||||||
|
if (weekdays.length > 0) {
|
||||||
|
formData.weekdays_filter = weekdays
|
||||||
|
}
|
||||||
|
console.log(formData)
|
||||||
|
if ($('#runmanilog_save').prop('checked')) {
|
||||||
|
formData.ilog_save = true;
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
formData.ilog_save = false;
|
||||||
|
}
|
||||||
|
|
||||||
|
//projede formatributy a kdyz jsou "" tak je smaze, default nahradi backend - tzn. smaze se puvodni hodnota
|
||||||
|
for (let key in formData) {
|
||||||
|
if (formData.hasOwnProperty(key) && formData[key] === "") {
|
||||||
|
delete formData[key];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
jsonString = JSON.stringify(formData);
|
||||||
|
console.log("EDIT json string pro formData pred odeslanim", jsonString);
|
||||||
|
$.ajax({
|
||||||
|
url:"/run_manager_records/"+formData.id,
|
||||||
|
beforeSend: function (xhr) {
|
||||||
|
xhr.setRequestHeader('X-API-Key',
|
||||||
|
API_KEY); },
|
||||||
|
method:"PATCH",
|
||||||
|
contentType: "application/json",
|
||||||
|
// dataType: "json",
|
||||||
|
data: jsonString,
|
||||||
|
success:function(data){
|
||||||
|
console.log("EDIT success data", data);
|
||||||
|
$('#addeditFormRunmanager')[0].reset();
|
||||||
|
window.$('#addeditModalRunmanager').modal('hide');
|
||||||
|
$('#runmanagersubmit').attr('disabled', false);
|
||||||
|
runmanagerRecords.ajax.reload();
|
||||||
|
disable_runmanager_buttons();
|
||||||
|
},
|
||||||
|
error: function(xhr, status, error) {
|
||||||
|
var err = eval("(" + xhr.responseText + ")");
|
||||||
|
window.alert(JSON.stringify(xhr));
|
||||||
|
console.log(JSON.stringify(xhr));
|
||||||
|
$('#runmanagersubmit').attr('disabled', false);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
});
|
||||||
@ -371,9 +371,10 @@ function initialize_chart() {
|
|||||||
}
|
}
|
||||||
|
|
||||||
chart = LightweightCharts.createChart(document.getElementById('chart'), chartOptions);
|
chart = LightweightCharts.createChart(document.getElementById('chart'), chartOptions);
|
||||||
chart.applyOptions({ timeScale: { visible: true, timeVisible: true, secondsVisible: true }, crosshair: {
|
chart.applyOptions({ timeScale: { visible: true, timeVisible: true, secondsVisible: true, minBarSpacing: 0.003}, crosshair: {
|
||||||
mode: LightweightCharts.CrosshairMode.Normal, labelVisible: true
|
mode: LightweightCharts.CrosshairMode.Normal, labelVisible: true
|
||||||
}})
|
}})
|
||||||
|
console.log("chart intiialized")
|
||||||
}
|
}
|
||||||
|
|
||||||
//mozna atributy last value visible
|
//mozna atributy last value visible
|
||||||
@ -990,12 +991,26 @@ JSON.safeStringify = (obj, indent = 2) => {
|
|||||||
return retVal;
|
return retVal;
|
||||||
};
|
};
|
||||||
|
|
||||||
function isToday(someDate) {
|
|
||||||
const today = new Date()
|
function isToday(someDate) {
|
||||||
return someDate.getDate() == today.getDate() &&
|
// Convert input date to Eastern Time
|
||||||
someDate.getMonth() == today.getMonth() &&
|
var dateInEastern = new Date(someDate.toLocaleString('en-US', { timeZone: 'America/New_York' }));
|
||||||
someDate.getFullYear() == today.getFullYear()
|
//console.log("vstupuje ",someDate)
|
||||||
}
|
//console.log("americky ",dateInEastern)
|
||||||
|
// Get today's date in Eastern Time
|
||||||
|
var todayInEastern = new Date(new Date().toLocaleString('en-US', { timeZone: 'America/New_York' }));
|
||||||
|
|
||||||
|
return dateInEastern.getDate() === todayInEastern.getDate() &&
|
||||||
|
dateInEastern.getMonth() === todayInEastern.getMonth() &&
|
||||||
|
dateInEastern.getFullYear() === todayInEastern.getFullYear();
|
||||||
|
}
|
||||||
|
// function isToday(someDate) {
|
||||||
|
|
||||||
|
// const today = new Date()
|
||||||
|
// return someDate.getDate() == today.getDate() &&
|
||||||
|
// someDate.getMonth() == today.getMonth() &&
|
||||||
|
// someDate.getFullYear() == today.getFullYear()
|
||||||
|
// }
|
||||||
|
|
||||||
//https://www.w3schools.com/jsref/jsref_tolocalestring.asp
|
//https://www.w3schools.com/jsref/jsref_tolocalestring.asp
|
||||||
function format_date(datum, markettime = false, timeonly = false) {
|
function format_date(datum, markettime = false, timeonly = false) {
|
||||||
|
|||||||
6
v2realbot/static/js/vbt/.html
Normal file
6
v2realbot/static/js/vbt/.html
Normal file
File diff suppressed because one or more lines are too long
157
v2realbot/static/js/vbt/api/_opt_deps/index.html
Normal file
157
v2realbot/static/js/vbt/api/_opt_deps/index.html
Normal file
File diff suppressed because one or more lines are too long
1280
v2realbot/static/js/vbt/api/_settings/index.html
Normal file
1280
v2realbot/static/js/vbt/api/_settings/index.html
Normal file
File diff suppressed because one or more lines are too long
82
v2realbot/static/js/vbt/api/accessors/index.html
Normal file
82
v2realbot/static/js/vbt/api/accessors/index.html
Normal file
File diff suppressed because one or more lines are too long
607
v2realbot/static/js/vbt/api/base/accessors/index.html
Normal file
607
v2realbot/static/js/vbt/api/base/accessors/index.html
Normal file
File diff suppressed because one or more lines are too long
68
v2realbot/static/js/vbt/api/base/chunking/index.html
Normal file
68
v2realbot/static/js/vbt/api/base/chunking/index.html
Normal file
File diff suppressed because one or more lines are too long
98
v2realbot/static/js/vbt/api/base/combining/index.html
Normal file
98
v2realbot/static/js/vbt/api/base/combining/index.html
Normal file
File diff suppressed because one or more lines are too long
13
v2realbot/static/js/vbt/api/base/decorators/index.html
Normal file
13
v2realbot/static/js/vbt/api/base/decorators/index.html
Normal file
File diff suppressed because one or more lines are too long
68
v2realbot/static/js/vbt/api/base/flex_indexing/index.html
Normal file
68
v2realbot/static/js/vbt/api/base/flex_indexing/index.html
Normal file
File diff suppressed because one or more lines are too long
103
v2realbot/static/js/vbt/api/base/grouping/base/index.html
Normal file
103
v2realbot/static/js/vbt/api/base/grouping/base/index.html
Normal file
File diff suppressed because one or more lines are too long
6
v2realbot/static/js/vbt/api/base/grouping/index.html
Normal file
6
v2realbot/static/js/vbt/api/base/grouping/index.html
Normal file
File diff suppressed because one or more lines are too long
33
v2realbot/static/js/vbt/api/base/grouping/nb/index.html
Normal file
33
v2realbot/static/js/vbt/api/base/grouping/nb/index.html
Normal file
File diff suppressed because one or more lines are too long
6
v2realbot/static/js/vbt/api/base/index.html
Normal file
6
v2realbot/static/js/vbt/api/base/index.html
Normal file
File diff suppressed because one or more lines are too long
115
v2realbot/static/js/vbt/api/base/indexes/index.html
Normal file
115
v2realbot/static/js/vbt/api/base/indexes/index.html
Normal file
File diff suppressed because one or more lines are too long
569
v2realbot/static/js/vbt/api/base/indexing/index.html
Normal file
569
v2realbot/static/js/vbt/api/base/indexing/index.html
Normal file
File diff suppressed because one or more lines are too long
75
v2realbot/static/js/vbt/api/base/merging/index.html
Normal file
75
v2realbot/static/js/vbt/api/base/merging/index.html
Normal file
File diff suppressed because one or more lines are too long
105
v2realbot/static/js/vbt/api/base/preparing/index.html
Normal file
105
v2realbot/static/js/vbt/api/base/preparing/index.html
Normal file
File diff suppressed because one or more lines are too long
81
v2realbot/static/js/vbt/api/base/resampling/base/index.html
Normal file
81
v2realbot/static/js/vbt/api/base/resampling/base/index.html
Normal file
File diff suppressed because one or more lines are too long
6
v2realbot/static/js/vbt/api/base/resampling/index.html
Normal file
6
v2realbot/static/js/vbt/api/base/resampling/index.html
Normal file
File diff suppressed because one or more lines are too long
51
v2realbot/static/js/vbt/api/base/resampling/nb/index.html
Normal file
51
v2realbot/static/js/vbt/api/base/resampling/nb/index.html
Normal file
File diff suppressed because one or more lines are too long
538
v2realbot/static/js/vbt/api/base/reshaping/index.html
Normal file
538
v2realbot/static/js/vbt/api/base/reshaping/index.html
Normal file
File diff suppressed because one or more lines are too long
530
v2realbot/static/js/vbt/api/base/wrapping/index.html
Normal file
530
v2realbot/static/js/vbt/api/base/wrapping/index.html
Normal file
File diff suppressed because one or more lines are too long
831
v2realbot/static/js/vbt/api/data/base/index.html
Normal file
831
v2realbot/static/js/vbt/api/data/base/index.html
Normal file
File diff suppressed because one or more lines are too long
70
v2realbot/static/js/vbt/api/data/custom/alpaca/index.html
Normal file
70
v2realbot/static/js/vbt/api/data/custom/alpaca/index.html
Normal file
File diff suppressed because one or more lines are too long
77
v2realbot/static/js/vbt/api/data/custom/av/index.html
Normal file
77
v2realbot/static/js/vbt/api/data/custom/av/index.html
Normal file
File diff suppressed because one or more lines are too long
71
v2realbot/static/js/vbt/api/data/custom/bento/index.html
Normal file
71
v2realbot/static/js/vbt/api/data/custom/bento/index.html
Normal file
File diff suppressed because one or more lines are too long
73
v2realbot/static/js/vbt/api/data/custom/binance/index.html
Normal file
73
v2realbot/static/js/vbt/api/data/custom/binance/index.html
Normal file
File diff suppressed because one or more lines are too long
102
v2realbot/static/js/vbt/api/data/custom/ccxt/index.html
Normal file
102
v2realbot/static/js/vbt/api/data/custom/ccxt/index.html
Normal file
File diff suppressed because one or more lines are too long
61
v2realbot/static/js/vbt/api/data/custom/csv/index.html
Normal file
61
v2realbot/static/js/vbt/api/data/custom/csv/index.html
Normal file
File diff suppressed because one or more lines are too long
51
v2realbot/static/js/vbt/api/data/custom/custom/index.html
Normal file
51
v2realbot/static/js/vbt/api/data/custom/custom/index.html
Normal file
File diff suppressed because one or more lines are too long
22
v2realbot/static/js/vbt/api/data/custom/db/index.html
Normal file
22
v2realbot/static/js/vbt/api/data/custom/db/index.html
Normal file
File diff suppressed because one or more lines are too long
156
v2realbot/static/js/vbt/api/data/custom/duckdb/index.html
Normal file
156
v2realbot/static/js/vbt/api/data/custom/duckdb/index.html
Normal file
File diff suppressed because one or more lines are too long
51
v2realbot/static/js/vbt/api/data/custom/feather/index.html
Normal file
51
v2realbot/static/js/vbt/api/data/custom/feather/index.html
Normal file
File diff suppressed because one or more lines are too long
58
v2realbot/static/js/vbt/api/data/custom/file/index.html
Normal file
58
v2realbot/static/js/vbt/api/data/custom/file/index.html
Normal file
File diff suppressed because one or more lines are too long
34
v2realbot/static/js/vbt/api/data/custom/gbm/index.html
Normal file
34
v2realbot/static/js/vbt/api/data/custom/gbm/index.html
Normal file
File diff suppressed because one or more lines are too long
35
v2realbot/static/js/vbt/api/data/custom/gbm_ohlc/index.html
Normal file
35
v2realbot/static/js/vbt/api/data/custom/gbm_ohlc/index.html
Normal file
File diff suppressed because one or more lines are too long
76
v2realbot/static/js/vbt/api/data/custom/hdf/index.html
Normal file
76
v2realbot/static/js/vbt/api/data/custom/hdf/index.html
Normal file
File diff suppressed because one or more lines are too long
6
v2realbot/static/js/vbt/api/data/custom/index.html
Normal file
6
v2realbot/static/js/vbt/api/data/custom/index.html
Normal file
File diff suppressed because one or more lines are too long
22
v2realbot/static/js/vbt/api/data/custom/local/index.html
Normal file
22
v2realbot/static/js/vbt/api/data/custom/local/index.html
Normal file
File diff suppressed because one or more lines are too long
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user