Compare commits

...

20 Commits

Author SHA1 Message Date
2d48e87893 ntp graphs
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-03-13 10:50:11 +01:00
6c1a62e09d nicer graph
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2025-03-12 21:13:24 +01:00
a5d3b13629 changes
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2025-03-12 20:49:44 +01:00
83f71b3f81 fix, 3
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-03-12 16:22:07 +01:00
730168ab61 fix, 2
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-03-12 16:18:28 +01:00
8bef6d676c fix
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-03-12 16:14:36 +01:00
813265f8ee forgotten requirement, 2
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-03-12 16:10:28 +01:00
b47070cfc2 forgotten requirement
Some checks failed
ci/woodpecker/push/woodpecker Pipeline failed
ci/woodpecker/tag/woodpecker Pipeline failed
2025-03-12 16:08:57 +01:00
92ef3e6a85 more png
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-03-12 16:04:34 +01:00
a63776fb3f deploy names changed
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-03-12 15:43:33 +01:00
e24a29e94f fix
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-03-12 15:02:41 +01:00
b3c2c7794a pillow
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-03-12 14:48:40 +01:00
7ff1b70098 routes
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-03-12 13:23:24 +01:00
aa4c307048 fix
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-03-12 13:07:05 +01:00
19672e6106 fix deployment
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-03-12 12:57:45 +01:00
97e9d3e4e5 load route files correctly
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-03-12 12:01:07 +01:00
0914a91fa0 debug start script and separate routes into separate files
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
2025-03-12 11:57:37 +01:00
a972916704 switch to python 3.12
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-02-19 13:11:55 +01:00
1774bb11aa change public name
Some checks failed
ci/woodpecker/push/woodpecker Pipeline failed
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-02-13 11:37:13 +01:00
d19ac55dea time reqs, 2
All checks were successful
ci/woodpecker/push/woodpecker Pipeline was successful
ci/woodpecker/tag/woodpecker Pipeline was successful
2025-02-13 11:09:18 +01:00
12 changed files with 384 additions and 272 deletions

View File

@ -1,4 +1,4 @@
FROM python:3.11-alpine3.21
FROM python:3.12-alpine3.21
ENV REDIS_URL=""
ENV SECRET_KEY=""

25
debug-build-run.sh Executable file
View File

@ -0,0 +1,25 @@
#!/bin/bash
set -x
IMAGE_NAME=numberimage
docker build --progress=plain -t $IMAGE_NAME .
. load-debug-env
docker run \
-it \
--rm \
-e "REDIS_URL=$REDIS_URL" \
-e "SECRET_KEY=$SECRET_KEY" \
-e "OIDC_CLIENT_SECRETS=$OIDC_CLIENT_SECRETS" \
-e "PGHOST=$PGHOST" \
-e "PGDATABASE=$PGDATABASE" \
-e "PGSSLMODE=$PGSSLMODE" \
-e "PGUSER=$PGUSER" \
-e "PGPASSWORD=$PGPASSWORD" \
-p 8080:8080 \
$IMAGE_NAME

View File

@ -1,27 +1,27 @@
apiVersion: apps/v1
kind: Deployment
metadata:
name: pv-stats
name: numbers
labels:
app: pv-stats
app: numbers
annotations:
secret.reloader.stakater.com/reload: pv-stats
secret.reloader.stakater.com/reload: numbers
spec:
replicas: 1
selector:
matchLabels:
app: pv-stats
app: numbers
template:
metadata:
labels:
app: pv-stats
app: numbers
spec:
containers:
- name: pv-stats
- name: numbers
image: %IMAGE%
envFrom:
- secretRef:
name: pv-stats
name: numbers
ports:
- containerPort: 8080
protocol: TCP
@ -29,11 +29,11 @@ spec:
apiVersion: v1
kind: Service
metadata:
name: pv-stats
name: numbers
spec:
type: ClusterIP
selector:
app: pv-stats
app: numbers
ports:
- name: http
targetPort: 8080
@ -42,23 +42,23 @@ spec:
apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
name: pv-stats
name: numbers
annotations:
cert-manager.io/cluster-issuer: letsencrypt-production-http
spec:
tls:
- hosts:
- pv-stats.hottis.de
secretName: pv-stats-cert
- numbers.hottis.de
secretName: numbers-cert
rules:
- host: pv-stats.hottis.de
- host: numbers.hottis.de
http:
paths:
- path: /
pathType: Prefix
backend:
service:
name: pv-stats
name: numbers
port:
number: 80

View File

@ -9,7 +9,7 @@ if [ "$GPG_PASSPHRASE" == "" ]; then
exit 1
fi
IMAGE_NAME=gitea.hottis.de/wn/pv-stats
IMAGE_NAME=gitea.hottis.de/wn/numbers
NAMESPACE=homea
DEPLOYMENT_DIR=$PWD/deployment
@ -25,7 +25,7 @@ kubectl create namespace $NAMESPACE \
# rm $SECRETS_FILE
eval "`cat secrets.asc | /usr/local/bin/decrypt-secrets.sh`"
kubectl create secret generic pv-stats \
kubectl create secret generic numbers \
--dry-run=client \
-o yaml \
--save-config \

15
load-debug-env Normal file
View File

@ -0,0 +1,15 @@
SECRETS=`mktemp`
gpg --decrypt --passphrase $GPG_PASSPHRASE --yes --batch --output $SECRETS ./deployment/secrets.asc
. $SECRETS
rm $SECRETS
DB_NAMESPACE=database1
DB_DEPLOYNAME=database
REDIS_NAMESPACE=redis
REDIS_SERVICE_NAME=redis
PGHOST=`kubectl get services $DB_DEPLOYNAME -n $DB_NAMESPACE -o jsonpath="{.status.loadBalancer.ingress[0].ip}"`
REDISHOST=`kubectl get services $REDIS_SERVICE_NAME -n $REDIS_NAMESPACE -o jsonpath="{.status.loadBalancer.ingress[0].ip}"`
REDIS_URL=redis://$REDISHOST:6379/4

32
src/app.py Normal file
View File

@ -0,0 +1,32 @@
from flask import Flask, session, g, render_template_string
from flask_session import Session
from flask_oidc import OpenIDConnect
from werkzeug.middleware.proxy_fix import ProxyFix
from loguru import logger
import redis
import json
import os
try:
redis_url = os.environ['REDIS_URL']
oidc_client_secrets = os.environ['OIDC_CLIENT_SECRETS']
secret_key = os.environ['SECRET_KEY']
except KeyError as e:
logger.error(f"Required environment variable not set ({e})")
raise e
app = Flask(__name__)
app.config.update({
'SECRET_KEY': secret_key,
'SESSION_TYPE': 'redis',
'SESSION_REDIS': redis.from_url(redis_url),
'OIDC_CLIENT_SECRETS': json.loads(oidc_client_secrets),
'OIDC_SCOPES': 'openid email',
'OIDC_USER_INFO_ENABLED': True,
'SESSION_USE_SIGNER': True,
})
Session(app)
oidc = OpenIDConnect(app)

16
src/debug_routes.py Normal file
View File

@ -0,0 +1,16 @@
from loguru import logger
import json
from app import app
from app import oidc
@app.route('/token_debug', methods=['GET'])
@oidc.require_login
def token_debug():
# Access Token vom Identity Provider abrufen
access_token = oidc.get_access_token()
return json.dumps({
"access_token": access_token
})

129
src/ntp_routes.py Normal file
View File

@ -0,0 +1,129 @@
from flask import Flask, session, g, render_template_string, Response
from loguru import logger
import json
import matplotlib.pyplot as plt
import matplotlib.dates as mdates
from matplotlib.ticker import ScalarFormatter
import pandas as pd
import psycopg
import sqlalchemy
import time
import io
from app import app
from app import oidc
@app.route('/ntp/stratum-rootdisp.png')
def stratum_rootdisp_png():
dbh = psycopg.connect()
engine = sqlalchemy.create_engine("postgresql+psycopg://", creator=lambda: dbh)
query = """
select time_bucket('5 minutes', time) as bucket,
attributes->>'Label' as device,
avg(cast(values->'rootdisp'->>'value' as float)) as rootdisp,
max(cast(values->'stratum'->>'value' as int)) as stratum
from measurements
where time >= date_trunc('day', now()) AND time < date_trunc('day', now()) + '1 day'::interval and
application = 'SNMP' and attributes->>'Label' IN ('harrison', 'david')
group by bucket, attributes->>'Label'
order by bucket, attributes->>'Label'
"""
df = pd.read_sql(query, con=engine)
df['rootdisp'] = df['rootdisp'] / 1e6
# Extract date for title
plot_date = df['bucket'].dt.date.iloc[0] if not df.empty else "Unknown Date"
# Create figure with two side-by-side subplots
fig, axes = plt.subplots(1, 2, figsize=(15, 5), sharex=True)
for i, device in enumerate(['harrison', 'david']):
ax1 = axes[i]
ax2 = ax1.twinx()
device_df = df[df['device'] == device]
ax1.plot(device_df['bucket'], device_df['rootdisp'], 'r-', label='Root Dispersion')
ax1.set_xlabel('Time')
ax1.set_ylabel('Root Dispersion (ms)', color='r')
ax1.tick_params(axis='y', labelcolor='r')
ax2.plot(device_df['bucket'], device_df['stratum'], 'b-', label='Stratum')
ax2.set_ylabel('Stratum', color='b')
ax2.tick_params(axis='y', labelcolor='b')
ax2.set_yticks(range(int(device_df['stratum'].min()), int(device_df['stratum'].max()) + 1))
ax1.set_title(f'{device.capitalize()}')
ax1.xaxis.set_major_formatter(mdates.DateFormatter('%H:%M'))
fig.autofmt_xdate(rotation=45)
fig.suptitle(f'Stratum and Root Dispersion - {plot_date}')
fig.tight_layout()
img_io = io.BytesIO()
plt.savefig(img_io, format='png')
img_io.seek(0)
plt.close(fig)
return Response(img_io, mimetype='image/png')
@app.route('/ntp/packets-load.png')
def packets_load_png():
dbh = psycopg.connect()
engine = sqlalchemy.create_engine("postgresql+psycopg://", creator=lambda: dbh)
query = """
select time_bucket('5 minutes', time) as bucket,
attributes->>'Label' as device,
avg(cast(values->'load1'->>'value' as float)) as load,
avg(cast(values->'processed-pkts'->>'value' as int)) as packets
from measurements
where time >= date_trunc('day', now()) AND time < date_trunc('day', now()) + '1 day'::interval and
application = 'SNMP' and attributes->>'Label' IN ('harrison', 'david')
group by bucket, attributes->>'Label'
order by bucket, attributes->>'Label'
"""
df = pd.read_sql(query, con=engine)
# Extract date for title
plot_date = df['bucket'].dt.date.iloc[0] if not df.empty else "Unknown Date"
# Create figure with two side-by-side subplots
fig, axes = plt.subplots(1, 2, figsize=(15, 5), sharex=True)
for i, device in enumerate(['harrison', 'david']):
ax1 = axes[i]
ax2 = ax1.twinx()
device_df = df[df['device'] == device]
ax1.plot(device_df['bucket'], device_df['load'], 'r-', label='CPU Load')
ax1.set_xlabel('Time')
ax1.set_ylabel('Load', color='r')
ax1.tick_params(axis='y', labelcolor='r')
ax2.plot(device_df['bucket'], device_df['packets'], 'b-', label='Processed Packets')
ax2.set_ylabel('Packets', color='b')
ax2.tick_params(axis='y', labelcolor='b')
ax1.set_title(f'{device.capitalize()}')
ax1.xaxis.set_major_formatter(mdates.DateFormatter('%H:%M'))
fig.autofmt_xdate(rotation=45)
fig.suptitle(f'CPU Load and Processed Packets - {plot_date}')
fig.tight_layout()
img_io = io.BytesIO()
plt.savefig(img_io, format='png')
img_io.seek(0)
plt.close(fig)
return Response(img_io, mimetype='image/png')

117
src/pv_routes.py Normal file
View File

@ -0,0 +1,117 @@
from flask import Flask, session, g, render_template_string
from loguru import logger
import json
import plotly.express as px
import plotly.graph_objects as po
import pandas as pd
import psycopg
import sqlalchemy
import time
from app import app
from app import oidc
@app.route('/pvstats')
@oidc.require_login
def pvstats():
try:
stepX_time = time.time()
dbh = psycopg.connect()
engine = sqlalchemy.create_engine("postgresql+psycopg://", creator=lambda: dbh)
step0_time = time.time()
df = pd.read_sql("SELECT month, cast(year AS varchar), current_energy AS value FROM pv_energy_by_month", con=engine)
step1_time = time.time()
duration1 = step1_time - step0_time
logger.info(f"{duration1=}")
fig_1 = px.bar(df, x='month', y='value', color='year', barmode='group')
step2_time = time.time()
duration2 = step2_time - step1_time
logger.info(f"{duration2=}")
fig_1.update_layout(
title=f"Jahreswerte Exportierte Energie {duration1:.3f}, {duration2:.3f}",
xaxis_title="",
yaxis_title="",
legend_title="Jahr",
xaxis=dict(
tickmode="array",
tickvals=list(range(1, 13)), # Monate 112
ticktext=["Jan", "Feb", "Mär", "Apr", "Mai", "Jun", "Jul", "Aug", "Sep", "Okt", "Nov", "Dez"]
),
yaxis=dict(ticksuffix=" kWh")
)
graph_html_1 = fig_1.to_html(full_html=False, default_height='30%')
step3_time = time.time()
df = pd.read_sql("SELECT time_bucket('5 minutes', time) AS bucket, AVG(power) AS avg_power FROM pv_power_v WHERE time >= date_trunc('day', now()) - '1 day'::interval AND time < date_trunc('day', now()) GROUP BY bucket ORDER BY bucket", con=engine)
step4_time = time.time()
duration3 = step4_time - step3_time
logger.info(f"{duration3=}")
fig_2 = px.line(df, x='bucket', y='avg_power')
step5_time = time.time()
duration4 = step5_time - step4_time
logger.info(f"{duration4=}")
fig_2.update_layout(
xaxis_title="",
yaxis_title="",
title=f"Export gestern {duration3:.3f}, {duration4:.3f}",
yaxis=dict(ticksuffix=" W")
)
graph_html_2 = fig_2.to_html(full_html=False, default_height='30%')
step6_time = time.time()
df = pd.read_sql("SELECT time_bucket('5 minutes', time) AS bucket, AVG(power) AS avg_power FROM pv_power_v WHERE time >= date_trunc('day', now()) AND time < date_trunc('day', now()) + '1 day'::interval GROUP BY bucket ORDER BY bucket", con=engine)
step7_time = time.time()
duration5 = step7_time - step6_time
logger.info(f"{duration5=}")
fig_3 = px.line(df, x='bucket', y='avg_power')
step8_time = time.time()
duration6 = step8_time - step7_time
logger.info(f"{duration6=}")
fig_3.update_layout(
xaxis_title="",
yaxis_title="",
title=f"Export heute {duration5:.3f}, {duration6:.3f}",
yaxis=dict(ticksuffix=" W")
)
graph_html_3 = fig_3.to_html(full_html=False, default_height='30%')
stepZ_time = time.time()
duration7 = stepZ_time - stepX_time
logger.info(f"{duration7=}")
return render_template_string(f"""
<html>
<head>
<title>Jahreswerte PV-Energie</title>
</head>
<body>
{graph_html_1}
{graph_html_2}
{graph_html_3}
<div style="height:9vh; background-color:lightgrey; font-family: Courier, Consolas, monospace;">
<table style="border-collapse: collapse;">
<style>
td.smallsep {{ padding-right: 10px }}
td.largesep {{ padding-right: 30px }}
</style>
<tr>
<td class="smallsep">Query 1:</td><td class="largesep"> {duration1:.3f} s</td><td class="smallsep">Graph 1:</td><td> {duration2:.3f} s</td>
</tr><tr>
<td class="smallsep">Query 2:</td><td class="largesep"> {duration3:.3f} s</td><td class="smallsep">Graph 2:</td><td> {duration4:.3f} s</td>
</tr><tr>
<td class="smallsep">Query 3:</td><td class="largesep"> {duration5:.3f} s</td><td class="smallsep">Graph 3:</td><td> {duration6:.3f} s</td>
</tr><tr>
<td class="smallsep">Total:</td><td> {duration7:.3f} s</td><td></td><td></td>
</tr>
</table>
</div>
</body>
</html>
""")
except Exception as e:
raise Exception(f"Error when querying energy export values: {e}")
finally:
if dbh is not None:
dbh.close()

View File

@ -38,3 +38,6 @@ tzdata==2025.1
urllib3==2.3.0
Werkzeug==3.1.3
zipp==3.21.0
pillow==11.1.0
matplotlib==3.10.1

24
src/routes.py Normal file
View File

@ -0,0 +1,24 @@
from flask import abort, Response
from PIL import Image, ImageDraw
import io
from app import app
from app import oidc
@app.route('/')
def index():
abort(404)
@app.route('/generate_image')
def generate_image():
img = Image.new('RGB', (200, 100), color=(255, 255, 255))
draw = ImageDraw.Draw(img)
draw.text((50, 40), "Hello, Flask!", fill=(0, 0, 0)) # Schwarzer Text
img_io = io.BytesIO()
img.save(img_io, 'PNG')
img_io.seek(0) # Zeiger zurücksetzen
return Response(img_io, mimetype='image/png')

View File

@ -1,265 +1,16 @@
from flask import Flask, session, g, render_template_string
from flask_session import Session
from flask_oidc import OpenIDConnect
from werkzeug.middleware.proxy_fix import ProxyFix
from loguru import logger
import redis
import json
import os
import plotly.express as px
import plotly.graph_objects as po
import pandas as pd
import psycopg
import sqlalchemy
import time
try:
redis_url = os.environ['REDIS_URL']
oidc_client_secrets = os.environ['OIDC_CLIENT_SECRETS']
secret_key = os.environ['SECRET_KEY']
except KeyError as e:
logger.error(f"Required environment variable not set ({e})")
raise e
app = Flask(__name__)
app.config.update({
'SECRET_KEY': secret_key,
'SESSION_TYPE': 'redis',
'SESSION_REDIS': redis.from_url(redis_url),
'OIDC_CLIENT_SECRETS': json.loads(oidc_client_secrets),
'OIDC_SCOPES': 'openid email',
'OIDC_USER_INFO_ENABLED': True,
'SESSION_USE_SIGNER': True,
})
Session(app)
oidc = OpenIDConnect(app)
@app.route('/token_debug', methods=['GET'])
@oidc.require_login
def token_debug():
# Access Token vom Identity Provider abrufen
access_token = oidc.get_access_token()
return json.dumps({
"access_token": access_token
})
@app.route('/')
@oidc.require_login
def index():
try:
stepX_time = time.time()
dbh = psycopg.connect()
engine = sqlalchemy.create_engine("postgresql+psycopg://", creator=lambda: dbh)
step0_time = time.time()
df = pd.read_sql("SELECT month, cast(year AS varchar), current_energy AS value FROM pv_energy_by_month", con=engine)
step1_time = time.time()
duration1 = step1_time - step0_time
logger.info(f"{duration1=}")
fig_1 = px.bar(df, x='month', y='value', color='year', barmode='group')
step2_time = time.time()
duration2 = step2_time - step1_time
logger.info(f"{duration2=}")
fig_1.update_layout(
title=f"Jahreswerte Exportierte Energie {duration1:.3f}, {duration2:.3f}",
xaxis_title="",
yaxis_title="",
legend_title="Jahr",
xaxis=dict(
tickmode="array",
tickvals=list(range(1, 13)), # Monate 112
ticktext=["Jan", "Feb", "Mär", "Apr", "Mai", "Jun", "Jul", "Aug", "Sep", "Okt", "Nov", "Dez"]
),
yaxis=dict(ticksuffix=" kWh")
)
graph_html_1 = fig_1.to_html(full_html=False, default_height='30%')
step3_time = time.time()
df = pd.read_sql("SELECT time_bucket('5 minutes', time) AS bucket, AVG(power) AS avg_power FROM pv_power_v WHERE time >= date_trunc('day', now()) - '1 day'::interval AND time < date_trunc('day', now()) GROUP BY bucket ORDER BY bucket", con=engine)
step4_time = time.time()
duration3 = step4_time - step3_time
logger.info(f"{duration3=}")
fig_2 = px.line(df, x='bucket', y='avg_power')
step5_time = time.time()
duration4 = step5_time - step4_time
logger.info(f"{duration4=}")
fig_2.update_layout(
xaxis_title="",
yaxis_title="",
title=f"Export gestern {duration3:.3f}, {duration4:.3f}",
yaxis=dict(ticksuffix=" W")
)
graph_html_2 = fig_2.to_html(full_html=False, default_height='30%')
step6_time = time.time()
df = pd.read_sql("SELECT time_bucket('5 minutes', time) AS bucket, AVG(power) AS avg_power FROM pv_power_v WHERE time >= date_trunc('day', now()) AND time < date_trunc('day', now()) + '1 day'::interval GROUP BY bucket ORDER BY bucket", con=engine)
step7_time = time.time()
duration5 = step7_time - step6_time
logger.info(f"{duration5=}")
fig_3 = px.line(df, x='bucket', y='avg_power')
step8_time = time.time()
duration6 = step8_time - step7_time
logger.info(f"{duration6=}")
fig_3.update_layout(
xaxis_title="",
yaxis_title="",
title=f"Export heute {duration5:.3f}, {duration6:.3f}",
yaxis=dict(ticksuffix=" W")
)
graph_html_3 = fig_3.to_html(full_html=False, default_height='30%')
stepZ_time = time.time()
duration7 = stepZ_time - stepX_time
logger.info(f"{duration7=}")
return render_template_string(f"""
<html>
<head>
<title>Jahreswerte PV-Energie</title>
</head>
<body>
{graph_html_1}
{graph_html_2}
{graph_html_3}
<div style="height:9vh; background-color:lightgrey; font-family: Courier, Consolas, monospace;">
<table style="border-collapse: collapse;">
<style>
td.smallsep {{ padding-right: 10px }}
td.largesep {{ padding-right: 30px }}
</style>
<tr>
<td class="smallsep">Query 1:</td><td class="largesep"> {duration1:.3f} s</td><td class="smallsep">Graph 1:</td><td> {duration2:.3f} s</td>
</tr><tr>
<td class="smallsep">Query 2:</td><td class="largesep"> {duration3:.3f} s</td><td class="smallsep">Graph 2:</td><td> {duration4:.3f} s</td>
</tr><tr>
<td class="smallsep">Query 3:</td><td class="largesep"> {duration5:.3f} s</td><td class="smallsep">Graph 3:</td><td> {duration6:.3f} s</td>
</tr><tr>
<td class="smallsep">Total:</td><td> {duration7:.3f} s</td><td></td><td></td>
</tr>
</table>
</div>
</body>
</html>
""")
except Exception as e:
raise Exception(f"Error when querying energy export values: {e}")
finally:
if dbh is not None:
dbh.close()
@app.route('/ntpserver')
def ntpserver():
try:
dbh = psycopg.connect()
engine = sqlalchemy.create_engine("postgresql+psycopg://", creator=lambda: dbh)
query = """
select time_bucket('5 minutes', time) as bucket,
device,
avg(cast(values->'rootdisp'->>'value' as float)) as rootdisp,
max(cast(values->'stratum'->>'value' as int)) as stratum
from measurements
where time >= date_trunc('day', now()) AND time < date_trunc('day', now()) + '1 day'::interval and
application = 'TSM' and attributes->>'Label' = 'david'
group by bucket, device
order by bucket, device
"""
df = pd.read_sql(query, con=engine)
fig = po.Figure()
fig.add_trace(po.Scatter(x=df['bucket'], y=df['rootdisp'], mode='lines', name='Root Dispersion', yaxis='y1', line=dict(color='red')))
fig.add_trace(po.Scatter(x=df['bucket'], y=df['stratum'], mode='lines', name='Stratum', yaxis='y2', line=dict(color='blue')))
fig.update_layout(
title='NTP Server Numbers',
# Linke Y-Achse
yaxis=dict(
title='Root Dispersion',
ticksuffix=' ms'
),
# Rechte Y-Achse
yaxis2=dict(
title='Stratum',
overlaying='y', # Legt die zweite Y-Achse über die erste
side='right', # Setzt sie auf die rechte Seite
tickmode='linear', # Stellt sicher, dass die Ticks in festen Intervallen sind
dtick=1, # Zeigt nur ganzzahlige Ticks
),
legend=dict(x=0.05, y=1) # Position der Legende
)
graph_html_1 = fig.to_html(full_html=False, default_height='30%')
query = """
select time_bucket('5 minutes', time) as bucket,
device,
avg(cast(values->'time-reqs-pkts'->>'value' as float)) as packets
from measurements
where time >= date_trunc('day', now()) AND time < date_trunc('day', now()) + '1 day'::interval and
application = 'SNMP' and attributes->>'Label' = 'david'
group by bucket, device
order by bucket, device
"""
df = pd.read_sql(query, con=engine)
fig_2 = px.line(df, x='bucket', y='packets')
fig_2.update_layout(
xaxis_title="",
yaxis_title="",
yaxis_ticksuffix="p/s",
title=f"Time Requests"
)
graph_html_2 = fig_2.to_html(full_html=False, default_height='30%')
query = """
select time_bucket('5 minutes', time) as bucket,
device,
avg(cast(values->'load1'->>'value' as float)) as loadaverage1min
from measurements
where time >= date_trunc('day', now()) AND time < date_trunc('day', now()) + '1 day'::interval and
application = 'SNMP' and attributes->>'Label' = 'david'
group by bucket, device
order by bucket, device
"""
df = pd.read_sql(query, con=engine)
fig_3 = px.line(df, x='bucket', y='loadaverage1min')
fig_3.update_layout(
xaxis_title="",
yaxis_title="",
title=f"CPU Load"
)
graph_html_3 = fig_3.to_html(full_html=False, default_height='30%')
return render_template_string(f"""
<html>
<head>
<title>NTP Server Numbers</title>
</head>
<body>
{graph_html_1}
{graph_html_2}
{graph_html_3}
</body>
</html>
""")
except Exception as e:
raise Exception(f"Error when querying NTP server values: {e}")
finally:
if dbh is not None:
dbh.close()
from app import app
import routes
import debug_routes
import pv_routes
import ntp_routes
if __name__ == '__main__':
app.run(port=8080)
app.run(host='0.0.0.0', port=8080)
else:
exposed_app = ProxyFix(app, x_for=1, x_host=1)