Compare commits

32 Commits

Author SHA1 Message Date
Marcel Gansfusz
6d5c390350 fixed error in compose.yml 2025-11-04 21:24:38 +01:00
Marcel Gansfusz
e13d06d4a7 fixed regular deletions of files 2025-11-04 19:16:59 +01:00
Marcel Gansfusz
547411ba03 made to kill fip entrys when 1d passes 2025-11-04 19:04:41 +01:00
Marcel Gansfusz
cdd26e0bc3 caught exception in trying to censor 2025-11-04 17:54:24 +01:00
Marcel Gansfusz
f7c73a0c5a fixed js 2025-11-04 17:17:13 +01:00
Marcel Gansfusz
8e74848397 fixed js 2025-11-04 17:13:18 +01:00
Marcel Gansfusz
8704aee82e fixed tesseract in Dockerfile 2025-11-04 16:56:24 +01:00
Marcel Gansfusz
594ac1fa00 updated censoring status logic 2025-11-04 16:55:11 +01:00
Marcel Gansfusz
2ee90cd0d7 added tessercat to Dockerfile 2025-11-04 15:45:37 +01:00
Marcel Gansfusz
d42bab5b19 changed the fetch in js to be relative (no explicite url; just a path); removed version from docker compose 2025-11-04 14:55:04 +01:00
Marcel Gansfusz
c3a87ceee6 changed stryle of greeting file 2025-10-31 17:48:40 +01:00
Marcel Gansfusz
6f2d373292 updated greeting file to represent new censoring mechanism 2025-10-31 16:18:43 +01:00
Marcel Gansfusz
a37206d6a4 added logging statement 2025-10-30 15:48:51 +01:00
Marcel Gansfusz
6bd75bf93f removed .nvim; added log statements 2025-10-30 15:31:00 +01:00
Marcel Gansfusz
5bc24a32d5 removed __pycache__ 2025-10-30 15:09:16 +01:00
Marcel Gansfusz
a9233926e5 added logging statements 2025-10-30 14:45:53 +01:00
Marcel Gansfusz
90235d2788 Made the database reconnect when connection is broken 2025-10-30 13:03:02 +01:00
Marcel Gansfusz
da316a9351 changed from string paths tp pathlib 2025-10-29 12:14:32 +01:00
Marcel Gansfusz
e6727daf8e i forgor 2025-10-28 19:32:33 +01:00
Marcel Gansfusz
d6508c739d in between state before converting to pathlib 2025-10-28 19:32:01 +01:00
Marcel Gansfusz
856c401c06 moved DOCKERFILE to Dockerfile 2025-10-27 18:17:43 +01:00
Marcel Gansfusz
4da77c95d1 finished writeing compatibility with docker; untested 2025-10-24 21:19:36 +02:00
Marcel Gansfusz
98742107b2 changed structure for docker usage 2025-10-24 21:02:42 +02:00
b9eb5e8bd4 Merge pull request 'improve_censoring_speed' (#1) from improve_censoring_speed into main
Reviewed-on: #1
2025-10-23 15:43:40 +02:00
Marcel Gansfusz
5c6a8dfba2 fixed bug in js, that blocked showing prof suggestions when nothing is entered in the field 2025-10-23 15:40:45 +02:00
Marcel Gansfusz
c30d69d205 added back option to run OCR 2025-10-23 00:06:25 +02:00
Marcel Gansfusz
56d3468889 changed the censoring mode to built in censoring with pymupdf 2025-10-22 23:26:33 +02:00
Marcel Gansfusz
26ea274023 made the loading animation prettier 2025-10-22 21:11:40 +02:00
Marcel Gansfusz
0c96d04326 Added Loading circle 2025-10-22 18:10:56 +02:00
Marcel Gansfusz
352540a3b1 Added afunctionality to readout wich page is currently being censored; It probably helps with patience; It is at the moment only implemented in the backend; A pretty frontend is still nessecerry 2025-10-22 01:12:25 +02:00
Marcel Gansfusz
7d828a7c3b modified the init file to work on local directory; added logging to init.py 2025-10-21 14:48:06 +02:00
Marcel Gansfusz
93f2c59997 prevented overwriteing of files and modified the filename of things that require a date 2025-10-20 15:04:34 +02:00
28 changed files with 1358 additions and 305 deletions

5
.gitignore vendored
View File

@@ -3,4 +3,9 @@ app/files/
app/pwfile.json
app/dest
app.log
init.log
app/__pycache__/
mariadb/*
unizeug
.mypy_cache
.nvim

View File

@@ -1,2 +0,0 @@
# remote_path="/srv/http/"
# remote_path="dev@10.0.0.25:/var/www/html/"

31
Dockerfile Normal file
View File

@@ -0,0 +1,31 @@
FROM python:3.13-rc-alpine
WORKDIR /usr/src/
COPY requirements.txt /usr/src/requirements.txt
COPY entrypoint.sh /usr/src/entrypoint.sh
RUN apk add --no-cache \
gcc \
g++ \
musl-dev \
python3-dev \
libffi-dev \
openssl-dev \
cargo \
make \
mariadb-connector-c-dev \
jpeg-dev \
zlib-dev \
freetype-dev \
lcms2-dev \
openjpeg-dev \
tiff-dev \
tk-dev \
tcl-dev \
libwebp-dev \
tesseract-ocr \
tesseract-ocr-data-deu
RUN python -m ensurepip --upgrade
RUN pip install setuptools wheel
RUN pip install -r requirements.txt
WORKDIR /python
CMD /bin/sh /usr/src/entrypoint.sh
# ENTRYPOINT ["/usr/src/entrypoint.sh"]

Binary file not shown.

Binary file not shown.

View File

Before

Width:  |  Height:  |  Size: 7.4 KiB

After

Width:  |  Height:  |  Size: 7.4 KiB

View File

Before

Width:  |  Height:  |  Size: 4.2 KiB

After

Width:  |  Height:  |  Size: 4.2 KiB

View File

Before

Width:  |  Height:  |  Size: 15 KiB

After

Width:  |  Height:  |  Size: 15 KiB

View File

Before

Width:  |  Height:  |  Size: 78 KiB

After

Width:  |  Height:  |  Size: 78 KiB

View File

Before

Width:  |  Height:  |  Size: 8.0 KiB

After

Width:  |  Height:  |  Size: 8.0 KiB

View File

Before

Width:  |  Height:  |  Size: 28 KiB

After

Width:  |  Height:  |  Size: 28 KiB

Binary file not shown.

File diff suppressed because one or more lines are too long

Before

Width:  |  Height:  |  Size: 89 KiB

After

Width:  |  Height:  |  Size: 103 KiB

164
app/index.html Normal file
View File

@@ -0,0 +1,164 @@
<!doctype html>
<html lang="de">
<head>
<title>Unizeug uploader</title>
<link rel="stylesheet" href="static/style.css" />
<script src="https://cdnjs.cloudflare.com/ajax/libs/pdf.js/2.4.456/pdf.min.js"></script>
<script src="static/app.js" defer></script>
<script src="static/autocomplete.js" defer></script>
<script src="static/dynhide.js" defer></script>
<script src="static/filedrop.js" defer></script>
<link
rel="icon"
type="image/png"
href="/favicon/favicon-96x96.png"
sizes="96x96"
/>
<link rel="icon" type="image/svg+xml" href="/favicon/favicon.svg" />
<link rel="shortcut icon" href="/favicon/favicon.ico" />
<link
rel="apple-touch-icon"
sizes="180x180"
href="/favicon/apple-touch-icon.png"
/>
<meta name="apple-mobile-web-app-title" content="Unizeug" />
<link rel="manifest" href="/favicon/site.webmanifest" />
</head>
<body>
<!-- The Modal -->
<div id="loading" class="modal">
<!-- Modal content -->
<div class="loading-content">
<!-- <span class="close">&times;</span> -->
<div class="loader"></div>
<p id="upload_status" class="upload_status_text">Uploading</p>
</div>
</div>
<!-- main -->
<div class="main">
<div class="left" id="controldiv">
<div id="fileupload">
<form id="uploadform" enctype="multipart/form-data">
<div class="filetop">
<!-- <label for="filepicker">Choose a file</label> -->
<input
type="file"
name="files"
id="filepicker"
multiple
placeholder="Drop File"
/>
</div>
<button type="submit" id="upload" method="POST" class="fileupload">
Upload
</button>
</form>
</div>
<div id="submitdiv">
<form id="submitform" ,onsubmit="submitFile(event)">
<label for="lva">Lehrveranstaltung:</label>
<div class="autocomplete">
<input
type="text"
id="lva"
name="lva"
placeholder="Lehrveranstaltung"
autocomplete="off"
/>
</div>
<br />
<!-- <br /> -->
<label for="prof">Vortragende*r:</label>
<div class="autocomplete">
<input
type="text"
id="prof"
name="prof"
placeholder="Vortragende*r"
autocomplete="off"
/>
</div>
<br />
<!-- <br /> -->
<label for="name">Name:</label>
<input
type="text"
id="name"
name="fname"
placeholder="Prüfung"
/><br />
<label for="sem">Semester:</label>
<input type="text" id="sem" name="sem" placeholder="2024W" /><br />
<input
type="radio"
id="pruefung"
name="stype"
value="0"
checked="checked"
/>
<label for="pruefung">Prüfung</label><br />
<input type="radio" id="klausur" name="stype" value="1" />
<label for="klausur">Klausur</label><br />
<input type="radio" id="uebung" name="stype" value="2" />
<label for="uebung">Übung</label><br />
<input type="radio" id="labor" name="stype" value="3" />
<label for="labor">Labor</label><br />
<input type="radio" id="unterlagen" name="stype" value="4" />
<label for="unterlagen">Unterlagen</label><br />
<input type="radio" id="zusammenfassungen" name="stype" value="5" />
<label for="zusammenfassungen">Zusammenfassung</label><br />
<input type="radio" id="multimedia" name="stype" value="6" />
<label for="multimedia">Multimedia</label><br />
<br />
<div id="subcatdiv">
<label for="subcat">Veranstaltung</label>
<div class="autocomplete">
<input
type="text"
id="subcat"
name="subcat"
placeholder="Klausur 1"
autocomplete="off"
/>
</div>
</div>
<div id="datediv">
<label for="date">Datum</label>
<input
type="date"
id="date"
name="ex_date"
placeholder="Drop File"
/><br />
</div>
<input
type="checkbox"
name="ocr"
id="sec_censor"
value="True"
/><label for="sec_censor">OCR</label><br /><br />
<button type="submit" id="send">Senden</button>
</form>
</div>
</div>
<div class="right" id="rightdiv">
<div class="buttons" id="buttonsdiv">
<button id="prev">Prev</button><button id="next">Next</button>
<div>
<span id="npage"></span>
<span>/</span>
<span id="npages"></span>
</div>
<button id="clr">Clear Page</button><button id="ca">Claer All</button>
</div>
<div id="cnvdiv">
<div class="stack" id="cnvcont">
<canvas id="cnv"></canvas>
<canvas id="drw_cnv"></canvas>
</div>
</div>
</div>
</div>
</body>
</html>

View File

@@ -1,10 +1,17 @@
import paramiko
from os.path import isdir
from stat import S_ISDIR, S_ISREG
import re
import pathlib
import os
# from base64 import decodebytes
import json
import mariadb
import logging
from pathlib import Path
import schedule
import time
import pytz
CATEGORIES = [
"Prüfungen",
@@ -16,9 +23,24 @@ CATEGORIES = [
"Multimedia",
]
SUBCAT_CATEGORIES = ["Klausuren", "Übungen", "Labore"]
unizeug_path = "/mnt/save/daten/Unizeug/"
unizeug_path = os.environ.get("UNIZEUG_PATH", "./unizeug")
APP_ROOT_PATH = Path(os.environ.get("APP_ROOT_PATH", "./app"))
FILES_IN_PROGRESS = APP_ROOT_PATH / "files/"
log = logging.getLogger(__name__)
logging.basicConfig(
filename="init.log",
level=logging.INFO,
format="[%(asctime)s, %(filename)s:%(lineno)s -> %(funcName)10s() ]%(levelname)s: %(message)s",
)
debug = log.debug
info = log.info
error = log.error
db = mariadb.connect(
host="localhost", user="wildserver", password="DBPassword", database="Unizeug"
host=os.environ.get("DB_HOST", "db"),
user=os.environ.get("DB_USER", "user"),
password=os.environ.get("DB_PASSWORD", "DBPASSWORD"),
database=os.environ.get("DB_DATABASE", "unizeug"),
)
c = db.cursor()
try:
@@ -49,33 +71,54 @@ except mariadb.OperationalError:
c.execute(
"CREATE TABLE SubCats(id BIGINT(20) UNSIGNED NOT NULL AUTO_INCREMENT,LId BIGINT(20),PId BIGINT(20),cat TINYINT UNSIGNED,name VARCHAR(256), PRIMARY KEY(id))"
)
try:
c.execute(
"CREATE TABLE FIP(id UUID DEFAULT(UUID()), filename VARCHAR(256), filetype VARCHAR(8),initTimeStamp DATETIME, PRIMARY KEY(id))"
)
except mariadb.OperationalError:
pass
db.commit()
def remove_old_FIP_entrys():
cur = db.cursor(dictionary=True)
cur.execute(
"SELECT id,filename FROM FIP WHERE HOUR(TIMEDIFF(NOW(),initTimeStamp)) > 24 "
)
files = cur.fetchall()
info(f"Remove Files: {files}")
for file in files:
c.execute("DELETE FROM FIP WHERE id=?", (file["id"],))
os.remove(FILES_IN_PROGRESS / file["filename"])
db.commit()
def get_dirstruct():
with open("app/pwfile.json", "r") as f:
cred = json.load(f)
ssh = paramiko.SSHClient()
print(cred["sftpurl"])
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
# with open("app/pwfile.json", "r") as f:
# cred = json.load(f)
# ssh = paramiko.SSHClient()
# print(cred["sftpurl"])
# ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
# key=paramiko.RSAKey(data=decodebytes(bytes(cred["key"],"utf-8")))
# ssh.get_host_keys().add(cred["sftpurl"], 'ssh-rsa', key)
ssh.connect(cred["sftpurl"], username=cred["sftpuser"], password=cred["sftpPW"])
sftp = ssh.open_sftp()
folders = sftp.listdir_attr(unizeug_path)
for entry in folders:
# ssh.connect(cred["sftpurl"], username=cred["sftpuser"], password=cred["sftpPW"])
# sftp = ssh.open_sftp()
# folders = sftp.listdir_attr(unizeug_path)
folders = pathlib.Path(unizeug_path)
for entry in folders.iterdir():
if entry is None:
continue
if not S_ISDIR(entry.st_mode):
if not entry.is_dir():
continue
fname = str(entry.filename)
fname = str(entry.name)
regex = re.compile(r"Multimedia_only")
if regex.search(fname):
continue
# print(fname)
lvid = re.search(r"[a-zA-Z0-9]{3}\.[a-zA-Z0-9]{3}", fname)
print(lvid)
# print(lvid)
if lvid is None:
error(f"Didnt Find LVA ID in Directory {fname}")
continue
lvid = lvid.group()[:3] + lvid.group()[4:]
# name = fname[:-8]
@@ -89,41 +132,38 @@ def get_dirstruct():
cur.execute("SELECT id FROM LVAs WHERE lvid=?", (lvid,))
lid = cur.fetchone()[0]
db.commit()
for profsdir in sftp.listdir_attr(unizeug_path + fname + "/"):
if profsdir is None or not S_ISDIR(profsdir.st_mode):
for profsdir in entry.iterdir():
if profsdir is None:
continue
if not profsdir.is_dir():
continue
# print(profsdir.filename)
try:
lastname, firstname = re.split(r"[_\-\s]", str(profsdir.filename))
lastname, firstname = re.split(r"[_\-\s]", str(profsdir.name))
pid = link_prof(firstname, lastname, lid)
except ValueError:
print(f"{name} is broken")
error(f"Couldnt get Profs from {fname}")
continue
for cat in sftp.listdir_attr(
unizeug_path + fname + "/" + profsdir.filename + "/"
):
if cat is None or not S_ISDIR(cat.st_mode):
for cat in profsdir.iterdir():
if cat is None:
continue
if cat.filename not in SUBCAT_CATEGORIES:
if not cat.is_dir():
continue
idx = CATEGORIES.index(cat.filename)
for subcat in sftp.listdir_attr(
unizeug_path
+ fname
+ "/"
+ profsdir.filename
+ "/"
+ cat.filename
+ "/"
):
if subcat is None or not S_ISDIR(subcat.st_mode):
if cat.name not in SUBCAT_CATEGORIES:
continue
idx = CATEGORIES.index(cat.name)
for subcat in cat.iterdir():
if subcat is None:
continue
if not subcat.is_dir():
continue
cur = db.cursor()
cur.execute(
"INSERT INTO SubCats (LId,PId,cat,name) VALUES(?,?,?,?)",
(lid, pid, idx, subcat.filename),
(lid, pid, idx, subcat.name),
)
db.commit()
remove_old_FIP_entrys()
def link_prof(firstname, lastname, lid):
@@ -150,3 +190,8 @@ def link_prof(firstname, lastname, lid):
if __name__ == "__main__":
get_dirstruct()
info("Database updated")
schedule.every().day.at("04:00", "Europe/Vienna").do(get_dirstruct)
while True:
schedule.run_pending()
time.sleep(1)

152
app/init_ssh.py Normal file
View File

@@ -0,0 +1,152 @@
import paramiko
from stat import S_ISDIR, S_ISREG
import re
# from base64 import decodebytes
import json
import mariadb
CATEGORIES = [
"Prüfungen",
"Klausuren",
"Übungen",
"Labore",
"Unterlagen",
"Zusammenfassungen",
"Multimedia",
]
SUBCAT_CATEGORIES = ["Klausuren", "Übungen", "Labore"]
unizeug_path = "/mnt/save/daten/Unizeug/"
db = mariadb.connect(
host="localhost", user="wildserver", password="DBPassword", database="Unizeug"
)
c = db.cursor()
try:
c.execute("DROP TABLE LVAs")
except mariadb.OperationalError:
pass
c.execute(
"CREATE TABLE LVAs(id BIGINT(20) unsigned NOT NULL AUTO_INCREMENT,lvid VARCHAR(6), lvname VARCHAR(256), lvpath VARCHAR(256),PRIMARY KEY(id))"
)
try:
c.execute("DROP TABLE Profs")
except mariadb.OperationalError:
pass
c.execute(
"CREATE TABLE Profs(id BIGINT(20) unsigned NOT NULL AUTO_INCREMENT,name VARCHAR(256),PRIMARY KEY(id))"
)
try:
c.execute("DROP TABLE LPLink")
except mariadb.OperationalError:
pass
c.execute(
"CREATE TABLE LPLink(id BIGINT(20) unsigned NOT NULL AUTO_INCREMENT,LId bigint(20),PId bigint(20),PRIMARY KEY(id))"
)
try:
c.execute("DROP TABLE SubCats")
except mariadb.OperationalError:
pass
c.execute(
"CREATE TABLE SubCats(id BIGINT(20) UNSIGNED NOT NULL AUTO_INCREMENT,LId BIGINT(20),PId BIGINT(20),cat TINYINT UNSIGNED,name VARCHAR(256), PRIMARY KEY(id))"
)
db.commit()
def get_dirstruct():
with open("app/pwfile.json", "r") as f:
cred = json.load(f)
ssh = paramiko.SSHClient()
print(cred["sftpurl"])
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
# key=paramiko.RSAKey(data=decodebytes(bytes(cred["key"],"utf-8")))
# ssh.get_host_keys().add(cred["sftpurl"], 'ssh-rsa', key)
ssh.connect(cred["sftpurl"], username=cred["sftpuser"], password=cred["sftpPW"])
sftp = ssh.open_sftp()
folders = sftp.listdir_attr(unizeug_path)
for entry in folders:
if entry is None:
continue
if not S_ISDIR(entry.st_mode):
continue
fname = str(entry.filename)
regex = re.compile(r"Multimedia_only")
if regex.search(fname):
continue
# print(fname)
lvid = re.search(r"[a-zA-Z0-9]{3}\.[a-zA-Z0-9]{3}", fname)
print(lvid)
if lvid is None:
continue
lvid = lvid.group()[:3] + lvid.group()[4:]
# name = fname[:-8]
name = re.sub(r"[a-zA-Z0-9]{3}\.[a-zA-Z0-9]{3}", "", fname)
# print(name)
# print(lvid)
cur = db.cursor()
cur.execute(
"INSERT INTO LVAs (lvid, lvname, lvpath) VALUES(?,?,?)", (lvid, name, fname)
)
cur.execute("SELECT id FROM LVAs WHERE lvid=?", (lvid,))
lid = cur.fetchone()[0]
db.commit()
for profsdir in sftp.listdir_attr(unizeug_path + fname + "/"):
if profsdir is None or not S_ISDIR(profsdir.st_mode):
continue
# print(profsdir.filename)
try:
lastname, firstname = re.split(r"[_\-\s]", str(profsdir.filename))
pid = link_prof(firstname, lastname, lid)
except ValueError:
print(f"{name} is broken")
continue
for cat in sftp.listdir_attr(
unizeug_path + fname + "/" + profsdir.filename + "/"
):
if cat is None or not S_ISDIR(cat.st_mode):
continue
if cat.filename not in SUBCAT_CATEGORIES:
continue
idx = CATEGORIES.index(cat.filename)
for subcat in sftp.listdir_attr(
unizeug_path
+ fname
+ "/"
+ profsdir.filename
+ "/"
+ cat.filename
+ "/"
):
if subcat is None or not S_ISDIR(subcat.st_mode):
continue
cur = db.cursor()
cur.execute(
"INSERT INTO SubCats (LId,PId,cat,name) VALUES(?,?,?,?)",
(lid, pid, idx, subcat.filename),
)
db.commit()
def link_prof(firstname, lastname, lid):
cur = db.cursor()
cur.execute("SELECT id from Profs WHERE name=?", (lastname + " " + firstname,))
res = cur.fetchone()
if res is not None:
cur.execute("INSERT INTO LPLink (LId,PId) VALUES(?,?)", (lid, res[0]))
db.commit()
return res[0]
cur.execute("SELECT id from Profs WHERE name=?", (firstname + " " + lastname,))
res = cur.fetchone()
if res is not None:
cur.execute("INSERT INTO LPLink (LId,PId) VALUES(?,?)", (lid, res[0]))
db.commit()
return res[0]
cur.execute("INSERT INTO Profs (name) VALUES(?)", (lastname + " " + firstname,))
cur.execute("SELECT id FROM Profs WHERE name=?", (lastname + " " + firstname,))
res = cur.fetchone()
cur.execute("INSERT INTO LPLink (LId,PId) VALUES(?,?)", (lid, res[0]))
db.commit()
return res[0]
if __name__ == "__main__":
get_dirstruct()

View File

@@ -1,44 +1,62 @@
from typing import Annotated
from typing import List, Dict, Tuple, Sequence
from starlette.responses import StreamingResponse
from annotated_types import IsDigit
from fastapi import FastAPI, File, HTTPException, UploadFile, Request, Form
from fastapi.responses import FileResponse
# import multiprocessing
# import threading
# import concurrent.futures
# import asyncio
import asyncio
# import fastapi
from fastapi.staticfiles import StaticFiles
import pymupdf
# import fitz as pymupdf
import json
import re
import os
import signal
import mariadb
import sys
import filetype
import datetime
import logging
import inspect
import pathlib
from pathlib import Path
from starlette.types import HTTPExceptionHandler
log = logging.getLogger(__name__)
logging.basicConfig(
filename="app.log",
filename=os.environ.get("APP_LOG_PATH"),
level=logging.INFO,
format="[%(asctime)s, %(filename)s:%(lineno)s -> %(funcName)10s() ]%(levelname)s: %(message)s",
format="[%(asctime)s, %(filename)s:%(lineno)s -> %(funcName)10s()] %(levelname)s: %(message)s",
)
debug = log.debug
info = log.info
error = log.error
critical = log.critical
def exception_handler(etype, value, tb):
log.exception(f"Uncought Exception: {value}")
sys.excepthook = exception_handler
db = mariadb.connect(
host="localhost", user="wildserver", password="DBPassword", database="Unizeug"
host=os.environ.get("DB_HOST", "db"),
user=os.environ.get("DB_USER", "user"),
password=os.environ.get("DB_PASSWORD", "DBPASSWORD"),
database=os.environ.get("DB_DATABASE", "unizeug"),
)
info("App Started")
@@ -47,8 +65,6 @@ info("App Started")
# startup()
app = FastAPI()
app.mount("/favicon", StaticFiles(directory="./favicon"), name="favicon")
app.mount("/static", StaticFiles(directory="./static"), name="static")
CATEGORIES = [
@@ -60,15 +76,18 @@ CATEGORIES = [
"Zusammenfassungen",
"Multimedia",
]
APP_ROOT_PATH = Path(os.environ.get("APP_ROOT_PATH", "./app"))
SUBCAT_CATEGORIES = ["Klausuren", "Übungen", "Labore"]
SUBCAT_CATEGORIES_I = [1, 2, 3]
EX_DATE_CATEGORIES = ["Prüfungen", "Klausuren"]
EX_DATE_CATEGORIES_I = [0, 1]
UNIZEUG_PATH = "./app/dest/"
FILES_IN_PROGRESS = "./app/files/"
EMPTYFILE = "./app/graphics/empty.pdf"
UNSUPPORTEDFILE = "./app/graphics/unsupported.pdf"
GREETINGFILE = "./app/graphics/greeting.pdf"
UNIZEUG_PATH = Path(os.environ.get("UNIZEUG_PATH", "./app/dest"))
FILES_IN_PROGRESS = APP_ROOT_PATH / "files/"
EMPTYFILE = APP_ROOT_PATH / "graphics/empty.pdf"
UNSUPPORTEDFILE = APP_ROOT_PATH / "graphics/unsupported.pdf"
GREETINGFILE = APP_ROOT_PATH / "graphics/greeting.pdf"
FAVICON = APP_ROOT_PATH / "favicon"
STATIC_FILES = APP_ROOT_PATH / "static"
# cur = db.cursor()
@@ -76,6 +95,9 @@ GREETINGFILE = "./app/graphics/greeting.pdf"
# for l in cur:
# print(l)
# locpaths = ["./VO_Mathematik_3.pdf"] # replace this with a database
censor_status_update_events: Dict[str, asyncio.Event] = {}
censor_status_datas: Dict[str, Dict[str, int | None | str | bool]] = {}
# censor_finished_flags: Dict[str, asyncio.Event] = {}
def _sql_quarry(
@@ -105,12 +127,40 @@ def _sql_quarry(
)
def sql_connector_is_active(connector: mariadb.Connection) -> bool:
try:
connector.ping()
except mariadb.Error as e:
return False
return True
def sql_connect(connector: mariadb.Connection) -> mariadb.Connection:
try:
connector = mariadb.connect(
host=os.environ.get("DB_HOST", "db"),
user=os.environ.get("DB_USER", "user"),
password=os.environ.get("DB_PASSWORD", "DBPASSWORD"),
database=os.environ.get("DB_DATABASE", "Unizeug"),
)
except mariadb.Error as e:
critical(
f"Cannot reconnect to Database {os.environ.get('DB_DATABASE', 'Unizeug')} on {os.environ.get('DB_HOST', 'db')}. Got Mariadb Error: {e}"
)
os.kill(os.getpid(), signal.SIGTERM)
raise HTTPException(500, detail="Database failed")
return connector
def sql(
querry: str,
data: Tuple[str | int, ...] | str | int = (),
return_result: bool = True,
commit: bool = False,
) -> List[Tuple]:
global db
if not sql_connector_is_active(db):
db = sql_connect(db)
cur = db.cursor(dictionary=False)
return _sql_quarry(cur, querry, data, return_result, commit)
@@ -121,6 +171,10 @@ def sqlT(
return_result: bool = True,
commit: bool = False,
) -> List[Dict]:
global db
if not sql_connector_is_active(db):
db = sql_connect(db)
cur = db.cursor(dictionary=True)
return _sql_quarry(cur, querry, data, return_result, commit)
@@ -139,10 +193,22 @@ def sqlT(
# )
app.mount(
"/favicon",
StaticFiles(directory=os.environ.get("FAVICON_PATH", FAVICON)),
name="favicon",
)
app.mount(
"/static",
StaticFiles(directory=os.environ.get("STATIC_PATH", STATIC_FILES)),
name="static",
)
@app.get("/")
async def get_index():
"""gives the Index.html file"""
return FileResponse("./index.html")
return FileResponse(APP_ROOT_PATH / "index.html")
@app.get("/files/{file_id}")
@@ -169,7 +235,7 @@ async def get_file(file_id: str):
# status_code=500, detail="Somethings wrong with the database"
# )
# filename = cur.fetchone()[0]
return FileResponse(FILES_IN_PROGRESS + filename)
return FileResponse(FILES_IN_PROGRESS / filename)
@app.get("/search/lva")
@@ -215,6 +281,9 @@ async def search_lva(
)
# res += cur.fetchall()
res = remove_duplicates(res + zw)
info(
f"LVA Search: {searchterm}; Result: {res[: (searchlim if searchlim != 0 else -1)]}"
)
if searchlim == 0:
return res
else:
@@ -249,6 +318,9 @@ async def search_profs(
)
# res += cur.fetchall()
res = remove_duplicates(res + zw)
info(
f"Prof Search: {searchterm}; Result: {res[: (searchlim if searchlim != 0 else -1)]}"
)
if searchlim == 0:
return res
else:
@@ -289,6 +361,9 @@ async def search_subcats(
)
# res += cur.fetchall()
res = remove_duplicates(res + rest)
info(
f"Subcatrgory Search: {searchterm}; Result: {res[: (searchlim if searchlim != 0 else -1)]}"
)
if searchlim == 0:
return res
else:
@@ -345,7 +420,7 @@ async def create_upload_file(files: List[UploadFile], c2pdf: bool = True):
content = doc.tobytes()
if ft != "dir":
filename = make_filename_unique(filename)
locpath = FILES_IN_PROGRESS + filename
locpath = FILES_IN_PROGRESS / filename
# locpaths.append(locpath)
# cur = db.cursor()
# try:
@@ -405,14 +480,14 @@ async def get_submission(
pagescales: Annotated[
str, Form()
], # Scales of Pages # Annotated[List[Dict[str, float]], Form()],
censor: Annotated[str, Form()],
ocr: Annotated[str, Form()],
):
"""handles submission"""
print(
f"lva: {lva}, prof: {prof}, fname {fname}, stype: {stype}, subcat: {subcat}, sem: {sem}, ex_date: {ex_date}, rects: {rects}, pagescales: {pagescales}, censor: {censor}"
f"lva: {lva}, prof: {prof}, fname {fname}, stype: {stype}, subcat: {subcat}, sem: {sem}, ex_date: {ex_date}, rects: {rects}, pagescales: {pagescales}, ocr: {ocr}"
)
info(
f"lva: {lva}, prof: {prof}, fname {fname}, stype: {stype}, subcat: {subcat}, sem: {sem}, ex_date: {ex_date}, rects: {rects}, pagescales: {pagescales}, censor: {censor}"
f"Got Submission: lva: {lva}, prof: {prof}, fname {fname}, stype: {stype}, subcat: {subcat}, sem: {sem}, ex_date: {ex_date}, rects: {rects}, pagescales: {pagescales}, ocr: {ocr}"
)
rects_p = json.loads(rects)
scales_p = json.loads(pagescales)
@@ -429,7 +504,7 @@ async def get_submission(
error(f"User tried to upload a file without specifying the {th[1]}")
raise HTTPException(400, f"You need to specify a {th[1]}")
filepath = "./app/files/" + res[0][0]
filepath = FILES_IN_PROGRESS / res[0][0]
# except mariadb.Error as e:
# print(f"Mariadb Error: {e}")
# raise HTTPException(
@@ -441,25 +516,73 @@ async def get_submission(
except ValueError as e:
error(f"Error creating savepath: {e}")
raise HTTPException(status_code=400, detail=f"Error creation savepath: {e}")
await censor_pdf(
filepath, dest, rects_p, scales_p, False if censor == "False" else True
)
# censor_finished_flags[fileId] = asyncio.Event()
censor_status_datas[fileId] = {}
if fileId not in censor_status_update_events:
censor_status_update_events[fileId] = asyncio.Event()
if ocr == "True":
await asyncio.to_thread(
censor_pdf_ocr,
filepath,
dest,
rects_p,
scales_p,
fileId,
)
else:
await asyncio.to_thread(
censor_pdf,
filepath,
dest,
rects_p,
scales_p,
fileId,
)
# return {"done": "ok"}
# print(dest)
# await censor_finished_flags[fileId].wait()
# censor_finished_flags[fileId].clear()
info(f"Saved file {fileId} as {dest}")
delete_from_FIP(fileId)
return FileResponse(dest, content_disposition_type="inline")
async def censor_pdf(
path: str,
destpath: str,
@app.get("/get_censor_status/{file_id}")
async def get_censor_status(file_id: str):
"""Yields the currrent page being censored and the total number of pages"""
if len(sql("Select filename from FIP where id=?", (file_id,))) < 1:
raise HTTPException(
400,
detail="You are trying to get a status updater for a file that dosent exist.",
)
if file_id not in censor_status_update_events:
censor_status_update_events[file_id] = asyncio.Event()
return StreamingResponse(
yield_censor_status(file_id), media_type="text/event-stream"
)
async def yield_censor_status(file_id: str):
"""Internal function to yield updates to the stream"""
while True:
await censor_status_update_events[file_id].wait()
censor_status_update_events[file_id].clear()
yield f"event: censorUpdate\ndata: {json.dumps(censor_status_datas[file_id])}\n\n"
if censor_status_datas[file_id]["done"]:
del censor_status_update_events[file_id]
del censor_status_datas[file_id]
return
def censor_pdf(
path: os.PathLike,
destpath: os.PathLike,
rects: List[List[List[float]]],
scales: List[Dict[str, float]],
secure: bool,
file_id: str,
):
"""Censors pdf and runs OCR
If Secure is True the file is converted to Pixels and then recreated; else the censored sections are just covering the text below and can be easiliy removed with e.g. Inkscape
"""Censors pdf and saves the file to the given Destpath.
Args:
path: path to the pdf document
destpath: Path where the result is supposed to be saved to
@@ -469,15 +592,62 @@ async def censor_pdf(
Returns:
None
"""
info(f"started Censoring for file {path} to be saved to {destpath}")
doc = pymupdf.open(path)
page = doc[0]
npage = doc.page_count
for i in range(npage):
page = doc[i]
if i < len(rects) and rects[i] != []:
print(i)
wfac = page.rect.width / scales[i]["width"]
hfac = page.rect.height / scales[i]["height"]
for rect in rects[i]:
prect = pymupdf.Rect(
rect[0] * wfac,
rect[1] * hfac,
(rect[0] + rect[2]) * wfac,
(rect[1] + rect[3]) * hfac,
)
page.add_redact_annot(
prect,
fill=(0, 0, 0),
)
page.apply_redactions()
censor_status_datas[file_id]["page"] = i + 1
censor_status_datas[file_id]["pages"] = npage
censor_status_datas[file_id]["done"] = False
censor_status_update_events[file_id].set()
doc.set_metadata({})
doc.save(destpath, garbage=4, deflate=True, clean=True)
censor_status_datas[file_id]["done"] = True
censor_status_update_events[file_id].set()
def censor_pdf_ocr(
path: os.PathLike,
destpath: os.PathLike,
rects: List[List[List[float]]],
scales: List[Dict[str, float]],
file_id: str,
):
"""Censors pdf and runs OCR
The file is converted to Pixels and then recreated.
Saves the file to the given Destpath.
Args:
path: path to the pdf document
destpath: Path where the result is supposed to be saved to
rects: Coordinates of rectangles to be placed on the pdf document
scales: Scales of the rects coordinates for the pdf document
secure: weather or not the pdf document is supposed to be converted into an Image (and back) to make shure, the censoring is irreversible
Returns:
None
"""
info(f"started Censoring in OCR Mode for file {path} to be saved to {destpath}")
doc = pymupdf.open(path)
output = pymupdf.open()
page = doc[0]
# width = page.rect.width
# height = page.rect.height
# print(width, height)
npage = doc.page_count
# pages = []
# tasks = []
for i in range(npage):
page = doc[i]
if i < len(rects) and rects[i] != []:
@@ -496,38 +666,41 @@ async def censor_pdf(
color=(0, 0, 0),
fill=(0, 0, 0),
)
if secure:
# pages.append(page)
censor_status_datas[file_id]["page"] = i + 1
censor_status_datas[file_id]["pages"] = npage
censor_status_datas[file_id]["done"] = False
censor_status_update_events[file_id].set()
# THis Costs us dearly
try:
bitmap = page.get_pixmap(dpi=400)
pdf_bytes = bitmap.pdfocr_tobytes(
language="deu",
tessdata="/usr/share/tessdata/", # tesseract needs to be installed; this is the path to thetesseract files
)
output.insert_pdf(pymupdf.Document(stream=pdf_bytes))
# tasks.append(asyncio.create_task(censor_page(page)))
print(f"Page {i + 1}/{npage}: CENSORING DONE")
else:
output.insert_pdf(doc, i, i)
# if secure:
# pages_bytes: List[bytes] = []
# censor_page(pages[0])
# with multiprocessing.Pool(npage) as p:
# pages_bytes = p.map(censor_page, pages)
# pages_bytes = p.map(test_function, [1, 2, 3, 4])
# for pdf_bytes in pages_bytes:
# output.insert_pdf(pymupdf.Document(stream=pdf_bytes))
# with concurrent.futures.ThreadPoolExecutor() as executor:
# futures = []
# for page in pages:
# futures.append(executor.submit(censor_page, page))
# for future in futures:
# output.insert_pdf(pymupdf.Document(stream=future.result()))
#
# for task in tasks:
# output.insert_pdf(pymupdf.Document(stream=await task))
# print("CENSORING DONE")
except RuntimeError as e:
error(
f"Error in OCR for document: {destpath}. Error: {e}. Falling back to standard mode."
)
if i < len(rects) and rects[i] != []:
for rect in rects[i]:
prect = pymupdf.Rect(
rect[0] * wfac,
rect[1] * hfac,
(rect[0] + rect[2]) * wfac,
(rect[1] + rect[3]) * hfac,
)
page.add_redact_annot(
prect,
fill=(0, 0, 0),
)
page.apply_redactions()
output.insert_pdf(page.parent, from_page=page.number, to_page=page.number)
# End of the costly part
print(f"Page {i + 1}/{npage}: CENSORING DONE")
output.save(destpath)
censor_status_datas[file_id]["done"] = True
censor_status_update_events[file_id].set()
def test_function(i: int) -> bytes:
@@ -579,21 +752,22 @@ def make_savepath(
ex_date: str,
fname: str,
ftype: str,
) -> str:
) -> os.PathLike:
"""Generates the path, the file is saved to after the upload process is finished. It creates all nessecery directories."""
info(f"Started to make Savepath for '{fname}' in '{lva}' with prof '{prof}'.")
lv = get_lvpath(lva)
lvpath = lv[1] + "/"
lvpath = Path(lv[1])
pf = get_profpath(prof, lv[0])
pfpath = pf[1] + "/"
catpath = CATEGORIES[int(cat)] + "/"
scpath = ""
pfpath = Path(pf[1])
catpath = Path(CATEGORIES[int(cat)])
scpath: str | os.PathLike = ""
if int(cat) in SUBCAT_CATEGORIES_I and subcat != "":
sc = get_subcatpath(subcat, int(cat), pf[0], lv[0])
scpath = sc[1] + "/"
scpath = Path(sc[1])
if int(cat) == 6:
savepath = UNIZEUG_PATH + lv[1] + "_Multimedia_only/" + pfpath
savepath = UNIZEUG_PATH / (lv[1] + "_Multimedia_only/") / pfpath
else:
savepath = UNIZEUG_PATH + lvpath + pfpath + catpath + scpath
savepath = UNIZEUG_PATH / lvpath / pfpath / catpath / scpath
os.makedirs(savepath, exist_ok=True)
filename = sem + "_"
if int(cat) in EX_DATE_CATEGORIES_I:
@@ -607,9 +781,20 @@ def make_savepath(
400,
"You have not specified a date for an upload that requires a date like an exam.",
)
filename += yyyy + "_" + mm + "_" + dd + "_"
filename += fname + "." + ftype
return savepath + filename
filename = yyyy + "_" + mm + "_" + dd + "_"
filename += fname
file = filename + "." + ftype
destpath = savepath / file
i = 0
while destpath.is_file():
info(f"{destpath} already exists.")
file = filename + f"_{i}." + ftype
i += 1
destpath = savepath / file
destpath.touch()
info(f"Path for file to be saved generated as: {savepath / file}")
return savepath / file
def get_lvpath(lva: str) -> Tuple[int, str]:
@@ -802,10 +987,10 @@ async def save_files_to_folder(files: List[UploadFile]) -> str:
if filename == "":
filename = "None"
filename = make_filename_unique(filename)
os.mkdir(FILES_IN_PROGRESS + filename)
os.mkdir(FILES_IN_PROGRESS / filename)
for idx, file in enumerate(files):
fn = file.filename if file.filename is not None else "None" + str(idx)
with open(FILES_IN_PROGRESS + filename + "/" + fn, "wb") as f:
with open(FILES_IN_PROGRESS / filename / fn, "wb") as f:
f.write(await file.read())
return filename
@@ -833,13 +1018,13 @@ async def remove_old_FIP_entrys():
info(f"Remove Files: {files}")
for file in files:
sql("DELETE FROM FIP WHERE id=?", (file["id"]), return_result=False)
os.remove(FILES_IN_PROGRESS + file["filename"])
os.remove(FILES_IN_PROGRESS / file["filename"])
# sql(
# "DELETE FROM FIP WHERE HOUR(TIMEDIFF(NOW(),initTimeStamp)) > 24",
# return_result=False,
# )
db.commit()
return FileResponse("./index.html")
return FileResponse(APP_ROOT_PATH / "index.html")
def delete_from_FIP(uuid: str):
@@ -847,4 +1032,4 @@ def delete_from_FIP(uuid: str):
if len(res) < 1:
raise HTTPException(500, "I am trying to delete a file that dose not exist")
sql("DELETE FROM FIP WHERE id=?", (uuid,), return_result=False, commit=True)
os.remove(FILES_IN_PROGRESS + res[0]["filename"])
os.remove(FILES_IN_PROGRESS / res[0]["filename"])

View File

@@ -188,6 +188,9 @@ class PDFDocument {
}
}
var mouseIsDown = false;
var modal;
var close_loading;
var upload_status;
//var startX = 0;
//var startY = 0;
//var pdf;
@@ -273,18 +276,39 @@ function submitPdf(eve) {
formdata.append("fileId", doc.fID);
//formdata.append("filename", doc.filename);
formdata.append("ftype", doc.filetype);
if (!formdata.has("censor")) {
formdata.append("censor", "False");
if (!formdata.has("ocr")) {
formdata.append("ocr", "False");
}
console.log(formdata);
submitForm(formdata);
}
async function submitForm(formData) {
var updateEventSource = null;
try {
const response = await fetch("http://127.0.0.1:8000/submit", {
updateEventSource = new EventSource("/get_censor_status/" + doc.fID);
modal.style.display = "flex";
// console.log("http://127.0.0.1:8000/get_censor_status/" + doc.fID);
updateEventSource.addEventListener("censorUpdate", function(eve) {
console.log(eve.data);
var data = JSON.parse(eve.data);
upload_status.innerText =
"Censoring Page " + data.page + "/" + data.pages;
});
} catch {
console.error(
"Error geting eventsource for updating censoring page count: " + error,
);
}
try {
const response = await fetch("/submit/", {
method: "POST",
body: formData,
});
if (updateEventSource !== null) {
updateEventSource.close();
}
modal.style.display = "none";
//let responseJSON=await response.json();
if (response.ok) {
console.log("Submit OK");
@@ -304,7 +328,7 @@ async function submitForm(formData) {
window.alert("Error: " + (await response.json())["detail"]);
}
} catch (error) {
console.error("Error" + error);
console.error("Error submitting: " + error);
}
}
function uploadPdf(eve) {
@@ -322,7 +346,7 @@ function uploadPdf(eve) {
}
async function uploadFile(formData) {
try {
const response = await fetch("http://127.0.0.1:8000/uploadfile", {
const response = await fetch("/uploadfile/", {
method: "POST",
body: formData,
});
@@ -366,6 +390,14 @@ function initListeners() {
doc.clearAll();
});
}
function initLoading() {
modal = document.querySelector("#loading");
// close_loading = document.querySelector(".close");
upload_status = document.querySelector("#upload_status");
// close_loading.addEventListener("click", function() {
// modal.style.display = "none";
// });
}
const startPdf = () => {
// doc = new PDFDocument(
// "./files/b78c869f-e0bb-11ef-9b58-84144d05d665",
@@ -374,6 +406,7 @@ const startPdf = () => {
// );
//pdf = new PDFView("./VO_Mathematik_3.pdf");
doc = new PDFDocument("./files/greeting", "greeting", "pdf");
initLoading();
initDraw();
initUpload();
initListeners();

View File

@@ -1,4 +1,4 @@
var url = "http://127.0.0.1:8000/search/";
var url = "/search/";
var lid = null;
var pid = null;
var activeAutocompletion = null;
@@ -21,7 +21,7 @@ function autocomplete(inp, type) {
i,
apirq,
iname,
val = this.value;
val = inp.value;
/*close any already open lists of autocompleted values*/
closeAllLists();
if (!val && type === "lva" && pid === null) {
@@ -56,7 +56,7 @@ function autocomplete(inp, type) {
a.setAttribute("id", this.id + "autocomplete-list");
a.setAttribute("class", "autocomplete-items");
/*append the DIV element as a child of the autocomplete container:*/
this.parentNode.appendChild(a);
inp.parentNode.appendChild(a);
/*for each item in the array...*/
//await response;
if (response.ok) {

View File

@@ -238,3 +238,84 @@ input[type="file"]::file-selector-button {
width: 100%;
/* background-color: purple; */
}
/* The Modal (background) */
.modal {
display: none;
/* Hidden by default */
position: fixed;
/* Stay in place */
z-index: 1;
/* Sit on top */
left: 0;
top: 0;
width: 100%;
/* Full width */
height: 100%;
/* Full height */
overflow: auto;
/* Enable scroll if needed */
background-color: #4f5977;
/* Fallback color */
background-color: rgba(0, 0, 0, 0.4);
/* Black w/ opacity */
justify-content: center;
}
/* Modal Content/Box */
.loading-content {
background-color: #4f5977;
margin: auto;
/* 15% from the top and centered */
padding: 20px;
/* border: 1px solid #888; */
/* width: 80%; */
border-radius: 15px;
display: flex;
flex-direction: column;
/* Could be more or less, depending on screen size */
align-items: center;
text-align: center;
}
/* The Close Button */
.close {
color: #aaa;
float: right;
font-size: 28px;
font-weight: bold;
}
.close:hover,
.close:focus {
color: black;
text-decoration: none;
cursor: pointer;
}
.upload_status_text {
color: #ffffff;
font-size: 16pt;
}
.loader {
margin: auto;
border: 16px solid #f3f3f3;
/* Light grey */
border-top: 16px solid #3498db;
/* Blue */
border-radius: 50%;
width: 120px;
height: 120px;
animation: spin 2s linear infinite;
}
@keyframes spin {
0% {
transform: rotate(0deg);
}
100% {
transform: rotate(360deg);
}
}

67
compose.yml Normal file
View File

@@ -0,0 +1,67 @@
services:
app:
container_name: python-app
# command: python -m uvicorn app.main:app --host 0.0.0.0 --port 80
build:
context: .
dockerfile: Dockerfile
volumes:
- ./app:/python
- ./unizeug:/unizeug:source
ports:
- 80:80
restart: unless-stopped
environment:
ENTRY_COMMAND: python -m uvicorn main:app --host 0.0.0.0 --port 80
APP_LOG_PATH: /python/app.log
APP_ROOT_PATH: /python
UNIZEUG_PATH: /unizeug
DB_HOST: db
DB_USER: app
DB_PASSWORD: DBPassword
DB_DATABASE: Unizeug
TZ: "Europe/Vienna"
depends_on:
- db
- scaner
db:
container_name: db
image: mariadb
restart: unless-stopped
environment:
MARIADB_ROOT_PASSWORD: DBPassword
MARIADB_USER: app
UNIZEUG_PATH: /unizeug
MARIADB_PASSWORD: DBPassword
MARIADB_DATABASE: Unizeug
TZ: "Europe/Vienna"
healthcheck:
test: ["CMD", "healthcheck.sh", "--connect", "--innodb_initialized"]
start_period: 10s
interval: 10s
timeout: 5s
retries: 3
volumes:
- ./mariadb:/var/lib/mysql
scaner:
container_name: python-scaner
# command: python /python/init.py
build:
context: .
dockerfile: Dockerfile
volumes:
- ./app:/python
- ./unizeug:/unizeug:source
restart: unless-stopped
environment:
ENTRY_COMMAND: python /python/init.py
UNIZEUG_PATH: /unizeug
APP_ROOT_PATH: /python
DB_HOST: db
DB_USER: app
DB_PASSWORD: DBPassword
DB_DATABASE: Unizeug
TZ: "Europe/Vienna"
depends_on:
- db

4
entrypoint.sh Executable file
View File

@@ -0,0 +1,4 @@
#!/bin/sh
if [[ -n "$ENTRY_COMMAND" ]]; then
/bin/sh -c "$ENTRY_COMMAND"
fi

View File

@@ -1,103 +0,0 @@
<!doctype html>
<html lang="de">
<head>
<title>Unizeug uploader</title>
<link rel="stylesheet" href="static/style.css" />
<script src="https://cdnjs.cloudflare.com/ajax/libs/pdf.js/2.4.456/pdf.min.js"></script>
<script src="static/app.js" defer></script>
<script src="static/autocomplete.js" defer></script>
<script src="static/dynhide.js" defer></script>
<script src="static/filedrop.js" defer></script>
<link rel="icon" type="image/png" href="/favicon/favicon-96x96.png" sizes="96x96" />
<link rel="icon" type="image/svg+xml" href="/favicon/favicon.svg" />
<link rel="shortcut icon" href="/favicon/favicon.ico" />
<link rel="apple-touch-icon" sizes="180x180" href="/favicon/apple-touch-icon.png" />
<meta name="apple-mobile-web-app-title" content="Unizeug" />
<link rel="manifest" href="/favicon/site.webmanifest" />
</head>
<body>
<div class="main">
<div class="left" id="controldiv">
<div id="fileupload">
<form id="uploadform" enctype="multipart/form-data">
<div class="filetop">
<!-- <label for="filepicker">Choose a file</label> -->
<input type="file" name="files" id="filepicker" multiple placeholder="Drop File" />
</div>
<button type="submit" id="upload" method="POST" class="fileupload">
Upload
</button>
</form>
</div>
<div id="submitdiv">
<form id="submitform" ,onsubmit="submitFile(event)">
<label for="lva">Lehrveranstaltung:</label>
<div class="autocomplete">
<input type="text" id="lva" name="lva" placeholder="Lehrveranstaltung" autocomplete="off" />
</div>
<br />
<!-- <br /> -->
<label for="prof">Vortragende*r:</label>
<div class="autocomplete">
<input type="text" id="prof" name="prof" placeholder="Vortragende*r" autocomplete="off" />
</div>
<br />
<!-- <br /> -->
<label for="name">Name:</label>
<input type="text" id="name" name="fname" placeholder="Prüfung" /><br />
<label for="sem">Semester:</label>
<input type="text" id="sem" name="sem" placeholder="2024W" /><br />
<input type="radio" id="pruefung" name="stype" value="0" checked="checked" />
<label for="pruefung">Prüfung</label><br />
<input type="radio" id="klausur" name="stype" value="1" />
<label for="klausur">Klausur</label><br />
<input type="radio" id="uebung" name="stype" value="2" />
<label for="uebung">Übung</label><br />
<input type="radio" id="labor" name="stype" value="3" />
<label for="labor">Labor</label><br />
<input type="radio" id="unterlagen" name="stype" value="4" />
<label for="unterlagen">Unterlagen</label><br />
<input type="radio" id="zusammenfassungen" name="stype" value="5" />
<label for="zusammenfassungen">Zusammenfassung</label><br />
<input type="radio" id="multimedia" name="stype" value="6" />
<label for="multimedia">Multimedia</label><br />
<br />
<div id="subcatdiv">
<label for="subcat">Veranstaltung</label>
<div class="autocomplete">
<input type="text" id="subcat" name="subcat" placeholder="Klausur 1" autocomplete="off" />
</div>
</div>
<div id="datediv">
<label for="date">Datum</label>
<input type="date" id="date" name="ex_date" placeholder="Drop File" /><br />
</div>
<input type="checkbox" name="censor" id="sec_censor" value="True" checked /><label
for="sec_censor">Zensieren</label><br /><br />
<button type="submit" id="send">Senden</button>
</form>
</div>
</div>
<div class="right" id="rightdiv">
<div class="buttons" id="buttonsdiv">
<button id="prev">Prev</button><button id="next">Next</button>
<div>
<span id="npage"></span>
<span>/</span>
<span id="npages"></span>
</div>
<button id="clr">Clear Page</button><button id="ca">Claer All</button>
</div>
<div id="cnvdiv">
<div class="stack" id="cnvcont">
<canvas id="cnv"></canvas>
<canvas id="drw_cnv"></canvas>
</div>
</div>
</div>
</div>
</body>
</html>

View File

@@ -44,10 +44,12 @@ pypdf==5.2.0
pytesseract==0.3.13
python-dotenv==1.0.1
python-multipart==0.0.20
pytz==2025.2
PyYAML==6.0.2
requests==2.32.3
rich==13.9.4
rich-toolkit==0.13.2
schedule==1.2.2
shellingham==1.5.4
sniffio==1.3.1
starlette==0.45.3