Compare commits

...

74 commits

Author SHA1 Message Date
597b9ba798 clarify whats needed 2023-03-31 02:33:27 +02:00
e138d6b567 liquidsoap: pin version 2023-03-31 02:23:02 +02:00
85a53b4e53 docker: move to alpine 2023-03-31 02:22:52 +02:00
ab472daf5e fix missing validator 2023-03-31 02:21:39 +02:00
03e201e6a2 fixup! Actually look at input directory 2023-03-31 02:21:14 +02:00
5117bd7782 fix detached sqlalchemy object 2023-03-31 02:20:15 +02:00
137f6e050f update requirements for python 3.11 2023-03-31 01:50:00 +02:00
c27b8e20ce remove obsolete tests 2023-03-31 01:49:39 +02:00
75291d7704 Actually look at input directory
refs #10
... still no support for HTTP!
2023-03-31 01:48:10 +02:00
boyska
14704ec7ed readme about testing 2021-09-29 00:39:21 +02:00
boyska
23144cafa7 Merge branch 'v2-ci' into v2 2021-09-29 00:37:58 +02:00
boyska
ef97c952d2 port tests to new async interface 2021-09-29 00:37:10 +02:00
boyska
c8bf2c2071 disable some checks if testing 2021-09-29 00:31:08 +02:00
boyska
d302596d73 nose → py.test 2021-09-29 00:30:31 +02:00
boyska
193d77ae09 upgrade sqlalchemy
let's hope this works; installation should be much faster (wheel)
2021-09-29 00:04:25 +02:00
boyska
993d91e5b8 refactor requirements 2021-09-29 00:01:14 +02:00
boyska
de43301785 gitlab-runner test 2021-09-28 23:56:27 +02:00
boyska
d0a5b7ed54 mypy can be run via gitlab-runner
gitlab-runner exec docker static
2021-09-28 23:49:22 +02:00
boyska
b24dd7cfe4 python version clearly pre-release 2021-09-28 23:35:51 +02:00
boyska
52564571f5 Merge branch 'feat/28' into fastapi 2021-09-22 13:27:40 +02:00
59d84eafe2
Use dedicated db volume to avoid stale files 2021-09-20 22:52:27 +02:00
51fd340cd2
Remove ror from paths 2021-09-20 22:51:48 +02:00
boyska
7ec0d6bfc8 flake, I don't care about docstrings 2021-09-17 17:38:21 +02:00
boyska
c602bb680c errors during generation are properly handled 2021-09-17 17:37:46 +02:00
boyska
40394331ef if download fails, don't try to run ffmpeg at all
error handling is still not complete: the job is still considered to be
"in progress"
2021-09-17 11:42:56 +02:00
fda08d7d0d
Add test infra for local audio source 2021-09-17 11:06:05 +02:00
75c2713536
Fix logic if failing to download 2021-09-17 11:02:07 +02:00
boyska
9b10e525f0 basic auth support 2021-09-17 10:53:19 +02:00
boyska
0d83a6fcd6 help the linter a bit 2021-09-17 10:44:17 +02:00
boyska
d49c7dff00 fix the exception handling
we ♥ when exceptions create more exceptions
2021-09-17 10:43:49 +02:00
c3f6423771
Demote start control to warning 2021-09-17 10:27:13 +02:00
8f5dcccf70
Handle retrieving exceptions 2021-09-17 10:27:09 +02:00
1ee4ca8eb8
Fix unbounded variables 2021-09-17 10:27:04 +02:00
boyska
fd1e5df655 Make: allow specifying dir owner
this is useful if you need to run docker with sudo because you're not in
docker group
2021-09-17 10:23:04 +02:00
boyska
a3d20b9a35 quieter apt install in docker 2021-09-17 10:22:55 +02:00
boyska
2ee4c3b649 remove leftovers of pre-fastapi era 2021-09-17 10:21:55 +02:00
boyska
6a17e63f85 fix logging problems
I don't know why this works
2021-09-17 10:21:03 +02:00
c788289567
Avoid using non-atomic shutil.move 2021-09-15 22:02:12 +02:00
bb8e4cdbfa
Generate first empty file at container startup 2021-09-15 21:53:57 +02:00
628e4d3d55
Enhance Makefile 2021-09-15 21:53:30 +02:00
c2b56cc85d
Fix docker settings 2021-09-15 18:22:08 +02:00
26181d083f
Fix startup checks 2021-09-15 17:54:40 +02:00
acc966b488
Do not singletonize the retriever 2021-09-15 17:54:35 +02:00
d967440a6d
Fix dependencies 2021-09-15 17:54:30 +02:00
ef9842e4d2
Adding docker machinery and makefile 2021-09-15 17:54:25 +02:00
5949e79f46
Plug http retriever in current logic 2021-09-15 17:54:20 +02:00
1718c4c331
Add http retriever 2021-08-26 21:46:47 -03:00
a192501570
Please the linter
- Do not use lambda
 - Replace assert with raise
 - Minor fixes
2021-08-25 15:48:12 -03:00
c48efc46d2
Update requirements in setup.py 2021-08-25 12:03:58 -03:00
5124f2d3ca
Black'd 2021-08-25 12:03:39 -03:00
7e99e31f43 "techrec serve" works now 2021-08-25 14:50:37 +02:00
fb79a598da verify that the file has the intended length
retries otherwise

fixes #29
2021-08-25 14:50:37 +02:00
15376a1052 initialization as fastcgi event 2021-08-25 14:50:37 +02:00
2341849e54 remove unused API call 2021-08-25 14:50:37 +02:00
fa0aec4276 fix typing 2021-08-25 14:50:37 +02:00
boyska
49008d0e93 back to the old way of running
it's simpler for us to make logging and setup work in this way
2021-08-25 11:50:34 +02:00
fbc77c47e8 type annotation in forge 2021-08-25 00:26:17 +02:00
dba069d757 moved away from github 2021-08-25 00:22:41 +02:00
ea51ad92d6 allow regeneration
... only with curl
refs #27
2021-08-25 00:00:10 +02:00
d43e655181 API change! /api/generate/{recid} 2021-08-24 23:56:29 +02:00
775d618315 clearly shows files not ready 2021-08-24 23:48:56 +02:00
f8cb5a9bce archive.js 2021-08-24 23:45:30 +02:00
ba78c78e7a drop process queue 2021-08-24 23:39:14 +02:00
53061be23e files are generated in TMPDIR
fixes #4
2021-08-24 23:32:43 +02:00
a3756ea14d readme updated 2021-08-24 23:21:31 +02:00
fe4576315a mostly works! 2021-08-24 22:29:11 +02:00
43f29e865d readme updated 2021-08-24 22:11:06 +02:00
6ef8704715 adapting APIs to fastapi
current status is still non-functional, though
2021-08-24 22:09:52 +02:00
d929839025 FIX get/post 2020-12-15 15:21:03 +01:00
c36a1ea0cc search ported to fastapi 2020-12-15 15:19:45 +01:00
97d6e65bb8 serve pages, too 2020-12-15 15:09:33 +01:00
1965c19bc4 staticfiles served 2020-12-15 14:57:06 +01:00
ac5f298c7d adapt to fastapi + reformat 2020-12-15 14:38:44 +01:00
514c600e0e start migration to fastapi 2020-12-15 14:36:39 +01:00
80 changed files with 1186 additions and 890 deletions

2
.gitignore vendored
View file

@ -5,3 +5,5 @@ build/
dist/
rec/
*.egg-info/
/venv
/docker/output/*

21
.gitlab-ci.yml Normal file
View file

@ -0,0 +1,21 @@
image: python:3.7
stages:
- static
- test
mypy:
stage: static
before_script:
- pip install mypy
script:
- mypy techrec
test:
stage: test
before_script:
- pip install pytest pytest-asyncio
- pip install -r requirements.txt
- mkdir techrec/output
script:
- pytest

30
Dockerfile Normal file
View file

@ -0,0 +1,30 @@
FROM python:3.7-alpine
ARG hostuid=1000
ARG hostgid=1000
ENV TECHREC_CONFIG=/src/techrec/docker/config.py
ENV DEBIAN_FRONTEND=noninteractive
RUN apk update && apk add ffmpeg shadow
WORKDIR /src
COPY . /src/techrec
RUN groupadd -g ${hostgid} techrec \
&& useradd -g techrec -u ${hostuid} -m techrec \
&& mkdir -p /src/techrec \
&& mkdir -p /src/db \
&& chown -R techrec:techrec /src
USER techrec
RUN python -m venv ./venv \
&& ./venv/bin/python -m pip install wheel \
&& ./venv/bin/python -m pip install -e ./techrec
VOLUME ["/src/db"]
EXPOSE 8000
CMD ["/src/techrec/docker/run-techrec.sh"]

61
Makefile Normal file
View file

@ -0,0 +1,61 @@
DOCKER := docker
DOCKERC := docker-compose
PORT := 8000
VENV := venv
CONFIG := dev_config.py
PY := python
OWNER := ${USER}
docker-build:
$(DOCKERC) build \
--no-cache \
--build-arg=hostgid=$(shell id -g) \
--build-arg=hostuid=$(shell id -u) \
--build-arg=audiogid=$(shell cat /etc/group | grep audio | awk -F: '{print $3}')
docker-build-liquidsoap:
$(DOCKER) pull savonet/liquidsoap:main
$(DOCKERC) build \
--no-cache \
--build-arg=audiogid=$(shell cat /etc/group | grep audio | awk -F: '{print $3}') \
liquidsoap
docker-build-techrec:
$(DOCKERC) build \
--no-cache \
--build-arg=hostgid=$(shell id -g ${OWNER}) \
--build-arg=hostuid=$(shell id -u ${OWNER}) \
techrec
docker-stop:
$(DOCKERC) down -v
docker-run:
$(DOCKERC) run --rm --service-ports techrec
docker-shell-techrec:
$(eval CONTAINER = $(shell docker ps|grep techrec_run|awk '{print $$12}'))
$(DOCKER) exec -ti $(CONTAINER) bash
docker-shell-storage:
$(DOCKERC) exec storage bash
docker-shell-liquidsoap:
$(eval CONTAINER = $(shell docker ps|grep liquidsoap|awk '{print $$12}'))
$(DOCKER) exec -ti $(CONTAINER) bash
docker-logs-storage:
$(DOCKERC) logs -f storage
docker-logs-liquidsoap:
$(DOCKERC) logs -f liquidsoap
local-install:
$(PY) -m venv $(VENV)
./$(VENV)/bin/pip install -e .
local-serve:
env TECHREC_CONFIG=$(CONFIG) ./$(VENV)/bin/techrec -vv serve
.PHONY: docker-build docker-build-liquidsoap docker-build-techrec docker-stop docker-run docker-shell-techrec docker-shell-storage docker-shell-liquidsoap docker-logs-storage docker-logs-liquidsoap local-install local-serve

View file

@ -1,7 +1,7 @@
TechRec
=======
A Python2/Python3 web application that assist radio speakers in recording their shows.
A Python3 web application that assist radio speakers in recording their shows.
Meant to be simple to install and to maintain.
It basically takes a directory with the continuous recording and create new
@ -10,7 +10,7 @@ files "cutting/pasting" with ffmpeg.
Features
=========
* little system dependencies: python and ffmpeg
* little system dependencies: python3 and ffmpeg
* The interface is extremely simple to use
* Supports nested recording (ie: to record an interview inside of a whole
show)
@ -32,16 +32,29 @@ parts of them. This can boil down to something like
ffmpeg -i concat:2014-20-01-00-00.mp3|2014-20-01-00-01.mp3 -acodec copy -ss 160 -t 1840 foo.mp3
```
This continous recording needs to be configured so that:
- It can be split in multiple directories, but the granularity of this must be below one directory per day.
- The filename must be fully informative, without looking at the parent directories.
How to run
===========
```sh
pip install .
env TECHREC_CONFIG=yourconfig.py techrec serve
```
Implementation details
======================
It is based on bottle, to get a minimal framework. Simple APIs are offered
through it, and the static site uses them.
It is based on [fastapi](https://fastapi.tiangolo.com/), a really nice
framework. Simple APIs are offered through it, and the static site uses them
through JS.
Jobs are not dispatched using stuff like celery, but with a thin wrapper over
`multiprocessing.Pool`; this is just to keep the installation as simple as
possible.
Jobs are not dispatched using stuff like celery, but just using
[`BackgroundTasks`](https://fastapi.tiangolo.com/tutorial/background-tasks/),
in order to keep the installation as simple as possible.
The encoding part is delegated to `ffmpeg`, but the code is really modular so
changing this is a breeze. To be quicker and avoid the quality issues related
@ -51,7 +64,11 @@ have the same format.
testing
-----------
unit tests can be run with `python setup.py test`
```
gitlab-runner exec docker test
```
(or, `pytest-3`, assuming you have a properly configured system)
screenshots
--------------

View file

@ -9,7 +9,7 @@ python-pip
RUN DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends
virtualenvwrapper
RUN git clone https://github.com/boyska/techrec.git /opt/techrec
RUN git clone https://git.lattuga.net/techbloc/techrec.git /opt/techrec
RUN virtualenv --python=python2 /opt/virtualenv
RUN /opt/virtualenv/bin/pip install -r /opt/techrec/server/requirements.txt
RUN mkdir /opt/db

49
docker-compose.yaml Normal file
View file

@ -0,0 +1,49 @@
version: "3"
services:
liquidsoap:
build:
context: .
dockerfile: docker/Dockerfile.liquidsoap
volumes:
- ./docker/run.liq:/run.liq
- ./docker/run.sh:/run.sh
- rec:/rec
devices:
- /dev/snd:/dev/snd
entrypoint: /run.sh
depends_on:
- storageprepare
storage:
image: nginx
volumes:
- rec:/var/www/rec
- ./docker/storage.conf:/etc/nginx/conf.d/default.conf:ro
ports:
- 18080:80
depends_on:
- storageprepare
storageprepare:
image: bash
volumes:
- rec:/rec
command: chmod 777 /rec
techrec:
build: .
volumes:
- .:/src/techrec
- rec:/rec
- ./docker/output:/src/output
- db:/src/db
ports:
- 8000:8000
depends_on:
- liquidsoap
- storage
volumes:
rec:
db:

View file

@ -0,0 +1,10 @@
FROM savonet/liquidsoap:v2.0.7
ENV audiogid=995
USER root
RUN groupadd -g ${audiogid} hostaudio \
&& usermod -a -G hostaudio liquidsoap
USER liquidsoap

9
docker/config.py Normal file
View file

@ -0,0 +1,9 @@
DB_URI = "sqlite:////src/db/techrec.db"
AUDIO_INPUT = "http://storage"
# decomment this if you want to test with local audio source
# AUDIO_INPUT = "/rec"
AUDIO_OUTPUT = "/src/output"
DEBUG = True
HOST = "0.0.0.0"
PORT = 8000
FFMPEG_OPTIONS = ["-loglevel", "warning"]

0
docker/output/.gitkeep Normal file
View file

5
docker/run-techrec.sh Executable file
View file

@ -0,0 +1,5 @@
#!/bin/sh
source /src/venv/bin/activate
pip install /src/techrec
/src/venv/bin/techrec forge 20230330-210204 20230330-232100
/bin/sh

26
docker/run.liq Executable file
View file

@ -0,0 +1,26 @@
#!/usr/bin/liquidsoap
settings.log.stdout.set(true);
settings.log.file.set(false);
settings.log.level.set(3);
# settings.server.telnet.set(true);
# settings.server.telnet.bind_addr.set("127.0.0.1");
# settings.server.telnet.port.set(6666);
rorinput = input.alsa(device="default", bufferize=true);
#rorinput = input.pulseaudio( );
# rorinput = insert_metadata(id="trx",rorinput);
rorinput = rewrite_metadata([("artist","Radio OndaRossa")],rorinput);
# ESCPOST
output.file(
id="rorrec",
reopen_when={0m},
%mp3(bitrate=80, samplerate=44100, stereo=true,stereo_mode="joint_stereo"),
"/rec/%Y-%m/%d/rec-%Y-%m-%d-%H-%M-%S.mp3",
# %vorbis(quality=0.3, samplerate=44100, channels=2),
# "/rec/%Y-%m/%d/rec-%Y-%m-%d-%H-%M-%S.ogg",
rorinput
);

11
docker/run.sh Executable file
View file

@ -0,0 +1,11 @@
#!/bin/bash
set -xueo pipefail
FILEPATH="/rec/$(date +%Y-%m)/$(date +%d)/rec-$(date +%Y-%m-%d-%H)-00-00.mp3"
mkdir -p $(dirname ${FILEPATH})
if ! [[ -f ${FILEPATH} ]]; then
ffmpeg -f lavfi -i anullsrc=r=11025:cl=mono -t 3600 -acodec mp3 ${FILEPATH}
fi
/run.liq

9
docker/storage.conf Normal file
View file

@ -0,0 +1,9 @@
server {
listen 80 default_server;
server_name storage;
location / {
root /var/www/rec;
autoindex on;
}
}

23
requirements.txt Normal file
View file

@ -0,0 +1,23 @@
aiofiles==0.6.0
aiohttp==3.7.4
anyio==3.6.2
async-timeout==3.0.1
attrs==22.2.0
chardet==3.0.4
click==7.1.2
fastapi==0.95.0
greenlet==2.0.2
h11==0.11.0
idna==3.4
iniconfig==2.0.0
multidict==6.0.4
packaging==23.0
pluggy==1.0.0
pydantic==1.10.0
pytest==7.2.2
sniffio==1.3.0
SQLAlchemy==1.4.25
starlette==0.26.1
typing_extensions==4.5.0
uvicorn==0.13.1
yarl==1.8.2

View file

@ -1,153 +0,0 @@
from datetime import datetime, timedelta
from time import sleep
import os
from subprocess import Popen
import logging
from .config_manager import get_config
def get_timefile_exact(time):
"""
time is of type `datetime`; it is not "rounded" to match the real file;
that work is done in get_timefile(time)
"""
return os.path.join(
get_config()["AUDIO_INPUT"], time.strftime(get_config()["AUDIO_INPUT_FORMAT"])
)
def round_timefile(exact):
"""
This will round the datetime, so to match the file organization structure
"""
return datetime(exact.year, exact.month, exact.day, exact.hour)
def get_timefile(exact):
return get_timefile_exact(round_timefile(exact))
def get_files_and_intervals(start, end, rounder=round_timefile):
"""
both arguments are datetime objects
returns an iterator whose elements are (filename, start_cut, end_cut)
Cuts are expressed in seconds
"""
if end <= start:
raise ValueError("end < start!")
while start <= end:
begin = rounder(start)
start_cut = (start - begin).total_seconds()
if end < begin + timedelta(seconds=3599):
end_cut = (begin + timedelta(seconds=3599) - end).total_seconds()
else:
end_cut = 0
yield (begin, start_cut, end_cut)
start = begin + timedelta(hours=1)
def mp3_join(named_intervals):
"""
Note that these are NOT the intervals returned by get_files_and_intervals,
as they do not supply a filename, but only a datetime.
What we want in input is basically the same thing, but with get_timefile()
applied on the first element
This function make the (quite usual) assumption that the only start_cut (if
any) is at the first file, and the last one is at the last file
"""
ffmpeg = get_config()["FFMPEG_PATH"]
startskip = None
endskip = None
files = []
for (filename, start_cut, end_cut) in named_intervals:
# this happens only one time, and only at the first iteration
if start_cut:
assert startskip is None
startskip = start_cut
# this happens only one time, and only at the first iteration
if end_cut:
assert endskip is None
endskip = end_cut
assert "|" not in filename
files.append(filename)
cmdline = [ffmpeg, "-i", "concat:%s" % "|".join(files)]
cmdline += get_config()["FFMPEG_OUT_CODEC"]
if startskip is not None:
cmdline += ["-ss", str(startskip)]
else:
startskip = 0
if endskip is not None:
cmdline += ["-t", str(len(files) * 3600 - (startskip + endskip))]
return cmdline
def create_mp3(start, end, outfile, options={}, **kwargs):
intervals = [
(get_timefile(begin), start_cut, end_cut)
for begin, start_cut, end_cut in get_files_and_intervals(start, end)
]
if os.path.exists(outfile):
raise OSError("file '%s' already exists" % outfile)
for path, _s, _e in intervals:
if not os.path.exists(path):
raise OSError("file '%s' does not exist; recording system broken?" % path)
# metadata date/time formatted according to
# https://wiki.xiph.org/VorbisComment#Date_and_time
metadata = {}
if outfile.endswith(".mp3"):
metadata["TRDC"] = start.replace(microsecond=0).isoformat()
metadata["RECORDINGTIME"] = metadata["TRDC"]
metadata["ENCODINGTIME"] = datetime.now().replace(microsecond=0).isoformat()
else:
metadata["DATE"] = start.replace(microsecond=0).isoformat()
metadata["ENCODER"] = "https://github.com/boyska/techrec"
if "title" in options:
metadata["TITLE"] = options["title"]
if options.get("license_uri", None) is not None:
metadata["RIGHTS-DATE"] = start.strftime("%Y-%m")
metadata["RIGHTS-URI"] = options["license_uri"]
if "extra_tags" in options:
metadata.update(options["extra_tags"])
metadata_list = []
for tag, value in metadata.items():
if "=" in tag:
logging.error('Received a tag with "=" inside, skipping')
continue
metadata_list.append("-metadata")
metadata_list.append("%s=%s" % (tag, value))
p = Popen(
mp3_join(intervals) + metadata_list + get_config()["FFMPEG_OPTIONS"] + [outfile]
)
if get_config()["FORGE_TIMEOUT"] == 0:
p.wait()
else:
start = datetime.now()
while (datetime.now() - start).total_seconds() < get_config()["FORGE_TIMEOUT"]:
p.poll()
if p.returncode is None:
sleep(1)
else:
break
if p.returncode is None:
os.kill(p.pid, 15)
try:
os.remove(outfile)
except:
pass
raise Exception("timeout") # TODO: make a specific TimeoutError
if p.returncode != 0:
raise OSError("return code was %d" % p.returncode)
return True
def main_cmd(options):
log = logging.getLogger("forge_main")
outfile = os.path.abspath(os.path.join(options.cwd, options.outfile))
log.debug("will forge an mp3 into %s" % (outfile))
create_mp3(options.starttime, options.endtime, outfile)

View file

@ -1,102 +0,0 @@
<!DOCTYPE html>
<html>
<head>
<title>TechREC</title>
<link rel="icon" href="/static/img/icon.ico" />
<link rel="stylesheet" type="text/css" href="/static/css/pure-min.css" />
<link rel="stylesheet" type="text/css" href="/static/css/pure-skin-porpora.css" />
<link rel="stylesheet" type="text/css" href="/static/css/jquery-ui.min.css" />
<link rel="stylesheet" type="text/css" href="/static/css/techrec.css">
<link rel="stylesheet" type="text/css" href="/static/css/font-awesome.css" />
<script src="/static/js/jquery-1.9.1.min.js"></script>
<script src="/static/js/jquery-ui.min.js"></script>
<script src="/static/js/jquery.ui.datepicker-it.min.js"></script>
<script src="/static/js/rec.js"></script>
<script>
function delta(end, start) {
//end, start are unix timestamps
diff = parseInt(end, 10) - parseInt(start, 10); //diff is in seconds
msec = diff*1000;
var hh = Math.floor(msec / 1000 / 60 / 60);
msec -= hh * 1000 * 60 * 60;
var mm = Math.floor(msec / 1000 / 60);
msec -= mm * 1000 * 60;
var ss = Math.floor(msec / 1000);
msec -= ss * 1000;
if(hh === 0) {
if(mm === 0) {
return ss + 's';
}
return mm + 'min ' + ss + 's';
}
return hh + 'h ' + mm + 'm ' + ss + 's';
}
$(function() {
"use strict";
RecAPI.get_archive().success(function(archive) {
/* To get sorted traversal, we need to do an array containing keys */
var keys = [];
for(var prop in archive) {
keys.push(prop);
}
keys.sort(function(a,b) { return b - a; }); //descending
/* ok, now we can traverse the objects */
for(var i =0; i < keys.length; i++) {
var rec = archive[keys[i]];
console.log(rec);
var name = $('<td/>').text(rec.name);
var start = $('<td/>').text(config.date_read(
parseInt(rec.starttime, 10)).toLocaleString()
);
var duration = $('<td/>').text(delta(rec.endtime, rec.starttime));
var dl_text = $('<span/>').text(" Scarica").addClass('pure-hidden-phone');
var fn = $("<td/>").append($("<a/>").prop("href", "/output/" +
rec.filename).addClass("pure-button pure-button-small")
.html( $("<i/>").addClass("fa fa-download").css("color", "green"))
.append(dl_text));
var row = $('<tr/>').append(name).append(start).append(duration).append(fn);
$('#ongoing-recs-table tbody').append(row);
}
});
});
</script>
</head>
<body class="pure-skin-porpora">
<div class="pure-menu pure-menu-open pure-menu-horizontal">
<a href="#" class="pure-menu-heading">TechRec</a>
<ul>
<li><a href="new.html">Diretta</a></li>
<li><a href="old.html">Vecchie</a></li>
<li class="pure-menu-selected"><a href="archive.html">Archivio</a></li>
</ul>
</div>
<h1>Registrazioni gi&agrave; completate</h1>
<div id="rec-normal" class="pure-g-r">
<div class="pure-u-1-8"></div>
<div class="pure-u-3-4">
<table width="100%" class="pure-table pure-table-horizontal pure-table-striped"
id="ongoing-recs-table" style="margin-top: 3em;">
<tbody>
<tr>
<th>Nome</th>
<th>Inizio</th>
<th>Durata</th>
<th>File</th>
</tr>
</tbody>
</table>
</div>
<div class="pure-u-1-8"></div>
</div>
</body>
</html>
<!-- vim: set ts=2 sw=2 noet: -->

View file

@ -1,80 +0,0 @@
import multiprocessing
class JobQueue(object):
def __init__(self):
self.pool = multiprocessing.Pool(processes=1)
self.last_job_id = 0
self.jobs = {} # job_id: AsyncResult
def submit(self, function, *args, **kwargs):
self.last_job_id += 1
job_id = self.last_job_id
def clean_jobs(res):
"""this callback will remove the job from the queue"""
del self.jobs[job_id]
self.jobs[job_id] = self.pool.apply_async(function, args, kwargs, clean_jobs)
return job_id
def check_job(self, job_id):
"""
If the job is running, return the asyncResult.
If it has already completed, returns True.
If no such job_id exists at all, returns False
"""
if job_id <= 0:
raise ValueError("non-valid job_id")
if self.last_job_id < job_id:
return False
if job_id in self.jobs:
return self.jobs[job_id]
return True
def join(self):
self.pool.close()
self.pool.join()
self.pool = None
def simulate_long_job(recid=None, starttime=None, endtime=None, name="", filename=None):
from time import sleep
print("evviva " + name)
sleep(2)
print("lavoro su " + name)
sleep(2)
print("done su " + name)
_queue = None
def get_process_queue():
global _queue
if _queue is None:
_queue = JobQueue()
return _queue
if __name__ == "__main__":
from datetime import datetime
n = datetime.now()
def sleep(n):
import time
print("Inizio %d" % n)
time.sleep(n)
print("Finisco %d" % n)
return n
get_process_queue().submit(sleep, 3)
get_process_queue().submit(sleep, 3)
get_process_queue().join()
print(get_process_queue().jobs)
delta = (datetime.now() - n).total_seconds()
print(delta)
assert 5 < delta < 7

View file

@ -1,4 +0,0 @@
Paste==1.7.5.1
SQLAlchemy==0.8.3
bottle==0.11.6
wsgiref==0.1.2

View file

@ -1,418 +0,0 @@
import os
import sys
from datetime import datetime
import logging
from functools import partial
import unicodedata
from bottle import Bottle, request, static_file, redirect, abort, response
import bottle
logger = logging.getLogger("server")
botlog = logging.getLogger("bottle")
botlog.setLevel(logging.INFO)
botlog.addHandler(logging.StreamHandler(sys.stdout))
bottle._stderr = lambda x: botlog.info(x.strip())
from .db import Rec, RecDB
from .processqueue import get_process_queue
from .forge import create_mp3
from .config_manager import get_config
def date_read(s):
return datetime.fromtimestamp(int(s))
def date_write(dt):
return dt.strftime("%s")
def rec_sanitize(rec):
d = rec.serialize()
d["starttime"] = date_write(d["starttime"])
d["endtime"] = date_write(d["endtime"])
return d
class DateApp(Bottle):
"""
This application will expose some date-related functions; it is intended to
be used when you need to know the server's time on the browser
"""
def __init__(self):
Bottle.__init__(self)
self.route("/help", callback=self.help)
self.route("/date", callback=self.date)
self.route("/custom", callback=self.custom)
def date(self):
n = datetime.now()
return {
"unix": n.strftime("%s"),
"isoformat": n.isoformat(),
"ctime": n.ctime(),
}
def custom(self):
n = datetime.now()
if "strftime" not in request.query:
abort(400, 'Need argument "strftime"')
response.content_type = "text/plain"
return n.strftime(request.query["strftime"])
def help(self):
response.content_type = "text/plain"
return (
"/date : get JSON dict containing multiple formats of now()\n"
+ "/custom?strftime=FORMAT : get now().strftime(FORMAT)"
)
class RecAPI(Bottle):
def __init__(self, app):
Bottle.__init__(self)
self._route()
self._app = app
self.db = RecDB(get_config()["DB_URI"])
def _route(self):
self.post("/create", callback=self.create)
self.post("/delete", callback=self.delete)
self.post("/update/<recid:int>", callback=self.update)
self.post("/generate", callback=self.generate)
self.get("/help", callback=self.help)
self.get("/", callback=self.help)
self.get("/get/search", callback=self.search)
self.get("/get/ongoing", callback=self.get_ongoing)
self.get("/get/archive", callback=self.get_archive)
self.get("/jobs", callback=self.running_jobs)
self.get("/jobs/<job_id:int>", callback=self.check_job)
def create(self):
req = dict(request.POST.decode().allitems())
ret = {}
logger.debug("Create request %s " % req)
now = datetime.now()
start = date_read(req["starttime"]) if "starttime" in req else now
name = req["name"] if "name" in req else u""
end = date_read(req["endtime"]) if "endtime" in req else now
rec = Rec(name=name, starttime=start, endtime=end)
ret = self.db.add(rec)
return self.rec_msg(
"Nuova registrazione creata! (id:%d)" % ret.id, rec=rec_sanitize(rec)
)
def delete(self):
req = dict(request.POST.decode().allitems())
logging.info("Server: request delete %s " % (req))
if "id" not in req:
return self.rec_err("No valid ID")
if self.db.delete(req["id"]):
return self.rec_msg("DELETE OK")
else:
return self.rec_err("DELETE error: %s" % (self.db.get_err()))
def update(self, recid):
req = dict(request.POST.decode().allitems())
newrec = {}
now = datetime.now()
if "starttime" not in req:
newrec["starttime"] = now
else:
newrec["starttime"] = date_read(req["starttime"])
if "endtime" not in req:
newrec["endtime"] = now
else:
newrec["endtime"] = date_read(req["endtime"])
if "name" in req:
newrec["name"] = req["name"]
try:
logger.info("prima di update")
result_rec = self.db.update(recid, newrec)
logger.info("dopo update")
except Exception as exc:
return self.rec_err("Errore Aggiornamento", exception=exc)
return self.rec_msg("Aggiornamento completato!", rec=rec_sanitize(result_rec))
def generate(self):
# prendiamo la rec in causa
recid = dict(request.POST.decode().allitems())["id"]
rec = self.db._search(_id=recid)[0]
if rec.filename is not None and os.path.exists(rec.filename):
return {
"status": "ready",
"message": "The file has already been generated at %s" % rec.filename,
"rec": rec,
}
if (
get_config()["FORGE_MAX_DURATION"] > 0
and (rec.endtime - rec.starttime).total_seconds()
> get_config()["FORGE_MAX_DURATION"]
):
response.status = 400
return {
"status": "error",
"message": "The requested recording is too long"
+ " (%d seconds)" % (rec.endtime - rec.starttime).total_seconds(),
}
rec.filename = get_config()["AUDIO_OUTPUT_FORMAT"] % {
"time": rec.starttime.strftime(
"%y%m%d_%H%M"
), # kept for retrocompatibility, should be dropped
"endtime": rec.endtime.strftime("%H%M"),
"startdt": rec.starttime.strftime("%y%m%d_%H%M"),
"enddt": rec.endtime.strftime("%y%m%d_%H%M"),
"name": "".join(
filter(
lambda c: c.isalpha(),
unicodedata.normalize("NFKD", rec.name)
.encode("ascii", "ignore")
.decode("ascii"),
)
),
}
self.db.get_session(rec).commit()
job_id = self._app.pq.submit(
create_mp3,
start=rec.starttime,
end=rec.endtime,
outfile=os.path.join(get_config()["AUDIO_OUTPUT"], rec.filename),
options={
"title": rec.name,
"license_uri": get_config()["TAG_LICENSE_URI"],
"extra_tags": get_config()["TAG_EXTRA"],
},
)
logger.debug("SUBMITTED: %d" % job_id)
return self.rec_msg(
"Aggiornamento completato!",
job_id=job_id,
result="/output/" + rec.filename,
rec=rec_sanitize(rec),
)
def check_job(self, job_id):
try:
job = self._app.pq.check_job(job_id)
except ValueError:
abort(400, "job_id not valid")
def ret(status):
return {"job_status": status, "job_id": job_id}
if job is True:
return ret("DONE")
if job is False:
abort(404, "No such job has ever been spawned")
else:
if job.ready():
try:
res = job.get()
return res
except Exception as exc:
r = ret("FAILED")
r["exception"] = str(exc)
import traceback
tb = traceback.format_exc()
logger.warning(tb)
if get_config()["DEBUG"]:
r["exception"] = "%s: %s" % (str(exc), tb)
r["traceback"] = tb
return r
return ret("WIP")
def running_jobs(self):
res = {}
res["last_job_id"] = self._app.pq.last_job_id
res["running"] = self._app.pq.jobs.keys()
return res
def search(self, args=None):
req = dict()
req.update(request.GET.allitems())
logger.debug("Search request: %s" % (req))
values = self.db._search(**req)
from pprint import pprint
logger.debug("Returned Values %s" % pprint([r.serialize() for r in values]))
ret = {}
for rec in values:
ret[rec.id] = rec_sanitize(rec)
logging.info("Return: %s" % ret)
return ret
def get_ongoing(self):
return {rec.id: rec_sanitize(rec) for rec in self.db.get_ongoing()}
def get_archive(self):
return {rec.id: rec_sanitize(rec) for rec in self.db.get_archive_recent()}
# @route('/help')
def help(self):
return "<h1>help</h1><hr/>\
<h2>/get, /get/, /get/<id> </h2>\
<h3>Get Info about rec identified by ID </h3>\
\
<h2>/search, /search/, /search/<key>/<value></h2>\
<h3>Search rec that match key/value (or get all)</h3>\
\
<h2>/delete/<id> </h2>\
<h3>Delete rec identified by ID </h3>\
<h2>/update </h2>\
<h3>Not implemented.</h3>"
# JSON UTILS
def rec_msg(self, msg, status=True, **kwargs):
d = {"message": msg, "status": status}
d.update(kwargs)
return d
def rec_err(self, msg, **kwargs):
return self.rec_msg(msg, status=False, **kwargs)
class RecServer:
def __init__(self):
self._app = Bottle()
self._app.pq = get_process_queue()
self._route()
self.db = RecDB(get_config()["DB_URI"])
def _route(self):
# Static part of the site
self._app.route(
"/output/<filepath:path>",
callback=lambda filepath: static_file(
filepath, root=get_config()["AUDIO_OUTPUT"], download=True
),
)
self._app.route(
"/static/<filepath:path>",
callback=lambda filepath: static_file(
filepath, root=get_config()["STATIC_FILES"]
),
)
self._app.route("/", callback=lambda: redirect("/new.html"))
self._app.route(
"/new.html",
callback=partial(
static_file, "new.html", root=get_config()["STATIC_PAGES"]
),
)
self._app.route(
"/old.html",
callback=partial(
static_file, "old.html", root=get_config()["STATIC_PAGES"]
),
)
self._app.route(
"/archive.html",
callback=partial(
static_file, "archive.html", root=get_config()["STATIC_PAGES"]
),
)
class DebugAPI(Bottle):
"""
This application is useful for testing the webserver itself
"""
def __init__(self):
Bottle.__init__(self)
self.route("/sleep/:milliseconds", callback=self.sleep)
self.route("/cpusleep/:howmuch", callback=self.cpusleep)
self.route("/big/:exponent", callback=self.big)
def sleep(self, milliseconds):
import time
time.sleep(int(milliseconds) / 1000.0)
return "ok"
def cpusleep(self, howmuch):
out = ""
for i in xrange(int(howmuch) * (10 ** 3)):
if i % 11234 == 0:
out += "a"
return out
def big(self, exponent):
"""
returns a 2**n -1 string
"""
for i in xrange(int(exponent)):
yield str(i) * (2 ** i)
def help(self):
response.content_type = "text/plain"
return """
/sleep/<int:milliseconds> : sleep, than say "ok"
/cpusleep/<int:howmuch> : busysleep, than say "ok"
/big/<int:exponent> : returns a 2**n -1 byte content
"""
class PasteLoggingServer(bottle.PasteServer):
def run(self, handler): # pragma: no cover
from paste import httpserver
from paste.translogger import TransLogger
handler = TransLogger(handler, **self.options["translogger_opts"])
del self.options["translogger_opts"]
httpserver.serve(handler, host=self.host, port=str(self.port), **self.options)
bottle.server_names["pastelog"] = PasteLoggingServer
def main_cmd(*args):
"""meant to be called from argparse"""
c = RecServer()
c._app.mount("/date", DateApp())
c._app.mount("/api", RecAPI(c._app))
if get_config()["DEBUG"]:
c._app.mount("/debug", DebugAPI())
server = get_config()["WSGI_SERVER"]
if server == "pastelog":
from paste.translogger import TransLogger
get_config()["WSGI_SERVER_OPTIONS"]["translogger_opts"] = get_config()[
"TRANSLOGGER_OPTS"
]
c._app.run(
server=server,
host=get_config()["HOST"],
port=get_config()["PORT"],
debug=get_config()["DEBUG"],
quiet=True, # this is to hide access.log style messages
**get_config()["WSGI_SERVER_OPTIONS"]
)
if __name__ == "__main__":
from cli import common_pre
common_pre()
logger.warn("Usage of server.py is deprecated; use cli.py")
main_cmd()
# vim: set ts=4 sw=4 et ai ft=python:

8
setup.cfg Normal file
View file

@ -0,0 +1,8 @@
[flake8]
max-line-length=89
ignore=D
[mypy]
show_error_codes = True
python_version = 3.7
pretty = True

View file

@ -1,28 +1,25 @@
#!/usr/bin/env python
from setuptools import setup
from distutils.core import setup
with open("requirements.txt") as buf:
REQUIREMENTS = [line.strip() for line in buf if line.strip()]
setup(
name="techrec",
version="1.2.0",
description="A Python2 web application "
version="2.0.0a1.dev1",
description="A Python3 web application "
"that assist radio speakers in recording their shows",
long_description=open("README.md").read(),
long_description_content_type="text/markdown",
author="boyska",
author_email="piuttosto@logorroici.org",
packages=["techrec"],
package_dir={"techrec": "server"},
install_requires=["Paste~=3.2", "SQLAlchemy==0.8.3", "bottle~=0.12"],
classifiers=[
"Programming Language :: Python :: 2.7",
"Programming Language :: Python :: 3.7",
],
package_dir={"techrec": "techrec"},
install_requires=REQUIREMENTS,
classifiers=["Programming Language :: Python :: 3.7"],
entry_points={"console_scripts": ["techrec = techrec.cli:main"]},
zip_safe=False,
install_package_data=True,
package_data={"techrec": ["static/**/*", "pages/*.html"]},
test_suite="nose.collector",
setup_requires=["nose>=1.0"],
tests_requires=["nose>=1.0"],
)

View file

@ -1,32 +1,44 @@
import logging
import os
import os.path
import sys
from argparse import ArgumentParser, Action
from argparse import Action, ArgumentParser
from datetime import datetime
import logging
import urllib.request
from . import forge, maint, server
from .config_manager import get_config
logging.basicConfig(stream=sys.stdout)
logger = logging.getLogger("cli")
CWD = os.getcwd()
OK_CODES = [200, 301, 302]
from . import forge
from . import maint
from .config_manager import get_config
from . import server
def is_writable(d):
return os.access(d, os.W_OK)
def check_remote_store(url: str) -> None:
try:
with urllib.request.urlopen(url) as req:
if req.code not in OK_CODES:
logger.warn(f"Audio input {url} not responding")
except Exception as e:
logger.warn(f"Audio input {url} not accessible: {e}")
def pre_check_permissions():
def is_writable(d):
return os.access(d, os.W_OK)
if is_writable(get_config()["AUDIO_INPUT"]):
yield "Audio input '%s' writable" % get_config()["AUDIO_INPUT"]
if not os.access(get_config()["AUDIO_INPUT"], os.R_OK):
yield "Audio input '%s' unreadable" % get_config()["AUDIO_INPUT"]
sys.exit(10)
if is_writable(os.getcwd()):
audio_input = get_config()["AUDIO_INPUT"]
if audio_input.startswith("http://") or audio_input.startswith("https://"):
check_remote_store(audio_input)
else:
if is_writable(audio_input):
yield "Audio input '%s' writable" % audio_input
if not os.access(audio_input, os.R_OK):
yield "Audio input '%s' unreadable" % audio_input
sys.exit(10)
if is_writable(CWD):
yield "Code writable"
if not is_writable(get_config()["AUDIO_OUTPUT"]):
yield "Audio output '%s' not writable" % get_config()["AUDIO_OUTPUT"]
@ -63,8 +75,14 @@ class DateTimeAction(Action):
setattr(namespace, self.dest, parsed_val)
def common_pre():
prechecks = [pre_check_user, pre_check_permissions, pre_check_ffmpeg]
code_dir = os.path.dirname(os.path.realpath(__file__))
def common_pre(nochecks=False):
if nochecks:
prechecks = []
else:
prechecks = [pre_check_user, pre_check_permissions, pre_check_ffmpeg]
configs = ["default_config.py"]
if "TECHREC_CONFIG" in os.environ:
for conf in os.environ["TECHREC_CONFIG"].split(":"):
@ -72,13 +90,14 @@ def common_pre():
continue
path = os.path.realpath(conf)
if not os.path.exists(path):
logger.warn("Configuration file '%s' does not exist; skipping" % path)
logger.warn(
"Configuration file '%s' does not exist; skipping" % path)
continue
configs.append(path)
if getattr(sys, 'frozen', False):
if getattr(sys, "frozen", False):
os.chdir(sys._MEIPASS)
else:
os.chdir(os.path.dirname(os.path.realpath(__file__)))
os.chdir(code_dir)
for conf in configs:
get_config().from_pyfile(conf)

View file

@ -2,15 +2,22 @@
This module contains DB logic
"""
from __future__ import print_function
import logging
import sys
from datetime import datetime, timedelta
import sys
from sqlalchemy import create_engine, Column, Integer, String, DateTime, inspect
from sqlalchemy.orm import sessionmaker
from sqlalchemy import (
Column,
DateTime,
Boolean,
Integer,
String,
create_engine,
inspect,
)
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
from .config_manager import get_config
@ -28,6 +35,8 @@ class Rec(Base):
starttime = Column(DateTime, nullable=True)
endtime = Column(DateTime, nullable=True)
filename = Column(String, nullable=True)
ready = Column(Boolean, default=False)
error = Column(String, nullable=True, default=None)
def __init__(self, name="", starttime=None, endtime=None, filename=None):
self.name = name
@ -43,6 +52,7 @@ class Rec(Base):
"starttime": self.starttime,
"endtime": self.endtime,
"filename": self.filename,
"ready": self.ready,
}
def __repr__(self):
@ -56,6 +66,14 @@ class Rec(Base):
contents += ",Filename: '%s'" % self.filename
return "<Rec(%s)>" % contents
@property
def status(self) -> str:
if self.error is not None:
return 'ERROR'
if self.ready:
return 'DONE'
return 'WIP'
class RecDB:
def __init__(self, uri):
@ -64,7 +82,8 @@ class RecDB:
self.log = logging.getLogger(name=self.__class__.__name__)
logging.getLogger("sqlalchemy.engine").setLevel(logging.FATAL)
logging.getLogger("sqlalchemy.engine.base.Engine").setLevel(logging.FATAL)
logging.getLogger(
"sqlalchemy.engine.base.Engine").setLevel(logging.FATAL)
logging.getLogger("sqlalchemy.dialects").setLevel(logging.FATAL)
logging.getLogger("sqlalchemy.pool").setLevel(logging.FATAL)
logging.getLogger("sqlalchemy.orm").setLevel(logging.FATAL)
@ -169,7 +188,7 @@ class RecDB:
return query.filter(Rec.filename == None)
def _query_saved(self, query=None):
"""Still not saved"""
"""saved, regardless of status"""
if query is None:
query = self.get_session().query(Rec)
return query.filter(Rec.filename != None)

View file

@ -4,24 +4,18 @@ import sys
HOST = "localhost"
PORT = "8000"
# pastelog is just "paste", but customized to accept logging options
WSGI_SERVER = "pastelog"
# these are pastelog-specific options for logging engine
TRANSLOGGER_OPTS = {
"logger_name": "accesslog",
"set_logger_level": logging.WARNING,
"setup_console_handler": False,
}
WSGI_SERVER_OPTIONS = {}
DEBUG = True
DB_URI = "sqlite:///techrec.db"
AUDIO_OUTPUT = "output/"
AUDIO_INPUT = "rec/"
AUDIO_INPUT_BASICAUTH = None # Could be a ("user", "pass") tuple instead
AUDIO_INPUT_FORMAT = "%Y-%m/%d/rec-%Y-%m-%d-%H-%M-%S.mp3"
AUDIO_OUTPUT_FORMAT = "techrec-%(startdt)s-%(endtime)s-%(name)s.mp3"
FORGE_TIMEOUT = 20
FORGE_MAX_DURATION = 3600 * 5
FORGE_VERIFY = False
FORGE_VERIFY_THRESHOLD = 3
FFMPEG_OUT_CODEC = ["-acodec", "copy"]
FFMPEG_OPTIONS = ["-loglevel", "warning", "-n"]
FFMPEG_PATH = "ffmpeg"
@ -33,7 +27,7 @@ TAG_LICENSE_URI = None
# defaults
STATIC_FILES = "static/"
STATIC_PAGES = "pages/"
if getattr(sys, 'frozen', False): # pyinstaller
if getattr(sys, "frozen", False): # pyinstaller
STATIC_FILES = os.path.join(sys._MEIPASS, STATIC_FILES)
STATIC_PAGES = os.path.join(sys._MEIPASS, STATIC_PAGES)
else:

226
techrec/forge.py Normal file
View file

@ -0,0 +1,226 @@
import asyncio
from aiofiles.os import os as async_os
import logging
import tempfile
import os
from datetime import datetime, timedelta
from subprocess import Popen
from time import sleep
from typing import Callable, Optional
from techrec.config_manager import get_config
from techrec.http_retriever import download
logger = logging.getLogger("forge")
Validator = Callable[[datetime, datetime, str], bool]
def round_timefile(exact: datetime) -> datetime:
"""
This will round the datetime, so to match the file organization structure
"""
return datetime(exact.year, exact.month, exact.day, exact.hour)
def get_files_and_intervals(start, end, rounder=round_timefile):
"""
both arguments are datetime objects
returns an iterator whose elements are (filename, start_cut, end_cut)
Cuts are expressed in seconds
"""
if end <= start:
raise ValueError("end < start!")
while start <= end:
begin = rounder(start)
start_cut = (start - begin).total_seconds()
if end < begin + timedelta(seconds=3599):
end_cut = (begin + timedelta(seconds=3599) - end).total_seconds()
else:
end_cut = 0
yield (begin, start_cut, end_cut)
start = begin + timedelta(hours=1)
class InputBackend:
def __init__(self, basepath):
self.base = basepath
self.log = logging.getLogger(self.__class__.__name__)
async def search_files(self, start, end):
# assumption: a day is not split in multiple folder
start_dir = self.parent_dir(self.time_to_uri(start))
end_dir = self.parent_dir(self.time_to_uri(end))
files = {
fpath
for directory in {start_dir, end_dir}
for fpath in await self.list_dir(directory)
}
files_date = [] # tuple of str, datetime
for fpath in files:
try:
dt = self.uri_to_time(fpath)
except Exception as exc:
self.log.debug("Skipping %s", fpath)
print(exc)
continue
if dt > end:
continue
files_date.append((fpath, dt))
# The first file in the list will now be the last chunk to be added.
files_date.sort(key=lambda fpath_dt: fpath_dt[1], reverse=True)
final_files = []
need_to_exit = False
for fpath, dt in files_date:
if need_to_exit:
break
if dt < start:
need_to_exit = True
final_files.insert(0, fpath)
self.log.info("Relevant files: %s", ", ".join(final_files))
return final_files
async def list_dir(self, path):
raise NotImplementedError()
def parent_dir(self, path):
return os.path.dirname(path)
def time_to_uri(self, time: datetime) -> str:
return os.path.join(
str(self.base),
time.strftime(get_config()["AUDIO_INPUT_FORMAT"])
)
def uri_to_time(self, fpath: str) -> datetime:
return datetime.strptime(
os.path.basename(fpath),
get_config()["AUDIO_INPUT_FORMAT"].split('/')[-1])
async def get_file(uri: str) -> str:
return uri
class DirBackend(InputBackend):
def uri_to_relative(self, fpath: str) -> str:
return os.path.relpath(fpath, str(self.base))
async def list_dir(self, path):
files = [os.path.join(path, f) for f in async_os.listdir(path)]
return files
class HttpBackend(InputBackend):
async def get_file(uri: str) -> str:
self.log.info(f"downloading: {uri}")
local = await download(
uri,
basic_auth=get_config()['AUDIO_INPUT_BASICAUTH'],
)
return local
def get_ffmpeg_cmdline(fpaths: list, backend, start: datetime, end: datetime) -> list:
ffmpeg = get_config()["FFMPEG_PATH"]
cmdline = [ffmpeg, "-i", "concat:%s" % "|".join(fpaths)]
cmdline += get_config()["FFMPEG_OUT_CODEC"]
startskip = (start - backend.uri_to_time(fpaths[0])).total_seconds()
if startskip > 0:
cmdline += ["-ss", "%d" % startskip]
cmdline += ["-t", "%d" % (end - start).total_seconds()]
return cmdline
async def create_mp3(
start: datetime,
end: datetime,
outfile: str,
options={},
validator: Optional[Validator] = None,
**kwargs,
):
be = DirBackend(get_config()['AUDIO_INPUT'])
fpaths = await be.search_files(start, end)
# metadata date/time formatted according to
# https://wiki.xiph.org/VorbisComment#Date_and_time
metadata = {}
if outfile.endswith(".mp3"):
metadata["TRDC"] = start.replace(microsecond=0).isoformat()
metadata["RECORDINGTIME"] = metadata["TRDC"]
metadata["ENCODINGTIME"] = datetime.now().replace(
microsecond=0).isoformat()
else:
metadata["DATE"] = start.replace(microsecond=0).isoformat()
metadata["ENCODER"] = "https://git.lattuga.net/techbloc/techrec"
if "title" in options:
metadata["TITLE"] = options["title"]
if options.get("license_uri", None) is not None:
metadata["RIGHTS-DATE"] = start.strftime("%Y-%m")
metadata["RIGHTS-URI"] = options["license_uri"]
if "extra_tags" in options:
metadata.update(options["extra_tags"])
metadata_list = []
for tag, value in metadata.items():
if "=" in tag:
logger.error('Received a tag with "=" inside, skipping')
continue
metadata_list.append("-metadata")
metadata_list.append("%s=%s" % (tag, value))
prefix, suffix = os.path.basename(outfile).split(".", 1)
tmp_file = tempfile.NamedTemporaryFile(
suffix=".%s" % suffix,
prefix="forge-%s" % prefix,
delete=False,
# This is needed to avoid errors with the rename across different mounts
dir=os.path.dirname(outfile),
)
cmd = (
get_ffmpeg_cmdline(fpaths, be, start, end)
+ metadata_list
+ ["-y"]
+ get_config()["FFMPEG_OPTIONS"]
+ [tmp_file.name]
)
logger.info("Running %s", " ".join(cmd))
p = Popen(cmd)
if get_config()["FORGE_TIMEOUT"] == 0:
p.wait()
else:
start = datetime.now()
while (datetime.now() - start).total_seconds() < get_config()["FORGE_TIMEOUT"]:
p.poll()
if p.returncode is None:
sleep(1)
else:
break
if p.returncode is None:
os.kill(p.pid, 15)
try:
os.remove(tmp_file.name)
except Exception:
pass
raise Exception("timeout") # TODO: make a specific TimeoutError
if p.returncode != 0:
raise OSError("return code was %d" % p.returncode)
if validator is not None and not validator(start, end, tmp_file.name):
os.unlink(tmp_file.name)
return False
os.rename(tmp_file.name, outfile)
return True
def main_cmd(options):
log = logging.getLogger("forge_main")
outfile = os.path.abspath(os.path.join(options.cwd, options.outfile))
log.debug("will forge an mp3 into %s" % (outfile))
asyncio.run(create_mp3(options.starttime, options.endtime, outfile))

52
techrec/http_retriever.py Normal file
View file

@ -0,0 +1,52 @@
# -*- encoding: utf-8 -*-
import os
from typing import Optional, Tuple
from tempfile import mkdtemp
from logging import getLogger
import aiohttp # type: ignore
CHUNK_SIZE = 2 ** 12
log = getLogger("http")
async def download(
remote: str,
staging: Optional[str] = None,
basic_auth: Optional[Tuple[str, str]] = None,
) -> str:
"""
This will download to AUDIO_STAGING the remote file and return the local
path of the downloaded file
"""
_, filename = os.path.split(remote)
if staging:
base = staging
else:
# if no staging is specified, and you want to clean the storage
# used by techrec: rm -rf /tmp/techrec*
base = mkdtemp(prefix="techrec-", dir="/tmp")
local = os.path.join(base, filename)
session_args = {}
if basic_auth is not None:
session_args["auth"] = aiohttp.BasicAuth(
login=basic_auth[0], password=basic_auth[1], encoding="utf-8"
)
log.debug("Downloading %s with %s options", remote, ",".join(session_args.keys()))
async with aiohttp.ClientSession(**session_args) as session:
async with session.get(remote) as resp:
if resp.status != 200:
raise ValueError(
"Could not download %s: error %d" % (remote, resp.status)
)
with open(local, "wb") as f:
while True:
chunk = await resp.content.read(CHUNK_SIZE)
if not chunk:
break
f.write(chunk)
log.debug("Downloading %s complete", remote)
return local

View file

@ -1,6 +1,7 @@
from __future__ import print_function
import sys
import logging
import sys
from sqlalchemy import inspect

View file

@ -0,0 +1,53 @@
<!DOCTYPE html>
<html>
<head>
<title>TechREC</title>
<link rel="icon" href="/static/img/icon.ico" />
<link rel="stylesheet" type="text/css" href="/static/css/pure-min.css" />
<link rel="stylesheet" type="text/css" href="/static/css/pure-skin-porpora.css" />
<link rel="stylesheet" type="text/css" href="/static/css/jquery-ui.min.css" />
<link rel="stylesheet" type="text/css" href="/static/css/techrec.css">
<link rel="stylesheet" type="text/css" href="/static/css/font-awesome.css" />
<script src="/static/js/jquery-1.9.1.min.js"></script>
<script src="/static/js/jquery-ui.min.js"></script>
<script src="/static/js/jquery.ui.datepicker-it.min.js"></script>
<script src="/static/js/rec.js"></script>
<script src="/static/js/archive.js"></script>
</head>
<body class="pure-skin-porpora">
<div class="pure-menu pure-menu-open pure-menu-horizontal">
<a href="#" class="pure-menu-heading">TechRec</a>
<ul>
<li><a href="new.html">Diretta</a></li>
<li><a href="old.html">Vecchie</a></li>
<li class="pure-menu-selected"><a href="archive.html">Archivio</a></li>
</ul>
</div>
<h1>Registrazioni gi&agrave; completate</h1>
<div id="rec-normal" class="pure-g-r">
<div class="pure-u-1-8"></div>
<div class="pure-u-3-4">
<table width="100%" class="pure-table pure-table-horizontal pure-table-striped"
id="ongoing-recs-table" style="margin-top: 3em;">
<tbody>
<tr>
<th>Nome</th>
<th>Inizio</th>
<th>Durata</th>
<th>File</th>
</tr>
</tbody>
</table>
</div>
<div class="pure-u-1-8"></div>
</div>
</body>
</html>
<!-- vim: set ts=2 sw=2 noet: -->

8
techrec/requirements.txt Normal file
View file

@ -0,0 +1,8 @@
SQLAlchemy==0.8.3
click==7.1.2
fastapi==0.62.0
h11==0.11.0
pydantic==1.7.3
starlette==0.13.6
typing-extensions==3.7.4.3
uvicorn==0.13.1

386
techrec/server.py Normal file
View file

@ -0,0 +1,386 @@
#!/usr/bin/env python3
import sys
import logging
import time
import os
import unicodedata
from datetime import datetime
from typing import Optional
from subprocess import check_output
from fastapi import FastAPI, HTTPException, Request, Response, BackgroundTasks
from fastapi.responses import FileResponse, RedirectResponse, JSONResponse
from fastapi.staticfiles import StaticFiles
from pydantic import BaseModel, Field
from .cli import common_pre
from .config_manager import get_config
from .db import Rec, RecDB
from .forge import create_mp3, Validator
logger = logging.getLogger("server")
common_pre(nochecks=('pytest' in sys.argv[0]))
app = FastAPI()
db = None
def date_read(s):
return datetime.fromtimestamp(int(s))
def date_write(dt):
return dt.strftime("%s")
def rec_sanitize(rec):
d = rec.serialize()
d["starttime"] = date_write(d["starttime"])
d["endtime"] = date_write(d["endtime"])
return d
@app.on_event("startup")
async def startup_event():
global db
common_pre()
if get_config()["DEBUG"]:
logging.basicConfig(level=logging.DEBUG)
db = RecDB(get_config()["DB_URI"])
@app.get("/date/date")
def date():
n = datetime.now()
return {"unix": n.strftime("%s"), "isoformat": n.isoformat(), "ctime": n.ctime()}
def TextResponse(text: str):
return Response(content=text, media_type="text/plain")
def abort(code, text):
raise HTTPException(status_code=code, detail=text)
@app.get("/date/custom")
def custom(strftime: str = ""):
n = datetime.now()
if not strftime:
abort(400, 'Need argument "strftime"')
return TextResponse(n.strftime(strftime))
@app.get("/date/help")
def help():
return TextResponse(
"/date : get JSON dict containing multiple formats of now()\n"
+ "/custom?strftime=FORMAT : get now().strftime(FORMAT)"
)
class CreateInfo(BaseModel):
starttime: Optional[str] = None
endtime: Optional[str] = None
name: str = ""
@app.post("/api/create")
async def create(req: CreateInfo = None):
ret = {}
logger.debug("Create request %s " % req)
if req is None:
req = CreateInfo()
now = datetime.now()
start = date_read(req.starttime) if req.starttime is not None else now
name = req.name
end = date_read(req.endtime) if req.endtime is not None else now
rec = Rec(name=name, starttime=start, endtime=end)
ret = db.add(rec)
return rec_msg(
"Nuova registrazione creata! (id:%d)" % ret.id, rec=rec_sanitize(rec)
)
class DeleteInfo(BaseModel):
id: int
@app.post("/api/delete")
def delete(req: DeleteInfo):
if db.delete(req.id):
return rec_msg("DELETE OK")
else:
return rec_err("DELETE error: %s" % (db.get_err()))
def timefield_factory():
return int(time.time())
TimeField = Field(default_factory=timefield_factory)
class UpdateInfo(BaseModel):
name: str = ""
starttime: int = Field(default_factory=timefield_factory)
endtime: int = Field(default_factory=timefield_factory)
filename: Optional[str] = None
@app.post("/api/update/{recid}")
async def update(recid: int, req: UpdateInfo):
global db
newrec = {}
newrec["starttime"] = date_read(req.starttime)
newrec["endtime"] = date_read(req.endtime)
if req.name:
newrec["name"] = req.name
try:
logger.info("prima di update")
result_rec = db.update(recid, newrec)
session = db.get_session(rec)
session.refresh(rec)
logger.info("dopo update")
except Exception as exc:
return rec_err("Errore Aggiornamento", exception=exc)
return rec_msg("Aggiornamento completato!", rec=rec_sanitize(result_rec))
class GenerateInfo(BaseModel):
id: int
class GenerateResponse(BaseModel):
status: str
message: str
@app.post("/api/generate/{recid}")
async def generate(recid: int, response: Response, background_tasks: BackgroundTasks):
global db
# prendiamo la rec in causa
rec = db._search(_id=recid)[0]
session = db.get_session(rec)
session.refresh(rec)
if rec.ready:
return {
"status": "ready",
"message": "The file has already been generated at %s" % rec.filename,
"rec": rec,
}
if (
get_config()["FORGE_MAX_DURATION"] > 0
and (rec.endtime - rec.starttime).total_seconds()
> get_config()["FORGE_MAX_DURATION"]
):
return JSONResponse(
status_code=400,
status="error",
message="The requested recording is too long"
+ " (%d seconds)" % (rec.endtime - rec.starttime).total_seconds(),
)
rec.filename = get_config()["AUDIO_OUTPUT_FORMAT"] % {
"time": rec.starttime.strftime(
"%y%m%d_%H%M"
), # kept for retrocompatibility, should be dropped
"endtime": rec.endtime.strftime("%H%M"),
"startdt": rec.starttime.strftime("%y%m%d_%H%M"),
"enddt": rec.endtime.strftime("%y%m%d_%H%M"),
"name": "".join(
filter(
lambda c: c.isalpha(),
unicodedata.normalize("NFKD", rec.name)
.encode("ascii", "ignore")
.decode("ascii"),
)
),
}
db.get_session(rec).commit()
background_tasks.add_task(
generate_mp3,
db_id=recid,
start=rec.starttime,
end=rec.endtime,
outfile=os.path.join(get_config()["AUDIO_OUTPUT"], rec.filename),
options={
"title": rec.name,
"license_uri": get_config()["TAG_LICENSE_URI"],
"extra_tags": get_config()["TAG_EXTRA"],
},
)
logger.debug("SUBMITTED: %d" % recid)
return rec_msg(
"Aggiornamento completato!",
job_id=rec.id,
result="/output/" + rec.filename,
rec=rec_sanitize(rec),
)
def get_duration(fname) -> float:
lineout = check_output(
[
"ffprobe",
"-v",
"error",
"-show_entries",
"format=duration",
"-i",
fname,
]
).split(b"\n")
duration = next(l for l in lineout if l.startswith(b"duration="))
value = duration.split(b"=")[1]
return float(value)
def get_validator(expected_duration_s: float, error_threshold_s: float) -> Validator:
def validator(start, end, fpath):
try:
duration = get_duration(fpath)
except Exception as exc:
logger.exception("Error determining duration of %s", fpath)
return False
logger.debug(
"expect %s to be %.1f±%.1fs, is %.1f",
fpath,
expected_duration_s,
error_threshold_s,
duration,
)
if duration > expected_duration_s + error_threshold_s:
return False
if duration < expected_duration_s - error_threshold_s:
return False
return True
return validator
async def generate_mp3(db_id: int, **kwargs):
"""creates and mark it as ready in the db"""
if get_config()["FORGE_VERIFY"]:
validator = get_validator(
(kwargs["end"] - kwargs["start"]).total_seconds(),
get_config()["FORGE_VERIFY_THRESHOLD"],
)
retries = 10
else:
validator = None
retries = 1
for i in range(retries):
try:
result = await create_mp3(validator=validator, **kwargs)
except Exception as exc:
logger.error("Error creating audio for %d -> %s", db_id, str(exc))
rec = db._search(_id=db_id)[0]
rec.error = str(exc)
db.get_session(rec).commit()
return False
logger.debug("Create mp3 for %d -> %s", db_id, result)
if result:
break
elif i < retries - 1:
logger.debug("waiting %d", i + 1)
time.sleep(i + 1) # waiting time increases at each retry
else:
logger.warning("Could not create mp3 for %d: validation failed", db_id)
return False
rec = db._search(_id=db_id)[0]
rec.ready = True
db.get_session(rec).commit()
return True
@app.get("/api/ready/{recid}")
def check_job(recid: int):
rec = db._search(_id=recid)[0]
out = {"job_id": recid, "job_status": rec.status}
return out
@app.get("/api/get/ongoing")
def get_ongoing():
return {rec.id: rec_sanitize(rec) for rec in db.get_ongoing()}
@app.get("/api/get/archive")
def get_archive():
return {rec.id: rec_sanitize(rec) for rec in db.get_archive_recent()}
@app.get("/api/help")
@app.get("/api")
def api_help():
return Response(
media_type="text/html",
content="""
<h1>help</h1><hr/>
<h2>/get, /get/, /get/{id} </h2>
<h3>Get Info about rec identified by ID </h3>
<h2>/search, /search/, /search/{key}/{value}</h2>
<h3>Search rec that match key/value (or get all)</h3>
<h2>/delete/{id} </h2>
<h3>Delete rec identified by ID </h3>
<h2>/update/{id} </h2>
<h3>Not implemented.</h3>
""",
)
# JSON UTILS
def rec_msg(msg, status=True, **kwargs):
d = {"message": msg, "status": status}
d.update(kwargs)
return d
def rec_err(msg, **kwargs):
return rec_msg(msg, status=False, **kwargs)
app.mount("/output", StaticFiles(directory=get_config()["AUDIO_OUTPUT"]))
app.mount("/static", StaticFiles(directory=get_config()["STATIC_FILES"]))
@app.get("/")
def home():
return RedirectResponse("/new.html")
@app.route("/new.html")
@app.route("/old.html")
@app.route("/archive.html")
def serve_pages(request: Request):
page = request.url.path[1:]
fpath = os.path.join(get_config()["STATIC_PAGES"], page)
return FileResponse(fpath)
def main_cmd(options):
import uvicorn
uvicorn.run(app, host=get_config()["HOST"], port=int(get_config()["PORT"]))
if __name__ == "__main__":
logger.warn("Usage of server.py is not supported anymore; use cli.py")
import sys
sys.exit(1)
# vim: set ts=4 sw=4 et ai ft=python:

View file

Before

Width:  |  Height:  |  Size: 1.7 KiB

After

Width:  |  Height:  |  Size: 1.7 KiB

View file

Before

Width:  |  Height:  |  Size: 332 B

After

Width:  |  Height:  |  Size: 332 B

View file

Before

Width:  |  Height:  |  Size: 333 B

After

Width:  |  Height:  |  Size: 333 B

View file

Before

Width:  |  Height:  |  Size: 330 B

After

Width:  |  Height:  |  Size: 330 B

View file

Before

Width:  |  Height:  |  Size: 333 B

After

Width:  |  Height:  |  Size: 333 B

View file

Before

Width:  |  Height:  |  Size: 225 B

After

Width:  |  Height:  |  Size: 225 B

View file

Before

Width:  |  Height:  |  Size: 223 B

After

Width:  |  Height:  |  Size: 223 B

View file

Before

Width:  |  Height:  |  Size: 206 B

After

Width:  |  Height:  |  Size: 206 B

View file

Before

Width:  |  Height:  |  Size: 208 B

After

Width:  |  Height:  |  Size: 208 B

View file

Before

Width:  |  Height:  |  Size: 364 B

After

Width:  |  Height:  |  Size: 364 B

View file

Before

Width:  |  Height:  |  Size: 6.8 KiB

After

Width:  |  Height:  |  Size: 6.8 KiB

View file

Before

Width:  |  Height:  |  Size: 4.4 KiB

After

Width:  |  Height:  |  Size: 4.4 KiB

View file

Before

Width:  |  Height:  |  Size: 4.4 KiB

After

Width:  |  Height:  |  Size: 4.4 KiB

View file

Before

Width:  |  Height:  |  Size: 4.4 KiB

After

Width:  |  Height:  |  Size: 4.4 KiB

View file

Before

Width:  |  Height:  |  Size: 4.4 KiB

After

Width:  |  Height:  |  Size: 4.4 KiB

View file

Before

Width:  |  Height:  |  Size: 6.2 KiB

After

Width:  |  Height:  |  Size: 6.2 KiB

View file

Before

Width:  |  Height:  |  Size: 197 KiB

After

Width:  |  Height:  |  Size: 197 KiB

View file

Before

Width:  |  Height:  |  Size: 18 KiB

After

Width:  |  Height:  |  Size: 18 KiB

View file

Before

Width:  |  Height:  |  Size: 11 KiB

After

Width:  |  Height:  |  Size: 11 KiB

View file

Before

Width:  |  Height:  |  Size: 673 B

After

Width:  |  Height:  |  Size: 673 B

View file

Before

Width:  |  Height:  |  Size: 2.8 KiB

After

Width:  |  Height:  |  Size: 2.8 KiB

View file

Before

Width:  |  Height:  |  Size: 2.2 KiB

After

Width:  |  Height:  |  Size: 2.2 KiB

View file

@ -0,0 +1,55 @@
function delta(end, start) {
//end, start are unix timestamps
diff = parseInt(end, 10) - parseInt(start, 10); //diff is in seconds
msec = diff*1000;
var hh = Math.floor(msec / 1000 / 60 / 60);
msec -= hh * 1000 * 60 * 60;
var mm = Math.floor(msec / 1000 / 60);
msec -= mm * 1000 * 60;
var ss = Math.floor(msec / 1000);
msec -= ss * 1000;
if(hh === 0) {
if(mm === 0) {
return ss + 's';
}
return mm + 'min ' + ss + 's';
}
return hh + 'h ' + mm + 'm ' + ss + 's';
}
$(function() {
"use strict";
RecAPI.get_archive().success(function(archive) {
/* To get sorted traversal, we need to do an array containing keys */
var keys = [];
for(var prop in archive) {
keys.push(prop);
}
keys.sort(function(a,b) { return b - a; }); //descending
/* ok, now we can traverse the objects */
for(var i =0; i < keys.length; i++) {
var rec = archive[keys[i]];
console.log(rec);
var name = $('<td/>').text(rec.name);
var start = $('<td/>').text(config.date_read(
parseInt(rec.starttime, 10)).toLocaleString()
);
var duration = $('<td/>').text(delta(rec.endtime, rec.starttime));
var dl_text = $('<span/>').text(" Scarica").addClass('pure-hidden-phone');
var fn = $("<td/>")
if(rec.ready) {
fn.append($("<a/>").prop("href", "/output/" + rec.filename)
.addClass("pure-button pure-button-small")
.html( $("<i/>").addClass("fa fa-download").css("color", "green"))
.append(dl_text));
} else {
fn.html("<small>File not found</small>")
}
var row = $('<tr/>').append(name).append(start).append(duration).append(fn);
row.data('id', rec.id)
$('#ongoing-recs-table tbody').append(row);
}
});
});

View file

@ -19,31 +19,46 @@ var RecAPI = {
create: function () {
return $.ajax('/api/create', {
method: 'POST',
contentType: 'application/json',
data: "{}",
dataType: 'json'
})
},
stop: function (rec) {
return $.post('/api/update/' + rec.id, {
starttime: rec.starttime
})
return $.ajax('/api/update/' + rec.id,
{
method: 'POST',
contentType: 'application/json',
data: JSON.stringify({
starttime: parseInt(rec.starttime, 10)
})
})
},
update: function (id, data) {
return $.post('/api/update/' + id, data)
return $.ajax(
'/api/update/' + data.id, {
method: 'POST',
dataType: 'json',
contentType: 'application/json',
data: JSON.stringify(data)
})
},
fullcreate: function (name, start, end) {
return $.ajax(
'/api/create', {
method: 'POST',
dataType: 'json',
data: { name: name,
contentType: 'application/json',
data: JSON.stringify({ name: name,
starttime: config.date_write(start),
endtime: config.date_write(end)
}
})
})
},
generate: function (rec) {
return $.post('/api/generate', {
id: rec.id
return $.ajax('/api/generate/' + rec.id, {
method: 'POST',
dataType: 'json',
})
},
get_archive: function () {
@ -55,7 +70,7 @@ var RecAPI = {
}
function poll_job (job_id, callback) {
$.getJSON('/api/jobs/' + job_id)
$.getJSON('/api/ready/' + job_id)
.done(function (data) {
if (data.job_status !== 'WIP') {
console.log('polling completed for job[' + job_id + ']', data)

View file

@ -1,13 +1,11 @@
from datetime import datetime, timedelta
from nose.tools import raises, eq_
import pytest
from pytest import raises
from .forge import (
get_files_and_intervals,
get_timefile_exact,
round_timefile,
get_timefile,
mp3_join,
)
from .config_manager import get_config
@ -21,6 +19,10 @@ get_config()["FFMPEG_PATH"] = "ffmpeg"
get_config()["FFMPEG_OUT_CODEC"] = ["-acodec", "copy"]
def eq_(a, b):
assert a == b, "%r != %r" % (a, b)
def minutes(n):
return timedelta(minutes=n)
@ -29,16 +31,8 @@ def seconds(n):
return timedelta(seconds=n)
# timefile
def test_timefile_exact():
eq_(get_timefile_exact(eight), "2014-05/30/2014-05-30-20-00-00.mp3")
# Rounding
def test_rounding_similarity():
eq_(round_timefile(eight), round_timefile(eight + minutes(20)))
assert round_timefile(eight) != round_timefile(nine)
@ -49,28 +43,17 @@ def test_rounding_value():
eq_(round_timefile(eight + minutes(20)), eight)
# Rounding + timefile
def test_timefile_alreadyround():
eq_(get_timefile(eight), "2014-05/30/2014-05-30-20-00-00.mp3")
def test_timefile_toround():
eq_(get_timefile(eight + minutes(20)), "2014-05/30/2014-05-30-20-00-00.mp3")
# Intervals
@raises(ValueError)
def test_intervals_same():
tuple(get_files_and_intervals(eight, eight))
with raises(ValueError):
tuple(get_files_and_intervals(eight, eight))
@raises(ValueError)
def test_intervals_before():
tuple(get_files_and_intervals(nine, eight))
with raises(ValueError):
tuple(get_files_and_intervals(nine, eight))
def test_intervals_full_1():
@ -163,39 +146,3 @@ def test_intervals_left_2():
eq_(res[1][2], 3599)
# MP3 Join
def test_mp3_1():
eq_(" ".join(mp3_join((("a", 0, 0),))), "ffmpeg -i concat:a -acodec copy")
def test_mp3_1_left():
eq_(" ".join(mp3_join((("a", 160, 0),))), "ffmpeg -i concat:a -acodec copy -ss 160")
def test_mp3_1_right():
eq_(
" ".join(mp3_join((("a", 0, 1600),))), "ffmpeg -i concat:a -acodec copy -t 2000"
)
def test_mp3_1_leftright():
eq_(
" ".join(mp3_join((("a", 160, 1600),))),
"ffmpeg -i concat:a -acodec copy -ss 160 -t 1840",
)
def test_mp3_2():
eq_(
" ".join(mp3_join((("a", 0, 0), ("b", 0, 0)))),
"ffmpeg -i concat:a|b -acodec copy",
)
def test_mp3_2_leftright():
eq_(
" ".join(mp3_join((("a", 1000, 0), ("b", 0, 1600)))),
"ffmpeg -i concat:a|b -acodec copy -ss 1000 -t 4600",
)