From 811a385be8914c3e26d4aa4a530043b1b1714b5f Mon Sep 17 00:00:00 2001 From: Lea Date: Sun, 1 Oct 2023 14:42:49 +0200 Subject: [PATCH] forgot the example --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 8f2b363..cd28943 100644 --- a/README.md +++ b/README.md @@ -63,7 +63,7 @@ On Linux, real cameras are UVC devices and `mjpeg-streamer` can work with these ## The solution 1. Let's just replace `mjpeg-streamer`. After searching for a bit, I found [`mjpeg-server`](https://github.com/blueimp/mjpeg-server) which does exactly what I need. \ -Instead of using a video device as input, it simply executes a command (for example) and streams the output from that. \ +Instead of using a video device as input, it simply executes a command (for example ffmpeg) and streams the output from that. \ The one downside to using this is that there is no way to capture a single frame, which means that timelapses will no longer be usable. I might look into forking it and adding snapshot support in the future. 2. Instead of ffmpeg, I opted to use gstreamer. After consulting a good friend of mine (ChatGPT), I managed to construct a gstreamer pipeline that transcodes the video to an MJPEG stream - While using a hardware accelerated transcoder!