diff --git a/README.md b/README.md index 8f2b363..cd28943 100644 --- a/README.md +++ b/README.md @@ -63,7 +63,7 @@ On Linux, real cameras are UVC devices and `mjpeg-streamer` can work with these ## The solution 1. Let's just replace `mjpeg-streamer`. After searching for a bit, I found [`mjpeg-server`](https://github.com/blueimp/mjpeg-server) which does exactly what I need. \ -Instead of using a video device as input, it simply executes a command (for example) and streams the output from that. \ +Instead of using a video device as input, it simply executes a command (for example ffmpeg) and streams the output from that. \ The one downside to using this is that there is no way to capture a single frame, which means that timelapses will no longer be usable. I might look into forking it and adding snapshot support in the future. 2. Instead of ffmpeg, I opted to use gstreamer. After consulting a good friend of mine (ChatGPT), I managed to construct a gstreamer pipeline that transcodes the video to an MJPEG stream - While using a hardware accelerated transcoder!