forgot the example
This commit is contained in:
parent
7b5c842b8f
commit
811a385be8
|
@ -63,7 +63,7 @@ On Linux, real cameras are UVC devices and `mjpeg-streamer` can work with these
|
||||||
## The solution
|
## The solution
|
||||||
|
|
||||||
1. Let's just replace `mjpeg-streamer`. After searching for a bit, I found [`mjpeg-server`](https://github.com/blueimp/mjpeg-server) which does exactly what I need. \
|
1. Let's just replace `mjpeg-streamer`. After searching for a bit, I found [`mjpeg-server`](https://github.com/blueimp/mjpeg-server) which does exactly what I need. \
|
||||||
Instead of using a video device as input, it simply executes a command (for example) and streams the output from that. \
|
Instead of using a video device as input, it simply executes a command (for example ffmpeg) and streams the output from that. \
|
||||||
The one downside to using this is that there is no way to capture a single frame, which means that timelapses will no longer be usable. I might look into forking it and adding snapshot support in the future.
|
The one downside to using this is that there is no way to capture a single frame, which means that timelapses will no longer be usable. I might look into forking it and adding snapshot support in the future.
|
||||||
2. Instead of ffmpeg, I opted to use gstreamer. After consulting a good friend of mine (ChatGPT), I managed to construct a gstreamer pipeline that transcodes the video to an MJPEG stream - While using a hardware accelerated transcoder!
|
2. Instead of ffmpeg, I opted to use gstreamer. After consulting a good friend of mine (ChatGPT), I managed to construct a gstreamer pipeline that transcodes the video to an MJPEG stream - While using a hardware accelerated transcoder!
|
||||||
|
|
||||||
|
|
Loading…
Reference in a new issue