add setup / descriptions
This commit is contained in:
parent
b2f3c7a521
commit
7b5c842b8f
146
README.md
146
README.md
|
@ -1,8 +1,144 @@
|
|||
# Droidcam with OctoPrint
|
||||
|
||||
I'll write setup instructions at some point. Right now I'm tired of all the crap I had to go through to figure this out.
|
||||
This expects that you're using OctoPrint's Raspberry Pi image, for other systems you'll probably have to adapt a bit.
|
||||
|
||||
This also expects that you're using OctoPrint's Raspberry Pi image, for other systems you'll probably have to adapt a bit.
|
||||
Also, timelapses will not work as `mjpeg-server` doesn't currently have a way to fetch a single snapshot.
|
||||
|
||||
## Why?
|
||||
|
||||
OctoPrint ships with `mjpeg-streamer`, which takes a video device (e.g. /dev/video0) as input and runs a HTTP server that serves that video stream as mjpeg stream. \
|
||||
For this, the camera must support mjpeg natively. Most normal cameras do this, but DroidCam doesn't:
|
||||
|
||||
<details>
|
||||
<summary>List of supported output formats for the DroidCam virtual video device (click to expand)</summary>
|
||||
|
||||
```bash
|
||||
$ sudo v4l2-ctl -d /dev/video1 --list-formats-ext
|
||||
ioctl: VIDIOC_ENUM_FMT
|
||||
Type: Video Capture
|
||||
|
||||
[0]: 'YU12' (Planar YUV 4:2:0)
|
||||
Size: Discrete 1280x720
|
||||
Interval: Discrete 0.033s (30.000 fps)
|
||||
```
|
||||
</details>
|
||||
|
||||
The only output option is `YU12`, which we will need to transcode to an MJPEG sequence.
|
||||
|
||||
<details>
|
||||
<summary>For comparison, this is the list of supported output formats on my Logitech webcam.</summary>
|
||||
|
||||
```bash
|
||||
sudo v4l2-ctl -d /dev/video0 --list-formats-ext
|
||||
ioctl: VIDIOC_ENUM_FMT
|
||||
Type: Video Capture
|
||||
|
||||
[0]: 'YUYV' (YUYV 4:2:2)
|
||||
Size: Discrete 1920x960
|
||||
Interval: Discrete 0.019s (54.000 fps)
|
||||
Interval: Discrete 0.022s (45.000 fps)
|
||||
Interval: Discrete 0.033s (30.000 fps)
|
||||
Interval: Discrete 0.067s (15.000 fps)
|
||||
[1]: 'MJPG' (Motion-JPEG, compressed)
|
||||
Size: Discrete 1920x960
|
||||
Interval: Discrete 0.019s (54.000 fps)
|
||||
Interval: Discrete 0.022s (45.000 fps)
|
||||
Interval: Discrete 0.033s (30.000 fps)
|
||||
Interval: Discrete 0.067s (15.000 fps)
|
||||
```
|
||||
|
||||
Note the `MJPG` option. This is what we need. This camera would work out of the box with OctoPrint's `mjpeg-streamer`.
|
||||
</details>
|
||||
|
||||
## What didn't work
|
||||
|
||||
So, the solution seemed simple. Right? Just create a v4l2 loopback device (let's call it `/dev/video1`), run ffmpeg to take the DroidCam camera stream as input (let's say `/dev/video0`) and transcode and output it to `/dev/video1`.
|
||||
|
||||
In theory, this would work. However, in practice there are 2 issues with this:
|
||||
|
||||
1. `mjpeg-streamer` is garbage. It is written in C, from what I can tell not actively maintained and has weird issues with v4l2 video devices. \
|
||||
On Linux, real cameras are UVC devices and `mjpeg-streamer` can work with these just fine. However, emulated video devices (using v4l2loopback) appear to function ever so slightly differently, meaning mjpeg-streamer can't read from our ffmpeg output (or the DroidCam output, even if it did support MJPEG directly).
|
||||
2. ffmpeg obliterates the Pi 4's CPU. Transcoding a 720p (or even 480p) stream in the background resulted in frequent video freezing and caused OctoPrint to sometimes freeze for a few seconds. This is most likely because ffmpeg doesn't have a hardware encoder for MJPEG that works on the Pi.
|
||||
|
||||
## The solution
|
||||
|
||||
1. Let's just replace `mjpeg-streamer`. After searching for a bit, I found [`mjpeg-server`](https://github.com/blueimp/mjpeg-server) which does exactly what I need. \
|
||||
Instead of using a video device as input, it simply executes a command (for example) and streams the output from that. \
|
||||
The one downside to using this is that there is no way to capture a single frame, which means that timelapses will no longer be usable. I might look into forking it and adding snapshot support in the future.
|
||||
2. Instead of ffmpeg, I opted to use gstreamer. After consulting a good friend of mine (ChatGPT), I managed to construct a gstreamer pipeline that transcodes the video to an MJPEG stream - While using a hardware accelerated transcoder!
|
||||
|
||||
## How to set it up
|
||||
|
||||
Disable the existing `mjpeg-streamer` service so we can run our own instead:
|
||||
```
|
||||
sudo systemctl disable --now webcamd.service
|
||||
```
|
||||
|
||||
Clone and compile the DroidCam client repository. DroidCam doesn't have arm64 builds so we need to compile it ourselves.
|
||||
|
||||
```bash
|
||||
# Install dependencies
|
||||
sudo apt install libturbojpeg0{,-dev} libavutil-dev libswscale-dev libasound2-dev libspeex-dev libusbmuxd-dev libplist-dev usbmuxd
|
||||
|
||||
# Clone and build
|
||||
cd ~
|
||||
git clone https://github.com/dev47apps/droidcam
|
||||
cd droidcam
|
||||
make droidcam-cli
|
||||
cp ./droidcam-cli /usr/bin/droidcam-cli
|
||||
```
|
||||
|
||||
Set up v4l2loopback, which DroidCam requires.
|
||||
```bash
|
||||
sudo apt install v4l2loopback-dkms
|
||||
|
||||
# Configure the module to load on startup
|
||||
echo "v4l2loopback" | sudo tee -a /etc/modules
|
||||
echo "options v4l2loopback devices=1 video_nr=1" | /etc/modprobe.d/v4l2loopback.conf
|
||||
|
||||
# Load the module now with the same configuration
|
||||
sudo modprobe v4l2loopback devices=1 video_nr=1
|
||||
```
|
||||
Note how we configure it to provide the emulated video device `/dev/video1`, not `/dev/video0` - This should prevent it from breaking should a real camera be plugged in at some point.
|
||||
|
||||
Next, install mjpeg-server.
|
||||
|
||||
```bash
|
||||
sudo apt install golang
|
||||
go get github.com/blueimp/mjpeg-server
|
||||
|
||||
# Verify that it was built and installed correctly.
|
||||
# If you get "No such file or directory", run the `go get` command again.
|
||||
~/go/bin/mjpeg-server --help
|
||||
```
|
||||
|
||||
You should now have everything you need. Let's verify:
|
||||
|
||||
> These instructions are using DroidCam on an iOS device via USB. If you're using an Android phone or want to use it wirelessly, run `droidcam-cli --help` and adapt the command accordingly. \
|
||||
> Unless your setup matches mine exactly, you will also have to modify the DroidCam command in a systemd service file later on. \
|
||||
> Tip: You can change the port in DroidCam settings so that it matches my instructions.
|
||||
|
||||
In one terminal, run DroidCam. If using the phone wired, you might have to reattach it once.
|
||||
```bash
|
||||
# Higher resolutions than 1280x720 may work, but resulted in messed up graphics at full refresh rate. You'll need to adapt the gstreamer pipeline to lower the framerate in that case, but I prefer smoothness over resolution.
|
||||
# /dev/video1 is the v4l2loopback device we created earlier, unless you used a different device number you don't need to change this.
|
||||
droidcam-cli -nocontrols -dev=/dev/video1 -size=1280x720 ios 57192
|
||||
```
|
||||
|
||||
While the first command is running, open a second terminal and run this:
|
||||
```bash
|
||||
# It is important that we use port 8080 since OctoPrint proxies localhost:8080 to `/webcam` on the OctoPrint HTTP server.
|
||||
# For `-b gaysex` and `multipartmux boundary=gaysex`, "gaysex" can be changed to any other string, as long as they are identical for both sides. This is just used to signal where the next JPEG frame starts.
|
||||
/home/pi/go/bin/mjpeg-server -a ":8080" -b gaysex -- gst-launch-1.0 -v v4l2src device=/dev/video1 ! videoconvert ! video/x-raw,format=I420 ! v4l2jpegenc ! multipartmux boundary=gaysex ! filesink location=/dev/stdout
|
||||
```
|
||||
|
||||
Now, if you open the OctoPrint UI, you should see your camera appear! Didn't work? Good luck. Otherwise, let's finish this up.
|
||||
|
||||
Follow the instructions in [services/](./services/) to install the systemd service files. If you've changed the `droidcam-cli` command above, make sure to also modify it in `droidcam.service`. You most likely don't need to change `droidcam-streamer.service`.
|
||||
|
||||
Finally, reboot the system for good measure and check if everything works! If using an iPhone as DroidCam client, I recommend setting up [Guided Access](https://support.apple.com/en-us/HT202612) to make sure that the screen stays on even when it's not actively streaming and to prevent others from messing with the phone. On Android, [App Pinning](https://duck.com/?q=android+app+pinning) can be used instead.
|
||||
|
||||
You should also enable dimming in DroidCam's settings so that the screen doesn't have to remain on all the time.
|
||||
|
||||
### Links
|
||||
|
||||
|
@ -20,3 +156,9 @@ v4l2loopback
|
|||
```
|
||||
options v4l2loopback devices=1 video_nr=1
|
||||
```
|
||||
|
||||
# Extra: Phone mount
|
||||
|
||||
Should you be in need of a mount to prop your phone up, I have included my OpenSCAD files for the mount I designed for myself. It works out of the box with my iPhone 11 with my specific rubber case, you'll probably have to configure it differently.
|
||||
|
||||
The files and instructions on what values to change are in [phone-mount/](./phone-mount/).
|
||||
|
|
BIN
phone-mount/.assets/back-details.jpg
Normal file
BIN
phone-mount/.assets/back-details.jpg
Normal file
Binary file not shown.
After Width: | Height: | Size: 603 KiB |
BIN
phone-mount/.assets/back.jpg
Normal file
BIN
phone-mount/.assets/back.jpg
Normal file
Binary file not shown.
After Width: | Height: | Size: 2.1 MiB |
BIN
phone-mount/.assets/front-details.jpg
Normal file
BIN
phone-mount/.assets/front-details.jpg
Normal file
Binary file not shown.
After Width: | Height: | Size: 120 KiB |
BIN
phone-mount/.assets/front.jpg
Normal file
BIN
phone-mount/.assets/front.jpg
Normal file
Binary file not shown.
After Width: | Height: | Size: 1.6 MiB |
32
phone-mount/README.md
Normal file
32
phone-mount/README.md
Normal file
|
@ -0,0 +1,32 @@
|
|||
# OpenSCAD files for my phone mount
|
||||
|
||||
<img src="./.assets/front.jpg" height=600 /> <img src="./.assets/back.jpg" height=600 />
|
||||
|
||||
### phone mount.scad
|
||||
|
||||
This is the part that clips on your phone. Adjust the following variables to fit your phone (use calipers to measure distances):
|
||||
|
||||
- depth
|
||||
- height
|
||||
- angle
|
||||
|
||||
<img src="./.assets/back-details.jpg" height=600 />
|
||||
|
||||
### base mount.scad
|
||||
|
||||
This is the piece that goes on your table. Adjust the following variables:
|
||||
|
||||
- table_thickness
|
||||
- grabber_length
|
||||
|
||||
<img src="./.assets/front-details.jpg" height=600 />
|
||||
|
||||
I also recommend taping this down so it won't fall off if you bump it.
|
||||
|
||||
### rod.scad
|
||||
|
||||
The rod that connects the two pieces. Ideally you'd use something else that you already have instead of printing a long tube.
|
||||
|
||||
In the previous two files, you can adjust `hole_depth` to fit whatever you're using as rod. Make sure to add 1-2mm as tolerance.
|
||||
|
||||
I'm not using this specific model because I already had a fitting rod from a previous project, but I guess you can save yourself the 2 minutes to model one yourself.
|
30
phone-mount/base mount.scad
Normal file
30
phone-mount/base mount.scad
Normal file
|
@ -0,0 +1,30 @@
|
|||
$fn = 25;
|
||||
|
||||
table_thickness = 38;
|
||||
holder_width = 50;
|
||||
grabber_length = 40;
|
||||
thickness = 4;
|
||||
hole_diameter = 21;
|
||||
hole_depth = 30;
|
||||
|
||||
cube([holder_width, thickness, table_thickness + thickness*2]);
|
||||
cube([holder_width, grabber_length + thickness, thickness]);
|
||||
translate([0, 0, table_thickness + thickness]) {
|
||||
cube([holder_width, grabber_length + thickness, thickness]);
|
||||
}
|
||||
|
||||
// Hole
|
||||
// Honestly not sure where some of these values come from
|
||||
translate([holder_width / 2, -(hole_diameter + thickness) / 2, table_thickness - hole_depth/2 + 2*thickness]) {
|
||||
difference() {
|
||||
group() {
|
||||
cylinder(d = hole_diameter + thickness, h = hole_depth, center = true);
|
||||
translate([-(hole_diameter + thickness) / 2, 0, -hole_depth / 2]) {
|
||||
cube([hole_diameter + thickness, hole_diameter / 2 + thickness + 0.5, hole_depth]);
|
||||
}
|
||||
}
|
||||
translate([0, 0, thickness]) {
|
||||
cylinder(d = hole_diameter, h = hole_depth, center = true);
|
||||
}
|
||||
}
|
||||
}
|
62
phone-mount/phone mount.scad
Normal file
62
phone-mount/phone mount.scad
Normal file
|
@ -0,0 +1,62 @@
|
|||
$fn = 25;
|
||||
|
||||
width = 153;
|
||||
height = 80;
|
||||
depth = 10;
|
||||
holder_width = 50;
|
||||
thickness = 2;
|
||||
hole_diameter = 21;
|
||||
hole_depth = 30;
|
||||
angle = 7.5;
|
||||
|
||||
// stolen from openscad wiki
|
||||
module prism(l, w, h) {
|
||||
polyhedron(//pt 0 1 2 3 4 5
|
||||
points=[[0,0,0], [l,0,0], [l,w,0], [0,w,0], [0,w,h], [l,w,h]],
|
||||
faces=[[0,1,2,3],[5,4,3,2],[0,4,5,1],[0,3,4],[5,2,1]]
|
||||
);
|
||||
}
|
||||
|
||||
// iphone
|
||||
// # cube([width, depth, height]);
|
||||
|
||||
translate([holder_width / 2, -thickness, -thickness]) {
|
||||
cube([holder_width, thickness, height + 2*thickness]);
|
||||
|
||||
// Top grabber
|
||||
translate([0, 0, height + thickness]) {
|
||||
cube([holder_width, depth+ 2*thickness, thickness]);
|
||||
translate([0, depth + thickness, -thickness]) {
|
||||
cube([holder_width, thickness, 2*thickness]);
|
||||
}
|
||||
}
|
||||
|
||||
// Bottom grabber
|
||||
translate([0, 0, 0]) {
|
||||
cube([holder_width, depth+ 2*thickness, thickness]);
|
||||
translate([0, depth + thickness, 0]) {
|
||||
cube([holder_width, thickness, 2*thickness]);
|
||||
}
|
||||
}
|
||||
|
||||
// Hole
|
||||
w = (hole_depth) * sin(angle) / sin(90 - angle);
|
||||
rotate([-angle, 0, 0])
|
||||
translate([holder_width / 2, -(hole_diameter + thickness*2) / 2 - w, hole_depth / 2]) {
|
||||
difference() {
|
||||
group() {
|
||||
cylinder(d = hole_diameter + thickness*2, h = hole_depth, center = true);
|
||||
rotate([180, 180, 0])
|
||||
translate([-(hole_diameter + thickness*2) / 2, -(hole_diameter + thickness*2) / 2, -hole_depth / 2]) {
|
||||
cube([hole_diameter + thickness*2, (hole_diameter + thickness*2) / 2, hole_depth]);
|
||||
translate([0, -w, 0]) {
|
||||
prism(l = hole_diameter + thickness*2, w = w, h = 30);
|
||||
}
|
||||
}
|
||||
}
|
||||
translate([0, 0, -thickness*2]) {
|
||||
cylinder(d = hole_diameter, h = hole_depth, center = true);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
10
phone-mount/rod.scad
Normal file
10
phone-mount/rod.scad
Normal file
|
@ -0,0 +1,10 @@
|
|||
$fn = 25;
|
||||
|
||||
diameter = 20;
|
||||
height = 170;
|
||||
thickness = 2;
|
||||
|
||||
difference() {
|
||||
cylinder(d = diameter, h = height, center = true);
|
||||
cylinder(d = diameter + - thickness, h = height + 1, center = true);
|
||||
}
|
|
@ -1,3 +1,12 @@
|
|||
# systemd service files
|
||||
|
||||
These should go into `/etc/systemd/user`.
|
||||
Copy these into `/etc/systemd/user`. Unless your DroidCam setup matches mine exactly, you'll have to adapt the command in [`droidcam.service`](./droidcam.service).
|
||||
|
||||
Run the following commands (not as root):
|
||||
|
||||
```bash
|
||||
systemctl --user daemon-reload
|
||||
systemctl --user enable --now droidcam.service
|
||||
systemctl --user enable --now droidcam-streamer.service
|
||||
loginctl enable-linger # Enables the services to run when you're not actively logged in
|
||||
```
|
||||
|
|
|
@ -10,8 +10,5 @@ ExecStart=/home/pi/go/bin/mjpeg-server -a ":8080" -b gaysex -- bash -c 'while [[
|
|||
Restart=always
|
||||
RestartSec=5
|
||||
|
||||
# Old transcoding command, ffmpeg hammered the CPU way too much and doesn't have support for the v4l2 mjpeg encoder. gstreamer performs leagues better
|
||||
# ExecStart=/home/pi/go/bin/mjpeg-server -a ":8080" -- ffmpeg -i /dev/video1 -fflags nobuffer -an -r 15 -f mpjpeg -q 2 -
|
||||
|
||||
[Install]
|
||||
WantedBy=default.target
|
||||
|
|
Loading…
Reference in a new issue