Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

iOS 12.2 issue #50

Closed
jeremieb opened this issue Mar 5, 2019 · 57 comments
Closed

iOS 12.2 issue #50

jeremieb opened this issue Mar 5, 2019 · 57 comments

Comments

@jeremieb
Copy link

jeremieb commented Mar 5, 2019

Hi all!

I'm currently using iOS 12.2 beta 4 (was using the 3rd one before today) and I'm no longer able to see any stream. The screenshot is still working, but no longer the live stream.

It's still working on my other devices like my Mac or my iPad. Don't know if it's because the beta is not working well or if Apple is gonna changed something soon on the stream part.

Just wanted to keep you updated. I'm gonna comment if something is changing.

Cheers!

J.

@gerarddvb
Copy link

I am also on beta 4 and the livestream is working without issues

@SleepyPrince
Copy link

I have similar problem, but somehow my hub device (iPad Air) can stream the video while my iPhone cannot.

@ptiboubou
Copy link

iPhone updated with final release of ios 12.2 and I have exactly the same problem. Snapshot Ok, no live stream...

@magnip
Copy link

magnip commented Mar 28, 2019

Same as above iPad mini still streaming but iPhone only snapshot. Uk updated to latest iOS on both devices.

@10w73
Copy link

10w73 commented Mar 29, 2019

Same here. After 12.2 my iPhone 8 won't stream video anymore. 😟

@h0lz
Copy link

h0lz commented Mar 29, 2019

Same here, iPhone 8; iOS 12.2 - only preview, no Live Stream.

macOS 10.14.3 - everything is fine.

@RaymondMouthaan
Copy link

RaymondMouthaan commented Apr 1, 2019

Same here,

  • no live stream on iPhone X running iOS 12.2
  • runs fine on iPad Air with iOS 12.1
  • after upgrade iPad Air to iOS 12.2, live stream did no longer work remains to work on iPad with ios 12.2

@windskipper
Copy link

Same issue here.

@moritzmhmk
Copy link
Owner

Thanks for reporting this!

I will not be able to continue working on this project until June/July.

If someone figures out how to fix this issue feel free to create a pull-request 😉

@SleepyPrince
Copy link

I gave up on this plug-in and went back to homebridge-camera-ffmpeg and it's working nicely. (Except I had to disable bitrate limit to get better video quality)

@10w73
Copy link

10w73 commented Apr 7, 2019

I gave up on this plug-in and went back to homebridge-camera-ffmpeg and it's working nicely. (Except I had to disable bitrate limit to get better video quality)

@SleepyPrince, cloud you elaborate what you did "exactly"? I tried to get it working but also only got stills. Which ffmpeg did you use? How did you establish a stream to grab by homebridge-camera-ffmpeg? I should mention I use a Pi Zero W.

@SleepyPrince
Copy link

SleepyPrince commented Apr 7, 2019

I followed this guide on my rpi 3A+, with the stock ffmpeg
https://github.crookster.org/Adding-A-Webcam-To-HomeKit/

I think pi zero should be similar

@10w73
Copy link

10w73 commented Apr 7, 2019

I followed this guide on my rpi 3A+, with the stock ffmpeg
https://github.crookster.org/Adding-A-Webcam-To-HomeKit/

I think pi zero should be similar

@SleepyPrince, thx! Do you also got a USB cam attached? I try to use the internal official pi cam. Do you know where get its stream from? Like in

"source": "-f v4l2 -r 30 -s 1280x720 -i /dev/video0"

I think it won't be the same, would it?

@SleepyPrince
Copy link

SleepyPrince commented Apr 7, 2019

I think if you have bcm2835-v4l2 in /etc/modules, the source should be the same (/dev/video# depends how many devices you have)
I got one with usb cam and one with pi cam

@10w73
Copy link

10w73 commented Apr 7, 2019

I think if you have bcm2835-v4l2 in /etc/modules, the source should be the same (/dev/video# depends how many devices you have)
I got one with usb cam and one with pi cam

@SleepyPrince, I was able to set it up. But I still only got stills. No video. Output is
[Camera-ffmpeg] ERROR: FFmpeg exited with code 1
when trying to watch the stream in the Home app.

@SleepyPrince
Copy link

have you tried using the debug option to see what's wrong?

@h0lz
Copy link

h0lz commented Apr 7, 2019

Had similar problem yesterday. Only stills but no stream on /video0.
Debug isn’t very helpful at all. It just say „stopped streaming“ (or so).
Returned to this plugin for now.

Besides: maybe it‘s an iPhone/HomeKit-Bug?
Seems likely imho.
Every other non iPhone device seems to work.

In my case two iPhone 8 on 12.2 aren’t working.
A 6s on previous iOS works, up-to-date macOS as well.
Any experience with other iPhones on 12.2?
Like 8+/X/XS/XR

@10w73
Copy link

10w73 commented Apr 7, 2019

@SleepyPrince I get:

ffmpeg started on 2019-04-07 at 20:16:01
Report written to "ffmpeg-20190407-201601.log"
Command line:
ffmpeg -loglevel debug -report -an -y -f v4l2 -r 15 -s 640x360 -i /dev/video0 -t 1 -s 480x270 -f image2 -
ffmpeg version N-93486-gc0b6e4cb6d Copyright (c) 2000-2019 the FFmpeg developers
built with gcc 6.3.0 (Raspbian 6.3.0-18+rpi1+deb9u1) 20170516
configuration: --arch=arm --target-os=linux --enable-gpl --enable-omx --enable-omx-rpi --enable-nonfree
libavutil 56. 26.100 / 56. 26.100
libavcodec 58. 47.106 / 58. 47.106
libavformat 58. 26.101 / 58. 26.101
libavdevice 58. 7.100 / 58. 7.100
libavfilter 7. 48.100 / 7. 48.100
libswscale 5. 4.100 / 5. 4.100
libswresample 3. 4.100 / 3. 4.100
libpostproc 55. 4.100 / 55. 4.100
Splitting the commandline.
Reading option '-loglevel' ... matched as option 'loglevel' (set logging level) with argument 'debug'.
Reading option '-report' ... matched as option 'report' (generate a report) with argument '1'.
Reading option '-an' ... matched as option 'an' (disable audio) with argument '1'.
Reading option '-y' ... matched as option 'y' (overwrite output files) with argument '1'.
Reading option '-f' ... matched as option 'f' (force format) with argument 'v4l2'.
Reading option '-r' ... matched as option 'r' (set frame rate (Hz value, fraction or abbreviation)) with argument '15'.
Reading option '-s' ... matched as option 's' (set frame size (WxH or abbreviation)) with argument '640x360'.
Reading option '-i' ... matched as input url with argument '/dev/video0'.
Reading option '-t' ... matched as option 't' (record or transcode "duration" seconds of audio/video) with argument '1'.
Reading option '-s' ... matched as option 's' (set frame size (WxH or abbreviation)) with argument '480x270'.
Reading option '-f' ... matched as option 'f' (force format) with argument 'image2'.
Reading option '-' ... matched as output url.
Finished splitting the commandline.
Parsing a group of options: global .
Applying option loglevel (set logging level) with argument debug.
Applying option report (generate a report) with argument 1.
Applying option y (overwrite output files) with argument 1.
Successfully parsed a group of options.
Parsing a group of options: input url /dev/video0.
Applying option an (disable audio) with argument 1.
Applying option f (force format) with argument v4l2.
Applying option r (set frame rate (Hz value, fraction or abbreviation)) with argument 15.
Applying option s (set frame size (WxH or abbreviation)) with argument 640x360.
Successfully parsed a group of options.
Opening an input file: /dev/video0.
[video4linux2,v4l2 @ 0x1b6c470] fd:4 capabilities:85200005
[video4linux2,v4l2 @ 0x1b6c470] Current input_channel: 0, input_name: Camera 0, input_std: 0
[video4linux2,v4l2 @ 0x1b6c470] Setting time per frame to 1/15
[video4linux2,v4l2 @ 0x1b6c470] All info found
Input #0, video4linux2,v4l2, from '/dev/video0':
Duration: N/A, start: 2813.815119, bitrate: 41472 kb/s
Stream #0:0, 1, 1/1000000: Video: rawvideo, 1 reference frame (I420 / 0x30323449), yuv420p, 640x360, 0/1, 41472 kb/s, 15 fps, 15 tbr, 1000k tbn, 1000k tbc
Successfully opened the file.
Parsing a group of options: output url -.
Applying option t (record or transcode "duration" seconds of audio/video) with argument 1.
Applying option s (set frame size (WxH or abbreviation)) with argument 480x270.
Applying option f (force format) with argument image2.
Successfully parsed a group of options.
Opening an output file: -.
Successfully opened the file.
Stream mapping:
Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
Press [q] to stop, [?] for help
cur_dts is invalid (this is harmless if it occurs once at the start per stream)
[rawvideo @ 0x1b6f040] PACKET SIZE: 345600, STRIDE: 960
detected 1 logical cores
[graph 0 input from stream 0:0 @ 0x1b76050] Setting 'video_size' to value '640x360'
[graph 0 input from stream 0:0 @ 0x1b76050] Setting 'pix_fmt' to value '0'
[graph 0 input from stream 0:0 @ 0x1b76050] Setting 'time_base' to value '1/15'
[graph 0 input from stream 0:0 @ 0x1b76050] Setting 'pixel_aspect' to value '0/1'
[graph 0 input from stream 0:0 @ 0x1b76050] Setting 'sws_param' to value 'flags=2'
[graph 0 input from stream 0:0 @ 0x1b76050] Setting 'frame_rate' to value '15/1'
[graph 0 input from stream 0:0 @ 0x1b76050] w:640 h:360 pixfmt:yuv420p tb:1/15 fr:15/1 sar:0/1 sws_param:flags=2
[scaler_out_0_0 @ 0x1b76660] Setting 'w' to value '480'
[scaler_out_0_0 @ 0x1b76660] Setting 'h' to value '270'
[scaler_out_0_0 @ 0x1b76660] Setting 'flags' to value 'bicubic'
[scaler_out_0_0 @ 0x1b76660] w:480 h:270 flags:'bicubic' interl:0
[format @ 0x1b768d0] Setting 'pix_fmts' to value 'yuvj420p|yuvj422p|yuvj444p'
[AVFilterGraph @ 0x1b75f70] query_formats: 6 queried, 5 merged, 0 already done, 0 delayed
[scaler_out_0_0 @ 0x1b76660] picking yuvj420p out of 3 ref:yuv420p alpha:0
[swscaler @ 0x1b77230] deprecated pixel format used, make sure you did set range correctly
[scaler_out_0_0 @ 0x1b76660] w:640 h:360 fmt:yuv420p sar:0/1 -> w:480 h:270 fmt:yuvj420p sar:0/1 flags:0x4
[mjpeg @ 0x1b72bd0] Forcing thread count to 1 for MJPEG encoding, use -thread_type slice or a constant quantizer if you want to use multiple cpu cores
[mjpeg @ 0x1b72bd0] intra_quant_bias = 96 inter_quant_bias = 0
Output #0, image2, to 'pipe:':
Metadata:
encoder : Lavf58.26.101
Stream #0:0, 0, 1/15: Video: mjpeg, 1 reference frame, yuvj420p(pc), 480x270, 0/1, q=2-31, 200 kb/s, 15 fps, 15 tbn, 15 tbc
Metadata:
encoder : Lavc58.47.106 mjpeg
Side data:
cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
Clipping frame in rate conversion by 0.000008
[image2 @ 0x1b6f4e0] Opening 'pipe:' for writing
[pipe @ 0x1c00e20] Setting default whitelist 'crypto'
[AVIOContext @ 0x1c0a230] Statistics: 0 seeks, 1 writeouts
[rawvideo @ 0x1b6f040] PACKET SIZE: 345600, STRIDE: 960
[image2 @ 0x1b6f4e0] Could not get frame filename number 2 from pattern 'pipe:'. Use '-frames:v 1' for a single image, or '-update' option, or use a pattern such as %03d within the filename.
av_interleaved_write_frame(): Invalid argument
No more output streams to write to, finishing.
frame= 2 fps=0.0 q=1.6 Lsize=N/A time=00:00:00.13 bitrate=N/A speed=0.724x
video:10kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
Input file #0 (/dev/video0):
Input stream #0:0 (video): 2 packets read (691200 bytes); 2 frames decoded;
Total: 2 packets (691200 bytes) demuxed
Output file #0 (pipe:):
Output stream #0:0 (video): 2 frames encoded; 2 packets muxed (10708 bytes);
Total: 2 packets (10708 bytes) muxed
2 frames successfully decoded, 0 decoding errors
Conversion failed!

and

ffmpeg started on 2019-04-07 at 20:16:04
Report written to "ffmpeg-20190407-201604.log"
Command line:
ffmpeg -loglevel debug -report -an -y -f v4l2 -r 15 -s 640x360 -i /dev/video0 -map 0:0 -vcodec h264_omx -pix_fmt yuv420p -r 15 -f rawvideo -tune zerolatency -vf "scale=640:360" -b:v 125k -bufsize 125k -maxrate 125k -payload_type 99 -ssrc 7644652 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params 8yKSgCKbSriUMQzk7ME5vZdvlX1vp+uaD370ChjN "srtp://192.168.178.3:50280?rtcpport=50280&localrtcpport=50280&pkt_size=1316" -loglevel debug
ffmpeg version N-93486-gc0b6e4cb6d Copyright (c) 2000-2019 the FFmpeg developers
built with gcc 6.3.0 (Raspbian 6.3.0-18+rpi1+deb9u1) 20170516
configuration: --arch=arm --target-os=linux --enable-gpl --enable-omx --enable-omx-rpi --enable-nonfree
libavutil 56. 26.100 / 56. 26.100
libavcodec 58. 47.106 / 58. 47.106
libavformat 58. 26.101 / 58. 26.101
libavdevice 58. 7.100 / 58. 7.100
libavfilter 7. 48.100 / 7. 48.100
libswscale 5. 4.100 / 5. 4.100
libswresample 3. 4.100 / 3. 4.100
libpostproc 55. 4.100 / 55. 4.100
Splitting the commandline.
Reading option '-loglevel' ... matched as option 'loglevel' (set logging level) with argument 'debug'.
Reading option '-report' ... matched as option 'report' (generate a report) with argument '1'.
Reading option '-an' ... matched as option 'an' (disable audio) with argument '1'.
Reading option '-y' ... matched as option 'y' (overwrite output files) with argument '1'.
Reading option '-f' ... matched as option 'f' (force format) with argument 'v4l2'.
Reading option '-r' ... matched as option 'r' (set frame rate (Hz value, fraction or abbreviation)) with argument '15'.
Reading option '-s' ... matched as option 's' (set frame size (WxH or abbreviation)) with argument '640x360'.
Reading option '-i' ... matched as input url with argument '/dev/video0'.
Reading option '-map' ... matched as option 'map' (set input stream mapping) with argument '0:0'.
Reading option '-vcodec' ... matched as option 'vcodec' (force video codec ('copy' to copy stream)) with argument 'h264_omx'.
Reading option '-pix_fmt' ... matched as option 'pix_fmt' (set pixel format) with argument 'yuv420p'.
Reading option '-r' ... matched as option 'r' (set frame rate (Hz value, fraction or abbreviation)) with argument '15'.
Reading option '-f' ... matched as option 'f' (force format) with argument 'rawvideo'.
Reading option '-tune' ...Unrecognized option 'tune'.
Error splitting the argument list: Option not found

For me it isn't working on macOS 0.14.4 nor on iOS 12.2

@SleepyPrince
Copy link

It seems to be having issue with "-tune zerolatency". I had this issue before on homebridge-camera-rpi, but not currently with my 3A+ homebridege-camera-ffmpeg setup.
I may try experimenting with my spare pi zero later.

@SleepyPrince
Copy link

SleepyPrince commented Apr 7, 2019

@10w73 I just tested on my pi zero w from scratch and it's working with a pi cam.
try to run this script after enabling camera from raspi-config

sudo apt update
sudo apt upgrade -y
sudo apt-get install libavahi-compat-libdnssd-dev ffmpeg -y
cd ~/
wget https://nodejs.org/dist/v10.15.3/node-v10.15.3-linux-armv6l.tar.gz
tar -xzf node-v10.15.3-linux-armv6l.tar.gz
sudo rm node-v10.15.3-linux-armv6l.tar.gz
cd node-v10.15.3-linux-armv6l
sudo cp -R * /usr/local/
sudo npm install -g --unsafe-perm homebridge homebridge-camera-ffmpeg
sudo usermod -aG video pi
sudo usermod -aG video homebridge
grep "bcm2835-v4l2" /etc/modules || echo "bcm2835-v4l2" >> /etc/modules
sudo modprobe bcm2835-v4l2
cd ~/
mkdir .homebridge
chmod 0755 .homebridge
cd ~/.homebridge

and here's my config.json

{
	"bridge": {
		"name": "Homebridge",
		"username": "CC:22:3D:E3:CE:3C",
		"port": 51826,
		"pin": "031-45-154"
	},

	"platforms": [{
	  "platform": "Camera-ffmpeg",
	  "cameras": [
		{
		  "name": "Test Cam",
		  "videoConfig": {
			  "source": "-f v4l2 -r 30 -s 1280x720 -i /dev/video0",
			  "maxBitrate": 1024,
			  "maxStreams": 2,
			  "maxWidth": 1280,
			  "maxHeight": 720,
			  "maxFPS": 30,
			  "vcodec": "h264_omx",
			  "debug": true
		  }
		}
	  ]
	}]
}

@10w73
Copy link

10w73 commented Apr 7, 2019

@SleepyPrince I need to start from scratch on another SD card to do so. My setup is a little more complex with automated encryped backups on a NAS drive and a REST api to remote control various stuff. I don't want to destroy that. Maybe I find time next weekend. I'll let you know! Thanks!

@10w73
Copy link

10w73 commented Apr 8, 2019

@SleepyPrince I couldn't wait … I started from scratch and now I got it working with homebridge-camera-ffmpeg. Thx.

@Supereg
Copy link

Supereg commented Apr 12, 2019

Could it be that older devices are just fine streaming on the latest iOS an only the newer models struggle to?
I discovered this pattern on my own set of devices and the experiences here strengthen this believe.

2016 MacBook Pro running macOS 10.14.4: streams just fine
iPad (6th Generation) running iOS 12.2: won't stream
iPhone X running iOS 12.2: won't stream
old iPhone 6 running iOS 12.2 (actually ran beta software of 12.2): streams just fine

the one difference between those devices is hardware encoding for h.265 (?)

Did any of you have success in reconfiguring/repairing the plugin or get it to work gain in other ways?



It seems to be having issue with "-tune zerolatency". I had this issue before on homebridge-camera-rpi, but not currently with my 3A+ homebridege-camera-ffmpeg setup.
I may try experimenting with my spare pi zero later.

I don't know where the "-tune zerolatency" comes from, but it's definitely not in the code

@Supereg
Copy link

Supereg commented Apr 15, 2019

Okay so, it seems that it won't work on newer devices (iPhone 8 and later and probably in combination with iOS 12.2) when you copy from the v4l2 camera using the -vcodec copy option (code).
When you change the vcodec option and transcode the input (like for example with -vcodec h264_omx) it works fine. That's a workaround which should be fine as long as you have hardware accelerated h264 transcoding.

It still would be interesting what exactly causes the problem when using the original video stream from the v4l2 devices.

And you should probably add -b:v ${bitrate}k -bufsize ${bitrate}k -maxrate ${bitrate}k options (or something like that) after the -vcodec, otherwise you get pretty crappy quality.

Edit: The one thing that is different in the log output is, that when changing from simply copying the video stream to transcoding it the following ffmpeg warning message does not appear again: Apr 15 15:00:41 node[4002]: [rtp @ 0x37c5b40] Timestamps are unset in a packet for stream 0. This is deprecated and will stop working in the future. Fix your code to set the timestamps properly. Any chances that this affects anything?

@Supereg
Copy link

Supereg commented Apr 16, 2019

The following error messages appear (error logs from iPhone gathered via Console.app on Mac) when trying to stream from an iPhone X iOS 12.2 when ffmpeg is configured to only copy the video from the rip camera:

Is anyone able to make use of this?

fehler	20:05:33.069314 +0200	mediaserverd	 [ERROR] VideoPacketBuffer_ScheduleFrames:986 VideoPacketBuffer[0x104488000] scheduling incomplete frame fecProtected:0 seq:3269 deadlineTimestamp:0 frameTimestamp:1923130184 diff:1923130184 !
fehler	20:05:33.069807 +0200	mediaserverd	 [ERROR] decoderFrameCallback:814 frame decode error -12909
fehler	20:05:33.115082 +0200	mediaserverd	 [ERROR] VideoPacketBuffer_ScheduleFrames:986 VideoPacketBuffer[0x104488000] scheduling incomplete frame fecProtected:0 seq:3271 deadlineTimestamp:0 frameTimestamp:1923133184 diff:1923133184 !
fehler	20:05:33.116769 +0200	mediaserverd	 [ERROR] decoderFrameCallback:814 frame decode error -12909
fehler	20:05:33.140550 +0200	mediaserverd	 [ERROR] VideoPacketBuffer_ScheduleFrames:986 VideoPacketBuffer[0x104488000] scheduling incomplete frame fecProtected:0 seq:3273 deadlineTimestamp:0 frameTimestamp:1923136183 diff:1923136183 !
fehler	20:05:33.141177 +0200	mediaserverd	 [ERROR] decoderFrameCallback:814 frame decode error -12909
fehler	20:05:33.170981 +0200	mediaserverd	 [ERROR] VideoPacketBuffer_ScheduleFrames:986 VideoPacketBuffer[0x104488000] scheduling incomplete frame fecProtected:0 seq:3275 deadlineTimestamp:0 frameTimestamp:1923139183 diff:1923139183 !
fehler	20:05:33.172143 +0200	mediaserverd	 [ERROR] decoderFrameCallback:814 frame decode error -12909

@milmber
Copy link

milmber commented Apr 16, 2019

Looks definitely like there is a timestamp issue with the /dev/video0 raw stream.

However it seems to have been fixed in the video4linux drivers as per this GitHub issue - raspberrypi/linux#1836.

I have tried the -fflags +genpts and -use_wallclock_as_timestamps 1 ffmpeg switches to generate/add timestamps to the stream with no luck.

Unfortunately the only solution seems to be to re-encode the stream with -vcodec h264_omx -pix_fmt yuv420p -r 30 -f rawvideo -tune zerolatency and the associated CPU overhead.

@Supereg
Copy link

Supereg commented Apr 17, 2019

The issue is pretty old and all commits should already be presents in the current kernels/kernel modules. However the issue does not really state that the issue was fixed. It seems pretty messed up.

I don’t know if those flags have any effect when you just copy the input stream(?)

It seems currently to be best practice to reencode the video stream with ffmpeg. There should be definitely instructions added on how to compile ffmpeg with hardware acceleration for h264 (do all pi models support that). Otherwise the CPU is gonna run pretty hot.
And maybe the ffmpeg command could also be improved by getting the raw video stream from the camera instead of the pre compressed h264 stream.

@tux-o-matic
Copy link

tux-o-matic commented Apr 17, 2019

Tested on a Pi 2 I'm around 130% CPU (4 cores). Just bellow 90% on a Zero W.
Remember that if you're updating a Raspbian after several months to get the improved ffmpeg, you'll have to run npm rebuild to rebase against the updated NodeJS packages.

@Supereg
Copy link

Supereg commented Apr 17, 2019

Out of interest did anyone tried streaming from an Apple Watch.
When I try to start the stream shortly displays 'Connecting...' and then 'Not available' (behavior is unchanged with the update). When debugging on HAP-NodeJS side, the watch queries the StreamingStatus characteristics which returns 'available' and then nothing happens.

Edit: nvm, repaired it with HomeKit and works like a charm now

@tux-o-matic
Copy link

tux-o-matic commented Apr 17, 2019

Streaming to my Apple Watch actually works now that you mention it. Never got anything else than snapshot until now. The ffmpeg command I've been testing is slightly different from your PR.

-f video4linux2 -input_format h264 -video_size ${width}x${height} -framerate ${fps} -i /dev/video0 \
-vcodec h264_omx -pix_fmt yuv420p -tune zerolatency -an -payload_type 99 -ssrc ${ssrc} -f rtp \
-srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params ${srtp} \
srtp://${address}:${port}?rtcpport=${port}&localrtcpport=${port}&pkt_size=1378`

@moritzmhmk
Copy link
Owner

I plan to continue working on this project in July.

If h264_omx works ootb in Raspbian (and buildroot) and the CPU usage is not increased this seams to be the best solution. Otherwise fixing the timestamp would be the alternative. The timestamp in the v4l2 driver is based on the uptime of the system, but iOS seems to expect it to start with the streaming (i.e. 0). Maybe the actual initial timestamp can be communicated via srtp (the streaming protocol) or the timestamp could be modified.

@Supereg
Copy link

Supereg commented Apr 19, 2019

What I don’t quite get is why are only the newer devices affected? If this is indeed a timestamp issue why is it only present on newer devices and only from iOS 12.2 onwards?

Did anyone test the current version with the latest iOS 12.3 beta?

@tux-o-matic
Copy link

@Supereg , it's not just older devices, iOS 12.2 on an iPhone SE was affected as well until changing vcodec to h264.

@fredericfdran
Copy link

Tested on a Pi 2 I'm around 130% CPU (4 cores). Just bellow 90% on a Zero W.

Definitely not the workaround I was hoping for as far as Zeros are concerned as I get 8 FPS at best...

@Supereg
Copy link

Supereg commented Apr 27, 2019

Does the Pi Zero not support h264 hardware acceleration?
Is ffmpeg correctly compiled with omx?

@tux-o-matic
Copy link

@Supereg, it does. But maybe as pointed out in this post from one of the developer working on the GPU of the Pi, using the mmal backend with ffmpeg might the way to go.

@Supereg
Copy link

Supereg commented Apr 29, 2019

@tux-o-matic As far as I know h264_mmal is only for decoding (?). Am I wrong? Although I complied FFmpeg with mmal enabled, listing encoders with ffmpeg -encoders | grep h264 only lists omx and the other way around for ffmpeg -decoders | grep h264.

@fredericfdran could you please post the output of ffmpeg -version

@tux-o-matic
Copy link

There is a new kid on the block HKCam providing a ready to use setup for using n RPi Zero W as a HomeKit camera with a bridge written in Go.
Without even trying their ffmpeg settings, a single issue points toward the same fps issue on the old CPU of the Zero.

@magnip
Copy link

magnip commented May 24, 2019

Yes have been using it for a fortnight now it’s very stable but as you mentioned the quality is still poor. Only improvements I’ve made so far was to allocate 256mb to the gpo memory for a little improvement.

@RaymondMouthaan
Copy link

@magnip, exactly the same here

@magnip
Copy link

magnip commented May 24, 2019

Also check this out https://www.reddit.com/r/RASPBERRY_PI_PROJECTS/comments/breooc/lowlatency_video_from_rpi_to_browser/?utm_source=share&utm_medium=ios_app&utm_name=ios_share_flow_optimization&utm_term=control_2

It’s not homekit compatible but proof the pi zero is capable.

@RaymondMouthaan
Copy link

@magnip, that's awesomely smooth!

Remains me of https://github.com/ccrisan/motioneyeos/wiki which is smooth in the browser too

@moritzmhmk
Copy link
Owner

7b1cbb9 should add support for iOS 12.2 and later :)

It is not yet published on npm but can be tested by following the "standalone" instructions. I will publish a new version on npm later after some more testing ;)

@moritzmhmk
Copy link
Owner

v0.0.5 is now available via npm

@moritzmhmk
Copy link
Owner

new problem: the video stops working (i.e. it seems to be working but actually only the first frame is shown) when the Pi is up and running for some time. This is caused by the timestamp being generated by the kerneltime. Using -start_at_zero will fix it for macOS but it will again not work on iOS 12.2 anymore

@moritzmhmk
Copy link
Owner

This should be fixed by bdef766

@tux-o-matic
Copy link

@moritzmhmk, if the issue is now solved using the original encoding with a timestamp correction for iOS 12.2, can the issue be closed?

@milmber
Copy link

milmber commented Jun 26, 2019

bdef766 and the -timestamps abs ffmpeg switch breaks video for me. Using -copyts instead works for me across multiple iOS 12.2 and iOS 13 (beta) devices (iPhone, Apple Watch and iPad).

@moritzmhmk
Copy link
Owner

-copyts is still used - its just an additional -timestamps abs.

For me without the latter it stops working after some uptime (I checked after 8 hours) on two Pi zero W, one Pi 2 and one Pi 3 that I have been testing. Using absolute timestamps the results are now independent of the uptime! With the addition of -timestamps abs it worked (even after long uptimes) for iOS but it (as I just realised) does not work on macOS.

I believe that using h264_omx is the best solution for anyone NOT using a Pi zero! I will continue this project with the aim to make it work on a Pi zero - where h264_omx for some reason does not perform well (It causes additional CPU load even though one should think it runs on the gpu...).

I will not be able to do much testing for another month now so feel free to experiment and post your results here and I will look into it in August :)

@milmber
Copy link

milmber commented Jun 30, 2019

-copyts is still used - its just an additional -timestamps abs.

For me without the latter it stops working after some uptime (I checked after 8 hours) on two Pi zero W, one Pi 2 and one Pi 3 that I have been testing. Using absolute timestamps the results are now independent of the uptime! With the addition of -timestamps abs it worked (even after long uptimes) for iOS but it (as I just realised) does not work on macOS.

I can confirm the same behaviour. However with -timestamps abs I get no live streaming. Removing this fixes it. However after a certain amount of uptime on my Raspberry Pi streaming stops working and I have to add -timestamps abs back in order to fix live streaming. This behaviour seems to be generic (i.e. I can replicate the exact same behaviour across different Raspberry Pi's and iOS/Mac OS devices).

@tux-o-matic
Copy link

tux-o-matic commented Sep 16, 2019

No news in a while.

  • The current master branch will freeze after a a little while.
  • Other fix (which I use) is very CPU intensive and meangs big video lag on Pi Zero.

Would a switch to gstreamer (also has HW acceleration support on the Pi) be an alternative?

@kshala
Copy link

kshala commented Sep 16, 2019

I ended up using https://github.com/mpromonet/v4l2rtspserver with VLC on my iPhone. It is surprisingly fast in performance. I know it’s quite different concept but may be someone with more knowledge can find the trick we can use here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests