Skip to content

Camera and Microphone streaming library via RTMP, HLS for iOS, macOS.

License

Notifications You must be signed in to change notification settings

furybubu/lf.swift

 
 

Repository files navigation

HaishinKit (formerly lf)

GitHub license

Camera and Microphone streaming library via RTMP, HLS for iOS, macOS.

Features

RTMP

  • Authentication
  • Publish and Recording (H264/AAC)
  • Playback
  • AMF0
  • AMF3
  • SharedObject
  • RTMPS
  • Native (RTMP over SSL/TSL)
  • Tunneled (RTMPT over SSL/TSL)
  • RTMPT (Technical Preview)
  • ReplayKit Live as a Broadcast Upload Extension (Technical Preview)

HLS

  • HTTPService
  • HLS Publish

Others

  • Hardware acceleration for H264 video encoding/AAC audio encoding
  • Objectiv-C Bridging

Requirements

- iOS OSX XCode Swift CocoaPods Carthage
0.5.0 8.0+ 10.11+ 8.0+ 3.0 1.1.0 #80
0.4.0 8.0+ 10.11+ 7.3+ 2.3 1.0.0 #80
0.3.0 8.0+ 10.11+ 7.3+ 2.3 1.0.0 -
0.2.0 8.0+ - ? 2.3 0.39.0 -

Cocoa Keys

iOS10.0+

  • NSMicrophoneUsageDescription
  • NSCameraUsageDescription
  • NSPhotoLibraryUsageDescription

Installation

CocoaPods

source 'https://github.com/CocoaPods/Specs.git'
use_frameworks!

def import_pods
    pod 'lf', '~> 0.5.0'
end

target 'Your Target'  do
    platform :ios, '8.0'
    import_pods
end

RTMP Usage

Real Time Messaging Protocol (RTMP).

var rtmpConnection:RTMPConnection = RTMPConnection()
var rtmpStream:RTMPStream = RTMPStream(connection: rtmpConnection)
rtmpStream.attachAudio(AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeAudio))
rtmpStream.attachCamera(DeviceUtil.device(withPosition: .back))

var lfView:LFView = LFView(frame: view.bounds)
lfView.videoGravity = AVLayerVideoGravityResizeAspectFill
lfView.attachStream(rtmpStream)

// add ViewController#view
view.addSubview(lfView)

rtmpConnection.connect("rtmp://localhost/appName/instanceName")
rtmpStream.publish("streamName")
// if you want to record a stream.
// rtmpStream.publish("streamName", type: .localRecord)

Settings

let sampleRate:Double = 44_100

// see: #58
#if(iOS)
do {
    try AVAudioSession.sharedInstance().setPreferredSampleRate(sampleRate)
    try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord)
    try AVAudioSession.sharedInstance().setMode(AVAudioSessionModeVideoChat)
    try AVAudioSession.sharedInstance().setActive(true)
} catch {
}
#endif

var rtmpStream:RTMPStream = RTMPStream(connection: rtmpConnection)
// 2nd arguemnt set false
rtmpStream.attachAudio(AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeAudio), automaticallyConfiguresApplicationAudioSession: false)

rtmpStream.captureSettings = [
    "fps": 30, // FPS
    "sessionPreset": AVCaptureSessionPresetMedium, // input video width/height
    "continuousAutofocus": false, // use camera autofocus mode
    "continuousExposure": false, //  use camera exposure mode
]
rtmpStream.audioSettings = [
    "muted": false, // mute audio
    "bitrate": 32 * 1024,
    "sampleRate": sampleRate, 
]
rtmpStream.videoSettings = [
    "width": 640, // video output width
    "height": 360, // video output height
    "bitrate": 160 * 1024, // video output bitrate
    // "dataRateLimits": [160 * 1024 / 8, 1], optional kVTCompressionPropertyKey_DataRateLimits property
    "profileLevel": kVTProfileLevel_H264_Baseline_3_1, // H264 Profile require "import VideoToolbox"
    "maxKeyFrameIntervalDuration": 2, // key frame / sec
]
// "0" means the same of input
rtmpStream.recorderSettings = [
    AVMediaTypeAudio: [
        AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
        AVSampleRateKey: 0,
        AVNumberOfChannelsKey: 0,
    ],
    AVMediaTypeVideo: [
        AVVideoCodecKey: AVVideoCodecH264,
        AVVideoHeightKey: 0,
        AVVideoWidthKey: 0,
    ],
]

Authentication

var rtmpConnection:RTMPConnection = RTMPConnection()
rtmpConnection.connect("rtmp://username:password@localhost/appName/instanceName")

Screen Capture

// iOS
rtmpStream.attachScreen(ScreenCaptureSession())
// macOS
rtmpStream.attachScreen(AVCaptureScreenInput(displayID: CGMainDisplayID()))

HTTP Usage

HTTP Live Streaming (HLS). Your iPhone/Mac become a IP Camera. Basic snipet. You can see http://ip.address:8080/hello/playlist.m3u8

var httpStream:HTTPStream = HTTPStream()
httpStream.attachCamera(DeviceUtil.device(withPosition: .back))
httpStream.attachAudio(AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeAudio))
httpStream.publish("hello")

var lfView:LFView = LFView(frame: view.bounds)
lfView.attachStream(httpStream)

var httpService:HTTPService = HTTPService(domain: "", type: "_http._tcp", name: "lf", port: 8080)
httpService.startRunning()
httpService.addHTTPStream(httpStream)

// add ViewController#view
view.addSubview(lfView)

License

New BSD

Donation

Bitcoin

1FXxyfegCfhLcCAHV6jYu2Uska6JaAVux9

Reference

About

Camera and Microphone streaming library via RTMP, HLS for iOS, macOS.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Swift 98.4%
  • Other 1.6%