182

GitHub - shogo4405/HaishinKit.swift: Camera and Microphone streaming library via...

 5 years ago
source link: https://github.com/shogo4405/HaishinKit.swift
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

README.md

HaishinKit (formerly lf)

Platform Language CocoaPods GitHub license

  • Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS.
  • Issuesの言語は、英語か、日本語でお願いします!

Features

RTMP

  • Authentication
  • Publish and Recording (H264/AAC)
  • Playback (Technical Preview)
  • Adaptive bitrate streaming
    • Handling (see also #126)
    • Automatic drop frames
  • Action Message Format
    • AMF0
    • AMF3
  • SharedObject
  • RTMPS
    • Native (RTMP over SSL/TSL)
    • Tunneled (RTMPT over SSL/TSL) (Technical Preview)
  • RTMPT (Technical Preview)
  • ReplayKit Live as a Broadcast Upload Extension (Technical Preview)

HLS

  • HTTPService
  • HLS Publish

Rendering

- HKView GLHKView MTHKView Engine AVCaptureVideoPreviewLayer OpenGL ES Metal Publish ○ ○ ◯ Playback × ○ ◯ VIsualEffect × ○ ◯ Condition Stable Stable Beta

Others

  • Support tvOS 10.2+ (Technical Preview)
    • tvOS can't publish Camera and Microphone. Available playback feature.
  • Hardware acceleration for H264 video encoding, AAC audio encoding
  • Support "Allow app extension API only" option
  • Support GPUImage framework (~> 0.5.12)
  • Objectiv-C Bridging

Requirements

- iOS OSX tvOS XCode Swift CocoaPods Carthage 0.10.0 8.0+ 10.11+ 10.2+ 10.0+ 4.2 1.5.0+ 0.29.0+ 0.9.0 8.0+ 10.11+ 10.2+ 9.3 4.1 1.5.0+ 0.29.0+

Cocoa Keys

Please contains Info.plist.

iOS 10.0+

  • NSMicrophoneUsageDescription
  • NSCameraUsageDescription

macOS 10.14+

  • NSMicrophoneUsageDescription
  • NSCameraUsageDescription

Installation

Please set up your project Swift 4.2.

CocoaPods

source 'https://github.com/CocoaPods/Specs.git'
use_frameworks!

def import_pods
    pod 'HaishinKit', '~> 0.10.1'
end

target 'Your Target'  do
    platform :ios, '8.0'
    import_pods
end

Carthage

github "shogo4405/HaishinKit.swift" ~> 0.10.1

License

BSD-3-Clause

Donation

Paypal

Bitcoin

1LP7Jo4VwAFdEisJSykBAtUyAusZjozSpw

Prerequisites

Make sure you setup and activate your AVAudioSession.

import AVFoundation
let session: AVAudioSession = AVAudioSession.sharedInstance()
do {
    try session.setPreferredSampleRate(44_100)
    // https://stackoverflow.com/questions/51010390/avaudiosession-setcategory-swift-4-2-ios-12-play-sound-on-silent
    if #available(iOS 10.0, *) {
        try session.setCategory(.playAndRecord, mode: .default, options: [.allowBluetooth])
    } else {
        session.perform(NSSelectorFromString("setCategory:withOptions:error:"), with: AVAudioSession.Category.playAndRecord, with:  [AVAudioSession.CategoryOptions.allowBluetooth])
    }
    try session.setMode(AVAudioSessionModeDefault)
    try session.setActive(true)
} catch {
}

RTMP Usage

Real Time Messaging Protocol (RTMP).

let rtmpConnection:RTMPConnection = RTMPConnection()
let rtmpStream: RTMPStream = RTMPStream(connection: rtmpConnection)
rtmpStream.attachAudio(AVCaptureDevice.default(for: AVMediaType.audio)) { error in
    // print(error)
}
rtmpStream.attachCamera(DeviceUtil.device(withPosition: .back)) { error in
    // print(error)
}

let hkView = HKView(frame: view.bounds)
hkView.videoGravity = AVLayerVideoGravity.resizeAspectFill
hkView.attachStream(rtmpStream)

// add ViewController#view
view.addSubview(hkView)

rtmpConnection.connect("rtmp://localhost/appName/instanceName")
rtmpStream.publish("streamName")
// if you want to record a stream.
// rtmpStream.publish("streamName", type: .localRecord)

Settings

let sampleRate:Double = 44_100

// see: #58
#if(iOS)
let session: AVAudioSession = AVAudioSession.sharedInstance()
do {
    try session.setPreferredSampleRate(44_100)
    // https://stackoverflow.com/questions/51010390/avaudiosession-setcategory-swift-4-2-ios-12-play-sound-on-silent
    if #available(iOS 10.0, *) {
        try session.setCategory(.playAndRecord, mode: .default, options: [.allowBluetooth])
    } else {
        session.perform(NSSelectorFromString("setCategory:withOptions:error:"), with: AVAudioSession.Category.playAndRecord, with:  [AVAudioSession.CategoryOptions.allowBluetooth])
    }
    try session.setActive(true)
} catch {
}
#endif

var rtmpStream = RTMPStream(connection: rtmpConnection)

rtmpStream.captureSettings = [
    "fps": 30, // FPS
    "sessionPreset": AVCaptureSession.Preset.medium.rawValue, // input video width/height
    "continuousAutofocus": false, // use camera autofocus mode
    "continuousExposure": false, //  use camera exposure mode
]
rtmpStream.audioSettings = [
    "muted": false, // mute audio
    "bitrate": 32 * 1024,
    "sampleRate": sampleRate, 
]
rtmpStream.videoSettings = [
    "width": 640, // video output width
    "height": 360, // video output height
    "bitrate": 160 * 1024, // video output bitrate
    // "dataRateLimits": [160 * 1024 / 8, 1], optional kVTCompressionPropertyKey_DataRateLimits property
    "profileLevel": kVTProfileLevel_H264_Baseline_3_1, // H264 Profile require "import VideoToolbox"
    "maxKeyFrameIntervalDuration": 2, // key frame / sec
]
// "0" means the same of input
rtmpStream.recorderSettings = [
    AVMediaType.audio: [
        AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
        AVSampleRateKey: 0,
        AVNumberOfChannelsKey: 0,
        // AVEncoderBitRateKey: 128000,
    ],
    AVMediaType.video: [
        AVVideoCodecKey: AVVideoCodecH264,
        AVVideoHeightKey: 0,
        AVVideoWidthKey: 0,
        /*
        AVVideoCompressionPropertiesKey: [
            AVVideoMaxKeyFrameIntervalDurationKey: 2,
            AVVideoProfileLevelKey: AVVideoProfileLevelH264Baseline30,
            AVVideoAverageBitRateKey: 512000
        ]
        */
    ],
]

// 2nd arguemnt set false
rtmpStream.attachAudio(AVCaptureDevice.default(for: AVMediaType.audio), automaticallyConfiguresApplicationAudioSession: false)

Authentication

var rtmpConnection:RTMPConnection = RTMPConnection()
rtmpConnection.connect("rtmp://username:password@localhost/appName/instanceName")

Screen Capture

// iOS
rtmpStream.attachScreen(ScreenCaptureSession(shared: UIApplication.shared))
// macOS
rtmpStream.attachScreen(AVCaptureScreenInput(displayID: CGMainDisplayID()))

HTTP Usage

HTTP Live Streaming (HLS). Your iPhone/Mac become a IP Camera. Basic snipet. You can see http://ip.address:8080/hello/playlist.m3u8

var httpStream:HTTPStream = HTTPStream()
httpStream.attachCamera(DeviceUtil.device(withPosition: .back))
httpStream.attachAudio(AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeAudio))
httpStream.publish("hello")

var lfView:LFView = LFView(frame: view.bounds)
lfView.attachStream(httpStream)

var httpService:HLSService = HLSService(domain: "", type: "_http._tcp", name: "lf", port: 8080)
httpService.startRunning()
httpService.addHTTPStream(httpStream)

// add ViewController#view
view.addSubview(lfView)

FAQ

How can I run example project?

Please hit carthage update command. HaishinKit needs Logboard module via Carthage.

carthage update

Do you support me via Email?

Yes. Consulting fee is $50/1 incident. I don't recommend. Please consider to use Issues.

Reference


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK