React Native Livestream Component
api.video is the video infrastructure for product builders. Lightning fast video APIs for integrating, scaling, and managing on-demand & low latency live streaming features in your app.
Project description
This module is made for broadcasting rtmp live stream from smartphone camera
Getting started
:warning: The React Native Live Stream SDK is designed for 0.69.1 version of React Native. Using the SDK with >0.69.1 of React Native can cause unexpected behaviour
Installation
or
Note: if you are on iOS, you will need two extra steps:
- Don't forget to install the native dependencies with Cocoapods
&&
- This project contains swift code, and if it's your first dependency with swift code, you need to create an empty swift file in your project (with the bridging header) from XCode. Find how to do that
Permissions
To be able to broadcast, you must:
- On Android: ask for internet, camera and microphone permissions:
Your application must dynamically require android.permission.CAMERA and android.permission.RECORD_AUDIO.
- On iOS: update Info.plist with a usage description for camera and microphone
NSCameraUsageDescription
Your own description of the purpose
NSMicrophoneUsageDescription
Your own description of the purpose
- On react-native you must handle the permissions requests before starting your livestream. If permissions are not accepted you will not be able to broadcast.
Code sample
import React, { useRef, useState } from 'react';
import { View, TouchableOpacity } from 'react-native';
import { LiveStreamView } from '@api.video/react-native-livestream';
const App = () => {
const ref = useRef(null);
const [streaming, setStreaming] = useState(false);
return (
<View style={{ flex: 1, alignItems: 'center' }}>
<LiveStreamView
style={{ flex: 1, backgroundColor: 'black', alignSelf: 'stretch' }}
ref={ref}
camera="back"
enablePinchedZoom={true}
video={{
fps: 30,
resolution: '720p',
bitrate: 2*1024*1024, // # 2 Mbps
gopDuration: 1, // 1 second
}}
audio={{
bitrate: 128000,
sampleRate: 44100,
isStereo: true,
}}
isMuted={false}
onConnectionSuccess={() => {
//do what you want
}}
onConnectionFailed={(e) => {
//do what you want
}}
onDisconnect={() => {
//do what you want
}}
/>
<View style={{ position: 'absolute', bottom: 40 }}>
<TouchableOpacity
style={{
borderRadius: 50,
backgroundColor: streaming ? 'red' : 'white',
width: 50,
height: 50,
}}
onPress={() => {
if (streaming) {
ref.current?.stopStreaming();
setStreaming(false);
} else {
ref.current?.startStreaming('YOUR_STREAM_KEY');
setStreaming(true);
}
}}
/>
</View>
</View>
);
}
export default App;
Documentation
Props & Methods
;
;
Example App
You can try our example app, feel free to test it.
Setup
Be sure to follow the React Native installation steps before anything.
- Open a new terminal
- Clone the repository and go into it
git clone https://github.com/apivideo/api.video-reactnative-live-stream.git livestream_example_app && cd livestream_example_app
Android
Install the packages and launch the application
yarn && yarn example android
iOS
- Install the packages
yarn install
- Go into
/example/ios
and install the Pods
cd /example/ios && pod install
- Sign your application
Open Xcode, click on "Open a project or file" and open the Example.xcworkspace
file.
You can find it in YOUR_PROJECT_NAME/example/ios
.
Click on Example, go in Signin & Capabilities
tab, add your team and create a unique
bundle identifier.
- Launch the application, from the root of your project
yarn example ios
Plugins
api.video live stream library is using external native library for broadcasting
Plugin | README |
---|---|
StreamPack | StreamPack |
HaishinKit | HaishinKit |
FAQ
If you have any questions, ask us here: https://community.api.video . Or use Issues.
Was this page helpful?