Custom audio sources
By default, Video SDK uses the basic audio and video modules on the device your app runs on. However, there are certain scenarios where you want to integrate a custom audio or video source into your app, such as:
- Your app has its own audio or video module.
- You want to use a non-camera source, such as recorded screen data.
- You need to process the captured audio or video with a pre-processing library for audio or image enhancement.
- You need flexible device resource allocation to avoid conflicts with other services.
Understand the tech
To set an external audio or video source, you configure the Agora Engine before joining a channel. To manage the capture and processing of audio and video frames, you use methods from outside the Video SDK that are specific to your custom source. Video SDK enables you to push processed audio and video data to the subscribers in a channel.
The figure below shows the workflow you need to implement to stream a custom video or audio source in your app.
Prerequisites
In order to follow this procedure you must have implemented Get Started with Voice Calling
Project setup
To create the environment necessary to implement call quality best practice into your app, open the project you created in Get Started with Voice Calling.
Integrate custom audio or video
To stream from a custom source, you convert the data stream into a suitable format and push this data using Video SDK.
Implement a custom video source
In this section you create the basic framework required to push video frames from a custom source. Depending on the type of your source, you add your own code to this framework that converts the source data to VideoFrame
data. To create the basic framework, take the following steps:
-
Import the required Agora and Android libraries
You use the Android
TextureView
andSurfaceTexture
objects for rendering custom video. The video data from theSurfaceTexture
is converted to aVideoFrame
before it is pushed to the channel. To use these libraries in your app, add the following statements after the lastimport
statement in/app/java/com.example.<projectname>/MainActivity
. -
Define variables to process and push video data
In
/app/java/com.example.<ProductWrapper>/MainActivity
, add the following declarations to theMainActivity
class: -
Enable custom video track publishing
When a user presses Join, you configure
ChannelMediaOptions
to enable publishing of the captured video from a custom source. You set the external video source, and set up aTextureView
for the custom video preview. To do this:-
Add the following lines to the
joinChannel(View view)
method in theMainActivity
class afterChannelMediaOptions options = new ChannelMediaOptions();
: -
In the
joinChannel(View view)
method, remove the following lines:
-
-
Set up a TextureView for the custom video
Create a new
TextureView
object, and add aSurfaceTextureListener
to it. The listener triggers theonSurfaceTextureAvailable
callback when aSurfaceTexture
becomes available. You add theTextureView
to theFrameLayout
container to display it in the UI. To do this, add the following method to theMainActivity
class: -
Define the
SurfaceTextureListener
When a
SurfaceTexture
becomes available, you create apreviewSurfaceTexture
and set itsonFrameAvailableListener
listener. You set up and configure your custom video source, set itsSurfaceTexture
to thepreviewSurfaceTexture
, and start the preview. To do this, add the following definition of thesurfaceTextureListener
to theMainActivity
class: -
Push the video frames
The
onFrameAvailableListener
callback is triggered when a new video frame is available. In the callback, you convert theSurfaceTexture
data to a Video SDKVideoFrame
and push the frame to the channel. To do this, add the followingOnFrameAvailableListener
to theMainActivity
class:
Implement a custom audio source
To push audio from a custom source to a channel, take the following steps:
-
Import the required Android and Java libraries
You use an
InputStream
to read the contents of the custom audio source. The app starts a separateProcess
to read and push the audio data. To use these libraries in your app, add the following statements after the lastimport
statement in/app/java/com.example.<projectname>/MainActivity
. -
Define variables to manage and push the audio stream
In
/app/java/com.example.<projectname>/MainActivity
, add the following declarations to theMainActivity
class: -
Add a raw audio file to the project
In this example, you use an audio file as the source of your custom audio data. To add the audio file to your Android project, create a folder
app\src\main\assets
and add a sample audio file in*.wav
or*.raw
format to this folder. Update the value of theAUDIO_FILE
variable to show the audio file name. Also make sure that the values of the audio file parameters in your code match the audio file you placed in the assets folder. -
Enable custom audio track publishing
When a user presses Join, you set the
ChannelMediaOptions
to disable the microphone audio track and enable the custom audio track. You also enable custom audio local playback and set the external audio source. To do this, add the following lines to thejoinChannel(View view)
method in theMainActivity
class afteroptions.clientRoleType = Constants.CLIENT_ROLE_BROADCASTER;
: -
Open the audio file
When the app starts, you open the audio file. To do this, add the following lines at the bottom of the
onCreate
method: -
Start the task to push audio frames
When a user successfully joins a channel, you start the task that pushes audio frames. To do this, add the following lines at the bottom of the
onJoinChannelSuccess
callback in theMainActivity
class: -
Read the input stream into a buffer
You read data from the input stream into a buffer. To do this, add the following method to the
MainActivity
class: -
Push the audio frames
You push the data in the buffer as an audio frame using a separate process. To do this, define the following
Runnable
class in theMainActivity
class: -
Close the audio file
When the app is closed, you close the audio file. To do this, add the following lines at the bottom of the
onDestroy
method:
Test your implementation
To ensure that you have implemented streaming from a custom source into your app:
-
Generate a temporary token in Agora Console
-
Add authentication data to the web demo
In your browser, navigate to the Agora web demo and update App ID, Channel, and Token with the values for your temporary token, then click Join.
-
In Android Studio, open
app/java/com.example.<projectname>/MainActivity
, and updateappId
,channelName
andtoken
with the values for your temporary token. -
Connect a physical Android device to your development device.
-
In Android Studio, click Run app. A moment later you see the project installed on your device.
If this is the first time you run the project, grant microphone and camera access to your app.
-
Test the custom video source
Add code to the basic framework presented above, to do the following:
-
In
onSurfaceTextureAvailable
enable the video source and set its parameters. -
In
onSurfaceTextureAvailable
setSurfaceTexture
of the custom video source topreviewSurfaceTexture
. -
In
onFrameAvailable
convertsurfaceTexture
data to aVideoFrame
.
-
-
Test the custom audio source
Press Join. You hear the audio file streamed to the web demo app.
To use this code for streaming data from your particular custom audio source, modify the
readBuffer()
method to read the audio data from your source, instead of a raw audio file.
Reference
This section contains information that completes the information in this page, or points you to documentation that explains other aspects to this product.