Skip to main content

Audio and voice effects

Video SDK makes it simple for you to publish audio captured through the microphone to subscribers in a channel. In some cases, users want to modify the captured audio to add sound effects, mix in a pre-recorded audio, or change the voice quality before the audio is published. Video SDK provides several options that enable you to add sound effects, mix in pre-recorded audio, apply voice effects and set the audio route. This page shows you how to implement these features in your channel.

Understand the tech

Using Video SDK you can implement the following audio features:

  • Audio effects and audio mixing

    Some real-time communication applications, such as online gaming and karaoke are built around audio mixing features. Playing a sound effect at the appropriate time or mixing background music with microphone audio is essential to such applications. Video SDK provides APIs that enable you to implement:

    • Audio Effects: Play audios with a short duration. For example, applause, cheers, and gunshots. You can play multiple audio effects at the same time.

    • Audio Mixing: Play longer music file, such as background music. Using audio mixing, you can play only one file at a time.

  • Voice effects

    Voice modifying effects such as chat beautifier, singing beautifier, and voice changer are gaining popularity in social interaction and entertainment scenarios. To help you quickly integrate voice effects into your project, Video SDK provides pre-configured effects. You can choose from the following types of effects:

    • Preset voice beautifiers: Chat beautifier, singing beautifier, and timbre transformation.

    • Preset audio effects: Voice changer, style transformation, room acoustics, and pitch correction.

    • Preset voice conversion: Changes the original voice to make it unrecognizable.

    • Customized audio effects: Controls the voice pitch, equalization, and reverberation.

  • Audio route

    When audio is played back on a device, it can be routed to one of several hardware components such as the earpiece, headphones, speakerphone, or Bluetooth devices. The audio route is changed by the device user when they add or remove an external device. Video SDK enables you to change the default audio route or temporarily set an audio route.

The following figure shows the workflow you need to implement to add audio and voice effects to your app and set the audio route:

Audio and Voice Effects

Prerequisites

In order to follow this procedure you must have:

  • Android Studio 4.1 or higher.
  • Android SDK API Level 24 or higher.
  • A mobile device that runs Android 4.1 or higher.
  • An Agora account and project.

  • A computer with Internet access.

    Ensure that no firewall is blocking your network communication.

Project setup

To create the environment necessary to implement call quality best practice into your app, open the project you created in Get Started with Video Calling.

Implement audio and voice effects and set the audio route

When a user plays an audio in the channel, or chooses to apply a voice effect from the available options, your app calls the appropriate Video SDK method to implement these choices.

This section shows how to use audio mixing, play sound effects, apply voice effects, and change the audio route, step-by-step.

Implement the user interface

To enable the user to play sound and voice effects and modify the audio route, add the following elements to the user interface:

  • A button to start and stop audio mixing.
  • A button to play the sound effect.
  • A button to apply various voice effects.
  • A switch to turn the speakerphone on or off.

In /app/res/layout/activity_main.xml add the following lines before </RelativeLayout>:


_39
<Button
_39
android:id="@+id/AudioMixingButton"
_39
android:layout_width="match_parent"
_39
android:layout_height="wrap_content"
_39
android:layout_below="@id/JoinButton"
_39
android:layout_alignEnd="@+id/LeaveButton"
_39
android:layout_alignStart="@id/JoinButton"
_39
android:text="Start Audio Mixing"
_39
android:onClick="audioMixing"
_39
/>
_39
_39
<Button
_39
android:id="@+id/PlayAudioEffect"
_39
android:layout_width="wrap_content"
_39
android:layout_height="wrap_content"
_39
android:layout_below="@+id/AudioMixingButton"
_39
android:layout_alignEnd="@id/LeaveButton"
_39
android:layout_alignStart="@id/JoinButton"
_39
android:onClick="playSoundEffect"
_39
android:text="Play audio effect" />
_39
_39
<Button
_39
android:id="@+id/ApplyVoiceEffect"
_39
android:layout_width="wrap_content"
_39
android:layout_height="wrap_content"
_39
android:layout_below="@+id/PlayAudioEffect"
_39
android:layout_alignEnd="@id/LeaveButton"
_39
android:layout_alignStart="@id/JoinButton"
_39
android:onClick="applyVoiceEffect"
_39
android:text="Apply voice effect" />
_39
_39
<androidx.appcompat.widget.SwitchCompat
_39
android:id="@+id/SwitchSpeakerphone"
_39
android:layout_width="wrap_content"
_39
android:layout_height="wrap_content"
_39
android:layout_centerHorizontal="true"
_39
android:checked="true"
_39
android:text="Speakerphone"
_39
android:layout_below="@id/ApplyVoiceEffect"/>

Handle the system logic

This section describes the steps required to use the relevant libraries, declare the necessary variables, and set up access to the UI elements.

  1. Import the required Agora and Android libraries

    To access and use the Button and Switch objects, and integrate Video SDK audio effects libraries, add the following statements after the last import statement in /app/java/com.example.<projectname>/MainActivity.


    _4
    import io.agora.rtc2.IAudioEffectManager;
    _4
    import android.widget.Button;
    _4
    import androidx.appcompat.widget.SwitchCompat;
    _4
    import android.widget.CompoundButton;

  2. Define variables to manage audio effects and access the UI elements

    In /app/java/com.example.<projectname>/MainActivity, add the following declarations to the MainActivity class:


    _10
    private IAudioEffectManager audioEffectManager;
    _10
    private final int soundEffectId = 1; // Unique identifier for the sound effect file
    _10
    private String soundEffectFilePath = "https://www.soundjay.com/human/applause-01.mp3"; // URL or path to the sound effect
    _10
    private int soundEffectStatus = 0;
    _10
    private int voiceEffectIndex = 0;
    _10
    private boolean audioPlaying = false; // Manage the audio mixing state
    _10
    private String audioFilePath = "https://www.kozco.com/tech/organfinale.mp3"; // URL or path to the audio mixing file
    _10
    _10
    private Button playEffectButton, voiceEffectButton;
    _10
    private SwitchCompat speakerphoneSwitch;

  3. Set up access to the speakerphone switch

    In /app/java/com.example.<projectname>/MainActivity, add the following lines to the onCreate method after setupVideoSDKEngine();


    _2
    SwitchCompat speakerphoneSwitch = findViewById(R.id.SwitchSpeakerphone);
    _2
    speakerphoneSwitch.setOnCheckedChangeListener(onCheckedChanged());

Implement sound and voice effects

To add sound and voice effect to your app, take the following steps:

  1. Enable the user to start and stop audio mixing

    Add the following method to the MainActivity class:


    _17
    public void audioMixing(View view) {
    _17
    Button startStopButton = (Button) findViewById(R.id.AudioMixingButton);
    _17
    audioPlaying = !audioPlaying;
    _17
    _17
    if (audioPlaying) {
    _17
    startStopButton.setText("Stop playing audio");
    _17
    try {
    _17
    agoraEngine.startAudioMixing(audioFilePath, false, 1, 0);
    _17
    showMessage("Audio playing");
    _17
    } catch (Exception e) {
    _17
    showMessage("Exception playing audio" + "\n" + e.toString());
    _17
    }
    _17
    } else {
    _17
    agoraEngine.stopAudioMixing();
    _17
    startStopButton.setText("Play Audio");
    _17
    }
    _17
    }

  2. Get the effects manager and pre-load sound effects

    To set up playing voice effects, get the audioEffectManager and pre-load the sound effects. In setupVideoSDKEngine add the following lines after agoraEngine.enableVideo();:


    _4
    // Set up the audio effects manager
    _4
    audioEffectManager = agoraEngine.getAudioEffectManager();
    _4
    // Pre-load sound effects to improve performance
    _4
    audioEffectManager.preloadEffect(soundEffectId, soundEffectFilePath);

  3. Play, pause, or resume playing the sound effect

    When a user presses the button, the sound effect is played. When they press the button again, the effect is paused and resumed alternately. To do this, add the following method to the MainActivity class:


    _25
    public void playSoundEffect(View view) {
    _25
    if (playEffectButton == null) playEffectButton = (Button) view;
    _25
    if (soundEffectStatus == 0) { // Stopped
    _25
    audioEffectManager.playEffect(
    _25
    soundEffectId, // The ID of the sound effect file.
    _25
    soundEffectFilePath, // The path of the sound effect file.
    _25
    0, // The number of sound effect loops. -1 means an infinite loop. 0 means once.
    _25
    1, // The pitch of the audio effect. 1 represents the original pitch.
    _25
    0.0, // The spatial position of the audio effect. 0.0 represents that the audio effect plays in the front.
    _25
    100, // The volume of the audio effect. 100 represents the original volume.
    _25
    true,// Whether to publish the audio effect to remote users.
    _25
    0 // The playback starting position of the audio effect file in ms.
    _25
    );
    _25
    playEffectButton.setText("Pause audio effect");
    _25
    soundEffectStatus = 1;
    _25
    } else if (soundEffectStatus == 1) { // Playing
    _25
    audioEffectManager.pauseEffect(soundEffectId);
    _25
    soundEffectStatus = 2;
    _25
    playEffectButton.setText("Resume audio effect");
    _25
    } else if (soundEffectStatus == 2) { // Paused
    _25
    audioEffectManager.resumeEffect(soundEffectId);
    _25
    soundEffectStatus = 1;
    _25
    playEffectButton.setText("Pause audio effect");
    _25
    }
    _25
    }

  4. Update the UI after the effect finishes playing

    When Video SDK has finished playing the sound effect, the onAudioEffectFinished event is fired. To handle this event, in the MainActivity class, add the following after private final IRtcEngineEventHandler mRtcEventHandler = new IRtcEngineEventHandler() {:


    _9
    @Override
    _9
    // Occurs when the audio effect playback finishes.
    _9
    public void onAudioEffectFinished(int soundId) {
    _9
    super.onAudioEffectFinished(soundId);
    _9
    showMessage("Audio effect finished");
    _9
    audioEffectManager.stopEffect(soundId);
    _9
    soundEffectStatus = 0; // Stopped
    _9
    runOnUiThread(() -> playEffectButton.setText("Play audio effect"));
    _9
    }

  5. Set an audio profile

    For applications where special audio performance is required, you set a suitable audio profile and audio scenario. In setupVideoSDKEngine add the following lines after agoraEngine.enableVideo();:


    _3
    // Specify the audio scenario and audio profile
    _3
    agoraEngine.setAudioProfile(Constants.AUDIO_PROFILE_MUSIC_HIGH_QUALITY_STEREO);
    _3
    agoraEngine.setAudioScenario(Constants.AUDIO_SCENARIO_GAME_STREAMING);

  6. Apply various voice and audio effects

    When a user presses the button, apply a new voice effect and change the text on the button to describe the effect. To do this, add the following method to the MainActivity class:


    _37
    public void applyVoiceEffect(View view) {
    _37
    if (voiceEffectButton == null) voiceEffectButton = (Button) view;
    _37
    voiceEffectIndex++;
    _37
    // Turn off all previous effects
    _37
    agoraEngine.setVoiceBeautifierPreset(Constants.VOICE_BEAUTIFIER_OFF);
    _37
    agoraEngine.setAudioEffectPreset(Constants.AUDIO_EFFECT_OFF);
    _37
    agoraEngine.setVoiceConversionPreset(Constants.VOICE_CONVERSION_OFF);
    _37
    _37
    if (voiceEffectIndex == 1) {
    _37
    agoraEngine.setVoiceBeautifierPreset(Constants.CHAT_BEAUTIFIER_MAGNETIC);
    _37
    voiceEffectButton.setText("Voice effect: Chat Beautifier");
    _37
    } else if (voiceEffectIndex == 2) {
    _37
    agoraEngine.setVoiceBeautifierPreset(Constants.SINGING_BEAUTIFIER);
    _37
    voiceEffectButton.setText("Voice effect: Singing Beautifier");
    _37
    } else if (voiceEffectIndex == 3) {
    _37
    agoraEngine.setAudioEffectPreset(Constants.VOICE_CHANGER_EFFECT_HULK);
    _37
    voiceEffectButton.setText("Voice effect: Hulk");
    _37
    } else if (voiceEffectIndex == 4) {
    _37
    agoraEngine.setVoiceConversionPreset(Constants.VOICE_CHANGER_BASS);
    _37
    voiceEffectButton.setText("Voice effect: Voice Changer");
    _37
    } else if (voiceEffectIndex == 5) {
    _37
    // Sets the local voice equalization.
    _37
    // The first parameter sets the band frequency. The value ranges between 0 and 9.
    _37
    // Each value represents the center frequency of the band:
    _37
    // 31, 62, 125, 250, 500, 1k, 2k, 4k, 8k, and 16k Hz.
    _37
    // The second parameter sets the gain of each band between -15 and 15 dB.
    _37
    // The default value is 0.
    _37
    agoraEngine.setLocalVoiceEqualization(Constants.AUDIO_EQUALIZATION_BAND_FREQUENCY.fromInt(4), 3);
    _37
    agoraEngine.setLocalVoicePitch(0.5);
    _37
    voiceEffectButton.setText("Voice effect: Voice Equalization");
    _37
    } else if (voiceEffectIndex > 5) { // Remove all effects
    _37
    voiceEffectIndex = 0;
    _37
    agoraEngine.setLocalVoicePitch(1.0);
    _37
    agoraEngine.setLocalVoiceEqualization(Constants.AUDIO_EQUALIZATION_BAND_FREQUENCY.fromInt(4), 0);
    _37
    voiceEffectButton.setText("Apply voice effect");
    _37
    }
    _37
    }

  7. Set the audio route

    When the user toggles the switch, enable or disable the speakerphone. To do this, add the following method to the MainActivity class:


    _9
    private CompoundButton.OnCheckedChangeListener onCheckedChanged() {
    _9
    return new CompoundButton.OnCheckedChangeListener() {
    _9
    @Override
    _9
    public void onCheckedChanged(CompoundButton buttonView, boolean isChecked) {
    _9
    agoraEngine.setDefaultAudioRoutetoSpeakerphone(false); // Disables the default audio route.
    _9
    agoraEngine.setEnableSpeakerphone(isChecked); // Enables or disables the speakerphone temporarily.
    _9
    }
    _9
    };
    _9
    }

Test your implementation

To ensure that you have implemented audio and voice effects into your app:

  1. Generate a temporary token in Agora Console .

  2. In your browser, navigate to the Agora web demo and update App ID, Channel, and Token with the values for your temporary token, then click Join.

  1. In Android Studio, open app/java/com.example.<projectname>/MainActivity, and update appId, channelName and token with the values for your temporary token.

  2. Connect a physical Android device to your development device.

  3. In Android Studio, click Run app. A moment later you see the project installed on your device.

    If this is the first time you run the project, grant microphone and camera access to your app.

  1. To join as a host, select Host and click Join.
  1. Press Start Audio Mixing.

    You hear the audio file played in the channel mixed with the microphone audio. Press the button again to stop audio mixing.

  2. Press Play Audio Effect.

    You hear the audio file being played. Press the button again to pause and resume the audio. When the audio has finished playing, you see a toast message.

  3. Press Apply voice effect.

    Put on headphones connected to your web browser app and speak into the microphone of your Android device. You hear your voice through the headphones with the voice effect name displayed on the button. Press the button again to test all the implemented voice effects one by one.

  4. Tap the Speakerphone switch to turn it off. Speak into the microphone connected to your browser app. The audio is routed to the earpiece on your Android device.

    Tap the switch again to turn the speakerphone on. When you speak into the browser app microphone, the audio is routed to the speakerphone on your Android device.

Reference

This section contains information that completes the information in this page, or points you to documentation that explains other aspects to this product.

Default audio routes

Depending on the scenario, Video SDK uses the following default audio routes:

  • Voice call: Earpiece
  • Audio broadcast: Speakerphone
  • Video call: Speakerphone
  • Video broadcast: Speakerphone

Audio route change workflow

The audio route is changed in the following ways:

  • User: Add or remove an external device such as headphones or a Bluetooth audio device.
  • Developer:
    • setDefaultAudioRoutetoSpeakerphone - change the default audio route.
    • setEnableSpeakerphone - temporarily change audio route.

The principles for audio route change are:

  • User behaviour has the highest priority:

    • When a user connects an external device the audio route automatically changes to the external device.
    • If the user connects multiple external devices in sequence, the audio route automatically changes to the last connected device.
  • Developers can implement the following functionality:

    • Call setDefaultAudioRoutetoSpeakerphone to change the default and current setting:

      The workflow is:

      1. The app calls setDefaultAudioRoutetoSpeakerphone(true). The audio route changes to the speakerphone.
      2. The user plugs in headphones. The audio route changes to the headphones.
      3. The app calls setDefaultAudioRoutetoSpeakerphone(true). The audio route remains the headphones, because setDefaultAudioRoutetoSpeakerphone works on the audio route of the device only.
      4. The user unplugs the headphones. The audio route changes to the speakerphone.
    • Call setEnableSpeakerphone to temporarily set the audio route to speakerphone or earpiece. Because setEnableSpeakerphone is a transient API, any user behaviour or audio-related API call might change the current audio device setEnableSpeakerphone.

      The workflow is:

      1. A user joins an interactive live streaming channel. The audio route is the speakerphone.
      2. The user plugs in headphones. The audio route changes to the headphones.
      3. The app calls setEnableSpeakerphone(true). On Android, the audio route changes to the speakerphone. On iOS, the audio route remains the headphones because on iOS, once the mobile deivce is connected to headphones or a Bluetooth audio device, you cannot change to audio route to the speakerphone.
    • Any change to the audio route triggers the onAudioRouteChanged (Android) or didAudioRouteChanged (iOS) callback. You can use this callback to get the current audio route.

Page Content

Video Calling