Skip to main content

Symbl.ai

This guide is provided by Symbl.ai. Agora is planning a documentation upgrade program for all extensions on the marketplace. Please stay tuned.

This extension allows you to embed real-time conversation intelligence into your mobile application providing contextual analysis of conversation data along with speech recognition without any upfront data training requirements.

Prerequisites

  • Agora Credentials: Make sure you have the Agora credentials and required information (App ID, Channel Name, Temp Token and Meeting ID). This information can be retrieved from the Agora Console.

Learn more about the Agora required credentials in Secure authenticatino..

  • Symbl API Credentials: You will also need the Symbl API credentials which are available on the Symbl Conversation Intelligence Extensions Marketplace product details page. Navigate to the Agora Extensions Marketplace and click on the Symbl card.

Activate the Symbl Conversation Intelligence Extension.

After activating the Symbl Conversation Intelligence Extension, click the View button under the Credentials column to retrieve the required Symbl credentials (App ID and App Secret).

Integrate Symbl Conversation Intelligence

This section walks you through the steps necessary to set up the Symbl Conversation Intelligence extension in your mobile application.

  1. Add the following information into your build.gradle module file:


    _3
    implementation 'com.squareup.okhttp3:okhttp:3.10.0'
    _3
    implementation 'org.java-websocket:Java-WebSocket:1.5.1'
    _3
    implementation 'ai.symbl:android.extension:0.0.2'

  2. Implement the interface io.agora.rtc2.IMediaExtensionObserver


    _1
    public class MainActivity extends AppCompatActivity implements io.agora.rtc2.IMediaExtensionObserver {

  3. Add the following method to set all the necessary information to initialize the Symbl configuration. You can find description for the parameters used in the table below:


    _62
    private void setSymblPluginConfigs(JSONObject pluginParams)throws JSONException{
    _62
    try{
    _62
    // Symbl main extension config objects
    _62
    SymblPluginConfig symblParams=new SymblPluginConfig();
    _62
    _62
    // Set the Symbl API Configuration
    _62
    ApiConfig apiConfig=new ApiConfig();
    _62
    apiConfig.setAppId("<symbl_app_id>");
    _62
    apiConfig.setAppSecret("<symbl_app_secret>");
    _62
    apiConfig.setTokenApi("https://api.symbl.ai/oauth2/token:generate");
    _62
    apiConfig.setSymblPlatformUrl("api-agora-1.symbl.ai");
    _62
    symblParams.setApiConfig(apiConfig);
    _62
    _62
    // Set the Symbl Confidence Level and Language Code
    _62
    RealtimeStartRequest realtimeFlowInitRequest=new RealtimeStartRequest();
    _62
    RealtimeAPIConfig realtimeAPIConfig=new RealtimeAPIConfig();
    _62
    realtimeAPIConfig.setConfidenceThreshold(Double.parseDouble("<symbl_confidence_threshold>"));
    _62
    realtimeAPIConfig.setLanguageCode("en-US");
    _62
    _62
    // Set the Speaker information
    _62
    Speaker speaker=new Speaker();
    _62
    speaker.setName("<symbl_meeting_user_Name>");
    _62
    speaker.setUserId("<symbl_meeting_UserId>");
    _62
    realtimeFlowInitRequest.setSpeaker(speaker);
    _62
    _62
    // Set the meeting encoding and speaker sample rate hertz
    _62
    SpeechRecognition speechRecognition=new SpeechRecognition();
    _62
    speechRecognition.setEncoding("<symbl_meeting_encoding>"));
    _62
    speechRecognition.setSampleRateHertz(Double.parseDouble("<symbl_meeting_sampleRateHertz>"));
    _62
    realtimeAPIConfig.setSpeechRecognition(speechRecognition);
    _62
    _62
    // Set the redaction content values
    _62
    Redaction redaction=new Redaction();
    _62
    redaction.setIdentifyContent(true);
    _62
    redaction.setRedactContent(true);
    _62
    redaction.setRedactionString("*****");
    _62
    realtimeAPIConfig.setRedaction(redaction);
    _62
    _62
    // Set the Tracker (custom business intent) information
    _62
    realtimeFlowInitRequest.setConfig(realtimeAPIConfig);
    _62
    Tracker tracker1=new Tracker();
    _62
    tracker1.setName("Budget");
    _62
    List<String> vocabulary=new ArrayList<> ();
    _62
    vocabulary.add("budget");
    _62
    vocabulary.add("budget decision");
    _62
    tracker1.setVocabulary(vocabulary);
    _62
    List<Tracker> trackerList=new ArrayList<> ();
    _62
    trackerList.add(tracker1);
    _62
    _62
    // Set the Symbl conversation parameters
    _62
    realtimeFlowInitRequest.setTrackers(trackerList);
    _62
    realtimeFlowInitRequest.setType("start_request");
    _62
    realtimeFlowInitRequest.setId("<symbl_unique_meetingId>");
    _62
    realtimeFlowInitRequest.setSentiments(true);
    _62
    realtimeFlowInitRequest.setInsightTypes(Arrays.asList("action_item","question","follow_up"));
    _62
    symblParams.setRealtimeStartRequest(realtimeFlowInitRequest);
    _62
    Gson gson=new Gson();
    _62
    pluginParams.put("inputRequest",gson.toJson(symblParams));
    _62
    }catch(Exception ex){
    _62
    Log.e(TAG,"ERROR while setting Symbl extension configuration");
    _62
    }
    _62
    }

The following table lists the parameters and their descriptions used in the sample code above:

ParameterDescription
symbl_meeting_UserIdUsed to identify the user in the real-time meeting.
symbl_meeting_user_NameThe name of the user attending the real-time meeting.
symbl_unique_meetingIdUnique identifier for the meeting.
symbl_platform_urlThe dedicated URL for the Symbl platform. Use api-agora-1.symbl.ai.
symbl_app_idThe Symbl App ID.
symbl_app_secretThe Symbl App Secret.
symbl_meeting_language_codeThe language code. Currently, en-US (English US) is the only language supported.
symbl_token_apiThe URL for secure token generation.
symbl_meeting_encodingThe audio encoding in which the audio will be sent over the WebSocket connection.
symbl_meeting_sampleRateHertzThe rate of the incoming audio stream.
symbl_confidence_thresholdThe minimum confidence score that you can set for an API to consider it as valid insight. The minimum confidence score should be in the range >=0.5 to <=1 (greater than or equal to 0.5 and less than or equal to 1.0.). Default value is 0.5.

You would have to call the method above during the initialization of the Agora Engine in your application.

On your onEvent() method, you should receive all the transcription information and the analyzed conversation events where you can parse the JSON responses.

For a basic Speech-to-Text (recognition result) type of response, you should expect the following payload:


_41
{
_41
"type": "recognition_result",
_41
"isFinal": true,
_41
"payload": {
_41
"raw": {
_41
"alternatives": [{
_41
"words": [{
_41
"word": "Hello",
_41
"startTime": {
_41
"seconds": "3",
_41
"nanos": "800000000"
_41
},
_41
"endTime": {
_41
"seconds": "4",
_41
"nanos": "200000000"
_41
}
_41
}, {
_41
"word": "world.",
_41
"startTime": {
_41
"seconds": "4",
_41
"nanos": "200000000"
_41
},
_41
"endTime": {
_41
"seconds": "4",
_41
"nanos": "800000000"
_41
}
_41
}],
_41
"transcript": "Hello world.",
_41
"confidence": 0.9128385782241821
_41
}]
_41
}
_41
},
_41
"punctuated": {
_41
"transcript": "Hello world"
_41
},
_41
"user": {
_41
"userId": "emailAddress",
_41
"name": "John Doe",
_41
"id": "23681108-355b-4fc3-9d94-ed47dd39fa56"
_41
}
_41
}

For a Topic type of response, you should expect the following payload:


_19
[
_19
{
_19
"id":"e69a5556-6729-XXXX-XXXX-2aee2deabb1b",
_19
"messageReferences":[
_19
{
_19
"id":"0df44422-0248-47e9-8814-e87f63404f2c",
_19
"relation":"text instance"
_19
}
_19
],
_19
"phrases":"auto insurance",
_19
"rootWords":[
_19
{
_19
"text":"auto"
_19
}
_19
],
_19
"score":0.9,
_19
"type":"topic"
_19
}
_19
]

For a Tracker type of response, you should expect the following payload:


_25
{
_25
"type":"tracker_response",
_25
"trackers":[
_25
{
_25
"name":"budget",
_25
"matches":[
_25
{
_25
"type":"vocabulary",
_25
"value":"budget",
_25
"messageRefs":[
_25
{
_25
"id":"0f3275d7-d534-48e2-aee2-685bd3167f4b",
_25
"text":"I would like to set a budget for this month.",
_25
"offset":22
_25
}
_25
],
_25
"insightRefs":[
_25
_25
]
_25
}
_25
]
_25
}
_25
],
_25
"sequenceNumber":0
_25
}

Given below is a complete sample of MainActivity.java class which includes the Agora RTC and Symbl settings.


_494
package agoramarketplace.symblai.cai;
_494
import androidx.annotation.NonNull;
_494
import androidx.annotation.RequiresApi;
_494
import androidx.appcompat.app.AppCompatActivity;
_494
import androidx.core.app.ActivityCompat;
_494
import androidx.core.content.ContextCompat;
_494
import android.Manifest;
_494
import android.content.Context;
_494
import android.content.pm.PackageManager;
_494
import android.os.Build;
_494
import android.os.Bundle;
_494
import android.util.Log;
_494
import android.view.SurfaceView;
_494
import android.view.View;
_494
import android.view.inputmethod.InputMethodManager;
_494
import android.widget.Button;
_494
import android.widget.EditText;
_494
import android.widget.FrameLayout;
_494
import android.widget.TextView;
_494
_494
import com.google.gson.Gson;
_494
import ai.symbl.android.extension.model.request.ApiConfig;
_494
import ai.symbl.android.extension.model.request.RealtimeAPIConfig;
_494
import ai.symbl.android.extension.model.request.RealtimeStartRequest;
_494
import ai.symbl.android.extension.model.request.Redaction;
_494
import ai.symbl.android.extension.model.request.Speaker;
_494
import ai.symbl.android.extension.model.request.SpeechRecognition;
_494
import ai.symbl.android.extension.model.request.SymblPluginConfig;
_494
import ai.symbl.android.extension.model.request.Tracker;
_494
import ai.symbl.android.extension.model.response.SymblResponse;
_494
import org.json.JSONException;
_494
import org.json.JSONObject;
_494
import java.util.ArrayList;
_494
import java.util.Arrays;
_494
import java.util.List;
_494
import java.util.UUID;
_494
import ai.symbl.android.extension.ExtensionManager;
_494
import ai.symbl.android.extension.SymblAIFilterManager;
_494
import io.agora.rtc2.Constants;
_494
import io.agora.rtc2.IRtcEngineEventHandler;
_494
import io.agora.rtc2.RtcEngine;
_494
import io.agora.rtc2.RtcEngineConfig;
_494
import io.agora.rtc2.UserInfo;
_494
import io.agora.rtc2.video.VideoCanvas;
_494
import io.agora.rtc2.video.VideoEncoderConfiguration;
_494
_494
public class MainActivity extends AppCompatActivity implements io.agora.rtc2.IMediaExtensionObserver {
_494
_494
private final static String TAG = "Agora_SymblTag java :";
_494
private static final int PERMISSION_REQ_ID = 22;
_494
// Set Agora platform configuration
_494
private final static String AGORA_CUSTOMER_CHANNEL_NAME = "Demo";
_494
private static final String AGORA_CUSTOMER_APP_ID = "XXXXXXXXXX";
_494
public static final String TOKEN_VALUE = "XXXXXXXXXX";
_494
public static final String AGORA_MEETING_ID = "Sample Meeting";
_494
// Set Symbl configuration in the code or optionally in "strings.xml"
_494
public static final String symbl_meeting_UserId = "user@mail.com";
_494
public static final String symbl_meeting_user_Name = "userName";
_494
private FrameLayout localVideoContainer;
_494
private FrameLayout remoteVideoContainer;
_494
private RtcEngine mRtcEngine;
_494
private SurfaceView mRemoteView;
_494
private TextView infoTextView;
_494
private Button wmButton;
_494
private EditText et_watermark;
_494
private static final String[] REQUESTED_PERMISSIONS = {
_494
Manifest.permission.RECORD_AUDIO,
_494
Manifest.permission.CAMERA
_494
};
_494
_494
@RequiresApi(api = Build.VERSION_CODES.M)
_494
@Override
_494
protected void onCreate(Bundle savedInstanceState) {
_494
Log.d(TAG, "onCreate");
_494
super.onCreate(savedInstanceState);
_494
setContentView(R.layout.activity_main);
_494
initUI();
_494
checkPermission();
_494
infoTextView.setText("Symbl Sample Conversation Application");
_494
wmButton.setEnabled(true);
_494
}
_494
_494
@Override
_494
protected void onDestroy() {
_494
Log.d(TAG, "onDestroy");
_494
super.onDestroy();
_494
mRtcEngine.leaveChannel();
_494
mRtcEngine.destroy();
_494
}
_494
_494
@RequiresApi(api = Build.VERSION_CODES.M)
_494
private void checkPermission() {
_494
Log.d(TAG, "checkPermission");
_494
if (checkSelfPermission(REQUESTED_PERMISSIONS[0], PERMISSION_REQ_ID) &&
_494
checkSelfPermission(REQUESTED_PERMISSIONS[1], PERMISSION_REQ_ID)) {
_494
initAgoraEngine();
_494
}
_494
}
_494
_494
@Override
_494
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
_494
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
_494
if (requestCode == PERMISSION_REQ_ID && grantResults.length > 0 &&
_494
grantResults[0] == PackageManager.PERMISSION_GRANTED) {
_494
initAgoraEngine();
_494
}
_494
}
_494
_494
private void initUI() {
_494
localVideoContainer = findViewById(R.id.view_container);
_494
remoteVideoContainer = findViewById(R.id.remote_video_view_container);
_494
infoTextView = findViewById(R.id.infoTextView);
_494
et_watermark = findViewById(R.id.et_watermark);
_494
wmButton = findViewById(R.id.enable_button);
_494
wmButton.setOnClickListener(new View.OnClickListener() {
_494
@Override
_494
public void onClick(View v) {
_494
InputMethodManager imm = (InputMethodManager) getSystemService(Context.INPUT_METHOD_SERVICE);
_494
imm.hideSoftInputFromWindow(et_watermark.getWindowToken(), 0);
_494
if (wmButton.isSelected()) {
_494
wmButton.setSelected(false);
_494
wmButton.setText(R.string.string_enable);
_494
disableEffect();
_494
} else {
_494
wmButton.setSelected(true);
_494
wmButton.setText(R.string.string_disable);
_494
enableEffect();
_494
}
_494
}
_494
});
_494
}
_494
_494
private boolean checkSelfPermission(String permission, int requestCode) {
_494
if (ContextCompat.checkSelfPermission(this, permission) !=
_494
PackageManager.PERMISSION_GRANTED) {
_494
ActivityCompat.requestPermissions(this, REQUESTED_PERMISSIONS, requestCode);
_494
return false;
_494
}
_494
return true;
_494
}
_494
_494
private void initAgoraEngine() {
_494
try {
_494
RtcEngineConfig config = new RtcEngineConfig();
_494
config.mContext = this;
_494
config.mAppId = AGORA_CUSTOMER_APP_ID;
_494
config.addExtension(ExtensionManager.EXTENSION_NAME);
_494
config.mExtensionObserver = this;
_494
config.mEventHandler = new IRtcEngineEventHandler() {
_494
@Override
_494
public void onJoinChannelSuccess(String s, int i, int i1) {
_494
Log.d(TAG, "on Join Channel Success");
_494
mRtcEngine.startPreview();
_494
JSONObject pluginParams = new JSONObject();
_494
try {
_494
setSymblPluginConfigs(pluginParams, AGORA_CUSTOMER_CHANNEL_NAME);
_494
mRtcEngine.setExtensionProperty(ExtensionManager.EXTENSION_VENDOR_NAME, ExtensionManager.EXTENSION_FILTER_NAME, "init", pluginParams.toString());
_494
} catch (JSONException e) {
_494
Log.d(TAG, "ERROR while joining channel " + e.getMessage());
_494
}
_494
}
_494
_494
@Override
_494
public void onFirstRemoteVideoDecoded(final int i, int i1, int i2, int i3) {
_494
super.onFirstRemoteVideoDecoded(i, i1, i2, i3);
_494
runOnUiThread(new Runnable() {
_494
@Override
_494
public void run() {
_494
setupRemoteVideo(i);
_494
}
_494
});
_494
}
_494
_494
@Override
_494
public void onUserJoined(int i, int i1) {
_494
super.onUserJoined(i, i1);
_494
}
_494
_494
@Override
_494
public void onUserOffline(final int i, int i1) {
_494
super.onUserOffline(i, i1);
_494
runOnUiThread(new Runnable() {
_494
@Override
_494
public void run() {
_494
onRemoteUserLeft();
_494
}
_494
});
_494
}
_494
};
_494
mRtcEngine = RtcEngine.create(config);
_494
mRtcEngine.enableExtension(ExtensionManager.EXTENSION_VENDOR_NAME, ExtensionManager.EXTENSION_FILTER_NAME, true);
_494
setupLocalVideo();
_494
VideoEncoderConfiguration configuration = new VideoEncoderConfiguration(640, 360,
_494
VideoEncoderConfiguration.FRAME_RATE.FRAME_RATE_FPS_24,
_494
VideoEncoderConfiguration.STANDARD_BITRATE,
_494
VideoEncoderConfiguration.ORIENTATION_MODE.ORIENTATION_MODE_FIXED_PORTRAIT);
_494
mRtcEngine.setVideoEncoderConfiguration(configuration);
_494
mRtcEngine.setClientRole(Constants.AUDIO_ENCODED_FRAME_OBSERVER_POSITION_MIC);
_494
mRtcEngine.enableLocalAudio(true);
_494
mRtcEngine.setEnableSpeakerphone(true);
_494
mRtcEngine.setAudioProfile(1);
_494
mRtcEngine.enableAudio();
_494
mRtcEngine.addHandler(new IRtcEngineEventHandler() {
_494
@Override
_494
public void onUserInfoUpdated(int uid, UserInfo userInfo) {
_494
super.onUserInfoUpdated(uid, userInfo);
_494
}
_494
@Override
_494
public void onUserJoined(int uid, int elapsed) {
_494
super.onUserJoined(uid, elapsed);
_494
}
_494
@Override
_494
public void onUserOffline(int uid, int reason) {
_494
super.onUserOffline(uid, reason);
_494
}
_494
@Override
_494
public void onConnectionStateChanged(int state, int reason) {
_494
super.onConnectionStateChanged(state, reason);
_494
}
_494
@Override
_494
public void onConnectionInterrupted() {
_494
super.onConnectionInterrupted();
_494
}
_494
@Override
_494
public void onConnectionLost() {
_494
super.onConnectionLost();
_494
}
_494
@Override
_494
public void onApiCallExecuted(int error, String api, String result) {
_494
super.onApiCallExecuted(error, api, result);
_494
}
_494
@Override
_494
public void onTokenPrivilegeWillExpire(String token) {
_494
super.onTokenPrivilegeWillExpire(token);
_494
}
_494
@Override
_494
public void onRequestToken() {
_494
super.onRequestToken();
_494
}
_494
@Override
_494
public void onActiveSpeaker(int uid) {
_494
super.onActiveSpeaker(uid);
_494
}
_494
@Override
_494
public void onFirstLocalAudioFramePublished(int elapsed) {
_494
super.onFirstLocalAudioFramePublished(elapsed);
_494
}
_494
@Override
_494
public void onAudioPublishStateChanged(String channel, STREAM_PUBLISH_STATE oldState, STREAM_PUBLISH_STATE newState, int elapseSinceLastState) {
_494
super.onAudioPublishStateChanged(channel, oldState, newState, elapseSinceLastState);
_494
}
_494
@Override
_494
public void onAudioSubscribeStateChanged(String channel, int uid, STREAM_SUBSCRIBE_STATE oldState, STREAM_SUBSCRIBE_STATE newState, int elapseSinceLastState) {
_494
super.onAudioSubscribeStateChanged(channel, uid, oldState, newState, elapseSinceLastState);
_494
}
_494
@Override
_494
public void onAudioRouteChanged(int routing) {
_494
super.onAudioRouteChanged(routing);
_494
}
_494
@Override
_494
public void onAudioQuality(int uid, int quality, short delay, short lost) {
_494
super.onAudioQuality(uid, quality, delay, lost);
_494
}
_494
@Override
_494
public void onRtcStats(RtcStats stats) {
_494
super.onRtcStats(stats);
_494
}
_494
@Override
_494
public void onLocalAudioStats(LocalAudioStats stats) {
_494
super.onLocalAudioStats(stats);
_494
}
_494
@Override
_494
public void onAudioMixingFinished() {
_494
super.onAudioMixingFinished();
_494
}
_494
_494
@Override
_494
public void onLocalAudioStateChanged(LOCAL_AUDIO_STREAM_STATE state, LOCAL_AUDIO_STREAM_ERROR error) {
_494
super.onLocalAudioStateChanged(state, error);
_494
}
_494
@Override
_494
public void onEncryptionError(ENCRYPTION_ERROR_TYPE errorType) {
_494
super.onEncryptionError(errorType);
_494
}
_494
@Override
_494
public void onPermissionError(PERMISSION permission) {
_494
super.onPermissionError(permission);
_494
}
_494
});
_494
mRtcEngine.joinChannel(TOKEN_VALUE, AGORA_CUSTOMER_CHANNEL_NAME, AGORA_MEETING_ID, 0);
_494
mRtcEngine.startPreview();
_494
} catch (Exception e) {
_494
e.printStackTrace();
_494
Log.e(TAG, " ERROR:: RTC engine startup error " + e.getMessage());
_494
}
_494
}
_494
_494
private void setupLocalVideo() {
_494
SurfaceView view = RtcEngine.CreateRendererView(this);
_494
view.setZOrderMediaOverlay(true);
_494
localVideoContainer.addView(view);
_494
mRtcEngine.setupLocalVideo(new VideoCanvas(view, VideoCanvas.RENDER_MODE_HIDDEN, 0));
_494
mRtcEngine.setLocalRenderMode(Constants.RENDER_MODE_HIDDEN);
_494
}
_494
_494
private void setupRemoteVideo(int uid) {
_494
// Only one remote video view is available for this sample application.
_494
int count = remoteVideoContainer.getChildCount();
_494
View view = null;
_494
for (int i = 0; i < count; i++) {
_494
View v = remoteVideoContainer.getChildAt(i);
_494
if (v.getTag() instanceof Integer && ((int) v.getTag()) == uid) {
_494
view = v;
_494
}
_494
}
_494
_494
if (view != null) {
_494
return;
_494
}
_494
_494
Log.d(TAG, " setupRemoteVideo uid = " + uid);
_494
mRemoteView = RtcEngine.CreateRendererView(getBaseContext());
_494
remoteVideoContainer.addView(mRemoteView);
_494
mRtcEngine.setupRemoteVideo(new VideoCanvas(mRemoteView, VideoCanvas.RENDER_MODE_HIDDEN, uid));
_494
mRtcEngine.setRemoteRenderMode(uid, Constants.RENDER_MODE_HIDDEN);
_494
mRemoteView.setTag(uid);
_494
}
_494
_494
private void onRemoteUserLeft() {
_494
removeRemoteVideo();
_494
}
_494
_494
private void removeRemoteVideo() {
_494
if (mRemoteView != null) {
_494
remoteVideoContainer.removeView(mRemoteView);
_494
}
_494
mRemoteView = null;
_494
}
_494
_494
private void enableEffect() {
_494
JSONObject pluginParams = new JSONObject();
_494
try {
_494
setSymblPluginConfigs(pluginParams, AGORA_CUSTOMER_CHANNEL_NAME);
_494
mRtcEngine.setExtensionProperty(ExtensionManager.EXTENSION_VENDOR_NAME, ExtensionManager.EXTENSION_FILTER_NAME, "start", pluginParams.toString());
_494
} catch (JSONException e) {
_494
e.printStackTrace();
_494
}
_494
}
_494
_494
/**
_494
* This method is used to set the Symbl configuration
_494
*
_494
* @param pluginParams
_494
* @param channelName
_494
* @throws JSONException
_494
*/
_494
private void setSymblPluginConfigs(JSONObject pluginParams, String channelName) throws JSONException {
_494
try {
_494
String symbl_unique_meetingId = UUID.randomUUID().toString();
_494
pluginParams.put("secret", getString(R.string.symbl_app_secret));
_494
pluginParams.put("appKey", getString(R.string.symbl_app_id));
_494
_494
pluginParams.put("meetingId", symbl_unique_meetingId);
_494
_494
pluginParams.put("userId", symbl_meeting_UserId);
_494
pluginParams.put("name", symbl_meeting_user_Name);
_494
pluginParams.put("languageCode", "en-US");
_494
_494
// Symbl main extension config objects
_494
SymblPluginConfig symplParams = new SymblPluginConfig();
_494
_494
// Set the Symbl API Configuration
_494
ApiConfig apiConfig = new ApiConfig();
_494
apiConfig.setAppId(getString(R.string.symbl_app_id));
_494
apiConfig.setAppSecret(getString(R.string.symbl_app_secret));
_494
apiConfig.setTokenApi(getString(R.string.symbl_token_api));
_494
apiConfig.setSymblPlatformUrl(getString(R.string.symbl_platform_url));
_494
symplParams.setApiConfig(apiConfig);
_494
_494
// Set the Symbl Confidence Level and Language Code
_494
RealtimeStartRequest realtimeFlowInitRequest = new RealtimeStartRequest();
_494
RealtimeAPIConfig realtimeAPIConfig = new RealtimeAPIConfig();
_494
realtimeAPIConfig.setConfidenceThreshold(Double.parseDouble(String.valueOf(R.string.symbl_confidence_threshold)));
_494
realtimeAPIConfig.setLanguageCode(getString(R.string.symbl_meeting_language_code));
_494
_494
// Set the Speaker information
_494
Speaker speaker = new Speaker();
_494
speaker.setName(symbl_meeting_user_Name);
_494
speaker.setUserId(symbl_meeting_UserId);
_494
realtimeFlowInitRequest.setSpeaker(speaker);
_494
_494
// Set the meeting encoding and speaker sample rate hertz
_494
SpeechRecognition speechRecognition = new SpeechRecognition();
_494
speechRecognition.setEncoding(getString(R.string.symbl_meeting_encoding));
_494
speechRecognition.setSampleRateHertz(Double.parseDouble(getString(R.string.symbl_meeting_sampleRateHertz)));
_494
realtimeAPIConfig.setSpeechRecognition(speechRecognition);
_494
_494
// Set the redaction content values
_494
Redaction redaction = new Redaction();
_494
redaction.setIdentifyContent(true);
_494
redaction.setRedactContent(true);
_494
redaction.setRedactionString("*****");
_494
realtimeAPIConfig.setRedaction(redaction);
_494
_494
// Set the Tracker (custom business intent) information
_494
realtimeFlowInitRequest.setConfig(realtimeAPIConfig);
_494
Tracker tracker1 = new Tracker();
_494
tracker1.setName("Budget");
_494
List < String > vocabulary = new ArrayList < > ();
_494
vocabulary.add("budgeted");
_494
vocabulary.add("budgeted decision");
_494
tracker1.setVocabulary(vocabulary);
_494
List < Tracker > trackerList = new ArrayList < > ();
_494
trackerList.add(tracker1);
_494
_494
// Set the Symbl conversation parameters
_494
realtimeFlowInitRequest.setTrackers(trackerList);
_494
realtimeFlowInitRequest.setType("start_request");
_494
realtimeFlowInitRequest.setId(symbl_unique_meetingId);
_494
realtimeFlowInitRequest.setSentiments(true);
_494
realtimeFlowInitRequest.setInsightTypes(Arrays.asList("action_item", "question", "follow_up"));
_494
symplParams.setRealtimeStartRequest(realtimeFlowInitRequest);
_494
Gson gson = new Gson();
_494
pluginParams.put("inputRequest", gson.toJson(symplParams));
_494
} catch (Exception ex) {
_494
Log.e(TAG, "ERROR while setting Symbl extension configuration");
_494
}
_494
}
_494
_494
private void disableEffect() {
_494
JSONObject o = new JSONObject();
_494
mRtcEngine.setExtensionProperty(ExtensionManager.EXTENSION_VENDOR_NAME, ExtensionManager.EXTENSION_FILTER_NAME, "stop", o.toString());
_494
}
_494
_494
@Override
_494
public void onEvent(String vendor, String extension, String key, String value) {
_494
Log.i(TAG, "Symbl conversation Event \n \n " + vendor + " extension: " + extension + " key: " + key + " value: " + value);
_494
final StringBuilder sb = new StringBuilder();
_494
sb.append(value);
_494
if ("result".equals(key)) {
_494
try {
_494
Gson json = new Gson();
_494
SymblResponse symblResponse = json.fromJson(value, SymblResponse.class);
_494
if (symblResponse.getEvent() != null && symblResponse.getEvent().length() > 0) {
_494
switch (symblResponse.getEvent()) {
_494
//this is the conversation response from Symbl platform
_494
case SymblAIFilterManager.SYMBL_START_PLUGIN_REQUEST:
_494
break;
_494
case SymblAIFilterManager.SYMBL_ON_MESSAGE:
_494
try {
_494
if (symblResponse.getErrorMessage() != null && symblResponse.getErrorMessage().length() > 0) {} else {}
_494
} catch (Exception ex) {
_494
Log.e(TAG, "ERROR on Symbl message on message transform error " + ex.getMessage());
_494
}
_494
break;
_494
case SymblAIFilterManager.SYMBL_CONNECTION_ERROR:
_494
if (symblResponse != null)
_494
Log.i(TAG, "SYMBL_CONNECTION_ERROR error code %s , error message " + symblResponse.getErrorCode());
_494
break;
_494
case SymblAIFilterManager.SYMBL_WEBSOCKETS_CLOSED:
_494
if (symblResponse != null)
_494
Log.i(TAG, "SYMBL_CONNECTION_ERROR " + symblResponse.getErrorCode());
_494
break;
_494
case SymblAIFilterManager.SYMBL_TOKEN_EXPIRED:
_494
break;
_494
case SymblAIFilterManager.SYMBL_STOP_REQUEST:
_494
break;
_494
case SymblAIFilterManager.SYMBL_ON_CLOSE:
_494
break;
_494
case SymblAIFilterManager.SYMBL_SEND_REQUEST:
_494
break;
_494
case SymblAIFilterManager.SYMBL_ON_ERROR:
_494
break;
_494
}
_494
} else { // all error cases handle here
_494
if (symblResponse != null) {
_494
sb.append("\n Symbl event :" + symblResponse.getEvent());
_494
sb.append("\n Error Message :" + symblResponse.getErrorMessage());
_494
sb.append("\n Error code :" + symblResponse.getErrorCode());
_494
}
_494
}
_494
_494
} catch (Exception exception) {
_494
System.out.println("result parse error ");
_494
}
_494
}
_494
this.runOnUiThread(new Runnable() {
_494
@Override
_494
public void run() {
_494
infoTextView.setText(sb.toString());
_494
}
_494
});
_494
}
_494
}

Run the demo

The following sample project provides you an Android mobile app using the Agora Video SDK and the Symbl.ai Extension and it can be used as a reference. Follow the instructions in the README file for setting up, configuring and running the sample mobile app in your own device. See Sample Android App Project.

Reference

Find comprehensive information about our REST APIs in the API Reference section.

Page Content