如何在安卓设备上使用声网 SDK 在视频通话时调整音量或设置静音

声网有很多提升视频通话质量和便利性的功能,其中包括通话静音和音量调整功能,这两个功能是所有视频通话应用的必备功能。

下面这个教程会教大家如何在安卓系统上使用声网 SDK 在视频通话时调整音量或设置静音(android sdk)。


前期准备

  • 了解如何使用声网搭建安卓版直播推流应用(android sdk)。
  • 了解安卓开发的基础知识。
  • Android Studio。
  • 一个安卓设备。


在 Gradle 中添加依赖

在开始编码前,首先要在 App 模块的 build.gradle 文件中添加下列依赖,同时下载声网第三方库,记得一定要使用最新版的声网库

implementation 'com.yanzhenjie:permission:2.0.3'
implementation 'io.agora.rtc:full-sdk:3.5.0'


在 Manifest.xml 文件中添加权限

在 Manifest.xml file 文件中添加下列权限:

<uses-permission android:name="android.permission.CAMERA" />
 <uses-permission android:name="android.permission.INTERNET" />
 <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
 <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
 <uses-permission android:name="android.permission.RECORD_AUDIO" />

创建一个 RtcEngine 实例

首先,我们通过初始化 RtcEngine 并把 IRtcEngineEventHandler 和 APP ID 传递给 create 方法来创建一个 RtcEngine 实例。IRtcEngineEventHandler 是一种提供默认实现的抽象类。我们用这个对象实例来激活 adjustPlaybackSignalVolume() 方法,该方法可以调整视频通话的音量。

try {
 mRtcEngine = RtcEngine.create(getBaseContext(), getString(R.string.agora_app_id), mRtcEventHandler);
} catch (Exception e) {
 Log.e(LOG_TAG, Log.getStackTraceString(e));
 throw new RuntimeException("fatal error\n" + Log.getStackTraceString(e));
}

设置视频通话模式

接下来,我们添加下列代码来设置视频通话的模式、屏幕方向和视频配置。我们把频道模式设置为 CHANNEL_PROFILE_COMMUNICATION,在这个模式下,1 对 1 视频通话的参与者可以发送和接收音视频:

mRtcEngine.setChannelProfile(Constants.CHANNEL_PROFILE_COMMUNICATION);
mRtcEngine.enableVideo();
mRtcEngine.setVideoEncoderConfiguration(new VideoEncoderConfiguration(VideoEncoderConfiguration.VD_640x480, VideoEncoderConfiguration.FRAME_RATE.FRAME_RATE_FPS_30,
 VideoEncoderConfiguration.STANDARD_BITRATE,
 VideoEncoderConfiguration.ORIENTATION_MODE.ORIENTATION_MODE_FIXED_PORTRAIT));

在视频通话过程中拖动 SeekBar 调整音量

SeekBar 是一个可以用手指左右拖动进度条,我们通过拖动 SeekBar 来调整视频通话或正在播放的音频文件的音量。

首先,我们创建一个 SeekBar 对象,同时调用 setOnSeekBarChangeListener:

SeekBar seekBar=(SeekBar)findViewById(R.id.seekBar);
seekBar.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener() {


然后,我们把 SeekBar 的进度值传递给 adjustPlaybackSignalVolume() 方法,该方法会调整视频通话的音量。声网视频通话的音量区间是 0 到 100,默认音量为 100。

用户拖动 SeekBar 时,onProgressChanged() 方法会记录 SeekBar 的变化:

@Override
 public void onProgressChanged(SeekBar seekBar, int progress,
 boolean fromUser) {
 mRtcEngine.adjustPlaybackSignalVolume(progress);
 }


接下来,我们执行 SeekBar.OnSeekBarChangeListener 页面的必要方法。

我们在 onStartTrackingTouch() 函数里添加了一个通知,通知内容是 SeekBar 已开始记录用户触摸:

@Override
 public void onStartTrackingTouch(SeekBar seekBar) {
 }


onStopTrackingTouch() 函数里则添加了 SeekBar 已停止记录用户触摸的通知:

@Override
 public void onStopTrackingTouch(SeekBar seekBar) {
 }
});

把视频通话设为静音

在加入视频通话前或加入后,我们可以通过调用 adjustPlaybackSignalVolume() 和 adjustAudioMixingVolume() 这两个方法把音量设置为 0,此时视频通话就会变为静音。

public void muteAudio(View view) {
 mRtcEngine.adjustAudioMixingVolume(0);
 mRtcEngine.adjustPlaybackSignalVolume(0);
}

声网视频通话 SDK 与静音/音量调整功能的集成

现在你一定已经知道怎么使用声网Agora SDK 的方法来设置静音或者调整音量啦,但你可能还需要下面的 Mute 类代码段,下面的代码可以帮你在视频推流应用中集成静音和音量调整功能。

public class Mute extends AppCompatActivity{
    private RtcEngine mRtcEngine;
    int volume=0;
    private IAudioEffectManager audioEffectManager;
    SeekBar seekBar;

    // Permissions
    private static final int PERMISSION_REQ_ID = 22;
    private static final String[] REQUESTED_PERMISSIONS = {Manifest.permission.RECORD_AUDIO, Manifest.permission.CAMERA};

    private static final String LOG_TAG = MainActivity.class.getSimpleName();

    // Handle SDK Events
    private final IRtcEngineEventHandler mRtcEventHandler = new IRtcEngineEventHandler() {
        @Override
        public void onUserJoined(final int uid, int elapsed) {
            runOnUiThread(new Runnable() {
                @Override
                public void run() {
                    // set first remote user to the main bg video container
                    setupRemoteVideoStream(uid);
                }
            });
        }

        // remote user has left channel
        @Override
        public void onUserOffline(int uid, int reason) { // Tutorial Step 7
            runOnUiThread(new Runnable() {
                @Override
                public void run() {
                    onRemoteUserLeft();
                }
            });
        }

        // remote user has toggled their video
        @Override
        public void onRemoteVideoStateChanged(final int uid, final int state, int reason, int elapsed) {
            runOnUiThread(new Runnable() {
                @Override
                public void run() {
                    onRemoteUserVideoToggle(uid, state);
                }
            });
        }
    };

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_mute);


        if (checkSelfPermission(REQUESTED_PERMISSIONS[0], PERMISSION_REQ_ID) &&
                checkSelfPermission(REQUESTED_PERMISSIONS[1], PERMISSION_REQ_ID)) {
            initAgoraEngine();
        }

        seekBar=(SeekBar)findViewById(R.id.seekBar);
        seekBar.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener() {
            @Override
            public void onProgressChanged(SeekBar seekBar, int progress,
                                          boolean fromUser) {

                mRtcEngine.adjustPlaybackSignalVolume(progress);
               // progress= volume
            }

            @Override
            public void onStartTrackingTouch(SeekBar seekBar) {

            }

            @Override
            public void onStopTrackingTouch(SeekBar seekBar) {

            }
        });

    }

    private void initAgoraEngine() {
        try {
            mRtcEngine = RtcEngine.create(getBaseContext(), getString(R.string.agora_app_id), mRtcEventHandler);
        } catch (Exception e) {
            Log.e(LOG_TAG, Log.getStackTraceString(e));

            throw new RuntimeException("fatal error\n" + Log.getStackTraceString(e));
        }
        setupSession();
    }

    private void setupSession() {
        mRtcEngine.setChannelProfile(Constants.CHANNEL_PROFILE_COMMUNICATION);

        mRtcEngine.enableVideo();

        mRtcEngine.setVideoEncoderConfiguration(new VideoEncoderConfiguration(VideoEncoderConfiguration.VD_640x480, VideoEncoderConfiguration.FRAME_RATE.FRAME_RATE_FPS_30,
                VideoEncoderConfiguration.STANDARD_BITRATE,
                VideoEncoderConfiguration.ORIENTATION_MODE.ORIENTATION_MODE_FIXED_PORTRAIT));
    }

    private void setupLocalVideoFeed() {

        // setup the container for the local user
        FrameLayout videoContainer = findViewById(R.id.floating_video_container);
        SurfaceView videoSurface = RtcEngine.CreateRendererView(getBaseContext());
        videoSurface.setZOrderMediaOverlay(true);
        videoContainer.addView(videoSurface);
        mRtcEngine.setupLocalVideo(new VideoCanvas(videoSurface, VideoCanvas.RENDER_MODE_FIT, 0));
    }

    private void setupRemoteVideoStream(int uid) {
        // setup ui element for the remote stream
        FrameLayout videoContainer = findViewById(R.id.bg_video_container);
        // ignore any new streams that join the session
        if (videoContainer.getChildCount() >= 1) {
            return;
        }

        SurfaceView videoSurface = RtcEngine.CreateRendererView(getBaseContext());
        videoContainer.addView(videoSurface);
        mRtcEngine.setupRemoteVideo(new VideoCanvas(videoSurface, VideoCanvas.RENDER_MODE_FIT, uid));
        mRtcEngine.setRemoteSubscribeFallbackOption(Constants.STREAM_FALLBACK_OPTION_AUDIO_ONLY);

    }



    // join the channel when user clicks UI button
    public void onjoinChannelClicked(View view) {
        mRtcEngine.joinChannel(null, "test-channel", "Extra Optional Data", 0); // if you do not specify the uid, Agora will assign one.
        setupLocalVideoFeed();
        findViewById(R.id.joinBtn).setVisibility(View.GONE); // set the join button hidden

    }



    private void leaveChannel() {
        mRtcEngine.leaveChannel();
    }

    private void removeVideo(int containerID) {
        FrameLayout videoContainer = findViewById(containerID);
        videoContainer.removeAllViews();
    }

    private void onRemoteUserVideoToggle(int uid, int state) {
        FrameLayout videoContainer = findViewById(R.id.bg_video_container);

        SurfaceView videoSurface = (SurfaceView) videoContainer.getChildAt(0);
        videoSurface.setVisibility(state == 0 ? View.GONE : View.VISIBLE);

        // add an icon to let the other user know remote video has been disabled
        if(state == 0){
            ImageView noCamera = new ImageView(this);
            noCamera.setImageResource(R.drawable.video_disabled);
            videoContainer.addView(noCamera);
        } else {
            ImageView noCamera = (ImageView) videoContainer.getChildAt(1);
            if(noCamera != null) {
                videoContainer.removeView(noCamera);
            }
        }
    }

    private void onRemoteUserLeft() {
        removeVideo(R.id.bg_video_container);
    }



    public boolean checkSelfPermission(String permission, int requestCode) {
        Log.i(LOG_TAG, "checkSelfPermission " + permission + " " + requestCode);
        if (ContextCompat.checkSelfPermission(this,
                permission)
                != PackageManager.PERMISSION_GRANTED) {

            ActivityCompat.requestPermissions(this,
                    REQUESTED_PERMISSIONS,
                    requestCode);
            return false;
        }
        return true;
    }


    @Override
    public void onRequestPermissionsResult(int requestCode,
                                           @NonNull String permissions[], @NonNull int[] grantResults) {
        super.onRequestPermissionsResult(requestCode, permissions, grantResults);
        Log.i(LOG_TAG, "onRequestPermissionsResult " + grantResults[0] + " " + requestCode);

        switch (requestCode) {
            case PERMISSION_REQ_ID: {
                if (grantResults[0] != PackageManager.PERMISSION_GRANTED || grantResults[1] != PackageManager.PERMISSION_GRANTED) {
                    Log.i(LOG_TAG, "Need permissions " + Manifest.permission.RECORD_AUDIO + "/" + Manifest.permission.CAMERA);
                    break;
                }

                initAgoraEngine();
                break;
            }
        }
    }

    @Override
    protected void onDestroy() {
        super.onDestroy();

        leaveChannel();
        RtcEngine.destroy();
        mRtcEngine = null;
    }


    public void muteAudio(View view) {
        mRtcEngine.adjustAudioMixingVolume(0);
        mRtcEngine.adjustPlaybackSignalVolume(0);

    }

    private void preloadAudioEffect(){
        // Gets the global audio effect manager.
        audioEffectManager = mRtcEngine.getAudioEffectManager();
        int id = 0;
        audioEffectManager.preloadEffect(id++, Environment.getExternalStorageDirectory().getPath()+"/Song/Caiiro.mp3");
        audioEffectManager.playEffect(
                0,  // The sound ID of the audio effect file to be played.
                Environment.getExternalStorageDirectory().getPath()+"/Song/Caiiro.mp3",  // The file path of the audio effect file.
                -1,   // The number of playback loops. -1 means an infinite loop.
                1,    // pitch	The pitch of the audio effect. The value ranges between 0.5 and 2. The default value is 1 (no change to the pitch). The lower the value, the lower the pitch.
                0.0,  // Sets the spatial position of the effect. 0 means the effect shows ahead.
                volume,  // Sets the volume. The value ranges between 0 and 100. 100 is the original volume.
                true, // Sets whether to publish the audio effect.
                0 // Start position
        );
        // Pauses all audio effects.
        audioEffectManager.pauseAllEffects();
    }

}

如果你不知道怎么使用声网 SDK 搭建一个 1 对 1 视频通话应用,可以查看 GitHub 上 Hermes 写的教程。上面的代码与该教程使用的是一样的理念~


小结

这个教程教会我们:

  • 如何使用 SeekBar 记录用户的触摸
  • 如何调整声网Agora 的视频通话音量
  • 如何在视频通话过程中设置静音


总结

太棒啦!你已经学会使用声网 SDK 来调整视频通话音量和设置静音了,感谢读到这里的童鞋们~

你可以点击这里了解怎么调整音量,你也可以点击这里在 GitHub 查看更多声网Agora 的功能。

如果你想要粘贴或引用我使用的 SDK(android sdk),请点击这里在 GitHub 查看。


其他资源

如果你在操作过程中遇到问题,可以查看声网官方文档


原文作者:Boemo Wame Mmopelwa,Boemo是一位喜欢探索创新方法的软件开发人员。他喜欢钻研学习复杂的理念,然后把这些复杂的理念用简单有趣且便于理解的方式讲述出来。
原文链接:https://www.agora.io/en/blog/how-to-mute-audio-and-adjust-volume-during-a-video-call-in-android-using-the-agora-sdk/
推荐阅读
相关专栏
SDK 教程
167 文章
本专栏仅用于分享音视频相关的技术文章,与其他开发者和声网 研发团队交流、分享行业前沿技术、资讯。发帖前,请参考「社区发帖指南」,方便您更好的展示所发表的文章和内容。