From aed345a08405c807a9358f263a1d76d69018fa1b Mon Sep 17 00:00:00 2001 From: billy-the-fish Date: Mon, 17 Jul 2023 18:14:18 +0200 Subject: [PATCH 001/184] Get started for ReactJS --- shared/common/no-uikit.mdx | 8 +- shared/common/prerequities.mdx | 5 + shared/common/project-setup/android.mdx | 49 +++++++++ shared/common/project-setup/electron.mdx | 22 ++++ shared/common/project-setup/flutter.mdx | 47 ++++++++ shared/common/project-setup/index.mdx | 21 ++++ shared/common/project-setup/ios.mdx | 7 ++ shared/common/project-setup/macos.mdx | 8 ++ shared/common/project-setup/react-js.mdx | 20 ++++ shared/common/project-setup/react-native.mdx | 61 +++++++++++ shared/common/project-setup/swift.mdx | 56 ++++++++++ shared/common/project-setup/unity.mdx | 21 ++++ shared/common/project-setup/web.mdx | 37 +++++++ shared/common/project-setup/windows.mdx | 59 ++++++++++ shared/variables/platform.js | 6 +- shared/video-sdk/_get-started-sdk.mdx | 2 +- shared/video-sdk/_get-started-uikit.mdx | 8 +- .../project-implementation/index.mdx | 2 + .../project-implementation/react-js.mdx | 103 ++++++++++++++++++ .../project-setup/react-js.mdx | 5 + .../get-started-sdk/project-test/index.mdx | 2 + .../get-started-sdk/project-test/react-js.mdx | 14 +++ .../get-started-sdk/reference/react-js.mdx | 5 + 23 files changed, 554 insertions(+), 14 deletions(-) create mode 100644 shared/common/project-setup/android.mdx create mode 100644 shared/common/project-setup/electron.mdx create mode 100644 shared/common/project-setup/flutter.mdx create mode 100644 shared/common/project-setup/index.mdx create mode 100644 shared/common/project-setup/ios.mdx create mode 100644 shared/common/project-setup/macos.mdx create mode 100644 shared/common/project-setup/react-js.mdx create mode 100644 shared/common/project-setup/react-native.mdx create mode 100644 shared/common/project-setup/swift.mdx create mode 100644 shared/common/project-setup/unity.mdx create mode 100644 shared/common/project-setup/web.mdx create mode 100644 shared/common/project-setup/windows.mdx create mode 100644 shared/video-sdk/get-started/get-started-sdk/project-implementation/react-js.mdx create mode 100644 shared/video-sdk/get-started/get-started-sdk/project-setup/react-js.mdx create mode 100644 shared/video-sdk/get-started/get-started-sdk/project-test/react-js.mdx create mode 100644 shared/video-sdk/get-started/get-started-sdk/reference/react-js.mdx diff --git a/shared/common/no-uikit.mdx b/shared/common/no-uikit.mdx index 24a00e19b..956c8a440 100644 --- a/shared/common/no-uikit.mdx +++ b/shared/common/no-uikit.mdx @@ -1,12 +1,6 @@ import * as data from '@site/data/variables'; - -**Currently, there is no for this platform.** - - -**Currently, there is no for this platform.** - - + **Currently, there is no for this platform.** diff --git a/shared/common/prerequities.mdx b/shared/common/prerequities.mdx index 03570b8f3..bf681ea76 100644 --- a/shared/common/prerequities.mdx +++ b/shared/common/prerequities.mdx @@ -37,6 +37,11 @@ - If you are developing a desktop application for Windows, macOS or Linux, make sure your development device meets the [Flutter desktop development requirements](https://docs.flutter.dev/development/platform-integration/desktop). + + +- A [supported browser](../reference/supported-platforms#browsers). +- Physical media input devices, such as a camera and a microphone. +- A JavaScript package manager such as [npm](https://www.npmjs.com/package/npm). - React Native 0.60 or later. For more information, see [Setting up the development environment](https://reactnative.dev/docs/environment-setup). diff --git a/shared/common/project-setup/android.mdx b/shared/common/project-setup/android.mdx new file mode 100644 index 000000000..d856c9aa3 --- /dev/null +++ b/shared/common/project-setup/android.mdx @@ -0,0 +1,49 @@ + + + 1. In Android Studio, create a new **Phone and Tablet**, **Java** [Android project](https://developer.android.com/studio/projects/create-project) with an **Empty Activity**. + + After creating the project, Android Studio automatically starts gradle sync. Ensure that the sync succeeds before you continue. + + 2. Integrate the into your Android project: + + These steps are for package install, if you prefer to manually install, follow the [installation instructions](../reference/downloads#manual-installation). + + 1. In `/Gradle Scripts/build.gradle (Module: .app)`, add the following line under `dependencies`: + + ``` groovy + dependencies { + ... + implementation 'io.agora.rtc::' + ... + } + ``` + + 2. Replace `` and `` with appropriate values for the latest release. For example, `io.agora.rtc:full-sdk:4.0.1`. + + You can obtain the latest `` and `` information using [Maven Central Repository Search](https://search.maven.org/search?q=io.agora.rtc). + + 3. Add permissions for network and device access. + + In `/app/Manifests/AndroidManifest.xml`, add the following permissions after ``: + + ``` java + + + + + + + + + + + + + ``` + + 4. To prevent obfuscating the code in , add the following line to `/Gradle Scripts/proguard-rules.pro`: + + ``` java + -keep class io.agora.**{*;} + ``` + diff --git a/shared/common/project-setup/electron.mdx b/shared/common/project-setup/electron.mdx new file mode 100644 index 000000000..23cb0aff0 --- /dev/null +++ b/shared/common/project-setup/electron.mdx @@ -0,0 +1,22 @@ + + + 1. Take the following steps to setup a new Electron project: + + 1. Open a terminal window and navigate to the directory where you want to create the project. + + 2. Execute the following command in the terminal: + + ``` bash + git clone https://github.com/electron/electron-quick-start + ``` + This command clones the Electron quick-start project that you use to implement . + + 2. Install the + + Open a terminal window in your project folder and execute the following command to download and install the . + + ``` bash + npm i agora-electron-sdk + ``` + Make sure the path to your project folder does not contain any spaces. This might cause error during the installation. + diff --git a/shared/common/project-setup/flutter.mdx b/shared/common/project-setup/flutter.mdx new file mode 100644 index 000000000..d91a3b496 --- /dev/null +++ b/shared/common/project-setup/flutter.mdx @@ -0,0 +1,47 @@ + + +1. **Set up a Flutter environment for a project** + + In the terminal, run the following command: + + ```bash + flutter doctor + ``` + Flutter checks your development device and helps you [set up](https://docs.flutter.dev/get-started/editor) your local development environment. Make sure that your system passes all the checks. + +1. **Create a new Flutter ** + + In the IDE of your choice, create a [Flutter Application project](https://docs.flutter.dev/development/tools/android-studio#creating-a-new-project). + + You can also create a new project from the terminal using the command: + + ```bash + flutter create <--insert project name--> + ``` + +1. **Add to your project** + + Add the following lines to `pubspec.yaml` under dependencies. + + ```yaml + dependencies: + ... + # For x.y.z, fill in a specific SDK version number. For example, 6.0.0 + agora_rtc_engine: ^x.y.z + permission_handler: ^9.2.0 + ... + ``` + + You can get the latest version number from [pub.dev](https://pub.dev/packages/agora_rtc_engine). + +1. **Use the Flutter framework to download dependencies to your ** + + Open a terminal window and execute the following command in the project folder: + + ```bash + flutter pub get + ``` + + + + \ No newline at end of file diff --git a/shared/common/project-setup/index.mdx b/shared/common/project-setup/index.mdx new file mode 100644 index 000000000..2d8fac272 --- /dev/null +++ b/shared/common/project-setup/index.mdx @@ -0,0 +1,21 @@ +import Android from './android.mdx'; +import Ios from './ios.mdx'; +import MacOS from './macos.mdx'; +import Web from './web.mdx'; +import ReactNative from './react-native.mdx'; +import ReactJS from './react-js.mdx'; +import Electron from './electron.mdx'; +import Flutter from './flutter.mdx'; +import Unity from './unity.mdx'; +import Windows from './windows.mdx'; + + + + + + + + + + + diff --git a/shared/common/project-setup/ios.mdx b/shared/common/project-setup/ios.mdx new file mode 100644 index 000000000..f21f0f8b6 --- /dev/null +++ b/shared/common/project-setup/ios.mdx @@ -0,0 +1,7 @@ +import Source from './swift.mdx'; + + + + + + diff --git a/shared/common/project-setup/macos.mdx b/shared/common/project-setup/macos.mdx new file mode 100644 index 000000000..74e608cdd --- /dev/null +++ b/shared/common/project-setup/macos.mdx @@ -0,0 +1,8 @@ +import Source from './swift.mdx'; + + + + + + + \ No newline at end of file diff --git a/shared/common/project-setup/react-js.mdx b/shared/common/project-setup/react-js.mdx new file mode 100644 index 000000000..1e84a7966 --- /dev/null +++ b/shared/common/project-setup/react-js.mdx @@ -0,0 +1,20 @@ + + + +1. Clone the [ sample code repository](https://github.com/AgoraIO/video-sdk-samples-reactjs) to + `` in your development environment: + + ```bash + git clone https://github.com/AgoraIO/video-sdk-samples-reactjs + ``` + +1. Install the dependencies. Open Terminal in the root directory of the cloned repository and run the following command: + + ```bash + npm i agora-rtc-react + ``` + By default is installed automatically. However, you can also [Install manually](../reference/downloads#through-the-agora-website). + + + + \ No newline at end of file diff --git a/shared/common/project-setup/react-native.mdx b/shared/common/project-setup/react-native.mdx new file mode 100644 index 000000000..e45dcb562 --- /dev/null +++ b/shared/common/project-setup/react-native.mdx @@ -0,0 +1,61 @@ + + +1. **Setup a React Native environment for project** + + In the terminal, run the following command: + + ```bash + npx react-native init ProjectName --template react-native-template-typescript + ``` + + npx creates a new boilerplate project in the `ProjectName` folder. + + For Android projects, enable the project to use Android SDK. In the `android` folder of your project, set the `sdk.dir` in the `local.properties` file. For example: + + + ```bash + sdk.dir=C:\\PATH\\TO\\ANDROID\\SDK + ``` + +1. **Test the setup** + + Launch your Android or iOS simulator and run your project by executing the following command: + + 1. Run `npx react-native start` in the root of your project to start Metro. + 1. Open another terminal in the root of your project and run `npx react-native run-android` to start the Android app, or run `npx react-native run-ios` to start the iOS app. + + You see your new app running in your Android or iOS simulator. You can also run your project on a physical Android or iOS device. For detailed instructions, see [Running on device](https://reactnative.dev/docs/running-on-device). + +1. **Integrate and configure ** + + To integrate on React Native 0.60.0 or later: + 1. Navigate to the root folder of your project in the terminal and integrate with either: + - npm + + ```bash + npm i --save react-native-agora + ``` + + - yarn + + ```bash + // Install yarn. + npm install -g yarn + // Download the Agora React Native SDK using yarn. + yarn add react-native-agora + ``` + + Do not link native modules manually, React Native 0.60.0 and later support [Autolinking](https://github.com/react-native-community/cli/blob/main/docs/autolinking.md). + + 1. If your target platform is iOS, use CocoaPods to install : + + ```bash + npx pod-install + ``` + + 1. uses Swift in native modules, your project must support compiling Swift. To create `File.swift`: + + 1. In Xcode, open `ios/ProjectName.xcworkspace`. + 1. Click **File > New > File, select iOS > Swift File**, then click **Next > Create** . + + \ No newline at end of file diff --git a/shared/common/project-setup/swift.mdx b/shared/common/project-setup/swift.mdx new file mode 100644 index 000000000..ad7e41278 --- /dev/null +++ b/shared/common/project-setup/swift.mdx @@ -0,0 +1,56 @@ + +1. [Create a new project](https://help.apple.com/xcode/mac/current/#/dev07db0e578) for this using the **App** template. Select the **Storyboard** Interface and **Swift** Language. + + If you have not already added team information, click **Add account…**, input your Apple ID, then click **Next**. + +1. [Enable automatic signing](https://help.apple.com/xcode/mac/current/#/dev23aab79b4) for your project. + + [Set the target devices](https://help.apple.com/xcode/mac/current/#/deve69552ee5) to deploy your iOS to an iPhone or iPad. + +1. Add project permissions for microphone and camera usage: + + 1. Open **Info** in the project navigation panel, then add the following properties to the [Information Property List](https://help.apple.com/xcode/mac/current/#/dev3f399a2a6): + + | Key | Type | Value | + |------------------------------|--------|------------------------| + | NSMicrophoneUsageDescription | String | Access the microphone. | + | NSCameraUsageDescription | String | Access the camera. | + + + 2. Add sandbox and runtime capabilities to your project: + + Open the target for your project in the project navigation properties, then add the following capabilities in **Signing & Capabilities**. + - **App Sandbox**: + - Incoming Connections (Server) + - Outgoing Connections (Client) + - Camera + - Audio Input + - **Hardened Runtime**: + - Camera + - Audio Input + + +1. Integrate into your project: + + These steps are for package install, if you prefer to use **CocoaPods** or manually install, follow the [installation instructions](../reference/downloads#manual-installation). + + 1. In Xcode, click **File** > **Add Packages**, then paste the following link in the search: + + ``` + https://github.com/AgoraIO/AgoraRtcEngine_macOS.git + ``` + + + ``` + https://github.com/AgoraIO/AgoraRtcEngine_iOS.git + ``` + + You see the available packages. Add the **** package and any other functionality that you want to integrate into your app. For example, _AgoraAINoiseSuppressionExtension_. Choose a version later than 4.0.0. + + 1. Click **Add Package**. In the new window, click **Add Package**. + + You see **AgoraRtcKit** in **Package Dependencies** for your project. + + + + diff --git a/shared/common/project-setup/unity.mdx b/shared/common/project-setup/unity.mdx new file mode 100644 index 000000000..ee489df11 --- /dev/null +++ b/shared/common/project-setup/unity.mdx @@ -0,0 +1,21 @@ + +1. **Create a Unity project**: + + 1. In Unity Hub, select **Projects**, then click **New Project**. + + 1. In **All templates**, select **3D**. Set the **Project name** and **Location**, then click **Create Project**. + + 1. In **Projects**, double-click the project you created. Your project opens in Unity. + + +1. **Integrate **: + 1. Go to [SDKs](/sdks), download the latest version of the Agora , and unzip the downloaded SDK to a local folder. + + 1. In Unity, click **Assets > Import Package > Custom Package**. + + 1. Navigate to the package and click *Open*. + + 1. In **Import Unity Package**, click **Import**. + + + \ No newline at end of file diff --git a/shared/common/project-setup/web.mdx b/shared/common/project-setup/web.mdx new file mode 100644 index 000000000..515d82599 --- /dev/null +++ b/shared/common/project-setup/web.mdx @@ -0,0 +1,37 @@ + + + 1. Create a new project using [Vite](https://vitejs.dev/) + + 1. Open a terminal window and navigate to the directory where you want to create the project. + + 2. Execute the following command in the terminal: + + ``` bash + npm create vite@latest agora_project --template vanilla + ``` + + When prompted to select a framework, choose `Vanilla` and when prompted to select a variant, choose `JavaScript`. + A directory named *agora\_project* is created which contains the project files. We will update the following files in the directory: + + - *index.html*: The visual interface with the user. + + - *main.js*: The programmable interface used to implement the logic. + + 2. Install the dependencies: + + In the terminal, navigate to the *agora\_project* directory, and execute the following command. + + ``` bash + npm install + ``` + + 3. Install the : + + Execute the following command in the terminal to download and install the . + + ``` bash + npm i agora-rtc-sdk-ng + ``` + + These steps are for package install, if you prefer to manually install, follow the [installation instructions](../reference/downloads#manual-installation). + diff --git a/shared/common/project-setup/windows.mdx b/shared/common/project-setup/windows.mdx new file mode 100644 index 000000000..45ca55387 --- /dev/null +++ b/shared/common/project-setup/windows.mdx @@ -0,0 +1,59 @@ + + + +1. **Create an MFC dialog-based application** + + 1. From the main menu, choose **File** > **New** > **Project**. + + 1. Enter **MFC** into the search box and then choose **MFC App** from the result list. + + 1. In **Project Name**, input `AgoraImplementation` and press **Create** to open the **MFC Application Wizard**. + + 1. In **MFC Application**, under **Application type** , select **Dialog based**, then click **Finish**. Your project opens in Visual Studio. + + +1. **Integrate ** + + To integrate the into your project. + + 1. Unzip the latest version of [](/sdks) in a local directory. + + 1. Copy the `sdk` directory of the downloaded SDK package to the root of your project, ``. + + 1. Create a new directory `/Debug`. + + 1. Copy the files from `` to `/Debug`. + + +1. **Configure your project properties** + + Right-click the project name In **Solution Explorer**, then click **Properties** to configure the following project properties, and click **OK**. + + 1. In **AgoraImplementation Property pages**, select **Win32** from the **Platform** dropdown list. + + If your targeted platform is `x64`, then select `x64` from the **Platform** dropdown list. + + 1. Go to the **C/C++** > **General** > **Additional Include Directories** menu, click **Edit**, and input the following string in the pop-up window: + + ``` + ;$(SolutionDir)sdk\high_level_api\include;$(ProjectDir) + ``` + + 1. Go to the **Linker** > **General** > **Additional Library Directories** menu, click **Edit**, and input the following string in the pop-up window: + + ``` + ;$(SolutionDir)sdk\x86; + ``` + The sample code uses `x86` platform. If you are using `x64`, then add the following string to the **Additional Library Directories** field: + + ``` + ;$(SolutionDir)sdk\x86_64 + ``` + + 1. Go to the **C/C++** > **Preprocessor** > **Preprocessor Definitions** menu, click **Edit**, and input the following path in the pop-up window: + + ``` + ;_CRT_SECURE_NO_WARNINGS + ``` + + \ No newline at end of file diff --git a/shared/variables/platform.js b/shared/variables/platform.js index 75c4bbaff..014f41744 100644 --- a/shared/variables/platform.js +++ b/shared/variables/platform.js @@ -30,7 +30,11 @@ const data = { PATH: 'react-native', CLIENT: 'app' }, - + 'react-js': { + NAME: 'ReactJS', + PATH: 'react-js', + CLIENT: 'app' + }, 'electron': { NAME: 'Electron', PATH: 'electron', diff --git a/shared/video-sdk/_get-started-sdk.mdx b/shared/video-sdk/_get-started-sdk.mdx index e863a5e61..07679ddfc 100644 --- a/shared/video-sdk/_get-started-sdk.mdx +++ b/shared/video-sdk/_get-started-sdk.mdx @@ -1,6 +1,6 @@ import * as data from '@site/data/variables'; import Prerequisites from '@docs/shared/common/prerequities.mdx'; -import ProjectSetup from '@docs/shared/video-sdk/get-started/get-started-sdk/project-setup/index.mdx'; +import ProjectSetup from '@docs/shared/common/project-setup/index.mdx'; import ProjectImplement from '@docs/shared/video-sdk/get-started/get-started-sdk/project-implementation/index.mdx'; import ProjectTest from '@docs/shared/video-sdk/get-started/get-started-sdk/project-test/index.mdx'; import Reference from '@docs/shared/video-sdk/get-started/get-started-sdk/reference/index.mdx'; diff --git a/shared/video-sdk/_get-started-uikit.mdx b/shared/video-sdk/_get-started-uikit.mdx index d65688db4..904f58574 100644 --- a/shared/video-sdk/_get-started-uikit.mdx +++ b/shared/video-sdk/_get-started-uikit.mdx @@ -9,9 +9,8 @@ import NoUIKit from '@docs/shared/common/no-uikit.mdx'; makes it easy to add to your in minutes. is an Open Source project that includes best practices for business logic, as well as a pre-built video UI. Every piece is customizable, so the developer has full control over how the video call looks and feels. - - - + + This page outlines the minimum code you need to integrate high-quality, low-latency functionality into your with a customizable UI. @@ -73,5 +72,4 @@ This section contains information that completes the information in this page, o - To ensure communication security in a test or production environment, use a token server to generate token is recommended to ensure communication security, see [Implement the authentication workflow](../get-started/authentication-workflow). - - + diff --git a/shared/video-sdk/get-started/get-started-sdk/project-implementation/index.mdx b/shared/video-sdk/get-started/get-started-sdk/project-implementation/index.mdx index c08dcd230..3cc84fd32 100644 --- a/shared/video-sdk/get-started/get-started-sdk/project-implementation/index.mdx +++ b/shared/video-sdk/get-started/get-started-sdk/project-implementation/index.mdx @@ -3,6 +3,7 @@ import Ios from './ios.mdx'; import MacOs from './macos.mdx'; import Web from './web.mdx'; import ReactNative from './react-native.mdx'; +import ReactJS from './react-js.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; import Unity from './unity.mdx'; @@ -13,6 +14,7 @@ import Windows from './windows.mdx'; + diff --git a/shared/video-sdk/get-started/get-started-sdk/project-implementation/react-js.mdx b/shared/video-sdk/get-started/get-started-sdk/project-implementation/react-js.mdx new file mode 100644 index 000000000..3fd445b3f --- /dev/null +++ b/shared/video-sdk/get-started/get-started-sdk/project-implementation/react-js.mdx @@ -0,0 +1,103 @@ + + +The following figure shows the API call sequence. + + +![Interface](/images/video-sdk/video-call-logic-web.png) + + +![Interface](/images/video-sdk/ils-call-logic-web.svg) + + +Best practice is to separate the workflows from your UI implementation. The + sample +project implements the business logic in the `AgoraManager` object. This class encapsulates the +`agoraEngine`, an instance of the`AgoraRTC`, and core functionality such as logging in to , + joining a channel, listening for events from other users and logging out. + +The following code examples show how to implement these steps in your : + +1. Create the variables you need to handle , local hardware and tracks, remote tracks and user events: + + ```javascript + const [agoraEngine, setAgoraEngine] = useState(null); + const [microphoneAndCameraTracks, setMicrophoneAndCameraTracks] = useState(null); + const [localAudioTrack, setLocalAudioTrack] = useState(null); + const [localVideoTrack, setLocalVideoTrack] = useState(null); + const [remoteVideoTrack, setRemoteVideoTrack] = useState(null); + const [remoteUid, setRemoteUid] = useState(null); + const [joined, setJoined] = useState(false); + const [showVideo, setShowVideo] = useState(false); + ``` + +1. Create an instance of the , handle local hardware and remote tracks: + + ```javascript + const setupVideoSDKEngine = async () => + { + const engine = AgoraRTC.createClient({ mode: "rtc", codec: "vp8" }); + const tracks = await AgoraRTC.createMicrophoneAndCameraTracks(); + if(engine && tracks) + { + setAgoraEngine(engine); + setMicrophoneAndCameraTracks(tracks); + engine.on("user-published", async (user, mediaType) => { + await engine.subscribe(user, mediaType); + if (mediaType === "video") { + setRemoteVideoTrack(user.videoTrack); + setRemoteUid(user.uid); + } + }); + + engine.on("user-unpublished", (user, mediaType) => { + if (mediaType === "video" && user.uid === remoteUid) { + setRemoteVideoTrack(null); + setRemoteUid(null); + } + }); + } + return engine; + }; + + ``` + +1. Join and leave a channel: + + ```javascript + const joinCall = async () => { + try { + await agoraEngine.join(appId, channelName, token, 0); + setLocalAudioTrack(microphoneAndCameraTracks[0]); + setLocalVideoTrack(microphoneAndCameraTracks[1]); + await agoraEngine.publish([microphoneAndCameraTracks[0], microphoneAndCameraTracks[1]]); + setJoined(true); + setShowVideo(true); + } catch (error) { + console.error("Failed to join or publish:", error); + } + }; + + const leaveCall = async () => { + try { + await agoraEngine.unpublish([localAudioTrack, localVideoTrack]); + await agoraEngine.leave(); + setJoined(false); + setShowVideo(false); + } catch (error) { + console.error("Failed to unpublish or leave:", error); + } + }; + ``` + + 1. Setup your `AgoraManager` instance with your security information: + + ```javascript + const GetStartedComponent = (props) => { + const agoraManager = AgoraManager({ + appId: props.appId || appId, + channelName: props.channelName || channelName, + token: props.token || token + }); + const [initialized, setInitialized] = useState(false); + ``` + \ No newline at end of file diff --git a/shared/video-sdk/get-started/get-started-sdk/project-setup/react-js.mdx b/shared/video-sdk/get-started/get-started-sdk/project-setup/react-js.mdx new file mode 100644 index 000000000..b6728a09a --- /dev/null +++ b/shared/video-sdk/get-started/get-started-sdk/project-setup/react-js.mdx @@ -0,0 +1,5 @@ + + + + + \ No newline at end of file diff --git a/shared/video-sdk/get-started/get-started-sdk/project-test/index.mdx b/shared/video-sdk/get-started/get-started-sdk/project-test/index.mdx index a722df263..4a5a0125c 100644 --- a/shared/video-sdk/get-started/get-started-sdk/project-test/index.mdx +++ b/shared/video-sdk/get-started/get-started-sdk/project-test/index.mdx @@ -3,6 +3,7 @@ import Ios from './ios.mdx'; import MacOs from './macos.mdx' import Web from './web.mdx'; import ReactNative from './react-native.mdx' +import ReactJS from './react-js.mdx' import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; import Unity from './unity.mdx'; @@ -13,6 +14,7 @@ import Windows from './windows.mdx'; + diff --git a/shared/video-sdk/get-started/get-started-sdk/project-test/react-js.mdx b/shared/video-sdk/get-started/get-started-sdk/project-test/react-js.mdx new file mode 100644 index 000000000..f147c1dcb --- /dev/null +++ b/shared/video-sdk/get-started/get-started-sdk/project-test/react-js.mdx @@ -0,0 +1,14 @@ + + +3. In `src/App.js`, update the values of `appID`, `channelName`, and `token` with the values for your temporary token. + +1. Open Terminal in the root directory of the cloned repository and run the following command: + ```terminal + npm start + ``` + + The project opens in your default browser. + +1. Select an item from the dropdown to test the sample code. + + \ No newline at end of file diff --git a/shared/video-sdk/get-started/get-started-sdk/reference/react-js.mdx b/shared/video-sdk/get-started/get-started-sdk/reference/react-js.mdx new file mode 100644 index 000000000..b6728a09a --- /dev/null +++ b/shared/video-sdk/get-started/get-started-sdk/reference/react-js.mdx @@ -0,0 +1,5 @@ + + + + + \ No newline at end of file From 9024111e43577d2fb404c86685510ed9c4e490e8 Mon Sep 17 00:00:00 2001 From: billy-the-fish Date: Fri, 28 Jul 2023 18:30:35 +0200 Subject: [PATCH 002/184] Last change. --- .../reference/api-reference/index.mdx | 3 + .../react-js/components-en.react.mdx | 262 ++++++++ .../react-js/data-types-en.react.mdx | 49 ++ .../api-reference/react-js/hooks-en.react.mdx | 611 ++++++++++++++++++ .../api-reference/react-js/index.mdx | 19 + video-calling/reference/api-reference.mdx | 19 + 6 files changed, 963 insertions(+) create mode 100644 shared/video-sdk/reference/api-reference/index.mdx create mode 100644 shared/video-sdk/reference/api-reference/react-js/components-en.react.mdx create mode 100644 shared/video-sdk/reference/api-reference/react-js/data-types-en.react.mdx create mode 100644 shared/video-sdk/reference/api-reference/react-js/hooks-en.react.mdx create mode 100644 shared/video-sdk/reference/api-reference/react-js/index.mdx create mode 100644 video-calling/reference/api-reference.mdx diff --git a/shared/video-sdk/reference/api-reference/index.mdx b/shared/video-sdk/reference/api-reference/index.mdx new file mode 100644 index 000000000..0efba9773 --- /dev/null +++ b/shared/video-sdk/reference/api-reference/index.mdx @@ -0,0 +1,3 @@ +import ReactJS from './react-js/index.mdx' + + \ No newline at end of file diff --git a/shared/video-sdk/reference/api-reference/react-js/components-en.react.mdx b/shared/video-sdk/reference/api-reference/react-js/components-en.react.mdx new file mode 100644 index 000000000..6810d9c7d --- /dev/null +++ b/shared/video-sdk/reference/api-reference/react-js/components-en.react.mdx @@ -0,0 +1,262 @@ + +## Components + +### AgoraRTCProvider + +This component is a [context provider](https://react.dev/learn/passing-data-deeply-with-context), which lets all of the components inside `children` read the `client` prop you pass. + +#### Props + +| Prop | Type | Default value | Description | +| ---------- | ----------------- | ------------- | ----------------------------------------------------------------------------------------------------------------------------------------- | +| `client` | [`IAgoraRTCClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcclient.html) | None | Created using the Web SDK's [`IAgoraRTC.createClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartc.html#createclient) method. | +| `children` | `ReactNode` | None | The React nodes to be rendered. + +#### Sample code + +```jsx +import { AgoraRTCProvider } from "agora-rtc-react"; +import AgoraRTC from "agora-rtc-sdk-ng"; + +function App({ children }) { + const [client] = useState(() => AgoraRTC.createClient({ mode: "rtc", codec: "vp8" })); + return {children}; +} +``` + +### AgoraRTCScreenShareProvider + +This component is a [context provider](https://react.dev/learn/passing-data-deeply-with-context), which lets all of the components inside `children` read the `client` prop you pass for screen sharing. + +#### Props + +| Prop | Type | Default value | Description | +| ---------- | ----------------- | ------------- | ----------------------------------------------------------------------------------------------------------------------------------------- | +| `client` | [`IAgoraRTCClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcclient.html) | None | Created using the Web SDK's [`IAgoraRTC.createClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartc.html#createclient) method. | +| `children` | `ReactNode` | None | The React nodes to be rendered. + +#### Caveats + +You can use `AgoraRTCScreenShareProvider` and `AgoraRTCProvider` together, but they do not share the `client` prop. + +#### Sample code + +```jsx +import { AgoraRTCScreenShareProvider } from "agora-rtc-react"; +import AgoraRTC from "agora-rtc-sdk-ng"; + +function App({ children }) { + const [client] = useState(() => AgoraRTC.createClient({ mode: "rtc", codec: "vp8" })); + return {children}; +} +``` + +### LocalAudioTrack + +This component plays the local audio track using the playback device selected by the user in the browser. + +If you need the capability to set the microphone device, use the Web SDK's [`IMicrophoneAudioTrack.setDevice`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/imicrophoneaudiotrack.html#setdevice) method. + +#### Props + +| Prop | Type | Default value | Description | +| ---------- | ------------------ | ------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| `track` | `ILocalAudioTrack` | None | The local audio track to be played. Call [`useLocalMicrophoneTrack`](#uselocalmicrophonetrack) to create a local audio track. | +| `play` | `boolean` | `false` |
  • `true`: Play the track.
  • `false`: Stop playing the track.
  • | +| `volume` | `number` | None | The volume. The value ranges from 0 (mute) to 1000 (maximum). A value of 100 is the original volume. When set to above 100, the SDK applies volume amplification using the [Web Audio API](https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API). | +| `disabled` | `boolean` | `false` |
  • `true`: Disable the track. When disabled, the SDK stops playing and publishing the track.
  • `false`: Enable the track.
  • | +| `muted` | `boolean` | `false` |
  • `true`: Pause sending media data of the track.
  • `false`: Resume sending media data of the track.
  • | +| `children` | `ReactNode` | None | The React nodes to be rendered. | + +#### Caveats + +Setting the `disabled` and `muted` prop invokes the Web SDK's [`setEnabled`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/ilocaltrack.html#setenabled) method and [`setMuted`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/ilocaltrack.html#setmuted) method, respectively. Therefore: + +- Compared to setting the `disabled` prop, setting the `muted` prop has faster response time and does not affect the audio capture state. +- Do not set `muted` and `disabled` together. + +#### Sample code + +```jsx +import { LocalAudioTrack, useLocalAudioTrack } from "agora-rtc-react"; + +function App() { + const audioTrack = useLocalAudioTrack(); + return ; +} +``` + +### LocalVideoTrack + +This component plays the local video track using the playback device selected by the user in the browser. + +If you need the capability to set the camera device, use the Web SDK's [`ICameraVideoTrack.setDevice`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/icameravideotrack.html#setdevice) method. + +#### Props + +| Prop | Type | Default value | Description | +| ---------- | ------------------ | ------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| `track` | `ILocalVideoTrack` | None | The local video track to be played. Call [`useLocalCameraTrack`](#uselocalcameratrack) or the Web SDK's [`IAgoraRTC.createScreenVideoTrack`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartc.html#createscreenvideotrack) method to create a local video track. | +| `play` | `boolean` | `false` |
  • `true`: Play the track.
  • `false`: Stop playing the track.
  • | +| `disabled` | `boolean` | `false` |
  • `true`: Disable the track. When disabled, the SDK will stop playing and publishing the track.
  • `false`: Enable the track.
  • | +| `muted` | `boolean` | `false` |
  • `true`: Pause sending media data of the track.
  • `false`: Resume sending media data of the track.
  • | + +#### Caveats + +Setting the `disabled` and `muted` prop invokes the Web SDK's [`setEnabled`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/ilocaltrack.html#setenabled) method and [`setMuted`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/ilocaltrack.html#setmuted) method, respectively. Therefore: + +- Compared to setting the `disabled` prop, setting the `muted` prop has faster response time and does not affect the audio capture state. +- Do not set `muted` and `disabled` together. + +#### Sample code + +```jsx +import { LocalVideoTrack, useLocalCameraTrack } from "agora-rtc-react"; + +function App() { + const videoTrack = useLocalCameraTrack(); + return ; +} +``` + +### LocalUser + +This component plays the camera video track and the microphone audio track of the local user using the playback devices selected by the user in the browser. + +When the video track stops playing, this component shows a cover image. + +If you need the capability to set the microphone or camera device, use the Web SDK's [`IMicrophoneAudioTrack.setDevice`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/imicrophoneaudiotrack.html#setdevice) or [`ICameraVideoTrack.setDevice`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/icameravideotrack.html#setdevice) method. + +#### Props + +| Prop | Type | Default value | Description | +| ------------ | ----------------------- | ------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| `micOn` | `boolean` | `false` |
  • `true`: Enable the local user's microphone.
  • `false`: Disable the local user's microphone.
  • | +| `cameraOn` | `boolean` | `false` |
  • `true`: Enable the local user's camera.
  • `false`: Disable the local user's camera.
  • | +| `audioTrack` | `IMicrophoneAudioTrack` | None | The microphone audio track to be played, which can be created by calling [`useLocalMicrophoneTrack`](#uselocalmicrophonetrack). | +| `videoTrack` | `ICameraVideoTrack` | None | The camera video track to be played, which can be created by calling [`useLocalCameraTrack`](#uselocalcameratrack). | +| `playAudio` | `boolean` | `false` |
  • `true`: Play the local user's audio track.
  • `false`: Stop playing the local user's audio track.
  • | +| `playVideo` | `boolean` | `false` |
  • `true`: Play the local user's video track.
  • `false`: Stop playing the local user's video track.
  • | +| `volume` | `number` | None | The volume. The value ranges from 0 (mute) to 1000 (maximum). A value of 100 is the original volume. When set to above 100, the SDK applies volume amplification using the [Web Audio API](https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API). | +| `cover` | `string` | None | The cover image to be displayed when `playVideo` is `false`, replacing the video frame. You can pass the URL of an online image or the relative path of a local image. | +| `children` | `ReactNode` | None | The React nodes to be rendered. | + +#### Sample code + +```jsx +import { LocalUser, useLocalAudioTrack, useLocalCameraTrack } from "agora-rtc-react"; + +function App() { + const audioTrack = useLocalAudioTrack(); + const videoTrack = useLocalCameraTrack(); + + return ( + + ); +} +``` + +### RemoteAudioTrack + +This component plays the audio track of a remote user with the playback device you specify. + +#### Props + +| Prop | Type | Default value | Description | +| ------------------ | ------------------- | ------ | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| `track` | [`IRemoteAudioTrack`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iremoteaudiotrack.html) | None | The remote audio track to be played. | +| `play` | `boolean` | None |
  • `true`: Play the track.
  • `false`: Stop playing the track.
  • | +| `playbackDeviceId` | `string` | None | The ID of the playback device, such as a speaker. The device ID can be obtained using [`IAgoraRTC.getPlaybackDevices`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartc.html#getplaybackdevices). This property is only supported in the desktop version of Chrome browser. Modifying the value of this property in other browsers throws a `NOT_SUPPORTED` error. | +| `volume` | `number` | None | The volume. The value ranges from 0 (mute) to 100 (the original volume). | +| `children` | `ReactNode` | None | The React nodes to be rendered. | + +#### Sample code + +```jsx +import { RemoteAudioTrack, useJoin, useRemoteAudioTracks, useRemoteUsers } from "agora-rtc-react"; + +function App() { + const remoteUsers = useRemoteUsers(); + const audioTracks = useRemoteAudioTracks(remoteUsers); + + return ( + <> + {audioTracks.map(track => ( + + ))} + + ); +} +``` + +### RemoteVideoTrack + +This component plays the video track of a remote user and does not support specifying the playback device. + +#### Props + +| Prop | Type | Default value | Description | +| -------- | ------------------- | ------------- | -------------------------------------------------------------------------------------------------------------------------------------------------- | +| `track` | [`IRemoteVideoTrack`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iremotevideotrack.html) | None | The remote video track object. | +| `play` | `boolean` | None |
  • `true`: Play the track.
  • `false`: Stop playing the track.
  • | + +#### Sample code + +```jsx +import { RemoteAudioTrack, useJoin, useRemoteAudioTracks, useRemoteUsers } from "agora-rtc-react"; + +function App() { + const remoteUsers = useRemoteUsers(); + const audioTracks = useRemoteAudioTracks(remoteUsers); + + return ( + <> + {audioTracks.map(track => ( + + ))} + + ); +} +``` + +### RemoteUser + +This component plays the video and audio tracks of a remote user and supports specifying the audio device to use. Specifying the video playback device is not supported. + +#### Props + +| Prop | Type | Default value | Description | +| ------------------ | ----------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| `user` | [`IAgoraRTCRemoteUser`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcremoteuser.html) | None | The remote user object. | +| `playVideo` | `boolean` | The value of [user.hasVideo](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcremoteuser.html#hasvideo) |
  • `true`: Play the video track of the remote user.
  • `false`: Stop playing the video track of the remote user.
  • | +| `playAudio` | `boolean` | The value of [user.hasAudio](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcremoteuser.html#hasaudio) |
  • `true`: Play the audio track of the remote user.
  • `false`: Stop playing the audio track of the remote user.
  • | +| `playbackDeviceId` | `string` | None | The ID of the playback device, such as a speaker. The device ID can be obtained using [`IAgoraRTC.getPlaybackDevices`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartc.html#getplaybackdevices). This property is only supported in the desktop version of the Chrome browser. Modifying the value of this property in other browsers throws a `NOT_SUPPORTED` error. | +| `volume` | `number` | None | The volume. The value ranges from 0 (mute) to 100 (the original volume). | +| `cover` | `string` | `() => ReactNode` | None | The cover image or custom component to be displayed when `playVideo` is `false`, replacing the video frame. You can pass the URL of an online image or the relative path of a local image. | +| `children` | `ReactNode` | None | The React nodes to be rendered. | + +#### Sample code + +```jsx +import { RemoteUser, useRemoteUsers } from "agora-rtc-react"; + +function App() { + const remoteUsers = useRemoteUsers(); + + return ( + <> + {remoteUsers.map(user => ( + + ))} + + ); +} +``` diff --git a/shared/video-sdk/reference/api-reference/react-js/data-types-en.react.mdx b/shared/video-sdk/reference/api-reference/react-js/data-types-en.react.mdx new file mode 100644 index 000000000..a199dc357 --- /dev/null +++ b/shared/video-sdk/reference/api-reference/react-js/data-types-en.react.mdx @@ -0,0 +1,49 @@ + +## Interfaces and classes + +This page provides descriptions for all Interfaces and Classes. + +### NetworkQuality + +The last-mile network quality. + + +| Property | Type | Required | Description | +|----------|--------------|----------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| +| `uplink` | `0` | `1` | `2` | `3` | `4` | `5` | `6` | Yes | The uplink network quality. It is calculated based on the uplink transmission bitrate, uplink packet loss rate, RTT (round-trip time) and jitter.
    • 0: The quality is unknown.
    • 1: The quality is excellent.
    • 2: The quality is good, but the bitrate is less than optimal.
    • 3: Users experience slightly impaired communication.
    • 4: Users can communicate with each other, but not very smoothly.
    • 5: The quality is so poor that users can barely communicate.
    • 6: The network is disconnected and users cannot communicate.
    | +| `downlink` | `0` | `1` | `2` | `3` | `4` | `5` | `6` | Yes | The downlink network quality. It is calculated based on the uplink transmission bitrate, uplink packet loss rate, RTT (round-trip time) and jitter.
    • 0: The quality is unknown.
    • 1: The quality is excellent.
    • 2: The quality is good, but the bitrate is less than optimal.
    • 3: Users experience slightly impaired communication.
    • 4: Users can communicate with each other, but not very smoothly.
    • 5: The quality is so poor that users can barely communicate.
    • 6: The network is disconnected and users cannot communicate.
    | +| `delay` | `number` | Yes | The average Round-Trip Time (RTT) from the SDK to the Agora edge server, measured in milliseconds (ms). | + +### JoinOptions + +Parameters used to join a channel. + + +| Property | Type | Required | Description | +| -------- | ---------------- | -------- | ---- | +| `appid` | `string` | Yes | The App ID of your Agora project. | +| `channel` | `string` | Yes | The name of the channel to join. See [`IAgoraRTCClient.join`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcclient.html#join) for details. | +| `token` | `string` | `null` | Yes | The token used for authentication. If token-based authentication is enabled for your project, a valid token must be provided. If token-based authentication is not enabled, you can pass `null`. See [`IAgoraRTCClient.join`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcclient.html#join) for details. | +| `uid` | `UID` |`null` | No | The user ID. If not provided, the Agora server assigns a number `uid` for you. See [`IAgoraRTCClient.join`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcclient.html#join) for details. | + +### AgoraRTCReactError + +Thrown errors. + +`AgoraRTCReactError` extends the browser's [Error object](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error). When you directly print the `AgoraRTCReactError` object, you can see the error message. + +| Property | Type | Required | Description | +| -------- | ---------------- | -------- | ---- | +| `rtcMethod` | `string` | Yes | The Web SDK method that throws the error, which helps you determine the corresponding Hook where the error occurred. See the table below. | + +**Mapping of `rtcMethod` to Hooks** + +| `rtcMethod` | Corresponding Hook | +| ------------------------------------------------------------ | ----------------------- | +| `"IAgoraRTCClient.join"` | `useJoin` | +| `"IAgoraRTC.createCameraVideoTrack" ` | `useLocalCameraTrack` | +| `"IAgoraRTC.createMicrophoneAudioTrack"` | `useLocalMicrophoneTrack` | +| `"IAgoraRTCClient.publish"` | `usePublish` | +| `"IAgoraRTCClient.unsubscribe"` or `"IAgoraRTCClient.subscribe"` | `useRemoteUserTrack` | +| `"IAgoraRTCClient.unsubscribe"` or `"IAgoraRTCClient.subscribe"` or `"IAgoraRTCClient.massUnsubscribe"` | `useRemoteVideoTracks ` | + diff --git a/shared/video-sdk/reference/api-reference/react-js/hooks-en.react.mdx b/shared/video-sdk/reference/api-reference/react-js/hooks-en.react.mdx new file mode 100644 index 000000000..9b0ab6506 --- /dev/null +++ b/shared/video-sdk/reference/api-reference/react-js/hooks-en.react.mdx @@ -0,0 +1,611 @@ + +## Hooks + +### useConnectionState + +Returns the detailed connection state of the SDK. + +#### Parameters + +| Parameter | Type | Required | Description | +| --------- | ---------------- | -------- | ---- | +| `client` | [`IAgoraRTCClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcclient.html) | `null` | No | Created using the Web SDK's [`IAgoraRTC.createClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartc.html#createclient) method. | + +#### Returns + +| Type | Description | +| ----------------- | ---------------------| +| `ConnectionState` | The connection state between the SDK and Agora's edge server. See [`ConnectionState`](https://api-ref.agora.io/en/voice-sdk/web/4.x/globals.html#connectionstate) for details. | + +#### Sample code + +```jsx +import { useConnectionState } from "agora-rtc-react"; + +function App() { + const connectionState = useConnectionState(); + + return
    {connectionState}
    ; +} +``` + +### useIsConnected + +Returns whether the SDK is connected to Agora's server. + +#### Parameters + +| Parameter | Type | Required | Description | +| --------- | ---------------- | -------- | ---- | +| `client` | [`IAgoraRTCClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcclient.html) | `null` | No | Created using the Web SDK's [`IAgoraRTC.createClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartc.html#createclient) method. | + +#### Returns + +| Type | Description | +| ----------------- | ---------------------| +| `boolean` |
  • `true`: The SDK is connected to the server.
  • `false`: The SDK is not connected to the server.
  • | + +#### Sample code + +```jsx +import { useIsConnected } from "agora-rtc-react"; + +function App() { + const isConnected = useIsConnected(); + + return
    {isConnected}
    ; +} +``` + +### useCurrentUID + +Returns the current user ID. + +#### Parameters + +| Parameter | Type | Required | Description | +| --------- | ---------------- | -------- | ---- | +| `client` | [`IAgoraRTCClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcclient.html) | `null` | No | Created using the Web SDK's [`IAgoraRTC.createClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartc.html#createclient) method. | + +#### Returns + +| Type | Description | +| ----------------- | ---------------------| +| `UID` | `undefined` | The user ID of the current user. If the current user has not joined any channel, `undefined` is returned. | + +#### Sample code + +```jsx +import { useCurrentUID } from "agora-rtc-react"; + +function App() { + const uid = useCurrentUID(); + + return
    {uid}
    ; +} +``` + +### useNetworkQuality + +Returns the network quality of the local user. + +#### Parameters + +| Parameter | Type | Required | Description | +| --------- | ---------------- | -------- | ---- | +| `client` | [`IAgoraRTCClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcclient.html) | `null` | No | Created using the Web SDK's [`IAgoraRTC.createClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartc.html#createclient) method. | + +#### Returns + +| Type | Description | +| ----------------- | ---------------------| +| `NetworkQuality` | The network quality of the local user. See [`NetworkQuality`](#networkquality) for details. | + +#### Sample code + +```jsx +import { useNetworkQuality } from "agora-rtc-react"; + +function App() { + const networkQuality = useNetworkQuality(); + + return
    {networkQuality}
    ; +} +``` + +### useVolumeLevel + +Returns the volume level of an audio track at a frequency of once per second. + +#### Parameters + +| Parameter | Type | Required | Description | +| -------- | ---------------- | -------- | ---- | +| `audioTrack` | `IRemoteAudioTrack` | `ILocalAudioTrack` | `undefined` | No | The local or remote audio track. The local audio track can be created by calling [`useLocalMicrophoneTrack`](#uselocalmicrophonetrack). If undefined, the volume level is 0. | + +#### Returns + +| Type | Description | +| -------- | ----------------------------------- | +| `number` | The volume level. The value range is [0,1]. 1 is the highest volume level. Usually a user with a volume level above 0.6 is a speaking user. | + +#### Sample code + +```jsx +import { useVolumeLevel, useLocalMicrophoneTrack } from "agora-rtc-react"; + +function App() { + const audioTrack = useLocalMicrophoneTrack(); + const volumeLevel = useVolumeLevel(audioTrack); + + return
    {volumeLevel}
    ; +} +``` + +### useRTCClient + +Returns the `IAgoraRTCClient` object. + +#### Parameters + +| Parameter | Type | Required | Description | +| --------- | ---------------- | -------- | ---- | +| `client` | [`IAgoraRTCClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcclient.html) | `null` | No | If provided, the passed `IAgoraRTCClient` object is returned. If not provided, the `IAgoraRTCClient` object obtained from the [parent component's context](./components#agorartcprovider) is returned. | + +#### Returns + +| Type | Description | +| ----------------- | ------------------------ | +| [`IAgoraRTCClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcclient.html) | The `IAgoraRTCClient` client. | + +#### Sample code + +```jsx +import { useRTCClient } from "agora-rtc-react"; + +function App() { + const client = useRTCClient(); + + return <>; +} +``` + +### useJoin + +This hook lets a user automatically join a channel when the component is ready and automatically leaves the channel when the component is unmounted. + +You can customize the conditions required to join a channel using `fetchArgs`. For example, generating a token and other asynchronous operations can be performed before joining the channel. + +#### Parameters + +| Parameter | Type | Required | Description | +| -------- | ---------------- | -------- | ---- | +| `fetchArgs` | `JoinOptions` | `(() => Promise)` | Required | The parameters or asynchronous function required to join the channel. See [`JoinOptions`](#joinoptions) for details. | +| `ready` | `boolean` | Optional | Whether the user is ready to join the channel. The default value is `true`. | +| `client` | [`IAgoraRTCClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcclient.html) | No | Created using the Web SDK's [`IAgoraRTC.createClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartc.html#createclient) method. | + +#### Returns + +Returns an object containing the following properties: + +| Property | Type | Description | +| --------- | ---------------------- | ---------------------------------------------- | +| `data` | `UID` | The user ID if the user successfully joins the channel. If you does not specify a `uid` when passing `fetchArgs`, the default value `0` is returned. | +| `isLoading` | `boolean`|
  • `true`: The hook is performing operations related to joining the channel.
  • `false`: The hook completes operations related to joining the channel, but it does not indicate a successful result.
  • | +| `isConnected` | `boolean` |
  • `true`: The SDK is connected to the server, indicating that the user successfully joins the channel.
  • `false`: The SDK is not connected to the server.
  • | +| `error` | `AgoraRTCReactError` | `null` | Returns `null` if the user successfully joins the channel, otherwise throws an error. See [`AgoraRTCReactError`](#agorartcreacterror) for details. | + +#### Sample code + +```jsx +import { useJoin } from "agora-rtc-react"; + +function App() { + // Example: passing a function as first argument + // useJoin(async () => { + // Fetch the token before joining the channel. Note that the data type of getData must be fetchArgs + // const getData = await getToken(); + // return getData; + // }, calling); + + useJoin( + { + appid: YOUR_APPID, + channel: YOUR_CHANNEL, + token: YOUR_TOKEN, + }, + ready, + ); + + return <>; +} +``` + +### usePublish + +This hook lets you publish the local tracks when the component is ready and unpublish them when the component is unmounted. + +#### Parameters + +| Parameter | Type | Required | Description | +| -------- | ---------------- | -------- | ---- | +| `tracks` | `(ILocalTrack` | `null)[]` | Yes | The list of local tracks. | +| `readyToPublish` | `boolean` | No | Whether the local tracks are ready to publish. The default value is `true`. | +| `client` | [`IAgoraRTCClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcclient.html) | `null` | No | Created using the Web SDK's [`IAgoraRTC.createClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartc.html#createclient) method. | + +#### Returns + +Returns an object containing the following properties: + +| Property | Type | Description | +| ----------------------| ---------------------- | ------------------------ | +| `isLoading` | `boolean` |
  • `true`: The hook is performing operations related to publishing the tracks.
  • `false`: The hook completes operations related to publishing the tracks, but it does not indicate a successful result.
  • | +| `error` | `AgoraRTCReactError` | `null` | Returns `null` if the tracks are successfully published, otherwise throws an error. See [`AgoraRTCReactError`](#agorartcreacterror) for details. | + +#### Sample code + +```jsx +import { useLocalMicrophoneTrack, useLocalCameraTrack, usePublish } from "agora-rtc-react"; + +function App() { + // get audioTrack and videoTrack before publish + const audioTrack = useLocalMicrophoneTrack(); + const videoTrack = useLocalCameraTrack(); + usePublish([audioTrack, videoTrack]); + + return <>; +} +``` + +### useLocalMicrophoneTrack + +This hook lets you create a local microphone audio track. + +- The hook can only create the audio track once before the component is destroyed. +- After the component is unmounted, the audio track created by the hook stops publishing. + +#### Parameters + +| Parameter | Type | Required | Description | +| -------- | ---------------- | -------- | ---- | +| `ready` | `boolean` | No | Whether it is ready to create the track. The default value is `true`. | +| `audioTrackConfig` | `MicrophoneAudioTrackInitConfig` | No | Configurations for initializing the microphone audio track. The default is `{ ANS: true, AEC: true }`. See [`MicrophoneAudioTrackInitConfig`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/microphoneaudiotrackinitconfig.html) for details. | +| `client` | [`IAgoraRTCClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcclient.html) | `null` | No | Created using the Web SDK's [`IAgoraRTC.createClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartc.html#createclient) method. | + +#### Returns + +Returns an object containing the following properties: + +| Property | Type | Description | +| ----------------------| ---------------------- | ------------------------ | +| `localMicrophoneTrack` | `IMicrophoneAudioTrack` | `null` | The created microphone audio track. | +| `isLoading` | `boolean` |
  • `true`: The hook is performing operations related to publishing the tracks.
  • `false`: The hook completes operations related to publishing the tracks, but it does not indicate a successfully result.
  • | +| `error` | `AgoraRTCReactError` | `null` | Returns `null` if the track is successfully created, otherwise throws an error. See [`AgoraRTCReactError`](#agorartcreacterror) for details. | + +#### Sample code + +```jsx +import { useLocalMicrophoneTrack } from "agora-rtc-react"; + +function App() { + const audioTrack = useLocalMicrophoneTrack(true, { ANS: true, AEC: true }); + + return <>; +} +``` + +### useLocalCameraTrack + +This hook lets you create a local camera video track. + +- The hook can only create the video track once before the component is destroyed. +- After the component is unmounted, the video track created by the hook stops publishing. + +#### Parameters + +| Parameter | Type | Required | Description | +| -------- | ---------------- | -------- | ---- | +| `ready` | `boolean` | No | Whether it is ready to create the track. The default value is `true`. | +| `client` | [`IAgoraRTCClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcclient.html) | `null` | No | Created using the Web SDK's [`IAgoraRTC.createClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartc.html#createclient) method. | + +#### Returns + +Returns an object containing the following properties: + +| Property | Type | Description | +| ----------------------| ---------------------- | ------------------------ | +| `localCameraTrack` | `ICameraVideoTrack` | `null` | The created camera video track. | +| `isLoading` | `boolean` |
  • `true`: The hook is performing operations related to publishing the tracks.
  • `false`: The hook completes operations related to publishing the tracks, but it does not indicate a successfully result.
  • | +| `error` | `AgoraRTCReactError` | `null` | Returns `null` if the tracks is successfully created, otherwise throws an error. See [`AgoraRTCReactError`](#agorartcreacterror) for details. | + +#### Sample code + +```jsx +import { useLocalCameraTrack } from "agora-rtc-react"; + +function App() { + const audioTrack = useLocalCameraTrack(); + + return <>; +} +``` + +### useRemoteVideoTracks + +This hook lets you automatically subscribe to and retrieve remote users' video tracks. + +- When the component is unmounted, the hook stops subscribing to the video tracks of the specified `users`. +- The hook updates the subscribed video tracks when the `users` parameter changes. + +#### Parameters + +| Parameter | Type | Required | Description | +| -------- | ---------------- | -------- | ---- | +| `users` | `IAgoraRTCRemoteUser[]` | `undefined` | Yes | The list of remote users. | +| `client` | [`IAgoraRTCClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcclient.html) | `null` | No | Created using the Web SDK's [`IAgoraRTC.createClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartc.html#createclient) method. | + +#### Returns + +Returns an object containing the following properties: + +| Property | Type | Description | +| ----------------------| ---------------------- | ------------------------ | +| `videoTracks` | `IRemoteVideoTrack[]` | The list of subscribed video tracks from remote users. | +| `isLoading` | `boolean` |
  • `true`: The hook is performing operations related to subscribing the tracks.
  • `false`: The hook completes operations related to subscribing the tracks, but it does not indicate a successfully result.
  • | +| `error` | `AgoraRTCReactError` | `null` | Returns `null` if the tracks are successfully subscribed, otherwise throws an error. See [`AgoraRTCReactError`](#agorartcreacterror) for details. | + +#### Sample code + +```jsx +import { useRemoteUsers, useRemoteVideoTracks } from "agora-rtc-react"; + +function App() { + const remoteUsers = useRemoteUsers(); + const videoTracks = useRemoteVideoTracks(remoteUsers); + + return <>; +} +``` + +### useRemoteAudioTracks + +This hook lets you automatically subscribe to and retrieve remote users' audio tracks. + +- When the component is unmounted, the hook stops subscribing to the audio tracks of the specified `users`. +- The hook updates the subscribed audio tracks when the `users` parameter changes. + +#### Parameters + +| Parameter | Type | Required | Description | +| -------- | ---------------- | -------- | ---- | +| `users` | `IAgoraRTCRemoteUser[]` | `undefined` | Yes | The list of remote users. | +| `client` | [`IAgoraRTCClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcclient.html) | `null` | No | Created using the Web SDK's [`IAgoraRTC.createClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartc.html#createclient) method. | + +#### Returns + +Returns an object containing the following properties: + +| Property | Type | Description | +| ----------------------| ---------------------- | ------------------------ | +| `audioTracks` | `IRemoteAudioTrack[]` | The list of subscribed audio tracks from remote users. | +| `isLoading` | `boolean` |
  • `true`: The hook is performing operations related to subscribing the tracks.
  • `false`: The hook completes operations related to subscribing the tracks, but it does not indicate a successfully result.
  • | +| `error` | `AgoraRTCReactError` | `null` | Returns `null` if the tracks are successfully subscribed, otherwise throws an error. See [`AgoraRTCReactError`](#agorartcreacterror) for details. | + +#### Sample code + +```jsx +import { useRemoteUsers, useRemoteVideoTracks } from "agora-rtc-react"; + +function App() { + //get remote user list + const remoteUsers = useRemoteUsers(); + const videoTracks = useRemoteVideoTracks(remoteUsers); + + return <>; +} +``` + +### useRemoteUserTrack + +This hook lets you retrieve the audio or video track of a remote user. + +#### Parameters + +| Parameter | Type | Required | Description | +| -------- | ---------------- | -------- | ---- | +| `user` | `IAgoraRTCRemoteUser` | `undefined` | Yes | The remote user. | +| `mediaType` | `"video"` | `"audio"` | Yes | The media type. Pass `"video"` or `"audio"`. | +| `client` | [`IAgoraRTCClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcclient.html) | `null` | No | Created using the Web SDK's [`IAgoraRTC.createClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartc.html#createclient) method. | + +#### Returns + +Returns an object containing the following properties: + +| Property | Type | Description | +| ----------------------| ---------------------- | ------------------------ | +| `track` | `IRemoteVideoTrack` | `IRemoteAudioTrack` | `undefined` | The audio or video track of the remote user (depending on the `mediaType` you specify). If the remote user or track does not exist, `undefined` is returned. | +| `isLoading` | `boolean` |
  • `true`: The hook is performing operations related to retrieving the tracks.
  • `false`: The hook completes operations related to retrieving the tracks, but it does not indicate a successful result.
  • | +| `error` | `AgoraRTCReactError` | `null` | Returns `null` if the track is successfully retrieved, otherwise throws an error. See [`AgoraRTCReactError`](#agorartcreacterror) for details.| + +#### Sample code + +```jsx +import { useRemoteUsers, useRemoteUserTrack } from "agora-rtc-react"; + +function App() { + const remoteUsers = useRemoteUsers(); + + const videoTrack = useRemoteUserTrack(remoteUsers[0], "video"); + const audioTrack = useRemoteUserTrack(remoteUsers[0], "audio"); + + return <>; +} +``` + +### useRemoteUsers + +This hook lets you retrieve the list of remote users. + +The return value of this hook is updated in the following cases: +- When a remote user joins or leaves the channel. +- When the role of a remote user changes (for example, from broadcaster to audience). +- When a remote user publishes or unpublishes the audio or video track. + +#### Parameters + +| Parameter | Type | Required | Description | +| --------- | ---------------- | -------- | ---- | +| `client` | [`IAgoraRTCClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcclient.html) | `null` | No | Created using the Web SDK's [`IAgoraRTC.createClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartc.html#createclient) method. | + +#### Returns + +| Type | Description | +| ----------------------- | -------------- | +| `IAgoraRTCRemoteUser[]` | The list of remote users. | + +#### Sample code + +```jsx +import { useRemoteUsers } from "agora-rtc-react"; + +function App() { + const remoteUsers = useRemoteUsers(); + + return <>; +} +``` + +### useAutoPlayVideoTrack + +This hook lets you automatically play a local or remote video track. + +- When the component is mounted, the hook determines whether to automatically play the track according to the `play` parameter. +- When the component is unmounted, the hook stops playing the `track`. + +#### Parameters + +| Parameter | Type | Required | Description | +| -------- | ---------------- | -------- | ---- | +| `track` | `IRemoteVideoTrack` | `ILocalVideoTrack` | Yes | The local or remote video track. | +| `play` | `boolean` | No |
  • `true`: Play the track.
  • `false`: Stop playing the track.
  • | +| `div` | `HTMLElement` | `null` | No | The HTML element used to render the video track. The video automatically plays within this element only if `play` is `true` and `div` is provided. Otherwise, the video does not play automatically. | + +#### Returns + +None. + +#### Sample code + +```jsx +import { useAutoPlayVideoTrack, useLocalCameraTrack } from "agora-rtc-react"; + +function App() { + const videoTrack = useLocalCameraTrack(); + useAutoPlayVideoTrack(track, play, div); + + return <>; +} +``` + +### useAutoPlayAudioTrack + +This hook lets you automatically play a local or remote audio track. + +- When the component is mounted, the hook determines whether to automatically play the track according to the `play` parameter. +- When the component is unmounted, the hook stops playing the `track`. + +#### Parameters + +| Parameter | Type | Required | Description | +| -------- | ---------------- | -------- | ---- | +| `track` | `IRemoteAudioTrack` | `ILocalAudioTrack` | Yes | The local or remote audio track. | +| `play` | `boolean` | No |
  • `true`: Play the track.
  • `false`: Stop playing the track.
  • | + +#### Returns + +None. + +#### Sample code + +```jsx +import { useAutoPlayAudioTrack, useLocalMicrophoneTrack } from "agora-rtc-react"; + +function App() { + const audioTrack = useLocalMicrophoneTrack(); + useAutoPlayAudioTrack(track, play); + + return <>; +} +``` + +### useClientEvent + +This hook lets you listen to specific events of the `IAgoraRTCClient` object. + +- When the component is mounted, the hook registers the corresponding event listener. +- When the component is unmounted, the hook destroys the corresponding event listener. +#### Parameters + +| Parameter | Type | Required | Description | +| -------- | ---------------- | -------- | ---- | +| `client` | [`IAgoraRTCClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcclient.html) | Yes | Created using the Web SDK's [`IAgoraRTC.createClient`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartc.html#createclient) method. | +| `event` | `string` | Yes | The event name. Supported values can be found in [`IAgoraRTCClient.on`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcclient.html?platform=All%20Platforms#on). | +| `listener` | `Function` | Yes | The callback function to run when the event is triggered. Supported values can be found in [`IAgoraRTCClient.on`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iagorartcclient.html?platform=All%20Platforms#on). | + +#### Returns + +None. + +#### Sample code + +```jsx +import { useRTCClient, useClientEvent } from "agora-rtc-react"; + +function App() { + const client = useRTCClient(); + useClientEvent(client, "connection-state-change", () => {}); + + return <>; +} +``` + +### useTrackEvent + +This hook lets you listen to specific events of the local or remote track. + +- When the component is mounted, the hook registers the corresponding event listener. +- When the component is unmounted, the hook destroys the corresponding event listener. + +#### Parameters + +| Parameter | Type | Required | Description | +| -------- | ---------------- | -------- | ---- | +| `track` | `ITrack` | Yes | The local or remote track object. | +| `event` | `string` | Yes | The event name. | +| `listener` | `Function` | Yes | The callback function to run when the event is triggered. | + +Different `track` objects support different `event` and `listener` combinations. The supported combinations are as follows: + +| `track` | `event` and `listener` | +| -------- | ---------------- | +| `ILocalTrack` | `ILocalVideoTrack` | `null` | See [`ILocalTrack.on`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/ilocaltrack.html#on). | +| `IBufferSourceAudioTrack` | `null` |See [`IBufferSourceAudioTrack.on`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/ibuffersourceaudiotrack.html#on). | +| `ILocalVideoTrack` | `null` | See [`ILocalVideoTrack.on`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/ilocalvideotrack.html#on). | +| `IRemoteTrack` | `null` | See [`IRemoteTrack.on`](https://api-ref.agora.io/en/voice-sdk/web/4.x/interfaces/iremotetrack.html#event_first_frame_decoded). | + +#### Returns + +None. + +#### Sample code + +```jsx +import { useRTCClient, useLocalCameraTrack, useTrackEvent } from "agora-rtc-react"; + +function App() { + const videoTrack = useLocalCameraTrack(); + useTrackEvent(client, "video-element-visible-status", () => {}); + + return <>; +} +``` \ No newline at end of file diff --git a/shared/video-sdk/reference/api-reference/react-js/index.mdx b/shared/video-sdk/reference/api-reference/react-js/index.mdx new file mode 100644 index 000000000..cfa78993e --- /dev/null +++ b/shared/video-sdk/reference/api-reference/react-js/index.mdx @@ -0,0 +1,19 @@ +import Components from './components-en.react.mdx'; +import DataTypes from './data-types-en.react.mdx'; +import Hooks from './hooks-en.react.mdx' + + for ReactJS contains the following objects: + +- Component: parts of the user interface that have their own logic and appearance. You use components to split +your UI into independently reusable pieces of code. You have function and class components. +- Hook: connect external functions such as state and life cycle methods to components. + +If you are not familiar with React, see [React official documentation](https://react.dev/learn). + + for ReactJS is based on for Web. For example, the `useLocalMicrophoneTrack` +Hook, calls createMicrophoneAudioTrack. + + + + + diff --git a/video-calling/reference/api-reference.mdx b/video-calling/reference/api-reference.mdx new file mode 100644 index 000000000..34e246f60 --- /dev/null +++ b/video-calling/reference/api-reference.mdx @@ -0,0 +1,19 @@ +--- +title: 'Video SDK API reference' +sidebar_position: 3 +type: docs +description: > + Links to the API reference for your platform +--- + +import ReactJS from '@docs/shared/video-sdk/reference/api-reference/index.mdx'; + +export const toc = [{}]; + + + See [API reference](/en/api-reference) + + + + + \ No newline at end of file From 891872adb75eaf0397eda98a930a845d8ab48a65 Mon Sep 17 00:00:00 2001 From: billy-the-fish Date: Fri, 28 Jul 2023 19:18:12 +0200 Subject: [PATCH 003/184] Latest changes. --- shared/common/project-setup/react-js.mdx | 10 ++++++---- .../get-started-sdk/project-test/react-js.mdx | 15 +++++++++++---- .../manual-install/index copy.mdx | 19 ------------------- .../manual-install/index.mdx | 2 ++ .../manual-install/react-js.mdx | 14 ++++++++++++++ 5 files changed, 33 insertions(+), 27 deletions(-) delete mode 100644 shared/video-sdk/reference/app-size-optimization/manual-install/index copy.mdx create mode 100644 shared/video-sdk/reference/app-size-optimization/manual-install/react-js.mdx diff --git a/shared/common/project-setup/react-js.mdx b/shared/common/project-setup/react-js.mdx index 1e84a7966..240230e42 100644 --- a/shared/common/project-setup/react-js.mdx +++ b/shared/common/project-setup/react-js.mdx @@ -8,12 +8,14 @@ git clone https://github.com/AgoraIO/video-sdk-samples-reactjs ``` -1. Install the dependencies. Open Terminal in the root directory of the cloned repository and run the following command: +1. Install the dependencies: - ```bash - npm i agora-rtc-react + In Terminal, navigate to `video-sdk-samples-reactjs`, and execute the following command. + + ``` bash + npm install ``` - By default is installed automatically. However, you can also [Install manually](../reference/downloads#through-the-agora-website). + is installed automatically. However, you can also [Install manually](../reference/downloads#through-the-agora-website). diff --git a/shared/video-sdk/get-started/get-started-sdk/project-test/react-js.mdx b/shared/video-sdk/get-started/get-started-sdk/project-test/react-js.mdx index f147c1dcb..b3fb23ee1 100644 --- a/shared/video-sdk/get-started/get-started-sdk/project-test/react-js.mdx +++ b/shared/video-sdk/get-started/get-started-sdk/project-test/react-js.mdx @@ -1,14 +1,21 @@ -3. In `src/App.js`, update the values of `appID`, `channelName`, and `token` with the values for your temporary token. +3. In `src/config.json`, update the values of `appID`, `channelName`, and `token` with the values for your temporary token. + +1. In Terminal run the following command to start a proxy server: + + ```bash + node ./utils/proxy.js + ``` + +1. In another Terminal instance run the following command: -1. Open Terminal in the root directory of the cloned repository and run the following command: ```terminal - npm start + yarn dev ``` The project opens in your default browser. -1. Select an item from the dropdown to test the sample code. +1. In the dropdown, select a sample you want to run and test the code. \ No newline at end of file diff --git a/shared/video-sdk/reference/app-size-optimization/manual-install/index copy.mdx b/shared/video-sdk/reference/app-size-optimization/manual-install/index copy.mdx deleted file mode 100644 index be501f963..000000000 --- a/shared/video-sdk/reference/app-size-optimization/manual-install/index copy.mdx +++ /dev/null @@ -1,19 +0,0 @@ -import Android from './android.mdx'; -import Ios from './ios.mdx'; -import Web from './web.mdx'; -import ReactNative from './react-native.mdx'; -import Electron from './electron.mdx'; -import Flutter from './flutter.mdx'; -import Unity from './unity.mdx'; -import MacOS from './macos.mdx'; -import Windows from './windows.mdx'; - - - - - - - - - - \ No newline at end of file diff --git a/shared/video-sdk/reference/app-size-optimization/manual-install/index.mdx b/shared/video-sdk/reference/app-size-optimization/manual-install/index.mdx index be501f963..381c66782 100644 --- a/shared/video-sdk/reference/app-size-optimization/manual-install/index.mdx +++ b/shared/video-sdk/reference/app-size-optimization/manual-install/index.mdx @@ -2,6 +2,7 @@ import Android from './android.mdx'; import Ios from './ios.mdx'; import Web from './web.mdx'; import ReactNative from './react-native.mdx'; +import ReactJS from './react-js.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; import Unity from './unity.mdx'; @@ -13,6 +14,7 @@ import Windows from './windows.mdx'; + diff --git a/shared/video-sdk/reference/app-size-optimization/manual-install/react-js.mdx b/shared/video-sdk/reference/app-size-optimization/manual-install/react-js.mdx new file mode 100644 index 000000000..cd12979b8 --- /dev/null +++ b/shared/video-sdk/reference/app-size-optimization/manual-install/react-js.mdx @@ -0,0 +1,14 @@ + + + +You use NPM to install . To do this: + +1. Open a terminal window in your project folder and execute the following command: + + ``` sh + npm i agora-rtc-react + ``` + +See [SDKs](../../sdks) for the latest ReactJS downloads. + + From 2c1eec17cd3fcf6b6b3ec66875549cb100b92aa4 Mon Sep 17 00:00:00 2001 From: billy-the-fish Date: Mon, 31 Jul 2023 20:36:50 +0200 Subject: [PATCH 004/184] Update on review. --- .../video-sdk/video-call-logic-reactjs.puml | 35 +++++ .../video-sdk/video-call-logic-reactjs.svg | 1 + shared/common/project-setup/react-js.mdx | 3 +- shared/common/project-test/android.mdx | 49 ++++++ shared/common/project-test/electron.mdx | 22 +++ shared/common/project-test/flutter.mdx | 47 ++++++ shared/common/project-test/index.mdx | 21 +++ shared/common/project-test/ios.mdx | 7 + shared/common/project-test/macos.mdx | 8 + shared/common/project-test/react-js.mdx | 32 ++++ shared/common/project-test/react-native.mdx | 61 ++++++++ shared/common/project-test/swift.mdx | 56 +++++++ shared/common/project-test/unity.mdx | 21 +++ shared/common/project-test/web.mdx | 37 +++++ shared/common/project-test/windows.mdx | 59 ++++++++ shared/video-sdk/_authentication-workflow.mdx | 3 + shared/video-sdk/_get-started-sdk.mdx | 7 +- .../project-implementation/index.mdx | 2 + .../project-implementation/react-js.mdx | 62 ++++++++ .../project-test/index.mdx | 2 + .../project-test/react-js.mdx | 28 ++++ .../reference/index.mdx | 2 + .../reference/react-js.mdx | 7 + .../project-test/index.mdx | 2 + .../cloud-proxy/project-test/index.mdx | 2 + .../project-test/index.mdx | 2 + .../project-test/index.mdx | 2 + .../project-test/index.mdx | 2 + .../develop/geofencing/project-test/index.mdx | 2 + .../project-test/index.mdx | 2 + .../develop/play-media/project-test/index.mdx | 2 + .../product-workflow/project-test/index.mdx | 2 + .../project-test/index.mdx | 2 + .../project-implementation/react-js.mdx | 143 ++++++++---------- .../get-started-sdk/project-test/react-js.mdx | 7 +- .../get-started-sdk/reference/index.mdx | 2 + .../get-started-sdk/reference/react-js.mdx | 3 +- 37 files changed, 663 insertions(+), 84 deletions(-) create mode 100644 assets/images/video-sdk/video-call-logic-reactjs.puml create mode 100644 assets/images/video-sdk/video-call-logic-reactjs.svg create mode 100644 shared/common/project-test/android.mdx create mode 100644 shared/common/project-test/electron.mdx create mode 100644 shared/common/project-test/flutter.mdx create mode 100644 shared/common/project-test/index.mdx create mode 100644 shared/common/project-test/ios.mdx create mode 100644 shared/common/project-test/macos.mdx create mode 100644 shared/common/project-test/react-js.mdx create mode 100644 shared/common/project-test/react-native.mdx create mode 100644 shared/common/project-test/swift.mdx create mode 100644 shared/common/project-test/unity.mdx create mode 100644 shared/common/project-test/web.mdx create mode 100644 shared/common/project-test/windows.mdx create mode 100644 shared/video-sdk/authentication-workflow/project-implementation/react-js.mdx create mode 100644 shared/video-sdk/authentication-workflow/project-test/react-js.mdx create mode 100644 shared/video-sdk/authentication-workflow/reference/react-js.mdx diff --git a/assets/images/video-sdk/video-call-logic-reactjs.puml b/assets/images/video-sdk/video-call-logic-reactjs.puml new file mode 100644 index 000000000..8757e6e33 --- /dev/null +++ b/assets/images/video-sdk/video-call-logic-reactjs.puml @@ -0,0 +1,35 @@ +@startuml video-call-logic-web +!include agora_skin.iuml + +actor "User" as USR + +box "Your app" + +participant "Video SDK" as APP + +end box + +box "Agora" + +participant "SD-RTN™" as API + +end box + +USR -> APP: Open App +APP -> APP: Setup app to handle local hardware and streaming. +group User +USR -> APP: Start call +APP -> APP: Create the agoraEngine\nconst agoraEngine = useRTCClient(AgoraRTC.createClient +APP -> APP: Retrieve authentication token to join channel +APP -> API: Join a channel:\n useJoin +API -> APP : Join accepted +APP -> APP: Create local media tracks :\nconst { isLoading: isLoadingCam, localCameraTrack } = useLocalCameraTrack();\nconst { isLoading: isLoadingMic, localMicrophoneTrack } = useLocalMicroph +APP -> API: Push local media tracks to the channel:\n usePublish([localMicrophoneTrack, localCameraTrack]); +API -> APP: Retrieve streaming from the other user: \n +API <-> APP: Receive and send data streams +end +USR -> APP: Leave call +APP -> API: leave the channel:\n \n useJoin + + +@enduml diff --git a/assets/images/video-sdk/video-call-logic-reactjs.svg b/assets/images/video-sdk/video-call-logic-reactjs.svg new file mode 100644 index 000000000..89d111cde --- /dev/null +++ b/assets/images/video-sdk/video-call-logic-reactjs.svg @@ -0,0 +1 @@ +Your appAgoraUserUserVideo SDKVideo SDKSD-RTN™SD-RTN™Open AppSetup app to handle local hardware and streaming.UserStart callCreate the agoraEngineconst agoraEngine = useRTCClient(AgoraRTC.createClientRetrieve authentication token to join channelJoin a channel:useJoinJoin acceptedCreate local media tracks :const { isLoading: isLoadingCam, localCameraTrack } = useLocalCameraTrack();const { isLoading: isLoadingMic, localMicrophoneTrack } = useLocalMicrophPush local media tracks to the channel:usePublish([localMicrophoneTrack, localCameraTrack]);Retrieve streaming from the other user:<RemoteUser user={remoteUser} playVideo={true} playAudio={true} />Receive and send data streamsLeave callleave the channel: useJoin \ No newline at end of file diff --git a/shared/common/project-setup/react-js.mdx b/shared/common/project-setup/react-js.mdx index 240230e42..a9e4c3f7e 100644 --- a/shared/common/project-setup/react-js.mdx +++ b/shared/common/project-setup/react-js.mdx @@ -1,8 +1,7 @@ -1. Clone the [ sample code repository](https://github.com/AgoraIO/video-sdk-samples-reactjs) to - `` in your development environment: +1. Clone the [ reference app](https://github.com/AgoraIO/video-sdk-samples-reactjs) to your development environment: ```bash git clone https://github.com/AgoraIO/video-sdk-samples-reactjs diff --git a/shared/common/project-test/android.mdx b/shared/common/project-test/android.mdx new file mode 100644 index 000000000..d856c9aa3 --- /dev/null +++ b/shared/common/project-test/android.mdx @@ -0,0 +1,49 @@ + + + 1. In Android Studio, create a new **Phone and Tablet**, **Java** [Android project](https://developer.android.com/studio/projects/create-project) with an **Empty Activity**. + + After creating the project, Android Studio automatically starts gradle sync. Ensure that the sync succeeds before you continue. + + 2. Integrate the into your Android project: + + These steps are for package install, if you prefer to manually install, follow the [installation instructions](../reference/downloads#manual-installation). + + 1. In `/Gradle Scripts/build.gradle (Module: .app)`, add the following line under `dependencies`: + + ``` groovy + dependencies { + ... + implementation 'io.agora.rtc::' + ... + } + ``` + + 2. Replace `` and `` with appropriate values for the latest release. For example, `io.agora.rtc:full-sdk:4.0.1`. + + You can obtain the latest `` and `` information using [Maven Central Repository Search](https://search.maven.org/search?q=io.agora.rtc). + + 3. Add permissions for network and device access. + + In `/app/Manifests/AndroidManifest.xml`, add the following permissions after ``: + + ``` java + + + + + + + + + + + + + ``` + + 4. To prevent obfuscating the code in , add the following line to `/Gradle Scripts/proguard-rules.pro`: + + ``` java + -keep class io.agora.**{*;} + ``` + diff --git a/shared/common/project-test/electron.mdx b/shared/common/project-test/electron.mdx new file mode 100644 index 000000000..23cb0aff0 --- /dev/null +++ b/shared/common/project-test/electron.mdx @@ -0,0 +1,22 @@ + + + 1. Take the following steps to setup a new Electron project: + + 1. Open a terminal window and navigate to the directory where you want to create the project. + + 2. Execute the following command in the terminal: + + ``` bash + git clone https://github.com/electron/electron-quick-start + ``` + This command clones the Electron quick-start project that you use to implement . + + 2. Install the + + Open a terminal window in your project folder and execute the following command to download and install the . + + ``` bash + npm i agora-electron-sdk + ``` + Make sure the path to your project folder does not contain any spaces. This might cause error during the installation. + diff --git a/shared/common/project-test/flutter.mdx b/shared/common/project-test/flutter.mdx new file mode 100644 index 000000000..d91a3b496 --- /dev/null +++ b/shared/common/project-test/flutter.mdx @@ -0,0 +1,47 @@ + + +1. **Set up a Flutter environment for a project** + + In the terminal, run the following command: + + ```bash + flutter doctor + ``` + Flutter checks your development device and helps you [set up](https://docs.flutter.dev/get-started/editor) your local development environment. Make sure that your system passes all the checks. + +1. **Create a new Flutter ** + + In the IDE of your choice, create a [Flutter Application project](https://docs.flutter.dev/development/tools/android-studio#creating-a-new-project). + + You can also create a new project from the terminal using the command: + + ```bash + flutter create <--insert project name--> + ``` + +1. **Add to your project** + + Add the following lines to `pubspec.yaml` under dependencies. + + ```yaml + dependencies: + ... + # For x.y.z, fill in a specific SDK version number. For example, 6.0.0 + agora_rtc_engine: ^x.y.z + permission_handler: ^9.2.0 + ... + ``` + + You can get the latest version number from [pub.dev](https://pub.dev/packages/agora_rtc_engine). + +1. **Use the Flutter framework to download dependencies to your ** + + Open a terminal window and execute the following command in the project folder: + + ```bash + flutter pub get + ``` + + + + \ No newline at end of file diff --git a/shared/common/project-test/index.mdx b/shared/common/project-test/index.mdx new file mode 100644 index 000000000..2d8fac272 --- /dev/null +++ b/shared/common/project-test/index.mdx @@ -0,0 +1,21 @@ +import Android from './android.mdx'; +import Ios from './ios.mdx'; +import MacOS from './macos.mdx'; +import Web from './web.mdx'; +import ReactNative from './react-native.mdx'; +import ReactJS from './react-js.mdx'; +import Electron from './electron.mdx'; +import Flutter from './flutter.mdx'; +import Unity from './unity.mdx'; +import Windows from './windows.mdx'; + + + + + + + + + + + diff --git a/shared/common/project-test/ios.mdx b/shared/common/project-test/ios.mdx new file mode 100644 index 000000000..f21f0f8b6 --- /dev/null +++ b/shared/common/project-test/ios.mdx @@ -0,0 +1,7 @@ +import Source from './swift.mdx'; + + + + + + diff --git a/shared/common/project-test/macos.mdx b/shared/common/project-test/macos.mdx new file mode 100644 index 000000000..74e608cdd --- /dev/null +++ b/shared/common/project-test/macos.mdx @@ -0,0 +1,8 @@ +import Source from './swift.mdx'; + + + + + + + \ No newline at end of file diff --git a/shared/common/project-test/react-js.mdx b/shared/common/project-test/react-js.mdx new file mode 100644 index 000000000..6cf666c96 --- /dev/null +++ b/shared/common/project-test/react-js.mdx @@ -0,0 +1,32 @@ + + + +3. In the `video-sdk-samples-reactjs` reference app, open `src/config.json` and set `appID` to the [AppID](https://docs-beta.agora.io/en/video-calling/reference/manage-agora-account?platform=android#get-the-app-id) of your project. + +1. Set the authentication token: + - **Temporary token**: + 1. Set `rtcToken` with the values for your temporary token you use in the web app. + - **Authentication server**: + 1. Set `rtcToken` to an empty string. + 1. Set `serverUrl` to the base URL of your authentication server. For example, `https://agora-token-service-production-1234.up.railway.app`. + 1. Start a proxy server so this web app can make HTTP calls to fetch a token. In a Terminal instance in the reference app root, run the following command: + + ```bash + node ./utils/proxy.js + ``` +1. Start this reference app. + + In Terminal, run the following command: + + ``` bash + yarn dev + ``` + +1. Open the project in your browser. The default URL is http://localhost:5173/. + +1. In the dropdown, select this document and test . + + + + + \ No newline at end of file diff --git a/shared/common/project-test/react-native.mdx b/shared/common/project-test/react-native.mdx new file mode 100644 index 000000000..e45dcb562 --- /dev/null +++ b/shared/common/project-test/react-native.mdx @@ -0,0 +1,61 @@ + + +1. **Setup a React Native environment for project** + + In the terminal, run the following command: + + ```bash + npx react-native init ProjectName --template react-native-template-typescript + ``` + + npx creates a new boilerplate project in the `ProjectName` folder. + + For Android projects, enable the project to use Android SDK. In the `android` folder of your project, set the `sdk.dir` in the `local.properties` file. For example: + + + ```bash + sdk.dir=C:\\PATH\\TO\\ANDROID\\SDK + ``` + +1. **Test the setup** + + Launch your Android or iOS simulator and run your project by executing the following command: + + 1. Run `npx react-native start` in the root of your project to start Metro. + 1. Open another terminal in the root of your project and run `npx react-native run-android` to start the Android app, or run `npx react-native run-ios` to start the iOS app. + + You see your new app running in your Android or iOS simulator. You can also run your project on a physical Android or iOS device. For detailed instructions, see [Running on device](https://reactnative.dev/docs/running-on-device). + +1. **Integrate and configure ** + + To integrate on React Native 0.60.0 or later: + 1. Navigate to the root folder of your project in the terminal and integrate with either: + - npm + + ```bash + npm i --save react-native-agora + ``` + + - yarn + + ```bash + // Install yarn. + npm install -g yarn + // Download the Agora React Native SDK using yarn. + yarn add react-native-agora + ``` + + Do not link native modules manually, React Native 0.60.0 and later support [Autolinking](https://github.com/react-native-community/cli/blob/main/docs/autolinking.md). + + 1. If your target platform is iOS, use CocoaPods to install : + + ```bash + npx pod-install + ``` + + 1. uses Swift in native modules, your project must support compiling Swift. To create `File.swift`: + + 1. In Xcode, open `ios/ProjectName.xcworkspace`. + 1. Click **File > New > File, select iOS > Swift File**, then click **Next > Create** . + + \ No newline at end of file diff --git a/shared/common/project-test/swift.mdx b/shared/common/project-test/swift.mdx new file mode 100644 index 000000000..ad7e41278 --- /dev/null +++ b/shared/common/project-test/swift.mdx @@ -0,0 +1,56 @@ + +1. [Create a new project](https://help.apple.com/xcode/mac/current/#/dev07db0e578) for this using the **App** template. Select the **Storyboard** Interface and **Swift** Language. + + If you have not already added team information, click **Add account…**, input your Apple ID, then click **Next**. + +1. [Enable automatic signing](https://help.apple.com/xcode/mac/current/#/dev23aab79b4) for your project. + + [Set the target devices](https://help.apple.com/xcode/mac/current/#/deve69552ee5) to deploy your iOS to an iPhone or iPad. + +1. Add project permissions for microphone and camera usage: + + 1. Open **Info** in the project navigation panel, then add the following properties to the [Information Property List](https://help.apple.com/xcode/mac/current/#/dev3f399a2a6): + + | Key | Type | Value | + |------------------------------|--------|------------------------| + | NSMicrophoneUsageDescription | String | Access the microphone. | + | NSCameraUsageDescription | String | Access the camera. | + + + 2. Add sandbox and runtime capabilities to your project: + + Open the target for your project in the project navigation properties, then add the following capabilities in **Signing & Capabilities**. + - **App Sandbox**: + - Incoming Connections (Server) + - Outgoing Connections (Client) + - Camera + - Audio Input + - **Hardened Runtime**: + - Camera + - Audio Input + + +1. Integrate into your project: + + These steps are for package install, if you prefer to use **CocoaPods** or manually install, follow the [installation instructions](../reference/downloads#manual-installation). + + 1. In Xcode, click **File** > **Add Packages**, then paste the following link in the search: + + ``` + https://github.com/AgoraIO/AgoraRtcEngine_macOS.git + ``` + + + ``` + https://github.com/AgoraIO/AgoraRtcEngine_iOS.git + ``` + + You see the available packages. Add the **** package and any other functionality that you want to integrate into your app. For example, _AgoraAINoiseSuppressionExtension_. Choose a version later than 4.0.0. + + 1. Click **Add Package**. In the new window, click **Add Package**. + + You see **AgoraRtcKit** in **Package Dependencies** for your project. + + + + diff --git a/shared/common/project-test/unity.mdx b/shared/common/project-test/unity.mdx new file mode 100644 index 000000000..ee489df11 --- /dev/null +++ b/shared/common/project-test/unity.mdx @@ -0,0 +1,21 @@ + +1. **Create a Unity project**: + + 1. In Unity Hub, select **Projects**, then click **New Project**. + + 1. In **All templates**, select **3D**. Set the **Project name** and **Location**, then click **Create Project**. + + 1. In **Projects**, double-click the project you created. Your project opens in Unity. + + +1. **Integrate **: + 1. Go to [SDKs](/sdks), download the latest version of the Agora , and unzip the downloaded SDK to a local folder. + + 1. In Unity, click **Assets > Import Package > Custom Package**. + + 1. Navigate to the package and click *Open*. + + 1. In **Import Unity Package**, click **Import**. + + + \ No newline at end of file diff --git a/shared/common/project-test/web.mdx b/shared/common/project-test/web.mdx new file mode 100644 index 000000000..515d82599 --- /dev/null +++ b/shared/common/project-test/web.mdx @@ -0,0 +1,37 @@ + + + 1. Create a new project using [Vite](https://vitejs.dev/) + + 1. Open a terminal window and navigate to the directory where you want to create the project. + + 2. Execute the following command in the terminal: + + ``` bash + npm create vite@latest agora_project --template vanilla + ``` + + When prompted to select a framework, choose `Vanilla` and when prompted to select a variant, choose `JavaScript`. + A directory named *agora\_project* is created which contains the project files. We will update the following files in the directory: + + - *index.html*: The visual interface with the user. + + - *main.js*: The programmable interface used to implement the logic. + + 2. Install the dependencies: + + In the terminal, navigate to the *agora\_project* directory, and execute the following command. + + ``` bash + npm install + ``` + + 3. Install the : + + Execute the following command in the terminal to download and install the . + + ``` bash + npm i agora-rtc-sdk-ng + ``` + + These steps are for package install, if you prefer to manually install, follow the [installation instructions](../reference/downloads#manual-installation). + diff --git a/shared/common/project-test/windows.mdx b/shared/common/project-test/windows.mdx new file mode 100644 index 000000000..45ca55387 --- /dev/null +++ b/shared/common/project-test/windows.mdx @@ -0,0 +1,59 @@ + + + +1. **Create an MFC dialog-based application** + + 1. From the main menu, choose **File** > **New** > **Project**. + + 1. Enter **MFC** into the search box and then choose **MFC App** from the result list. + + 1. In **Project Name**, input `AgoraImplementation` and press **Create** to open the **MFC Application Wizard**. + + 1. In **MFC Application**, under **Application type** , select **Dialog based**, then click **Finish**. Your project opens in Visual Studio. + + +1. **Integrate ** + + To integrate the into your project. + + 1. Unzip the latest version of [](/sdks) in a local directory. + + 1. Copy the `sdk` directory of the downloaded SDK package to the root of your project, ``. + + 1. Create a new directory `/Debug`. + + 1. Copy the files from `` to `/Debug`. + + +1. **Configure your project properties** + + Right-click the project name In **Solution Explorer**, then click **Properties** to configure the following project properties, and click **OK**. + + 1. In **AgoraImplementation Property pages**, select **Win32** from the **Platform** dropdown list. + + If your targeted platform is `x64`, then select `x64` from the **Platform** dropdown list. + + 1. Go to the **C/C++** > **General** > **Additional Include Directories** menu, click **Edit**, and input the following string in the pop-up window: + + ``` + ;$(SolutionDir)sdk\high_level_api\include;$(ProjectDir) + ``` + + 1. Go to the **Linker** > **General** > **Additional Library Directories** menu, click **Edit**, and input the following string in the pop-up window: + + ``` + ;$(SolutionDir)sdk\x86; + ``` + The sample code uses `x86` platform. If you are using `x64`, then add the following string to the **Additional Library Directories** field: + + ``` + ;$(SolutionDir)sdk\x86_64 + ``` + + 1. Go to the **C/C++** > **Preprocessor** > **Preprocessor Definitions** menu, click **Edit**, and input the following path in the pop-up window: + + ``` + ;_CRT_SECURE_NO_WARNINGS + ``` + + \ No newline at end of file diff --git a/shared/video-sdk/_authentication-workflow.mdx b/shared/video-sdk/_authentication-workflow.mdx index 203ecf440..ac13c73f6 100644 --- a/shared/video-sdk/_authentication-workflow.mdx +++ b/shared/video-sdk/_authentication-workflow.mdx @@ -156,6 +156,9 @@ To ensure that you have implemented token authentication work + + + Your magically connects to the same channel you used in web demo. You don’t need to hardcode a token in your app; each channel is secured with a specific token, and each token is refreshed automatically. That’s pretty cool! diff --git a/shared/video-sdk/_get-started-sdk.mdx b/shared/video-sdk/_get-started-sdk.mdx index 07679ddfc..d1eaf5359 100644 --- a/shared/video-sdk/_get-started-sdk.mdx +++ b/shared/video-sdk/_get-started-sdk.mdx @@ -4,6 +4,7 @@ import ProjectSetup from '@docs/shared/common/project-setup/index.mdx'; import ProjectImplement from '@docs/shared/video-sdk/get-started/get-started-sdk/project-implementation/index.mdx'; import ProjectTest from '@docs/shared/video-sdk/get-started/get-started-sdk/project-test/index.mdx'; import Reference from '@docs/shared/video-sdk/get-started/get-started-sdk/reference/index.mdx'; +import {PlatformWrapper} from "../../../src/mdx-components/PlatformWrapper"; @@ -65,8 +66,12 @@ In order to follow this procedure you must have: ## Project setup + +To install the reference on your development device: + + To integrate into your , do the following: - + You are ready to add features to your . diff --git a/shared/video-sdk/authentication-workflow/project-implementation/index.mdx b/shared/video-sdk/authentication-workflow/project-implementation/index.mdx index 48b167c2d..cde9904f0 100644 --- a/shared/video-sdk/authentication-workflow/project-implementation/index.mdx +++ b/shared/video-sdk/authentication-workflow/project-implementation/index.mdx @@ -2,6 +2,7 @@ import Android from './android.mdx'; import Ios from './ios.mdx'; import Web from './web.mdx'; import ReactNative from './react-native.mdx' +import ReactJS from './react-js.mdx' import Unity from './unity.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; @@ -15,6 +16,7 @@ import LinuxC from './linux-c.mdx'; + diff --git a/shared/video-sdk/authentication-workflow/project-implementation/react-js.mdx b/shared/video-sdk/authentication-workflow/project-implementation/react-js.mdx new file mode 100644 index 000000000..51b35edff --- /dev/null +++ b/shared/video-sdk/authentication-workflow/project-implementation/react-js.mdx @@ -0,0 +1,62 @@ + + + +1. Import the components and hooks you need to handle the authentication workflow: + + ``` typescript + import { useClientEvent, useRTCClient } from "agora-rtc-react"; + ``` + + +1. Retrieve a token from the authentication server + + ``` typescript + async function fetchRTCToken(channelName: string) { + if (config.serverUrl !== "") { + try { + const response = await fetch( + `${config.proxyUrl}${config.serverUrl}/rtc/${channelName}/publisher/uid/${config.uid}/?expiry=${config.tokenExpiryTime}` + ); + const data = await response.json(); + console.log("RTC token fetched from server: ", data.rtcToken); + return data.rtcToken; + } catch (error) { + console.error(error); + throw error; + } + } else { + return config.rtcToken; + } + } + ``` + +1. Handle the event triggered by when the token is about to expire + + A token expires after the `tokenExpiryTime` specified in the call to the token server or after 24 hours, if the + time is not specified. The `useTokenWillExpire` method receives a callback when the current token is about to + expire so that a fresh token may be retrieved and used. + + + ``` typescript + const useTokenWillExpire = () => { + const agoraEngine = useRTCClient(); + useClientEvent(agoraEngine, "token-privilege-will-expire", () => { + if (config.serverUrl !== "") { + fetchRTCToken(config.channelName) + .then((token: string) => { + console.log("RTC token fetched from server: ", token); + return agoraEngine.renewToken(token); + }) + .catch((error) => { + console.error(error); + }); + } else { + console.log("Please make sure you specified the token server URL in the configuration file"); + } + }); + }; + ``` + + + + diff --git a/shared/video-sdk/authentication-workflow/project-test/index.mdx b/shared/video-sdk/authentication-workflow/project-test/index.mdx index 9d30bf9d3..d60fa7e53 100644 --- a/shared/video-sdk/authentication-workflow/project-test/index.mdx +++ b/shared/video-sdk/authentication-workflow/project-test/index.mdx @@ -2,6 +2,7 @@ import Android from './android.mdx'; import Ios from './ios.mdx'; import Web from './web.mdx'; import ReactNative from './react-native.mdx' +import ReactJS from './react-js.mdx'; import Unity from './unity.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; @@ -15,6 +16,7 @@ import LinuxC from './linux-c.mdx'; + diff --git a/shared/video-sdk/authentication-workflow/project-test/react-js.mdx b/shared/video-sdk/authentication-workflow/project-test/react-js.mdx new file mode 100644 index 000000000..4979ce945 --- /dev/null +++ b/shared/video-sdk/authentication-workflow/project-test/react-js.mdx @@ -0,0 +1,28 @@ + + +3. In the `video-sdk-samples-reactjs` reference app, open `src/config.json` and update the following: + + - `appID` - the [AppID](https://docs-beta.agora.io/en/video-calling/reference/manage-agora-account?platform=android#get-the-app-id) of your project. + - `rtcToken` - an empty string. + - `serverUrl` - the base URL of your authentication server. For example, `https://agora-token-service-production-1234.up.railway.app`. + +1. Start a proxy server so this web app can make HTTP calls to fetch a token. In a Terminal instance in the reference app root, run the following command: + + ```bash + node ./utils/proxy.js + ``` + +1. Start this reference app. + + In Terminal, run the following command: + + ``` bash + yarn dev + ``` + +1. Open the project in your browser. The default URL is http://localhost:5173/. + +1. In the dropdown, select this document and test . + + + \ No newline at end of file diff --git a/shared/video-sdk/authentication-workflow/reference/index.mdx b/shared/video-sdk/authentication-workflow/reference/index.mdx index 92cb5b55d..fa1e7d39b 100644 --- a/shared/video-sdk/authentication-workflow/reference/index.mdx +++ b/shared/video-sdk/authentication-workflow/reference/index.mdx @@ -3,12 +3,14 @@ import Ios from './ios.mdx'; import Macos from './macos.mdx'; import Web from './web.mdx'; import ReactNative from './react-native.mdx' +import ReactJS from './react-js.mdx'; import Unity from './unity.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; import LinuxC from './linux-c.mdx'; import Windows from './windows.mdx' + diff --git a/shared/video-sdk/authentication-workflow/reference/react-js.mdx b/shared/video-sdk/authentication-workflow/reference/react-js.mdx new file mode 100644 index 000000000..0aa518dab --- /dev/null +++ b/shared/video-sdk/authentication-workflow/reference/react-js.mdx @@ -0,0 +1,7 @@ + +### API reference + +- Source code: https://github.com/AgoraIO/video-sdk-samples-reactjs/tree/sdk_2.0_updates/src/authentication-workflow +- API reference: [Video SDK for React JS](/en/video-calling/reference/api-reference?platform=react-js) + + \ No newline at end of file diff --git a/shared/video-sdk/develop/audio-and-voice-effects/project-test/index.mdx b/shared/video-sdk/develop/audio-and-voice-effects/project-test/index.mdx index 20aad8e12..edeaf1556 100644 --- a/shared/video-sdk/develop/audio-and-voice-effects/project-test/index.mdx +++ b/shared/video-sdk/develop/audio-and-voice-effects/project-test/index.mdx @@ -2,6 +2,7 @@ import Android from './android.mdx'; import Ios from './ios.mdx'; import Web from './web.mdx'; import ReactNative from './react-native.mdx'; +import ReactJS from '@docs/shared/common/project-test/react-js.mdx'; import Unity from './unity.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; @@ -14,6 +15,7 @@ import Windows from './windows.mdx'; + diff --git a/shared/video-sdk/develop/cloud-proxy/project-test/index.mdx b/shared/video-sdk/develop/cloud-proxy/project-test/index.mdx index 7b1ffffef..b62a31688 100644 --- a/shared/video-sdk/develop/cloud-proxy/project-test/index.mdx +++ b/shared/video-sdk/develop/cloud-proxy/project-test/index.mdx @@ -2,6 +2,7 @@ import Android from './android.mdx'; import Ios from './ios.mdx'; import Web from './web.mdx'; import ReactNative from './react-native.mdx'; +import ReactJS from '@docs/shared/common/project-test/react-js.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; import MacOS from './macos.mdx' @@ -14,6 +15,7 @@ import Windows from './windows.mdx'; + diff --git a/shared/video-sdk/develop/custom-video-and-audio/project-test/index.mdx b/shared/video-sdk/develop/custom-video-and-audio/project-test/index.mdx index f7cd6ae71..b86f3a15d 100644 --- a/shared/video-sdk/develop/custom-video-and-audio/project-test/index.mdx +++ b/shared/video-sdk/develop/custom-video-and-audio/project-test/index.mdx @@ -2,6 +2,7 @@ import Android from './android.mdx'; import Ios from './ios.mdx'; import Web from './web.mdx'; import Reactnative from './react-native.mdx'; +import ReactJS from '@docs/shared/common/project-test/react-js.mdx'; import Electron from './electron.mdx'; import Unity from './unity.mdx'; import MacOS from './macos.mdx'; @@ -13,6 +14,7 @@ import Windows from './windows.mdx'; + diff --git a/shared/video-sdk/develop/encrypt-media-streams/project-test/index.mdx b/shared/video-sdk/develop/encrypt-media-streams/project-test/index.mdx index 32b10774b..bf77b140c 100644 --- a/shared/video-sdk/develop/encrypt-media-streams/project-test/index.mdx +++ b/shared/video-sdk/develop/encrypt-media-streams/project-test/index.mdx @@ -2,6 +2,7 @@ import Android from './android.mdx'; import Ios from './ios.mdx'; import Web from './web.mdx'; import Electron from './electron.mdx'; +import ReactJS from '@docs/shared/common/project-test/react-js.mdx'; import Unity from './unity.mdx'; import Flutter from './flutter.mdx'; import ReactNative from './react-native.mdx' @@ -16,4 +17,5 @@ import Windows from './windows.mdx'; + \ No newline at end of file diff --git a/shared/video-sdk/develop/ensure-channel-quality/project-test/index.mdx b/shared/video-sdk/develop/ensure-channel-quality/project-test/index.mdx index 452d18042..d47bce197 100644 --- a/shared/video-sdk/develop/ensure-channel-quality/project-test/index.mdx +++ b/shared/video-sdk/develop/ensure-channel-quality/project-test/index.mdx @@ -5,6 +5,7 @@ import Unity from './unity.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; import ReactNative from './react-native.mdx'; +import ReactJS from '@docs/shared/common/project-test/react-js.mdx'; import MacOS from './macos.mdx' import Windows from './windows.mdx'; @@ -17,3 +18,4 @@ import Windows from './windows.mdx'; + diff --git a/shared/video-sdk/develop/geofencing/project-test/index.mdx b/shared/video-sdk/develop/geofencing/project-test/index.mdx index df6b4f50d..a8fd2281a 100644 --- a/shared/video-sdk/develop/geofencing/project-test/index.mdx +++ b/shared/video-sdk/develop/geofencing/project-test/index.mdx @@ -3,6 +3,7 @@ import Ios from './ios.mdx'; import Web from './web.mdx'; import Unity from './unity.mdx'; import ReactNative from './react-native.mdx'; +import ReactJS from '@docs/shared/common/project-test/react-js.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; import MacOS from './macos.mdx'; @@ -14,6 +15,7 @@ import Windows from './windows.mdx'; + diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/index.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/index.mdx index 49bccb086..5b2747dae 100644 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/index.mdx +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-test/index.mdx @@ -7,6 +7,7 @@ import Unity from './unity.mdx'; import MacOS from './macos.mdx'; import Windows from './windows.mdx'; import Flutter from './flutter.mdx' +import ReactJS from '@docs/shared/common/project-test/react-js.mdx'; @@ -14,6 +15,7 @@ import Flutter from './flutter.mdx' + diff --git a/shared/video-sdk/develop/play-media/project-test/index.mdx b/shared/video-sdk/develop/play-media/project-test/index.mdx index 1d2425015..057a71a5b 100644 --- a/shared/video-sdk/develop/play-media/project-test/index.mdx +++ b/shared/video-sdk/develop/play-media/project-test/index.mdx @@ -2,6 +2,7 @@ import Android from './android.mdx'; import Ios from './ios.mdx'; import Web from './web.mdx'; import ReactNative from './react-native.mdx'; +import ReactJS from '@docs/shared/common/project-test/react-js.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; import Unity from './unity.mdx'; @@ -14,6 +15,7 @@ import Windows from './windows.mdx'; + diff --git a/shared/video-sdk/develop/product-workflow/project-test/index.mdx b/shared/video-sdk/develop/product-workflow/project-test/index.mdx index 6a9076b7d..6e12e3c31 100644 --- a/shared/video-sdk/develop/product-workflow/project-test/index.mdx +++ b/shared/video-sdk/develop/product-workflow/project-test/index.mdx @@ -4,6 +4,7 @@ import Web from './web.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; import ReactNative from './react-native.mdx'; +import ReactJS from '@docs/shared/common/project-test/react-js.mdx'; import MacOS from './macos.mdx'; import Unity from './unity.mdx'; import Windows from './windows.mdx'; @@ -16,5 +17,6 @@ import Windows from './windows.mdx'; + diff --git a/shared/video-sdk/develop/stream-raw-audio-and-video/project-test/index.mdx b/shared/video-sdk/develop/stream-raw-audio-and-video/project-test/index.mdx index 0d715ff95..ce8dd83a8 100644 --- a/shared/video-sdk/develop/stream-raw-audio-and-video/project-test/index.mdx +++ b/shared/video-sdk/develop/stream-raw-audio-and-video/project-test/index.mdx @@ -6,6 +6,7 @@ import Unity from './unity.mdx'; import MacOS from './macos.mdx'; import Flutter from './flutter.mdx'; import ReactNative from './react-native.mdx'; +import ReactJS from '@docs/shared/common/project-test/react-js.mdx'; import Windows from './windows.mdx' @@ -16,4 +17,5 @@ import Windows from './windows.mdx' + diff --git a/shared/video-sdk/get-started/get-started-sdk/project-implementation/react-js.mdx b/shared/video-sdk/get-started/get-started-sdk/project-implementation/react-js.mdx index 3fd445b3f..50b6fd6d6 100644 --- a/shared/video-sdk/get-started/get-started-sdk/project-implementation/react-js.mdx +++ b/shared/video-sdk/get-started/get-started-sdk/project-implementation/react-js.mdx @@ -3,101 +3,90 @@ The following figure shows the API call sequence. -![Interface](/images/video-sdk/video-call-logic-web.png) +![Interface](/images/video-sdk/video-call-logic-reactjs.svg) ![Interface](/images/video-sdk/ils-call-logic-web.svg) -Best practice is to separate the workflows from your UI implementation. The - sample -project implements the business logic in the `AgoraManager` object. This class encapsulates the -`agoraEngine`, an instance of the`AgoraRTC`, and core functionality such as logging in to , +The reference app implements the business logic in the `agoraManager` component. + This class encapsulates core functionality such as logging in to , joining a channel, listening for events from other users and logging out. The following code examples show how to implement these steps in your : -1. Create the variables you need to handle , local hardware and tracks, remote tracks and user events: +1. Import the components and hooks you need to manage a video call: - ```javascript - const [agoraEngine, setAgoraEngine] = useState(null); - const [microphoneAndCameraTracks, setMicrophoneAndCameraTracks] = useState(null); - const [localAudioTrack, setLocalAudioTrack] = useState(null); - const [localVideoTrack, setLocalVideoTrack] = useState(null); - const [remoteVideoTrack, setRemoteVideoTrack] = useState(null); - const [remoteUid, setRemoteUid] = useState(null); - const [joined, setJoined] = useState(false); - const [showVideo, setShowVideo] = useState(false); + ```typescript + import { + LocalVideoTrack, + RemoteUser, + useJoin, + useLocalCameraTrack, + useLocalMicrophoneTrack, + usePublish, + useRemoteUsers, + } from "agora-rtc-react"; ``` -1. Create an instance of the , handle local hardware and remote tracks: - - ```javascript - const setupVideoSDKEngine = async () => - { - const engine = AgoraRTC.createClient({ mode: "rtc", codec: "vp8" }); - const tracks = await AgoraRTC.createMicrophoneAndCameraTracks(); - if(engine && tracks) - { - setAgoraEngine(engine); - setMicrophoneAndCameraTracks(tracks); - engine.on("user-published", async (user, mediaType) => { - await engine.subscribe(user, mediaType); - if (mediaType === "video") { - setRemoteVideoTrack(user.videoTrack); - setRemoteUid(user.uid); - } - }); - - engine.on("user-unpublished", (user, mediaType) => { - if (mediaType === "video" && user.uid === remoteUid) { - setRemoteVideoTrack(null); - setRemoteUid(null); - } - }); - } - return engine; - }; +1. Handle the device hardware used to communication with : - ``` + ```typescript + const { isLoading: isLoadingCam, localCameraTrack } = useLocalCameraTrack(); + const { isLoading: isLoadingMic, localMicrophoneTrack } = useLocalMicrophoneTrack(); + const remoteUsers = useRemoteUsers(); + usePublish([localMicrophoneTrack, localCameraTrack]); + ``` -1. Join and leave a channel: +1. Use these objects to stream to and from : + ```typescript + const deviceLoading = isLoadingMic || isLoadingCam; + if (deviceLoading) return
    Loading devices...
    ; - ```javascript - const joinCall = async () => { - try { - await agoraEngine.join(appId, channelName, token, 0); - setLocalAudioTrack(microphoneAndCameraTracks[0]); - setLocalVideoTrack(microphoneAndCameraTracks[1]); - await agoraEngine.publish([microphoneAndCameraTracks[0], microphoneAndCameraTracks[1]]); - setJoined(true); - setShowVideo(true); - } catch (error) { - console.error("Failed to join or publish:", error); - } - }; - - const leaveCall = async () => { - try { - await agoraEngine.unpublish([localAudioTrack, localVideoTrack]); - await agoraEngine.leave(); - setJoined(false); - setShowVideo(false); - } catch (error) { - console.error("Failed to unpublish or leave:", error); - } - }; + return ( +
    +
    + +
    + {remoteUsers.map((remoteUser) => ( +
    + +
    + ))} +
    + ); ``` - 1. Setup your `AgoraManager` instance with your security information: +1. Enable your user to join a channel: - ```javascript - const GetStartedComponent = (props) => { - const agoraManager = AgoraManager({ - appId: props.appId || appId, - channelName: props.channelName || channelName, - token: props.token || token + ```typescript + useJoin({ + appid: config.appId, + channel: config.channelName, + token: config.rtcToken, + uid: config.uid, }); - const [initialized, setInitialized] = useState(false); + ``` + +1. Create an instance of the : + + ```typescript + const client = useRTCClient(AgoraRTC.createClient({ codec: "vp8", mode: "rtc" })); ``` + +1. Join and leave a channel: + + ```javascript + {!joined ? + ( + + ) : + ( + + + + + )} + ``` +
    \ No newline at end of file diff --git a/shared/video-sdk/get-started/get-started-sdk/project-test/react-js.mdx b/shared/video-sdk/get-started/get-started-sdk/project-test/react-js.mdx index b3fb23ee1..1156260e4 100644 --- a/shared/video-sdk/get-started/get-started-sdk/project-test/react-js.mdx +++ b/shared/video-sdk/get-started/get-started-sdk/project-test/react-js.mdx @@ -1,6 +1,7 @@ -3. In `src/config.json`, update the values of `appID`, `channelName`, and `token` with the values for your temporary token. +3. In the `video-sdk-samples-reactjs` reference app, open `src/config.json` and update the values of `appID`, + `channelName`, and `rtcToken` with the values for your temporary token. 1. In Terminal run the following command to start a proxy server: @@ -14,8 +15,8 @@ yarn dev ``` - The project opens in your default browser. +1. Open the project in your browser. The default URL is http://localhost:5173/. -1. In the dropdown, select a sample you want to run and test the code. +1. In the dropdown, select the Get Started app and test . \ No newline at end of file diff --git a/shared/video-sdk/get-started/get-started-sdk/reference/index.mdx b/shared/video-sdk/get-started/get-started-sdk/reference/index.mdx index feb51890c..e283bba68 100644 --- a/shared/video-sdk/get-started/get-started-sdk/reference/index.mdx +++ b/shared/video-sdk/get-started/get-started-sdk/reference/index.mdx @@ -3,6 +3,7 @@ import Ios from './ios.mdx'; import MacOs from './macos.mdx'; import Web from './web.mdx'; import ReactNative from './react-native.mdx' +import ReactJS from './react-js.mdx'; import Unity from './unity.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; @@ -14,6 +15,7 @@ import Windows from './windows.mdx'; + diff --git a/shared/video-sdk/get-started/get-started-sdk/reference/react-js.mdx b/shared/video-sdk/get-started/get-started-sdk/reference/react-js.mdx index b6728a09a..dd7c2bc00 100644 --- a/shared/video-sdk/get-started/get-started-sdk/reference/react-js.mdx +++ b/shared/video-sdk/get-started/get-started-sdk/reference/react-js.mdx @@ -1,5 +1,6 @@ - +- Source code: https://github.com/AgoraIO/video-sdk-samples-reactjs/tree/main/src/sdk_quickstart +- API reference: [Video SDK for React JS](/en/video-calling/reference/api-reference?platform=react-js) \ No newline at end of file From 6b83aeeb7a19592f3ff00ccdb86badeb18f91137 Mon Sep 17 00:00:00 2001 From: billy-the-fish Date: Tue, 1 Aug 2023 15:42:01 +0200 Subject: [PATCH 005/184] call quality guide. --- shared/video-sdk/_authentication-workflow.mdx | 8 +- .../develop/_ensure-channel-quality.mdx | 16 ++- .../project-implementation/index.mdx | 2 + .../project-implementation/react-js.mdx | 127 ++++++++++++++++++ 4 files changed, 146 insertions(+), 7 deletions(-) create mode 100644 shared/video-sdk/develop/ensure-channel-quality/project-implementation/react-js.mdx diff --git a/shared/video-sdk/_authentication-workflow.mdx b/shared/video-sdk/_authentication-workflow.mdx index ecf5313b9..06b22fdc0 100644 --- a/shared/video-sdk/_authentication-workflow.mdx +++ b/shared/video-sdk/_authentication-workflow.mdx @@ -95,9 +95,9 @@ In the project you implemented, the use This section shows you how to deploy a token server on a cloud platform. -1. Start deploying the token server to your cloud platform: - The cloud platform retrieves the project code and necessary files from Github, then takes you to the - **Deployment** page. +1. Start deploying the token server to your cloud platform. + Click your cloud platform below. The cloud platform retrieves the project code and necessary files from Github, +then takes you to the **Deployment** page. - [Render](https://render.com/deploy?repo=https://github.com/AgoraIO-Community/agora-token-service) - [Railway](https://railway.app/new/template/NKYzQA?referralCode=waRWUT) - [Heroku](https://www.heroku.com/deploy/?template=https://github.com/AgoraIO-Community/agora-token-service) @@ -133,7 +133,7 @@ This section shows you how to deploy a token server on a cloud platform. /rtc/:channelName/:role/:tokentype/:uid/?expiry=expireTime ``` - For example: `https://agora-token-server-l2yj.onrender.com/rtc/MyChannel/1/uid/1/?expiry=300` + For example: `https://agora-token-server-.onrender.com/rtc/MyChannel/1/uid/1/?expiry=300` Your token server returns a JSON object containing an encrypted token: diff --git a/shared/video-sdk/develop/_ensure-channel-quality.mdx b/shared/video-sdk/develop/_ensure-channel-quality.mdx index fa3f91fb9..a21071126 100644 --- a/shared/video-sdk/develop/_ensure-channel-quality.mdx +++ b/shared/video-sdk/develop/_ensure-channel-quality.mdx @@ -4,6 +4,8 @@ import ProjectSetup from '@docs/shared/video-sdk/develop/ensure-channel-quality/ import ProjectImplement from '@docs/shared/video-sdk/develop/ensure-channel-quality/project-implementation/index.mdx'; import ProjectTest from '@docs/shared/video-sdk/develop/ensure-channel-quality/project-test/index.mdx'; import Reference from '@docs/shared/video-sdk/develop/ensure-channel-quality/reference/index.mdx'; +import {PlatformWrapper} from "../../../../src/mdx-components/PlatformWrapper"; + Customer satisfaction for your integrated depends on the quality of video and audio it provides. Quality of audiovisual communication through your is affected by the following factors: @@ -104,16 +106,24 @@ The following figure shows the workflow you need to implement to ensure channel ## Prerequisites + In order to follow this procedure you must have: - -* Implemented the [](../get-started/get-started-sdk) project for . - +* Implemented the [](../get-started/get-started-sdk#project-setup) project for . + + +In order to view and test the code used in this page: +* View or install the [ reference app](https://github.com/AgoraIO/video-sdk-samples-reactjs) for . + + ## Project setup To create the environment necessary to implement call quality best practices into your , open the [](../get-started/get-started-sdk) project you created previously. + + ## Implement best practice to optimize call quality diff --git a/shared/video-sdk/develop/ensure-channel-quality/project-implementation/index.mdx b/shared/video-sdk/develop/ensure-channel-quality/project-implementation/index.mdx index a1b4b72e9..4aa72dbc0 100644 --- a/shared/video-sdk/develop/ensure-channel-quality/project-implementation/index.mdx +++ b/shared/video-sdk/develop/ensure-channel-quality/project-implementation/index.mdx @@ -5,6 +5,7 @@ import Unity from './unity.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; import ReactNative from './react-native.mdx'; +import ReactJS from './react-js.mdx'; import MacOS from './macos.mdx' import Windows from './windows.mdx'; @@ -17,3 +18,4 @@ import Windows from './windows.mdx'; + diff --git a/shared/video-sdk/develop/ensure-channel-quality/project-implementation/react-js.mdx b/shared/video-sdk/develop/ensure-channel-quality/project-implementation/react-js.mdx new file mode 100644 index 000000000..e42e3b3a5 --- /dev/null +++ b/shared/video-sdk/develop/ensure-channel-quality/project-implementation/react-js.mdx @@ -0,0 +1,127 @@ + + + +To implement the call quality features, take the following steps: + +1. Import the components and hooks you need to manage a video call: + + ```typescript + import { + useRTCClient, + useRemoteUsers, + useNetworkQuality, + useLocalMicrophoneTrack, + useLocalCameraTrack, + useConnectionState, + useAutoPlayAudioTrack, + useJoin, + useVolumeLevel, + LocalVideoTrack, + } from "agora-rtc-react"; + import { ICameraVideoTrack, ILocalAudioTrack } from "agora-rtc-sdk-ng"; + ``` + +1. **Enable the user to test the network** + + ```ts + const networkQuality = useNetworkQuality(); + + const updateNetworkStatus = () => { + if (networkQuality.uplink === 0) { + return ; + } else if (networkQuality.uplink === 1) { + return ; + } else if (networkQuality.uplink === 2) { + return ; + } else { + return ; + } + } + ``` + +1. **Implement best practice for app initiation** + + When a user starts your , the is created and initialized in the `setupVideoSDKEngine` function. After initialization, your does the following: + + * _Enable dual stream mode_: Required for multi-user scenarios. + * _Set an audio profile and audio scenario_: Setting an audio profile is optional and only required if you have special requirements such as streaming music. + * _Set the video configuration_: Setting a video configuration is also optional. It is useful when you want to + change one or more of `mirrorMode`, `frameRate`, `bitrate`, `dimensions`, `orientationMode` or `degradationPrefer` from the default setting to custom values. + For more information, see [video profile table](#video-profile-table). + + + ```ts + const callQualityEssentials = async () => { + try { + await agoraEngine.enableDualStream(); + } catch (error) { + console.log(error); + } + await localCameraTrack?.setEncoderConfiguration({ + width: 640, + height: { ideal: 480, min: 400, max: 500 }, + frameRate: 15, + bitrateMin: 600, + bitrateMax: 1000, + }); + }; + ``` + +1. **Listen to events to receive state change notifications and quality statistics** + + ```ts + const showStatistics = () => { + const localAudioStats = agoraEngine.getLocalAudioStats(); + console.log("Local audio stats:", localAudioStats); + + const localVideoStats = agoraEngine.getLocalVideoStats(); + console.log("Local video stats:", localVideoStats); + + const rtcStats = agoraEngine.getRTCStats(); + console.log("Channel statistics:", rtcStats); + }; + ``` + + + + 3. Add the `OnClientRoleChanged` call back to get recieve state change notification on role change by the client. To specify the audience latency level, call the `updateChannelMediaOptions()` method with the specified latency. For choose `AudienceLatencyLevelUltraLowLatency`. Ultra-low latency is a feature of and its use is subject to special [pricing](../reference/pricing#unit-pricing). + + ```ts + + ``` + + + 3. Add the `OnClientRoleChanged` call back to get recieve state change notification on role change by the client. To specify the audience latency level, call the `updateChannelMediaOptions()` method with the specified latency. For choose `AudienceLatencyLevelLowLatency`. Low latency is a feature of and its use is subject to special [pricing](../reference/pricing#unit-pricing). + + ```ts + + ``` + + + Each event reports the statistics of the audio video streams from each remote user and host. + +1. **Switch stream quality when the user taps the switch button** + + ```ts + const setRemoteVideoQuality = () => { + if (!remoteUser) { + console.log("No remote user in the channel"); + return; + } + + if (!isHighRemoteVideoQuality) { + agoraEngine + .setRemoteVideoStreamType(remoteUser.uid, 0) + .then(() => setVideoQualityState(true)) + .catch((error) => console.error(error)); + } else { + agoraEngine + .setRemoteVideoStreamType(remoteUser.uid, 1) + .then(() => setVideoQualityState(false)) + .catch((error) => console.error(error)); + } + }; + ``` + + + \ No newline at end of file From 67a19c85dcdfde61c9cb01ef92f1fbbc52c60d46 Mon Sep 17 00:00:00 2001 From: billy-the-fish Date: Tue, 1 Aug 2023 17:46:23 +0200 Subject: [PATCH 006/184] Update prerequisites --- shared/common/prerequities-get-started.mdx | 77 ++++++++++++++++++ shared/common/prerequities.mdx | 81 ++----------------- .../_use-an-extension.mdx | 2 - .../ai-noise-suppression.mdx | 6 +- shared/video-sdk/_authentication-workflow.mdx | 15 ++-- shared/video-sdk/_get-started-sdk.mdx | 2 +- .../develop/_audio-and-voice-effects.mdx | 8 -- shared/video-sdk/develop/_cloud-proxy.mdx | 4 +- .../develop/_custom-video-and-audio.mdx | 12 ++- .../develop/_ensure-channel-quality.mdx | 9 --- shared/video-sdk/develop/_geofencing.mdx | 8 +- .../develop/_integrate-token-generation.mdx | 15 ---- .../develop/_media-stream-encryption.mdx | 14 +--- shared/video-sdk/develop/_play-media.mdx | 9 --- .../video-sdk/develop/_product-workflow.mdx | 10 --- shared/video-sdk/develop/_spatial-audio.mdx | 12 +-- .../develop/_stream-raw-audio-and-video.mdx | 8 +- .../project-implementation/react-js.mdx | 35 ++++---- 18 files changed, 124 insertions(+), 203 deletions(-) create mode 100644 shared/common/prerequities-get-started.mdx diff --git a/shared/common/prerequities-get-started.mdx b/shared/common/prerequities-get-started.mdx new file mode 100644 index 000000000..bf681ea76 --- /dev/null +++ b/shared/common/prerequities-get-started.mdx @@ -0,0 +1,77 @@ + +- [Android Studio](https://developer.android.com/studio) 4.1 or higher. +- Android SDK API Level 24 or higher. +- A mobile device that runs Android 4.1 or higher. + + +- Xcode 12.0 or higher. +- A device running iOS 9.0 or higher. + + +- Xcode 12.0 or higher. +- A device running macOs 10.11 or higher. +- An Apple developer account + + +- A device running Windows 7 or higher. +- Microsoft Visual Studio 2017 or higher with [Desktop development with C++](https://devblogs.microsoft.com/cppblog/windows-desktop-development-with-c-in-visual-studio/) support. + + +- A [supported browser](../reference/supported-platforms#browsers). +- Physical media input devices, such as a camera and a microphone. +- A JavaScript package manager such as [npm](https://www.npmjs.com/package/npm). + + +- [Flutter](https://docs.flutter.dev/get-started/install) 2.0.0 or higher +- Dart 2.15.1 or higher +- [Android Studio](https://developer.android.com/studio), IntelliJ, VS Code, or any other IDE that supports Flutter, see [Set up an editor](https://docs.flutter.dev/get-started/editor). + +- If your target platform is iOS: + - Xcode on macOS (latest version recommended) + - A physical iOS device + - iOS version 12.0 or later + +- If your target platform is Android: + - Android Studio on macOS or Windows (latest version recommended) + - An Android emulator or a physical Android device. + +- If you are developing a desktop application for Windows, macOS or Linux, make sure your development device meets the [Flutter desktop development requirements](https://docs.flutter.dev/development/platform-integration/desktop). + + + +- A [supported browser](../reference/supported-platforms#browsers). +- Physical media input devices, such as a camera and a microphone. +- A JavaScript package manager such as [npm](https://www.npmjs.com/package/npm). + + +- React Native 0.60 or later. For more information, see [Setting up the development environment](https://reactnative.dev/docs/environment-setup). +- Node 10 or later +- For iOS + - A machine running macOS + - Xcode 10 or later + - CocoaPods + - A physical or virtual mobile device running iOS 9.0 or later. If you use React Native 0.63 or later, ensure your iOS version is 10.0 or later. +- For Android + - A machine running macOS, Windows, or Linux + - [Java Development Kit (JDK) 11](https://openjdk.org/projects/jdk/11/) or later + - Android Studio + - A physical or virtual mobile device running Android 5.0 or later + + +- [Unity Hub](https://unity.com/download) +- [Unity Editor 2017.X LTS or higher](https://unity.com/releases/editor/archive) +- Microsoft Visual Studio 2017 or higher + + +- Physical media input devices, such as a camera and a microphone. +- A JavaScript package manager such as [npm](https://www.npmjs.com/package/npm). + + +* A device running Linux Ubuntu 14.04 or above; 18.04+ is recommended. +* At least 2 GB of memory. +* `cmake` 3.6.0 or above. + +- An [account](../reference/manage-agora-account#create-an-agora-account) and [project](../reference/manage-agora-account#create-an-agora-project). +- A computer with Internet access. + + Ensure that no firewall is blocking your network communication. diff --git a/shared/common/prerequities.mdx b/shared/common/prerequities.mdx index bf681ea76..fce715742 100644 --- a/shared/common/prerequities.mdx +++ b/shared/common/prerequities.mdx @@ -1,77 +1,8 @@ - -- [Android Studio](https://developer.android.com/studio) 4.1 or higher. -- Android SDK API Level 24 or higher. -- A mobile device that runs Android 4.1 or higher. + +To follow this procedure you must have: +* Implemented the [](../get-started/get-started-sdk#prerequisites) project for . - -- Xcode 12.0 or higher. -- A device running iOS 9.0 or higher. + +To test the code used in this page you need to: +* Setup the [ reference app](../get-started/get-started-sdk#prerequisites) for . - -- Xcode 12.0 or higher. -- A device running macOs 10.11 or higher. -- An Apple developer account - - -- A device running Windows 7 or higher. -- Microsoft Visual Studio 2017 or higher with [Desktop development with C++](https://devblogs.microsoft.com/cppblog/windows-desktop-development-with-c-in-visual-studio/) support. - - -- A [supported browser](../reference/supported-platforms#browsers). -- Physical media input devices, such as a camera and a microphone. -- A JavaScript package manager such as [npm](https://www.npmjs.com/package/npm). - - -- [Flutter](https://docs.flutter.dev/get-started/install) 2.0.0 or higher -- Dart 2.15.1 or higher -- [Android Studio](https://developer.android.com/studio), IntelliJ, VS Code, or any other IDE that supports Flutter, see [Set up an editor](https://docs.flutter.dev/get-started/editor). - -- If your target platform is iOS: - - Xcode on macOS (latest version recommended) - - A physical iOS device - - iOS version 12.0 or later - -- If your target platform is Android: - - Android Studio on macOS or Windows (latest version recommended) - - An Android emulator or a physical Android device. - -- If you are developing a desktop application for Windows, macOS or Linux, make sure your development device meets the [Flutter desktop development requirements](https://docs.flutter.dev/development/platform-integration/desktop). - - - -- A [supported browser](../reference/supported-platforms#browsers). -- Physical media input devices, such as a camera and a microphone. -- A JavaScript package manager such as [npm](https://www.npmjs.com/package/npm). - - -- React Native 0.60 or later. For more information, see [Setting up the development environment](https://reactnative.dev/docs/environment-setup). -- Node 10 or later -- For iOS - - A machine running macOS - - Xcode 10 or later - - CocoaPods - - A physical or virtual mobile device running iOS 9.0 or later. If you use React Native 0.63 or later, ensure your iOS version is 10.0 or later. -- For Android - - A machine running macOS, Windows, or Linux - - [Java Development Kit (JDK) 11](https://openjdk.org/projects/jdk/11/) or later - - Android Studio - - A physical or virtual mobile device running Android 5.0 or later - - -- [Unity Hub](https://unity.com/download) -- [Unity Editor 2017.X LTS or higher](https://unity.com/releases/editor/archive) -- Microsoft Visual Studio 2017 or higher - - -- Physical media input devices, such as a camera and a microphone. -- A JavaScript package manager such as [npm](https://www.npmjs.com/package/npm). - - -* A device running Linux Ubuntu 14.04 or above; 18.04+ is recommended. -* At least 2 GB of memory. -* `cmake` 3.6.0 or above. - -- An [account](../reference/manage-agora-account#create-an-agora-account) and [project](../reference/manage-agora-account#create-an-agora-project). -- A computer with Internet access. - - Ensure that no firewall is blocking your network communication. diff --git a/shared/extensions-marketplace/_use-an-extension.mdx b/shared/extensions-marketplace/_use-an-extension.mdx index 2c8836f33..0930c723c 100644 --- a/shared/extensions-marketplace/_use-an-extension.mdx +++ b/shared/extensions-marketplace/_use-an-extension.mdx @@ -27,11 +27,9 @@ A typical transmission pipeline consists of a chain of procedures, including cap ## Prerequisites -In order to follow this procedure you must have: -* Implemented the [](../get-started/get-started-sdk) project for . diff --git a/shared/extensions-marketplace/ai-noise-suppression.mdx b/shared/extensions-marketplace/ai-noise-suppression.mdx index fda5b873f..46ff038cc 100644 --- a/shared/extensions-marketplace/ai-noise-suppression.mdx +++ b/shared/extensions-marketplace/ai-noise-suppression.mdx @@ -1,5 +1,5 @@ -import { ProductWrapper } from '../../../src/mdx-components/ProductWrapper'; +import Prerequisites from '@docs/shared/common/prerequities.mdx'; import ProjectImplement from '@docs/shared/extensions-marketplace/ai-noise-suppression/project-implementation/index.mdx'; import Reference from '@docs/shared/extensions-marketplace/ai-noise-suppression/reference/index.mdx'; @@ -33,9 +33,7 @@ In the pre-processing stage, uses deep learning noise reductio ## Prerequisites -To use the extension, you must meet the following requirements: - -- Implemented the [](../get-started/get-started-sdk) project for . + - [Noise type](#type) matches your business scenario. For example, if you want the microphone to collect background music, the extension is not applicable because it categorizes such background music as noise. ## Implementation diff --git a/shared/video-sdk/_authentication-workflow.mdx b/shared/video-sdk/_authentication-workflow.mdx index 06b22fdc0..4532c2fee 100644 --- a/shared/video-sdk/_authentication-workflow.mdx +++ b/shared/video-sdk/_authentication-workflow.mdx @@ -42,22 +42,18 @@ The following figure shows the call flow you need to implement to create step-up ## Prerequisites -To follow this procedure, you must have: - -- Implemented the [](../get-started/get-started-sdk) + - Created a cloud platform account that is verified through your GitHub account. The following platforms are currently supported: - - [Render](https://render.com/) - - [Railway](https://railway.app/) - - [Heroku](https://www.heroku.com/) + - [Render](https://render.com/) + - [Railway](https://railway.app/) + - [Heroku](https://www.heroku.com/) To integrate a token generator directly into your security infrastructure, see [Token generators](/video-calling/develop/integrate-token-generation). - - - + ## Project setup To integrate token authentication into your , do the following: @@ -81,6 +77,7 @@ To integrate token authentication into your , do the following 1. Open the project you created in the [](../get-started/get-started-sdk). 2. Log in to your cloud platform. + ## Implement the authentication workflow diff --git a/shared/video-sdk/_get-started-sdk.mdx b/shared/video-sdk/_get-started-sdk.mdx index d1eaf5359..a2321b216 100644 --- a/shared/video-sdk/_get-started-sdk.mdx +++ b/shared/video-sdk/_get-started-sdk.mdx @@ -1,5 +1,5 @@ import * as data from '@site/data/variables'; -import Prerequisites from '@docs/shared/common/prerequities.mdx'; +import Prerequisites from '@docs/shared/common/prerequities-get-started.mdx'; import ProjectSetup from '@docs/shared/common/project-setup/index.mdx'; import ProjectImplement from '@docs/shared/video-sdk/get-started/get-started-sdk/project-implementation/index.mdx'; import ProjectTest from '@docs/shared/video-sdk/get-started/get-started-sdk/project-test/index.mdx'; diff --git a/shared/video-sdk/develop/_audio-and-voice-effects.mdx b/shared/video-sdk/develop/_audio-and-voice-effects.mdx index 11c02c963..2d4e3eaed 100644 --- a/shared/video-sdk/develop/_audio-and-voice-effects.mdx +++ b/shared/video-sdk/develop/_audio-and-voice-effects.mdx @@ -57,16 +57,8 @@ The following figure shows the workflow you need to implement to add audio and v ## Prerequisites -In order to follow this procedure you must have: - -- Implemented the [](../get-started/get-started-sdk) project for . -## Project setup - -To create the environment necessary to implement audio and voice effects into your , open the [](../get-started/get-started-sdk) project you created previously. - - ## Implement audio and voice effects and set the audio route diff --git a/shared/video-sdk/develop/_cloud-proxy.mdx b/shared/video-sdk/develop/_cloud-proxy.mdx index 63b16edd2..01ced8658 100644 --- a/shared/video-sdk/develop/_cloud-proxy.mdx +++ b/shared/video-sdk/develop/_cloud-proxy.mdx @@ -1,5 +1,5 @@ import * as data from '@site/data/variables'; - +import Prerequisites from '@docs/shared/common/prerequities.mdx'; import ProjectImplement from '@docs/shared/video-sdk/develop/cloud-proxy/project-implementation/index.mdx'; import ProjectTest from '@docs/shared/video-sdk/develop/cloud-proxy/project-test/index.mdx'; import Reference from '@docs/shared/video-sdk/develop/cloud-proxy/reference/index.mdx'; @@ -54,7 +54,7 @@ The steps you need to implement in your are: ## Prerequisites -In order to follow this procedure you must have: + - Implemented the [](../get-started/get-started-sdk) project for . diff --git a/shared/video-sdk/develop/_custom-video-and-audio.mdx b/shared/video-sdk/develop/_custom-video-and-audio.mdx index d1a78f0de..c7d2f25d2 100644 --- a/shared/video-sdk/develop/_custom-video-and-audio.mdx +++ b/shared/video-sdk/develop/_custom-video-and-audio.mdx @@ -41,17 +41,15 @@ The following figure shows the workflow you need to implement to stream a custom ![Process custom audio](/images/voice-sdk/custom-source-audio.svg) -## Prerequisites - -To follow this procedure you must have implemented the [](../get-started/get-started-sdk) project for . -## Project setup +## Prerequisites -To create the environment necessary to implement custom audio and video into your , open the [](../get-started/get-started-sdk) project you created previously. + -Set up [OpenCV](https://docs.opencv.org/4.x/d3/d52/tutorial_windows_install.html) for playing custom video source in your project. +- Set up [OpenCV](https://docs.opencv.org/4.x/d3/d52/tutorial_windows_install.html) for playing custom video source + in your project. - + ## Integrate custom audio or video diff --git a/shared/video-sdk/develop/_ensure-channel-quality.mdx b/shared/video-sdk/develop/_ensure-channel-quality.mdx index a21071126..f7106f2bb 100644 --- a/shared/video-sdk/develop/_ensure-channel-quality.mdx +++ b/shared/video-sdk/develop/_ensure-channel-quality.mdx @@ -106,15 +106,6 @@ The following figure shows the workflow you need to implement to ensure channel ## Prerequisites - -In order to follow this procedure you must have: -* Implemented the [](../get-started/get-started-sdk#project-setup) project for . - - -In order to view and test the code used in this page: -* View or install the [ reference app](https://github.com/AgoraIO/video-sdk-samples-reactjs) for . - diff --git a/shared/video-sdk/develop/_geofencing.mdx b/shared/video-sdk/develop/_geofencing.mdx index 0e295d282..72ac7158b 100644 --- a/shared/video-sdk/develop/_geofencing.mdx +++ b/shared/video-sdk/develop/_geofencing.mdx @@ -1,5 +1,5 @@ import * as data from '@site/data/variables'; - +import Prerequisites from '@docs/shared/common/prerequities.mdx'; import ProjectImplement from '@docs/shared/video-sdk/develop/geofencing/project-implementation/index.mdx'; import ProjectTest from '@docs/shared/video-sdk/develop/geofencing/project-test/index.mdx'; import Reference from '@docs/shared/video-sdk/develop/geofencing/reference/index.mdx'; @@ -15,11 +15,7 @@ This section shows you how to enable geofencing in your . The ## Prerequisites -In order to follow this procedure you must have implemented the [](../get-started/get-started-sdk) project. - -## Project setup - -In order to create the environment necessary to implement geofencing in your , open the [](../get-started/get-started-sdk) project you created previously. + ## Implement geofencing in your diff --git a/shared/video-sdk/develop/_integrate-token-generation.mdx b/shared/video-sdk/develop/_integrate-token-generation.mdx index 858091f44..629d60e8b 100644 --- a/shared/video-sdk/develop/_integrate-token-generation.mdx +++ b/shared/video-sdk/develop/_integrate-token-generation.mdx @@ -16,21 +16,6 @@ When a user attempts to connect to an channel, your -To follow this procedure you must have created: - -* An [Agora developer account](https://sso2.agora.io/en/v4/signup/with-email) - -* A project in with an [App ID](../reference/manage-agora-account#get-the-app-id), and [App Certificate](../reference/manage-agora-account#get-the-app-certificate). - - - - - -To follow this procedure you must have an authentication server project in which you wish to integrate token generation. - ## Project setup diff --git a/shared/video-sdk/develop/_media-stream-encryption.mdx b/shared/video-sdk/develop/_media-stream-encryption.mdx index 400cf58b1..30d055546 100644 --- a/shared/video-sdk/develop/_media-stream-encryption.mdx +++ b/shared/video-sdk/develop/_media-stream-encryption.mdx @@ -22,21 +22,11 @@ All users in a channel must use the same encryption configuration to initiate `a ## Prerequisites -To follow this procedure you must have: - -* Implemented the [](../get-started/get-started-sdk) project for . - -* [OpenSSL](https://www.openssl.org/) latest version - - -## Project setup - -To encrypt the media streams in your , you need to: - -- Open the [](../get-started/get-started-sdk) project you created previously. + - Set up [OpenSSL](https://www.openssl.org/) in your development device. + ## Implement media stream encryption To implement media stream encryption, do the following: diff --git a/shared/video-sdk/develop/_play-media.mdx b/shared/video-sdk/develop/_play-media.mdx index a0eb7a4f0..c2a3fcb00 100644 --- a/shared/video-sdk/develop/_play-media.mdx +++ b/shared/video-sdk/develop/_play-media.mdx @@ -18,17 +18,8 @@ The following figure shows the workflow you need to integrate media player funct ## Prerequisites -In order to follow this procedure you must have: - -* Implemented the [](../get-started/get-started-sdk) project for . -## Project setup - -To create the environment necessary to implement playing media files into your , open the [](../get-started/get-started-sdk) project you created previously. - - - ## Add a media player to your diff --git a/shared/video-sdk/develop/_product-workflow.mdx b/shared/video-sdk/develop/_product-workflow.mdx index b58bc53c9..965dcf7f0 100644 --- a/shared/video-sdk/develop/_product-workflow.mdx +++ b/shared/video-sdk/develop/_product-workflow.mdx @@ -65,18 +65,8 @@ This page shows you how to add the following features to your ## Prerequisites -In order to follow this procedure you must have: - -* Implemented the [](../get-started/get-started-sdk) project for . - -## Project setup - -To create the environment necessary to implement screen sharing and volume control features into your , open the [](../get-started/get-started-sdk) project you created previously. - - - ## Implement a client for This section shows how to use to implement screen sharing and volume control into your , step-by-step. diff --git a/shared/video-sdk/develop/_spatial-audio.mdx b/shared/video-sdk/develop/_spatial-audio.mdx index fa2eb09f9..dd0d89c58 100644 --- a/shared/video-sdk/develop/_spatial-audio.mdx +++ b/shared/video-sdk/develop/_spatial-audio.mdx @@ -39,19 +39,9 @@ The following figure shows the workflow you need to integrate spatial audio into ## Prerequisites -In order to follow this procedure you must have: - -* Implemented the [](../get-started/get-started-sdk) project for . +* To ensure a true spatial experience, best practice is to use an audio device that supports true binaural playback. -* To ensure a true spatial experience, recommends using an audio device that supports true binaural playback. - - -## Project setup - -To create the environment necessary to implement spatial audio into your , open the [](../get-started/get-started-sdk) project for you created previously. - - ## Add spatial audio to your diff --git a/shared/video-sdk/develop/_stream-raw-audio-and-video.mdx b/shared/video-sdk/develop/_stream-raw-audio-and-video.mdx index ea1c89c22..6f25f3759 100644 --- a/shared/video-sdk/develop/_stream-raw-audio-and-video.mdx +++ b/shared/video-sdk/develop/_stream-raw-audio-and-video.mdx @@ -1,5 +1,5 @@ import * as data from '@site/data/variables'; - +import Prerequites from '@docs/shared/common/prerequities.mdx'; import ProjectImplement from '@docs/shared/video-sdk/develop/stream-raw-audio-and-video/project-implementation/index.mdx'; import ProjectTest from '@docs/shared/video-sdk/develop/stream-raw-audio-and-video/project-test/index.mdx'; import Reference from '@docs/shared/video-sdk/develop/stream-raw-audio-and-video/reference/index.mdx'; @@ -29,13 +29,11 @@ The figure below shows the workflow you need to implement to process raw video a ![Process raw audio and video](/images/video-sdk/process-raw-video-audio.png) -## Prerequisites -To follow this procedure you must have implemented the [](../get-started/get-started-sdk) project for . -## Project setup +## Prerequisites -To create the environment necessary to integrate processing of raw audio and video data in your , open the [](../get-started/get-started-sdk) project you created previously. + ## Implement raw data processing diff --git a/shared/video-sdk/develop/ensure-channel-quality/project-implementation/react-js.mdx b/shared/video-sdk/develop/ensure-channel-quality/project-implementation/react-js.mdx index e42e3b3a5..78f56a62e 100644 --- a/shared/video-sdk/develop/ensure-channel-quality/project-implementation/react-js.mdx +++ b/shared/video-sdk/develop/ensure-channel-quality/project-implementation/react-js.mdx @@ -49,25 +49,24 @@ To implement the call quality features, take the following steps: change one or more of `mirrorMode`, `frameRate`, `bitrate`, `dimensions`, `orientationMode` or `degradationPrefer` from the default setting to custom values. For more information, see [video profile table](#video-profile-table). + ```ts + const callQualityEssentials = async () => { + try { + await agoraEngine.enableDualStream(); + } catch (error) { + console.log(error); + } + await localCameraTrack?.setEncoderConfiguration({ + width: 640, + height: { ideal: 480, min: 400, max: 500 }, + frameRate: 15, + bitrateMin: 600, + bitrateMax: 1000, + }); + }; + ``` - ```ts - const callQualityEssentials = async () => { - try { - await agoraEngine.enableDualStream(); - } catch (error) { - console.log(error); - } - await localCameraTrack?.setEncoderConfiguration({ - width: 640, - height: { ideal: 480, min: 400, max: 500 }, - frameRate: 15, - bitrateMin: 600, - bitrateMax: 1000, - }); - }; - ``` - -1. **Listen to events to receive state change notifications and quality statistics** +1. **Show quality statistics** ```ts const showStatistics = () => { From 499aa7a2833366775ba137a4e9c6991626469d13 Mon Sep 17 00:00:00 2001 From: billy-the-fish Date: Tue, 1 Aug 2023 19:27:05 +0200 Subject: [PATCH 007/184] Filter out the docs without an example. --- .../_use-an-extension.mdx | 9 ++- .../ai-noise-suppression.mdx | 7 +++ .../virtual-background.mdx | 6 ++ shared/video-sdk/develop/_cloud-proxy.mdx | 2 - .../develop/_custom-video-and-audio.mdx | 9 ++- shared/video-sdk/develop/_migration-guide.mdx | 7 +++ shared/video-sdk/develop/_play-media.mdx | 9 ++- .../video-sdk/develop/_product-workflow.mdx | 7 ++- shared/video-sdk/develop/_spatial-audio.mdx | 6 ++ .../develop/_stream-raw-audio-and-video.mdx | 7 ++- .../project-implementation/index.mdx | 2 + .../project-implementation/react-js.mdx | 59 +++++++++++++++++ .../reference/index.mdx | 2 + .../reference/react-js.mdx | 6 ++ .../project-implementation/index.mdx | 2 + .../project-implementation/react-js.mdx | 29 +++++++++ .../develop/cloud-proxy/reference/index.mdx | 2 + .../cloud-proxy/reference/react-js.mdx | 6 ++ .../project-implementation/react-js.mdx | 3 + .../project-implementation/index.mdx | 2 + .../project-implementation/react-js.mdx | 63 +++++++++++++++++++ .../encrypt-media-streams/reference/index.mdx | 2 + .../reference/react-js.mdx | 6 ++ .../project-implementation/index.mdx | 2 + .../project-implementation/react-js.mdx | 21 +++++++ .../project-implementation/index.mdx | 2 + .../project-test/index.mdx | 2 + video-calling/develop/cloud-proxy.mdx | 4 +- 28 files changed, 275 insertions(+), 9 deletions(-) create mode 100644 shared/video-sdk/develop/audio-and-voice-effects/project-implementation/react-js.mdx create mode 100644 shared/video-sdk/develop/audio-and-voice-effects/reference/react-js.mdx create mode 100644 shared/video-sdk/develop/cloud-proxy/project-implementation/react-js.mdx create mode 100644 shared/video-sdk/develop/cloud-proxy/reference/react-js.mdx create mode 100644 shared/video-sdk/develop/custom-video-and-audio/project-implementation/react-js.mdx create mode 100644 shared/video-sdk/develop/encrypt-media-streams/project-implementation/react-js.mdx create mode 100644 shared/video-sdk/develop/encrypt-media-streams/reference/react-js.mdx create mode 100644 shared/video-sdk/develop/geofencing/project-implementation/react-js.mdx diff --git a/shared/extensions-marketplace/_use-an-extension.mdx b/shared/extensions-marketplace/_use-an-extension.mdx index 0930c723c..f915492f5 100644 --- a/shared/extensions-marketplace/_use-an-extension.mdx +++ b/shared/extensions-marketplace/_use-an-extension.mdx @@ -25,6 +25,11 @@ An extension accesses voice and video data when it is captured from the user's l A typical transmission pipeline consists of a chain of procedures, including capture, pre-processing, encoding, transmitting, decoding, post-processing, and play. Audio or video extensions are inserted into either the pre-processing or post-processing procedure, in order to modify the voice or video data in the transmission pipeline. + +**Coming soon for this beta program.** + + + ## Prerequisites @@ -80,4 +85,6 @@ To ensure that you have integrated the extension in your : This section contains information that completes the information in this page, or points you to documentation that explains other aspects to this product. - \ No newline at end of file + + + \ No newline at end of file diff --git a/shared/extensions-marketplace/ai-noise-suppression.mdx b/shared/extensions-marketplace/ai-noise-suppression.mdx index 46ff038cc..76e7ed9ac 100644 --- a/shared/extensions-marketplace/ai-noise-suppression.mdx +++ b/shared/extensions-marketplace/ai-noise-suppression.mdx @@ -1,3 +1,4 @@ +import {PlatformWrapper} from "../../../src/mdx-components/PlatformWrapper"; import Prerequisites from '@docs/shared/common/prerequities.mdx'; import ProjectImplement from '@docs/shared/extensions-marketplace/ai-noise-suppression/project-implementation/index.mdx'; @@ -30,6 +31,10 @@ In the pre-processing stage, uses deep learning noise reductio ![](/images/extensions-marketplace/ai-noise-suppression.png) + +**Coming soon for this beta program.** + + ## Prerequisites @@ -158,3 +163,5 @@ Currently, has the following limitations: - Although supports Safari v14.1 and greater, there are performance issues. Best practice is to not support Safari. - does not support browsers on mobile devices. + + \ No newline at end of file diff --git a/shared/extensions-marketplace/virtual-background.mdx b/shared/extensions-marketplace/virtual-background.mdx index 3b20282bb..42f78fb40 100644 --- a/shared/extensions-marketplace/virtual-background.mdx +++ b/shared/extensions-marketplace/virtual-background.mdx @@ -29,6 +29,10 @@ Virtual Background enables users to blur their background or replace it with a s Want to test ? Try the online demo. + + **Coming soon for this beta program.** + + ## Understand the tech @@ -60,4 +64,6 @@ This section contains information that completes the information in this page, o + + \ No newline at end of file diff --git a/shared/video-sdk/develop/_cloud-proxy.mdx b/shared/video-sdk/develop/_cloud-proxy.mdx index 01ced8658..32959918d 100644 --- a/shared/video-sdk/develop/_cloud-proxy.mdx +++ b/shared/video-sdk/develop/_cloud-proxy.mdx @@ -56,8 +56,6 @@ The steps you need to implement in your are: -- Implemented the [](../get-started/get-started-sdk) project for . - - Configured your firewall to allow communication through the [allowed IP address](../reference/cloud-proxy-allowed-ips). diff --git a/shared/video-sdk/develop/_custom-video-and-audio.mdx b/shared/video-sdk/develop/_custom-video-and-audio.mdx index c7d2f25d2..7d98fe991 100644 --- a/shared/video-sdk/develop/_custom-video-and-audio.mdx +++ b/shared/video-sdk/develop/_custom-video-and-audio.mdx @@ -42,6 +42,11 @@ The following figure shows the workflow you need to implement to stream a custom ![Process custom audio](/images/voice-sdk/custom-source-audio.svg) + +**Coming soon for this beta program.** + + + ## Prerequisites @@ -73,4 +78,6 @@ To ensure that you have implemented streaming from a custom source into your \ No newline at end of file + + + \ No newline at end of file diff --git a/shared/video-sdk/develop/_migration-guide.mdx b/shared/video-sdk/develop/_migration-guide.mdx index a8274f9f8..d84b5d67c 100644 --- a/shared/video-sdk/develop/_migration-guide.mdx +++ b/shared/video-sdk/develop/_migration-guide.mdx @@ -10,6 +10,13 @@ import Windows from '@docs/shared/video-sdk/develop/migration-guide/windows.mdx' import Unity from '@docs/shared/video-sdk/develop/migration-guide/unity.mdx'; import ReactNative from '@docs/shared/video-sdk/develop/migration-guide/react.mdx'; + +ReactJS is a new platform for Video SDK v4.x. + + + + + diff --git a/shared/video-sdk/develop/_play-media.mdx b/shared/video-sdk/develop/_play-media.mdx index c2a3fcb00..977ae8884 100644 --- a/shared/video-sdk/develop/_play-media.mdx +++ b/shared/video-sdk/develop/_play-media.mdx @@ -16,6 +16,11 @@ The following figure shows the workflow you need to integrate media player funct ![play media](/images/common/play-media.png) + +**Coming soon for this beta program.** + + + ## Prerequisites @@ -43,4 +48,6 @@ To ensure that you have implemented media player features into your + + **Coming soon for this beta program.** + + This page shows you how to add the following features to your : @@ -87,4 +91,5 @@ To ensure that you have implemented workflow features in you This section contains information that completes the information in this page, or points you to documentation that explains other aspects to this product. - \ No newline at end of file + + \ No newline at end of file diff --git a/shared/video-sdk/develop/_spatial-audio.mdx b/shared/video-sdk/develop/_spatial-audio.mdx index dd0d89c58..ba4c0113d 100644 --- a/shared/video-sdk/develop/_spatial-audio.mdx +++ b/shared/video-sdk/develop/_spatial-audio.mdx @@ -37,6 +37,11 @@ The following figure shows the workflow you need to integrate spatial audio into ![Spatial Audio](/images/video-sdk/spatial-audio-web.svg) + +**Coming soon for this beta program.** + + + ## Prerequisites @@ -64,3 +69,4 @@ To ensure that you have implemented features into your + \ No newline at end of file diff --git a/shared/video-sdk/develop/_stream-raw-audio-and-video.mdx b/shared/video-sdk/develop/_stream-raw-audio-and-video.mdx index 6f25f3759..05c44f3a0 100644 --- a/shared/video-sdk/develop/_stream-raw-audio-and-video.mdx +++ b/shared/video-sdk/develop/_stream-raw-audio-and-video.mdx @@ -29,7 +29,10 @@ The figure below shows the workflow you need to implement to process raw video a ![Process raw audio and video](/images/video-sdk/process-raw-video-audio.png) - + +**Coming soon for this beta program.** + + ## Prerequisites @@ -53,4 +56,6 @@ To ensure that you have implemented raw data processing into your +To add audio mixing and audio route changing logic to your , take the following steps: + +1. **Add the required variables** + + + ```ts + const [isAudioMixing, setAudioMixing] = useState(false); + const [audioFileTrack, setAudioFileTrack] = useState(null); + const [showDropdown, setShowDropdown] = useState(false); + const [playbackDevices, setPlaybackDevices] = useState([]); + const playoutDeviceRef = useRef(null); + const connectionState = useConnectionState(); + ``` + + +1. **Add an audio track to play to a channel** + + + ```ts + // Event handler for selecting an audio file + const handleFileChange = (event: React.ChangeEvent) => { + if (event.target.files && event.target.files.length > 0) { + const selectedFile = event.target.files[0]; + try + { + AgoraRTC.createBufferSourceAudioTrack({ source: selectedFile }) + .then((track) => {setAudioFileTrack(track)}) + .catch((error) => {console.error(error);}) + } catch (error) { + console.error("Error creating buffer source audio track:", error); + } + } + }; + ``` + +1. **Set the audio route** + + ```ts + // Event handler for changing the audio playback device + const handleAudioRouteChange = () => { + if (audioFileTrack) { + const deviceID = playoutDeviceRef.current?.value; + if (deviceID) { + console.log("The selected device id is: " + deviceID); + try { + audioFileTrack.setPlaybackDevice(deviceID) + .then(() => {console.log("Audio route changed")}) + .catch((error) => {console.error(error);}); + } catch (error) { + console.error("Error setting playback device:", error); + } + } + } + }; + ``` + + + \ No newline at end of file diff --git a/shared/video-sdk/develop/audio-and-voice-effects/reference/index.mdx b/shared/video-sdk/develop/audio-and-voice-effects/reference/index.mdx index be501f963..381c66782 100644 --- a/shared/video-sdk/develop/audio-and-voice-effects/reference/index.mdx +++ b/shared/video-sdk/develop/audio-and-voice-effects/reference/index.mdx @@ -2,6 +2,7 @@ import Android from './android.mdx'; import Ios from './ios.mdx'; import Web from './web.mdx'; import ReactNative from './react-native.mdx'; +import ReactJS from './react-js.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; import Unity from './unity.mdx'; @@ -13,6 +14,7 @@ import Windows from './windows.mdx'; + diff --git a/shared/video-sdk/develop/audio-and-voice-effects/reference/react-js.mdx b/shared/video-sdk/develop/audio-and-voice-effects/reference/react-js.mdx new file mode 100644 index 000000000..dd7c2bc00 --- /dev/null +++ b/shared/video-sdk/develop/audio-and-voice-effects/reference/react-js.mdx @@ -0,0 +1,6 @@ + + +- Source code: https://github.com/AgoraIO/video-sdk-samples-reactjs/tree/main/src/sdk_quickstart +- API reference: [Video SDK for React JS](/en/video-calling/reference/api-reference?platform=react-js) + + \ No newline at end of file diff --git a/shared/video-sdk/develop/cloud-proxy/project-implementation/index.mdx b/shared/video-sdk/develop/cloud-proxy/project-implementation/index.mdx index 7b1ffffef..9465783e6 100644 --- a/shared/video-sdk/develop/cloud-proxy/project-implementation/index.mdx +++ b/shared/video-sdk/develop/cloud-proxy/project-implementation/index.mdx @@ -2,6 +2,7 @@ import Android from './android.mdx'; import Ios from './ios.mdx'; import Web from './web.mdx'; import ReactNative from './react-native.mdx'; +import ReactJS from './react-js.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; import MacOS from './macos.mdx' @@ -14,6 +15,7 @@ import Windows from './windows.mdx'; + diff --git a/shared/video-sdk/develop/cloud-proxy/project-implementation/react-js.mdx b/shared/video-sdk/develop/cloud-proxy/project-implementation/react-js.mdx new file mode 100644 index 000000000..2c07759f5 --- /dev/null +++ b/shared/video-sdk/develop/cloud-proxy/project-implementation/react-js.mdx @@ -0,0 +1,29 @@ + + +1. Import the components and hooks you need to manage a video call: + + ```typescript + import { useRTCClient, useClientEvent } from "agora-rtc-react"; + ``` + +1. **Enable the connection to ** + + ```typescript + const useCloudProxy = () => { + const agoraEngine = useRTCClient(); + useEffect(() => { + agoraEngine.startProxyServer(3); + }, []); + + useClientEvent(agoraEngine, "is-using-cloud-proxy", (isUsingProxy) => { + // Display the proxy server state based on the isUsingProxy Boolean variable. + if (isUsingProxy == true) { + console.log("Cloud proxy service activated"); + } else { + console.log("Proxy service failed") + } + }); + }; + ``` + + \ No newline at end of file diff --git a/shared/video-sdk/develop/cloud-proxy/reference/index.mdx b/shared/video-sdk/develop/cloud-proxy/reference/index.mdx index 7ae892546..7ceabf3a0 100644 --- a/shared/video-sdk/develop/cloud-proxy/reference/index.mdx +++ b/shared/video-sdk/develop/cloud-proxy/reference/index.mdx @@ -2,6 +2,7 @@ import Android from './android.mdx'; import Ios from './ios.mdx'; import Web from './web.mdx'; import ReactNative from './react-native.mdx'; +import ReactJS from './react-js.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; import MacOS from './macos.mdx' @@ -13,6 +14,7 @@ import Windows from './windows.mdx'; + diff --git a/shared/video-sdk/develop/cloud-proxy/reference/react-js.mdx b/shared/video-sdk/develop/cloud-proxy/reference/react-js.mdx new file mode 100644 index 000000000..dd7c2bc00 --- /dev/null +++ b/shared/video-sdk/develop/cloud-proxy/reference/react-js.mdx @@ -0,0 +1,6 @@ + + +- Source code: https://github.com/AgoraIO/video-sdk-samples-reactjs/tree/main/src/sdk_quickstart +- API reference: [Video SDK for React JS](/en/video-calling/reference/api-reference?platform=react-js) + + \ No newline at end of file diff --git a/shared/video-sdk/develop/custom-video-and-audio/project-implementation/react-js.mdx b/shared/video-sdk/develop/custom-video-and-audio/project-implementation/react-js.mdx new file mode 100644 index 000000000..a77a88372 --- /dev/null +++ b/shared/video-sdk/develop/custom-video-and-audio/project-implementation/react-js.mdx @@ -0,0 +1,3 @@ + + + \ No newline at end of file diff --git a/shared/video-sdk/develop/encrypt-media-streams/project-implementation/index.mdx b/shared/video-sdk/develop/encrypt-media-streams/project-implementation/index.mdx index 9e8c5a5cf..4bc7c2110 100644 --- a/shared/video-sdk/develop/encrypt-media-streams/project-implementation/index.mdx +++ b/shared/video-sdk/develop/encrypt-media-streams/project-implementation/index.mdx @@ -5,6 +5,7 @@ import Electron from './electron.mdx'; import Unity from './unity.mdx'; import Flutter from './flutter.mdx'; import ReactNative from './react-native.mdx'; +import ReactJS from './react-js.mdx'; import MacOS from './macos.mdx'; import Windows from './windows.mdx'; @@ -16,4 +17,5 @@ import Windows from './windows.mdx'; + \ No newline at end of file diff --git a/shared/video-sdk/develop/encrypt-media-streams/project-implementation/react-js.mdx b/shared/video-sdk/develop/encrypt-media-streams/project-implementation/react-js.mdx new file mode 100644 index 000000000..0f21e2896 --- /dev/null +++ b/shared/video-sdk/develop/encrypt-media-streams/project-implementation/react-js.mdx @@ -0,0 +1,63 @@ + +1. Import the components and hooks you need to manage a video call: + + ```typescript + import { useRTCClient } from 'agora-rtc-react'; + ``` + +1. **Add a method to convert a string from `Base64` to `Uint8Array`** + + ```typescript + function base64ToUint8Array(props: { base64Str: string }) { + const { base64Str } = props; + const raw = window.atob(base64Str); + const result = new Uint8Array(new ArrayBuffer(raw.length)); + for (let i = 0; i < raw.length; i += 1) { + result[i] = raw.charCodeAt(i); + } + return result; + } + ``` + +3. **Add a method to convert a string from `Hex` to `ASCII`** + + ```typescript + function hex2ascii(props: { hexx: string }) { + const { hexx } = props; + let str = ''; + for (let i = 0; i < hexx.length; i += 2) { + str += String.fromCharCode(parseInt(hexx.substring(i, 2), 16)); + } + return str; + } + + ``` + +4. **Call the channel encryption methods to enable channel encryption** + + To enable channel encryption in your , you need to: + + 1. Convert `encryptionSaltBase64` to `base64ToUint8Array`. + + 2. Convert `encryptionKey` to `hex2ascii`. + + 3. Set the `encryptionMode` variable to a encryption mode. + + 4. Call `setEncryptionConfig` and pass `encryptionMode`, `encryptionKey`, and `encryptionSaltBase64` as parameters. + + ```typescript + const useMediaEncryption = () => { + const agoraEngine = useRTCClient(); + useEffect(() => { + // Convert the salt string to base64ToUint8Array. + const salt = base64ToUint8Array({ base64Str: config.salt }) || config.salt; + // Convert the cipherKey string to hex2ascii. + const cipherKey = hex2ascii({ hexx: config.cipherKey }) || config.cipherKey; + // Set an encryption mode. + const encryptionMode = config.encryptionMode || "aes-256-gcm2"; + // Start channel encryption + agoraEngine.setEncryptionConfig(encryptionMode, cipherKey, salt); + }, []); // Empty dependency array ensures the effect runs only once when the component mounts + }; + ``` + \ No newline at end of file diff --git a/shared/video-sdk/develop/encrypt-media-streams/reference/index.mdx b/shared/video-sdk/develop/encrypt-media-streams/reference/index.mdx index 0be29d57a..720091e80 100644 --- a/shared/video-sdk/develop/encrypt-media-streams/reference/index.mdx +++ b/shared/video-sdk/develop/encrypt-media-streams/reference/index.mdx @@ -5,6 +5,7 @@ import Electron from './electron.mdx'; import Unity from './unity.mdx'; import Flutter from './flutter.mdx'; import ReactNative from './react-native.mdx'; +import ReactJS from './react-js.mdx'; import MacOS from './macos.mdx'; import Windows from './windows.mdx'; @@ -17,4 +18,5 @@ import Windows from './windows.mdx'; + \ No newline at end of file diff --git a/shared/video-sdk/develop/encrypt-media-streams/reference/react-js.mdx b/shared/video-sdk/develop/encrypt-media-streams/reference/react-js.mdx new file mode 100644 index 000000000..dd7c2bc00 --- /dev/null +++ b/shared/video-sdk/develop/encrypt-media-streams/reference/react-js.mdx @@ -0,0 +1,6 @@ + + +- Source code: https://github.com/AgoraIO/video-sdk-samples-reactjs/tree/main/src/sdk_quickstart +- API reference: [Video SDK for React JS](/en/video-calling/reference/api-reference?platform=react-js) + + \ No newline at end of file diff --git a/shared/video-sdk/develop/geofencing/project-implementation/index.mdx b/shared/video-sdk/develop/geofencing/project-implementation/index.mdx index df6b4f50d..4765eae34 100644 --- a/shared/video-sdk/develop/geofencing/project-implementation/index.mdx +++ b/shared/video-sdk/develop/geofencing/project-implementation/index.mdx @@ -3,6 +3,7 @@ import Ios from './ios.mdx'; import Web from './web.mdx'; import Unity from './unity.mdx'; import ReactNative from './react-native.mdx'; +import ReactJS from './react-js.mdx' import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; import MacOS from './macos.mdx'; @@ -14,6 +15,7 @@ import Windows from './windows.mdx'; + diff --git a/shared/video-sdk/develop/geofencing/project-implementation/react-js.mdx b/shared/video-sdk/develop/geofencing/project-implementation/react-js.mdx new file mode 100644 index 000000000..f93bf5cad --- /dev/null +++ b/shared/video-sdk/develop/geofencing/project-implementation/react-js.mdx @@ -0,0 +1,21 @@ + + +To enable geofencing in your , call `setArea` and pass in the region you want to exclude or include. + + ```ts + const useGeofencing = () => { + useEffect(() => { + AgoraRTC.setArea({ + areaCode: [AREAS.NORTH_AMERICA, AREAS.ASIA] + }) + }, []); + }; + ``` + +If your fails to connect to the specified region of , instead of connection to another region, throws an error. If a firewall is deployed in your network environment, ensure that you: + +* Whitelist certain domains +* Allow all IP addresses +* Open the firewall ports defined in Use Cloud Proxy. + + \ No newline at end of file diff --git a/shared/video-sdk/develop/integrate-token-generation/project-implementation/index.mdx b/shared/video-sdk/develop/integrate-token-generation/project-implementation/index.mdx index 872efb6cf..5a983e2c5 100644 --- a/shared/video-sdk/develop/integrate-token-generation/project-implementation/index.mdx +++ b/shared/video-sdk/develop/integrate-token-generation/project-implementation/index.mdx @@ -4,12 +4,14 @@ import Web from './web.mdx'; import Electron from './electron.mdx'; import Unity from './unity.mdx'; import ReactNative from './react-native.mdx' +import ReactJS from './web.mdx' import Flutter from './flutter.mdx'; import MacOS from './macos.mdx'; import Windows from './windows.mdx'; + diff --git a/shared/video-sdk/develop/integrate-token-generation/project-test/index.mdx b/shared/video-sdk/develop/integrate-token-generation/project-test/index.mdx index aec7934e6..c79378a60 100644 --- a/shared/video-sdk/develop/integrate-token-generation/project-test/index.mdx +++ b/shared/video-sdk/develop/integrate-token-generation/project-test/index.mdx @@ -4,6 +4,7 @@ import Web from './web.mdx'; import Electron from './electron.mdx'; import Unity from './unity.mdx'; import ReactNative from './react-native.mdx' +import ReactJS from './web.mdx'; import Flutter from './flutter.mdx'; import MacOS from './macos.mdx'; import Windows from './windows.mdx'; @@ -11,6 +12,7 @@ import Windows from './windows.mdx'; + diff --git a/video-calling/develop/cloud-proxy.mdx b/video-calling/develop/cloud-proxy.mdx index f34038806..8cd6b0cc7 100644 --- a/video-calling/develop/cloud-proxy.mdx +++ b/video-calling/develop/cloud-proxy.mdx @@ -5,8 +5,8 @@ description: > Implement Agora Cloud Proxy feature for reliable audio and video connectivity. --- -import AudioAndVoiceEffects from '@docs/shared/video-sdk/develop/_cloud-proxy.mdx'; +import CloudProxy from '@docs/shared/video-sdk/develop/_cloud-proxy.mdx'; export const toc = [{}]; - + From 30a291d0e5ffcadacccd4209a57ac8be4e81ec9f Mon Sep 17 00:00:00 2001 From: billy-the-fish Date: Tue, 1 Aug 2023 19:57:24 +0200 Subject: [PATCH 008/184] Updates on review. --- shared/common/project-test/react-js.mdx | 2 +- .../video-sdk/authentication-workflow/project-test/react-js.mdx | 2 +- .../ensure-channel-quality/project-implementation/react-js.mdx | 1 - .../get-started/get-started-sdk/project-test/react-js.mdx | 2 +- 4 files changed, 3 insertions(+), 4 deletions(-) diff --git a/shared/common/project-test/react-js.mdx b/shared/common/project-test/react-js.mdx index 6cf666c96..4cc1b8c6f 100644 --- a/shared/common/project-test/react-js.mdx +++ b/shared/common/project-test/react-js.mdx @@ -1,7 +1,7 @@ -3. In the `video-sdk-samples-reactjs` reference app, open `src/config.json` and set `appID` to the [AppID](https://docs-beta.agora.io/en/video-calling/reference/manage-agora-account?platform=android#get-the-app-id) of your project. +3. In the `video-sdk-samples-reactjs` reference app, open `src/config.json` and set `appId` to the [AppID](https://docs-beta.agora.io/en/video-calling/reference/manage-agora-account?platform=android#get-the-app-id) of your project. 1. Set the authentication token: - **Temporary token**: diff --git a/shared/video-sdk/authentication-workflow/project-test/react-js.mdx b/shared/video-sdk/authentication-workflow/project-test/react-js.mdx index 4979ce945..9c6617c20 100644 --- a/shared/video-sdk/authentication-workflow/project-test/react-js.mdx +++ b/shared/video-sdk/authentication-workflow/project-test/react-js.mdx @@ -2,7 +2,7 @@ 3. In the `video-sdk-samples-reactjs` reference app, open `src/config.json` and update the following: - - `appID` - the [AppID](https://docs-beta.agora.io/en/video-calling/reference/manage-agora-account?platform=android#get-the-app-id) of your project. + - `appId` - the [AppID](https://docs-beta.agora.io/en/video-calling/reference/manage-agora-account?platform=android#get-the-app-id) of your project. - `rtcToken` - an empty string. - `serverUrl` - the base URL of your authentication server. For example, `https://agora-token-service-production-1234.up.railway.app`. diff --git a/shared/video-sdk/develop/ensure-channel-quality/project-implementation/react-js.mdx b/shared/video-sdk/develop/ensure-channel-quality/project-implementation/react-js.mdx index 78f56a62e..63f5b8b34 100644 --- a/shared/video-sdk/develop/ensure-channel-quality/project-implementation/react-js.mdx +++ b/shared/video-sdk/develop/ensure-channel-quality/project-implementation/react-js.mdx @@ -44,7 +44,6 @@ To implement the call quality features, take the following steps: When a user starts your , the is created and initialized in the `setupVideoSDKEngine` function. After initialization, your does the following: * _Enable dual stream mode_: Required for multi-user scenarios. - * _Set an audio profile and audio scenario_: Setting an audio profile is optional and only required if you have special requirements such as streaming music. * _Set the video configuration_: Setting a video configuration is also optional. It is useful when you want to change one or more of `mirrorMode`, `frameRate`, `bitrate`, `dimensions`, `orientationMode` or `degradationPrefer` from the default setting to custom values. For more information, see [video profile table](#video-profile-table). diff --git a/shared/video-sdk/get-started/get-started-sdk/project-test/react-js.mdx b/shared/video-sdk/get-started/get-started-sdk/project-test/react-js.mdx index 1156260e4..7913b3596 100644 --- a/shared/video-sdk/get-started/get-started-sdk/project-test/react-js.mdx +++ b/shared/video-sdk/get-started/get-started-sdk/project-test/react-js.mdx @@ -1,6 +1,6 @@ -3. In the `video-sdk-samples-reactjs` reference app, open `src/config.json` and update the values of `appID`, +3. In the `video-sdk-samples-reactjs` reference app, open `src/config.json` and update the values of `appId`, `channelName`, and `rtcToken` with the values for your temporary token. 1. In Terminal run the following command to start a proxy server: From e1f9f6545798e9ffbc3bd687389fea5741e69674 Mon Sep 17 00:00:00 2001 From: billy-the-fish Date: Wed, 2 Aug 2023 17:53:23 +0200 Subject: [PATCH 009/184] Update API ref. --- shared/variables/global.js | 2 ++ .../authentication-workflow/reference/react-js.mdx | 4 ++-- .../develop/audio-and-voice-effects/reference/react-js.mdx | 4 ++-- shared/video-sdk/develop/cloud-proxy/reference/react-js.mdx | 4 ++-- .../develop/custom-video-and-audio/reference/index.mdx | 2 ++ .../develop/custom-video-and-audio/reference/react-js.mdx | 6 ++++++ .../develop/encrypt-media-streams/reference/react-js.mdx | 4 ++-- .../develop/ensure-channel-quality/reference/index.mdx | 2 ++ .../develop/ensure-channel-quality/reference/react-js.mdx | 6 ++++++ shared/video-sdk/develop/geofencing/reference/index.mdx | 2 ++ shared/video-sdk/develop/geofencing/reference/react-js.mdx | 6 ++++++ shared/video-sdk/develop/play-media/reference/index.mdx | 2 ++ shared/video-sdk/develop/play-media/reference/react-js.mdx | 6 ++++++ .../develop/product-workflow/reference/react-js.mdx | 6 ++++++ .../get-started/get-started-sdk/reference/react-js.mdx | 5 +++-- 15 files changed, 51 insertions(+), 10 deletions(-) create mode 100644 shared/video-sdk/develop/custom-video-and-audio/reference/react-js.mdx create mode 100644 shared/video-sdk/develop/ensure-channel-quality/reference/react-js.mdx create mode 100644 shared/video-sdk/develop/geofencing/reference/react-js.mdx create mode 100644 shared/video-sdk/develop/play-media/reference/react-js.mdx create mode 100644 shared/video-sdk/develop/product-workflow/reference/react-js.mdx diff --git a/shared/variables/global.js b/shared/variables/global.js index a132f86e3..e205843cf 100644 --- a/shared/variables/global.js +++ b/shared/variables/global.js @@ -29,6 +29,8 @@ export const API_REF_IOS_ROOT_RTC_ENGINE_KIT = `${API_REF_IOS_ROOT_RTC_KIT}/agor export const API_REF_IOS_ROOT_VOICE_SDK = `${API_REF_ROOT_VOICE_SDK}/ios/${VSDK_RELEASE_API}/documentation`; export const API_REF_IOS_ROOT_RTC_KIT_VOICE_SDK = `${API_REF_IOS_ROOT_VOICE_SDK}/agorartckit`; export const API_REF_IOS_ROOT_RTC_ENGINE_KIT_VOICE_SDK = `${API_REF_IOS_ROOT_RTC_KIT_VOICE_SDK}/agorartcenginekit`; +export const API_REF_RNJS_ROOT = `${API_REF_ROOT}/reactjs/2.x`; +export const API_REF_RNJS_ROOT_VOICE = `${API_REF_ROOT_VOICE_SDK}/reactjs/2.x`; export const API_REF_RN_ROOT = `${API_REF_ROOT}/react-native/${MAJOR_VERSION}/API`; export const API_REF_RN_PREVIOUS_ROOT = `${API_REF_ROOT}/react-native/${VSDK_PREVIOUS_RELEASE_API}`; export const API_REF_RN_ROOT_VOICE = `${API_REF_ROOT_VOICE_SDK}/react-native/${MAJOR_VERSION}/API`; diff --git a/shared/video-sdk/authentication-workflow/reference/react-js.mdx b/shared/video-sdk/authentication-workflow/reference/react-js.mdx index 0aa518dab..3c50e5094 100644 --- a/shared/video-sdk/authentication-workflow/reference/react-js.mdx +++ b/shared/video-sdk/authentication-workflow/reference/react-js.mdx @@ -1,7 +1,7 @@ ### API reference -- Source code: https://github.com/AgoraIO/video-sdk-samples-reactjs/tree/sdk_2.0_updates/src/authentication-workflow -- API reference: [Video SDK for React JS](/en/video-calling/reference/api-reference?platform=react-js) +- [Reference app](https://github.com/AgoraIO/video-sdk-samples-reactjs/tree/main#samples) +- API reference \ No newline at end of file diff --git a/shared/video-sdk/develop/audio-and-voice-effects/reference/react-js.mdx b/shared/video-sdk/develop/audio-and-voice-effects/reference/react-js.mdx index dd7c2bc00..6959ce6c1 100644 --- a/shared/video-sdk/develop/audio-and-voice-effects/reference/react-js.mdx +++ b/shared/video-sdk/develop/audio-and-voice-effects/reference/react-js.mdx @@ -1,6 +1,6 @@ -- Source code: https://github.com/AgoraIO/video-sdk-samples-reactjs/tree/main/src/sdk_quickstart -- API reference: [Video SDK for React JS](/en/video-calling/reference/api-reference?platform=react-js) +- [Reference app](https://github.com/AgoraIO/video-sdk-samples-reactjs/tree/main#samples) +- API reference \ No newline at end of file diff --git a/shared/video-sdk/develop/cloud-proxy/reference/react-js.mdx b/shared/video-sdk/develop/cloud-proxy/reference/react-js.mdx index dd7c2bc00..6959ce6c1 100644 --- a/shared/video-sdk/develop/cloud-proxy/reference/react-js.mdx +++ b/shared/video-sdk/develop/cloud-proxy/reference/react-js.mdx @@ -1,6 +1,6 @@ -- Source code: https://github.com/AgoraIO/video-sdk-samples-reactjs/tree/main/src/sdk_quickstart -- API reference: [Video SDK for React JS](/en/video-calling/reference/api-reference?platform=react-js) +- [Reference app](https://github.com/AgoraIO/video-sdk-samples-reactjs/tree/main#samples) +- API reference \ No newline at end of file diff --git a/shared/video-sdk/develop/custom-video-and-audio/reference/index.mdx b/shared/video-sdk/develop/custom-video-and-audio/reference/index.mdx index 938ab3ecd..a6e857d72 100644 --- a/shared/video-sdk/develop/custom-video-and-audio/reference/index.mdx +++ b/shared/video-sdk/develop/custom-video-and-audio/reference/index.mdx @@ -7,12 +7,14 @@ import Unity from './unity.mdx'; import MacOS from './macos.mdx'; import Flutter from './flutter.mdx'; import Windows from './windows.mdx'; +import ReactJS from './react-js.mdx'; + diff --git a/shared/video-sdk/develop/custom-video-and-audio/reference/react-js.mdx b/shared/video-sdk/develop/custom-video-and-audio/reference/react-js.mdx new file mode 100644 index 000000000..6959ce6c1 --- /dev/null +++ b/shared/video-sdk/develop/custom-video-and-audio/reference/react-js.mdx @@ -0,0 +1,6 @@ + + +- [Reference app](https://github.com/AgoraIO/video-sdk-samples-reactjs/tree/main#samples) +- API reference + + \ No newline at end of file diff --git a/shared/video-sdk/develop/encrypt-media-streams/reference/react-js.mdx b/shared/video-sdk/develop/encrypt-media-streams/reference/react-js.mdx index dd7c2bc00..6959ce6c1 100644 --- a/shared/video-sdk/develop/encrypt-media-streams/reference/react-js.mdx +++ b/shared/video-sdk/develop/encrypt-media-streams/reference/react-js.mdx @@ -1,6 +1,6 @@ -- Source code: https://github.com/AgoraIO/video-sdk-samples-reactjs/tree/main/src/sdk_quickstart -- API reference: [Video SDK for React JS](/en/video-calling/reference/api-reference?platform=react-js) +- [Reference app](https://github.com/AgoraIO/video-sdk-samples-reactjs/tree/main#samples) +- API reference \ No newline at end of file diff --git a/shared/video-sdk/develop/ensure-channel-quality/reference/index.mdx b/shared/video-sdk/develop/ensure-channel-quality/reference/index.mdx index eeb57a5a7..4b26b4bb9 100644 --- a/shared/video-sdk/develop/ensure-channel-quality/reference/index.mdx +++ b/shared/video-sdk/develop/ensure-channel-quality/reference/index.mdx @@ -7,6 +7,7 @@ import Flutter from './flutter.mdx'; import ReactNative from './react-native.mdx'; import MacOS from './macos.mdx'; import Windows from './windows.mdx' +import ReactJS from './react-js.mdx'; @@ -16,5 +17,6 @@ import Windows from './windows.mdx' + diff --git a/shared/video-sdk/develop/ensure-channel-quality/reference/react-js.mdx b/shared/video-sdk/develop/ensure-channel-quality/reference/react-js.mdx new file mode 100644 index 000000000..6959ce6c1 --- /dev/null +++ b/shared/video-sdk/develop/ensure-channel-quality/reference/react-js.mdx @@ -0,0 +1,6 @@ + + +- [Reference app](https://github.com/AgoraIO/video-sdk-samples-reactjs/tree/main#samples) +- API reference + + \ No newline at end of file diff --git a/shared/video-sdk/develop/geofencing/reference/index.mdx b/shared/video-sdk/develop/geofencing/reference/index.mdx index df6b4f50d..f425918f9 100644 --- a/shared/video-sdk/develop/geofencing/reference/index.mdx +++ b/shared/video-sdk/develop/geofencing/reference/index.mdx @@ -2,6 +2,7 @@ import Android from './android.mdx'; import Ios from './ios.mdx'; import Web from './web.mdx'; import Unity from './unity.mdx'; +import ReactJS from './react-js.mdx'; import ReactNative from './react-native.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; @@ -13,6 +14,7 @@ import Windows from './windows.mdx'; + diff --git a/shared/video-sdk/develop/geofencing/reference/react-js.mdx b/shared/video-sdk/develop/geofencing/reference/react-js.mdx new file mode 100644 index 000000000..6959ce6c1 --- /dev/null +++ b/shared/video-sdk/develop/geofencing/reference/react-js.mdx @@ -0,0 +1,6 @@ + + +- [Reference app](https://github.com/AgoraIO/video-sdk-samples-reactjs/tree/main#samples) +- API reference + + \ No newline at end of file diff --git a/shared/video-sdk/develop/play-media/reference/index.mdx b/shared/video-sdk/develop/play-media/reference/index.mdx index 95f43012c..a471fcc8b 100644 --- a/shared/video-sdk/develop/play-media/reference/index.mdx +++ b/shared/video-sdk/develop/play-media/reference/index.mdx @@ -1,6 +1,7 @@ import Android from './android.mdx'; import Ios from './ios.mdx'; import Web from './web.mdx'; +import ReactJS from './react-js.mdx'; import ReactNative from './react-native.mdx'; import Electron from './electron.mdx'; import Flutter from './flutter.mdx'; @@ -13,6 +14,7 @@ import Windows from './windows.mdx'; + diff --git a/shared/video-sdk/develop/play-media/reference/react-js.mdx b/shared/video-sdk/develop/play-media/reference/react-js.mdx new file mode 100644 index 000000000..6959ce6c1 --- /dev/null +++ b/shared/video-sdk/develop/play-media/reference/react-js.mdx @@ -0,0 +1,6 @@ + + +- [Reference app](https://github.com/AgoraIO/video-sdk-samples-reactjs/tree/main#samples) +- API reference + + \ No newline at end of file diff --git a/shared/video-sdk/develop/product-workflow/reference/react-js.mdx b/shared/video-sdk/develop/product-workflow/reference/react-js.mdx new file mode 100644 index 000000000..6959ce6c1 --- /dev/null +++ b/shared/video-sdk/develop/product-workflow/reference/react-js.mdx @@ -0,0 +1,6 @@ + + +- [Reference app](https://github.com/AgoraIO/video-sdk-samples-reactjs/tree/main#samples) +- API reference + + \ No newline at end of file diff --git a/shared/video-sdk/get-started/get-started-sdk/reference/react-js.mdx b/shared/video-sdk/get-started/get-started-sdk/reference/react-js.mdx index dd7c2bc00..d52ba66a2 100644 --- a/shared/video-sdk/get-started/get-started-sdk/reference/react-js.mdx +++ b/shared/video-sdk/get-started/get-started-sdk/reference/react-js.mdx @@ -1,6 +1,7 @@ -- Source code: https://github.com/AgoraIO/video-sdk-samples-reactjs/tree/main/src/sdk_quickstart -- API reference: [Video SDK for React JS](/en/video-calling/reference/api-reference?platform=react-js) +- [Reference app](https://github.com/AgoraIO/video-sdk-samples-reactjs/tree/main#samples) +- API reference + \ No newline at end of file From 490b76353b717dd4bd51f652b0e85753991127b6 Mon Sep 17 00:00:00 2001 From: billy-the-fish Date: Wed, 2 Aug 2023 17:59:00 +0200 Subject: [PATCH 010/184] Update API ref. --- video-calling/reference/api-reference.mdx | 19 ------------------- 1 file changed, 19 deletions(-) delete mode 100644 video-calling/reference/api-reference.mdx diff --git a/video-calling/reference/api-reference.mdx b/video-calling/reference/api-reference.mdx deleted file mode 100644 index 34e246f60..000000000 --- a/video-calling/reference/api-reference.mdx +++ /dev/null @@ -1,19 +0,0 @@ ---- -title: 'Video SDK API reference' -sidebar_position: 3 -type: docs -description: > - Links to the API reference for your platform ---- - -import ReactJS from '@docs/shared/video-sdk/reference/api-reference/index.mdx'; - -export const toc = [{}]; - - - See [API reference](/en/api-reference) - - - - - \ No newline at end of file From 179a89fb408af76962530ae02a2121a5d09d2949 Mon Sep 17 00:00:00 2001 From: billy-the-fish Date: Fri, 4 Aug 2023 19:14:02 +0200 Subject: [PATCH 011/184] Update docs to match the examples supplied so far. --- shared/common/project-test/react-js.mdx | 21 ++++--- shared/video-sdk/develop/_play-media.mdx | 6 -- .../project-implementation/react-js.mdx | 25 +++++++- .../project-implementation/react-js.mdx | 30 +++++----- .../project-implementation/react-js.mdx | 8 +-- .../project-implementation/index.mdx | 3 +- .../project-implementation/react-js.mdx | 59 +++++++++++++++++++ shared/video-sdk/reference/_release-notes.mdx | 2 + .../reference/release-notes/react-js.mdx | 8 +++ 9 files changed, 126 insertions(+), 36 deletions(-) create mode 100644 shared/video-sdk/develop/play-media/project-implementation/react-js.mdx create mode 100644 shared/video-sdk/reference/release-notes/react-js.mdx diff --git a/shared/common/project-test/react-js.mdx b/shared/common/project-test/react-js.mdx index 4cc1b8c6f..b1df955ed 100644 --- a/shared/common/project-test/react-js.mdx +++ b/shared/common/project-test/react-js.mdx @@ -1,19 +1,24 @@ -3. In the `video-sdk-samples-reactjs` reference app, open `src/config.json` and set `appId` to the [AppID](https://docs-beta.agora.io/en/video-calling/reference/manage-agora-account?platform=android#get-the-app-id) of your project. +3. In the `video-sdk-samples-reactjs` reference app, open `src/agora-manager/config.json` and set `appId` to the + [AppID](https://docs-beta.agora.io/en/video-calling/reference/manage-agora-account?platform=android#get-the-app-id) of your project. 1. Set the authentication token: - **Temporary token**: - 1. Set `rtcToken` with the values for your temporary token you use in the web app. + 1. Set `rtcToken` with the value of your [temporary token](https://docs-beta.agora.io/en/video-calling/reference/manage-agora-account?platform=android#generate-a-temporary-token) - **Authentication server**: - 1. Set `rtcToken` to an empty string. - 1. Set `serverUrl` to the base URL of your authentication server. For example, `https://agora-token-service-production-1234.up.railway.app`. + 1. Setup an [Authentication server](https://docs-beta.agora.io/en/video-calling/get-started/authentication-workflow?platform#create-and-run-a-token-server) + 1. In `config.json`: + + 1. Set `rtcToken` to an empty string. + 1. Set `serverUrl` to the base URL of your authentication server. For example, `https://agora-token-service-production-1234.up.railway.app`. 1. Start a proxy server so this web app can make HTTP calls to fetch a token. In a Terminal instance in the reference app root, run the following command: - ```bash - node ./utils/proxy.js - ``` + ```bash + node ./utils/proxy.js + ``` + 1. Start this reference app. In Terminal, run the following command: @@ -27,6 +32,8 @@ 1. In the dropdown, select this document and test . + + \ No newline at end of file diff --git a/shared/video-sdk/develop/_play-media.mdx b/shared/video-sdk/develop/_play-media.mdx index 977ae8884..3cbf87a39 100644 --- a/shared/video-sdk/develop/_play-media.mdx +++ b/shared/video-sdk/develop/_play-media.mdx @@ -16,10 +16,6 @@ The following figure shows the workflow you need to integrate media player funct ![play media](/images/common/play-media.png) - -**Coming soon for this beta program.** - - ## Prerequisites @@ -49,5 +45,3 @@ To ensure that you have implemented media player features into your To add audio mixing and audio route changing logic to your , take the following steps: -1. **Add the required variables** +1. Handle audio mixing + + ```ts + const AudioMixing: React.FC<{ track: IBufferSourceAudioTrack }> = ({ track }) => { + usePublish([track]); + + useEffect(() => { + track.startProcessAudioBuffer(); + track.play(); // to play the track for the local user + return () => { + track.stopProcessAudioBuffer(); + track.stop(); + }; + }, [track]); + + return
    Audio mixing is in progress
    ; + }; + ``` + +1. Add the required variables ```ts @@ -14,7 +33,7 @@ To add audio mixing and audio route changing logic to your , t ``` -1. **Add an audio track to play to a channel** +1. Add an audio track to play to a channel ```ts @@ -34,7 +53,7 @@ To add audio mixing and audio route changing logic to your , t }; ``` -1. **Set the audio route** +1. Set the audio route ```ts // Event handler for changing the audio playback device diff --git a/shared/video-sdk/develop/cloud-proxy/project-implementation/react-js.mdx b/shared/video-sdk/develop/cloud-proxy/project-implementation/react-js.mdx index 2c07759f5..344c6eddb 100644 --- a/shared/video-sdk/develop/cloud-proxy/project-implementation/react-js.mdx +++ b/shared/video-sdk/develop/cloud-proxy/project-implementation/react-js.mdx @@ -6,24 +6,24 @@ import { useRTCClient, useClientEvent } from "agora-rtc-react"; ``` -1. **Enable the connection to ** +1. Enable the connection to ```typescript - const useCloudProxy = () => { - const agoraEngine = useRTCClient(); - useEffect(() => { - agoraEngine.startProxyServer(3); - }, []); + const useCloudProxy = () => { + const agoraEngine = useRTCClient(); + useEffect(() => { + agoraEngine.startProxyServer(3); + }, []); - useClientEvent(agoraEngine, "is-using-cloud-proxy", (isUsingProxy) => { - // Display the proxy server state based on the isUsingProxy Boolean variable. - if (isUsingProxy == true) { - console.log("Cloud proxy service activated"); - } else { - console.log("Proxy service failed") - } - }); - }; + useClientEvent(agoraEngine, "is-using-cloud-proxy", (isUsingProxy) => { + // Display the proxy server state based on the isUsingProxy Boolean variable. + if (isUsingProxy == true) { + console.log("Cloud proxy service activated"); + } else { + console.log("Proxy service failed") + } + }); + }; ```
    \ No newline at end of file diff --git a/shared/video-sdk/develop/ensure-channel-quality/project-implementation/react-js.mdx b/shared/video-sdk/develop/ensure-channel-quality/project-implementation/react-js.mdx index 63f5b8b34..a47770e8c 100644 --- a/shared/video-sdk/develop/ensure-channel-quality/project-implementation/react-js.mdx +++ b/shared/video-sdk/develop/ensure-channel-quality/project-implementation/react-js.mdx @@ -21,7 +21,7 @@ To implement the call quality features, take the following steps: import { ICameraVideoTrack, ILocalAudioTrack } from "agora-rtc-sdk-ng"; ``` -1. **Enable the user to test the network** +1. Enable the user to test the network ```ts const networkQuality = useNetworkQuality(); @@ -39,7 +39,7 @@ To implement the call quality features, take the following steps: } ``` -1. **Implement best practice for app initiation** +1. Implement best practice for app initiation When a user starts your , the is created and initialized in the `setupVideoSDKEngine` function. After initialization, your does the following: @@ -65,7 +65,7 @@ To implement the call quality features, take the following steps: }; ``` -1. **Show quality statistics** +1. Show quality statistics ```ts const showStatistics = () => { @@ -98,7 +98,7 @@ To implement the call quality features, take the following steps: Each event reports the statistics of the audio video streams from each remote user and host. -1. **Switch stream quality when the user taps the switch button** +1. Switch stream quality when the user taps the switch button ```ts const setRemoteVideoQuality = () => { diff --git a/shared/video-sdk/develop/play-media/project-implementation/index.mdx b/shared/video-sdk/develop/play-media/project-implementation/index.mdx index 95f43012c..56e5539e8 100644 --- a/shared/video-sdk/develop/play-media/project-implementation/index.mdx +++ b/shared/video-sdk/develop/play-media/project-implementation/index.mdx @@ -7,12 +7,13 @@ import Flutter from './flutter.mdx'; import Unity from './unity.mdx'; import MacOS from './macos.mdx'; import Windows from './windows.mdx'; - +import ReactJS from './react-js.mdx'; + diff --git a/shared/video-sdk/develop/play-media/project-implementation/react-js.mdx b/shared/video-sdk/develop/play-media/project-implementation/react-js.mdx new file mode 100644 index 000000000..5497269d1 --- /dev/null +++ b/shared/video-sdk/develop/play-media/project-implementation/react-js.mdx @@ -0,0 +1,59 @@ + + +1. Import the components and hooks you need to manage a video call: + + ```typescript + import { usePublish, useConnectionState } from "agora-rtc-react"; + import AgoraRTC, { IBufferSourceAudioTrack } from "agora-rtc-sdk-ng"; + ``` + +1. Process an audio file + + ```typescript + const PlayAudioFile: React.FC<{ track: IBufferSourceAudioTrack }> = ({ track }) => { + usePublish([track]); + + useEffect(() => { + track.startProcessAudioBuffer(); + track.play(); // to play the track for the local user + return () => { + track.stopProcessAudioBuffer(); + track.stop(); + }; + }, [track]); + + return
    Audio file playing
    ; + }; + ``` + +1. Play the audio to a channel + + ```typescript + try + { + AgoraRTC.createBufferSourceAudioTrack({ source: selectedFile }) + .then((track) => {setAudioFileTrack(track)}) + .catch((error) => {console.error(error);}) + } catch (error) { + console.error("Error creating buffer source audio track:", error); + } + ``` + +1. Put it all together in the UI + + ```typescript + +

    + +

    + {isMediaPlaying && audioFileTrack && } + ``` +
    \ No newline at end of file diff --git a/shared/video-sdk/reference/_release-notes.mdx b/shared/video-sdk/reference/_release-notes.mdx index f3d0e0472..f9af8a083 100644 --- a/shared/video-sdk/reference/_release-notes.mdx +++ b/shared/video-sdk/reference/_release-notes.mdx @@ -5,6 +5,7 @@ import Android from '@docs/shared/video-sdk/reference/release-notes/android.mdx' import Ios from '@docs/shared/video-sdk/reference/release-notes/ios.mdx'; import Unity from '@docs/shared/video-sdk/reference/release-notes/unity.mdx'; import Flutter from '@docs/shared/video-sdk/reference/release-notes/flutter.mdx'; +import ReactJS from '@docs/shared/video-sdk/reference/release-notes/react-js.mdx'; import ReactNative from '@docs/shared/video-sdk/reference/release-notes/react-native.mdx'; import Electron from '@docs/shared/video-sdk/reference/release-notes/electron.mdx'; import Macos from '@docs/shared/video-sdk/reference/release-notes/macos.mdx'; @@ -28,6 +29,7 @@ This page provides the release notes for . + diff --git a/shared/video-sdk/reference/release-notes/react-js.mdx b/shared/video-sdk/reference/release-notes/react-js.mdx new file mode 100644 index 000000000..97ab97727 --- /dev/null +++ b/shared/video-sdk/reference/release-notes/react-js.mdx @@ -0,0 +1,8 @@ + + +### v2.0.0-alpha.0 + +This is the first alpha release of Video SDK for ReactJS. + + + \ No newline at end of file From 9da1f5e912ef283f265694f392faa758f1942d56 Mon Sep 17 00:00:00 2001 From: billy-the-fish Date: Sat, 5 Aug 2023 17:04:24 +0200 Subject: [PATCH 012/184] Add the custom audio, but not published as PR is not accepted yet. --- .../project-implementation/react-js.mdx | 67 +++++++++++++++++++ 1 file changed, 67 insertions(+) diff --git a/shared/video-sdk/develop/custom-video-and-audio/project-implementation/react-js.mdx b/shared/video-sdk/develop/custom-video-and-audio/project-implementation/react-js.mdx index a77a88372..53454c800 100644 --- a/shared/video-sdk/develop/custom-video-and-audio/project-implementation/react-js.mdx +++ b/shared/video-sdk/develop/custom-video-and-audio/project-implementation/react-js.mdx @@ -1,3 +1,70 @@ +1. Import the components and hooks you need to manage a video call: + + ```typescript + import AgoraRTC, { ILocalAudioTrack, ILocalVideoTrack } from "agora-rtc-sdk-ng"; + import { usePublish, useLocalCameraTrack, useConnectionState } from "agora-rtc-react"; + ``` + +1. Play a custom audio track: + ```typescript + const CustomAudioTrack: React.FC<{ customAudioTrack: ILocalAudioTrack | null }> = ({ customAudioTrack }) => { + usePublish([customAudioTrack]); + + useEffect(() => { + customAudioTrack?.play(); // to play the track for the local user + return () => { + customAudioTrack?.stop(); + }; + }, [customAudioTrack]); + + return null; + }; + ``` + +1. Play a custom video track: + ```typescript + const CustomVideoTrack: React.FC<{ customVideoTrack: ILocalVideoTrack | null }> = ({ customVideoTrack }) => { + const { localCameraTrack } = useLocalCameraTrack(); + useEffect(() => { + const mediaStreamTrack = customVideoTrack?.getMediaStreamTrack(); + if (mediaStreamTrack) { + localCameraTrack?.replaceTrack(mediaStreamTrack, true) + .then(() => console.log("Track replaced")) + .catch((error) => console.error(error)); + } + return () => { + // Stop the replaced local camera track when the component unmounts + localCameraTrack?.stop(); + }; + }, [customVideoTrack, localCameraTrack]); + return null; + }; + ``` + +1. Create the variables you need to handle the custom tracks: + + ``` ts + const [customAudioTrack, setCustomAudioTrack] = useState(null); + const [customVideoTrack, setCustomVideoTrack] = useState(null); + const connectionState = useConnectionState(); + const [customMediaState, enableCustomMedia] = useState(false); + ``` + +1. Put it all together and handle custom audio and video: + + ```ts + const createCustomAudioAndVideoTracks = () => { + navigator.mediaDevices.getUserMedia({ audio: true, video: true }) + .then((stream) => { + const audioMediaStreamTracks = stream.getAudioTracks(); + const videoMediaStreamTracks = stream.getVideoTracks(); + setCustomAudioTrack(AgoraRTC.createCustomAudioTrack({ mediaStreamTrack: audioMediaStreamTracks[0] })); + setCustomVideoTrack(AgoraRTC.createCustomVideoTrack({ mediaStreamTrack: videoMediaStreamTracks[0] })); + }) + .catch((error) => console.error(error)); + }; + + ``` \ No newline at end of file From 9bab009962309b55938896c8a610fbfd535c8d6e Mon Sep 17 00:00:00 2001 From: billy-the-fish Date: Sat, 5 Aug 2023 17:21:29 +0200 Subject: [PATCH 013/184] Last update. --- .../project-implementation/react-js.mdx | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/shared/video-sdk/develop/encrypt-media-streams/project-implementation/react-js.mdx b/shared/video-sdk/develop/encrypt-media-streams/project-implementation/react-js.mdx index 0f21e2896..5c7be1371 100644 --- a/shared/video-sdk/develop/encrypt-media-streams/project-implementation/react-js.mdx +++ b/shared/video-sdk/develop/encrypt-media-streams/project-implementation/react-js.mdx @@ -5,7 +5,7 @@ import { useRTCClient } from 'agora-rtc-react'; ``` -1. **Add a method to convert a string from `Base64` to `Uint8Array`** +1. Add a method to convert a string from `Base64` to `Uint8Array` ```typescript function base64ToUint8Array(props: { base64Str: string }) { @@ -19,7 +19,7 @@ } ``` -3. **Add a method to convert a string from `Hex` to `ASCII`** +3. Add a method to convert a string from `Hex` to `ASCII` ```typescript function hex2ascii(props: { hexx: string }) { @@ -33,7 +33,7 @@ ``` -4. **Call the channel encryption methods to enable channel encryption** +4. Call the channel encryption methods to enable channel encryption To enable channel encryption in your , you need to: From 7204360ffd329d7aff8c54dc50bd65ce9e3cf77e Mon Sep 17 00:00:00 2001 From: Dasun Nirmitha Date: Wed, 16 Aug 2023 15:31:13 +0530 Subject: [PATCH 014/184] POC3 iOS get-stared changes. --- .../get-started-sdk/swift/handle-events.mdx | 99 ++++++++ .../get-started-sdk/swift/join-channel.mdx | 146 +++++++++++ .../get-started-sdk/swift/leave-channel.mdx | 45 ++++ shared/video-sdk/_get-started-sdk.mdx | 13 +- .../project-implementation/swift.mdx | 239 ++++-------------- .../get-started-sdk/project-setup/swift.mdx | 60 +---- .../get-started-sdk/project-test/swift.mdx | 9 +- 7 files changed, 365 insertions(+), 246 deletions(-) create mode 100644 assets/code/video-sdk/get-started-sdk/swift/handle-events.mdx create mode 100644 assets/code/video-sdk/get-started-sdk/swift/join-channel.mdx create mode 100644 assets/code/video-sdk/get-started-sdk/swift/leave-channel.mdx diff --git a/assets/code/video-sdk/get-started-sdk/swift/handle-events.mdx b/assets/code/video-sdk/get-started-sdk/swift/handle-events.mdx new file mode 100644 index 000000000..92500a11f --- /dev/null +++ b/assets/code/video-sdk/get-started-sdk/swift/handle-events.mdx @@ -0,0 +1,99 @@ + +``` swift +/** + The delegate is telling us that the local user has successfully joined the channel. + - Parameters: + - engine: The Agora RTC engine kit object. + - channel: The channel name. + - uid: The ID of the user joining the channel. + - elapsed: The time elapsed (ms) from the user calling `joinChannel` until this method is called. +*/ + + If the client's role is `.broadcaster`, this method also adds the broadcaster's + userId (``localUserId``) to the ``allUsers`` set. +open func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinChannel channel: String, withUid uid: UInt, elapsed: Int) { + self.localUserId = uid + if self.role == .broadcaster { + self.allUsers.insert(uid) + } +} + +/** + The delegate is telling us that a remote user has joined the channel. + + - Parameters: + - engine: The Agora RTC engine kit object. + - uid: The ID of the user joining the channel. + - elapsed: The time elapsed (ms) from the user calling `joinChannel` until this method is called. + +This method adds the remote user to the `allUsers` set. +*/ +open func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinedOfUid uid: UInt, elapsed: Int) { + self.allUsers.insert(uid) +} + +/** + The delegate is telling us that a remote user has left the channel. + + - Parameters: + - engine: The Agora RTC engine kit object. + - uid: The ID of the user who left the channel. + - reason: The reason why the user left the channel. + +This method removes the remote user from the `allUsers` set. +*/ +open func rtcEngine(_ engine: AgoraRtcEngineKit, didOfflineOfUid uid: UInt, reason: AgoraUserOfflineReason) { + self.allUsers.remove(uid) +} +``` + + + +``` swift +/** + The delegate is telling us that the local user has successfully joined the channel. + - Parameters: + - engine: The Agora RTC engine kit object. + - channel: The channel name. + - uid: The ID of the user joining the channel. + - elapsed: The time elapsed (ms) from the user calling `joinChannel` until this method is called. +*/ + + If the client's role is `.broadcaster`, this method also adds the broadcaster's + userId (``localUserId``) to the ``allUsers`` set. +open func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinChannel channel: String, withUid uid: UInt, elapsed: Int) { + self.localUserId = uid + if self.role == .broadcaster { + self.allUsers.insert(uid) + } +} + +/** + The delegate is telling us that a remote user has joined the channel. + + - Parameters: + - engine: The Agora RTC engine kit object. + - uid: The ID of the user joining the channel. + - elapsed: The time elapsed (ms) from the user calling `joinChannel` until this method is called. + +This method adds the remote user to the `allUsers` set. +*/ +open func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinedOfUid uid: UInt, elapsed: Int) { + self.allUsers.insert(uid) +} + +/** + The delegate is telling us that a remote user has left the channel. + + - Parameters: + - engine: The Agora RTC engine kit object. + - uid: The ID of the user who left the channel. + - reason: The reason why the user left the channel. + +This method removes the remote user from the `allUsers` set. +*/ +open func rtcEngine(_ engine: AgoraRtcEngineKit, didOfflineOfUid uid: UInt, reason: AgoraUserOfflineReason) { + self.allUsers.remove(uid) +} +``` + diff --git a/assets/code/video-sdk/get-started-sdk/swift/join-channel.mdx b/assets/code/video-sdk/get-started-sdk/swift/join-channel.mdx new file mode 100644 index 000000000..f886bad2c --- /dev/null +++ b/assets/code/video-sdk/get-started-sdk/swift/join-channel.mdx @@ -0,0 +1,146 @@ + +``` swift +/** +Joins a channel, starting the connection to an RTC session. +- Parameters: + - channel: Name of the channel to join. + - token: Token to join the channel, this can be nil for an weak security testing session. + - uid: User ID of the local user. This can be 0 to allow the engine to automatically assign an ID. + - info: Info is currently unused by RTC, it is reserved for future use. +- Returns: Error code, 0 = success, < 0 = failure. +*/ +@discardableResult +open func joinChannel( + _ channel: String, token: String? = nil, uid: UInt = 0, info: String? = nil +) -> Int32 { + self.agoraEngine.joinChannel( + byToken: token, channelId: channel, info: info, uid: uid + ) +} + +/** +This method is used by this app specifically. If there is a tokenURL, +it will attempt to retrieve a token from there. +Otherwise it will simply apply the provided token in config.json or nil. + +- Parameters: + - channel: Name of the channel to join. + - uid: User ID of the local user. This can be 0 to allow the engine to automatically assign an ID. +- Returns: Error code, 0 = success, < 0 = failure. +*/ +@discardableResult +internal func joinChannel(_ channel: String, uid: UInt? = nil) async -> Int32 { + let userId = uid ?? DocsAppConfig.shared.uid + var token = DocsAppConfig.shared.rtcToken + if !DocsAppConfig.shared.tokenUrl.isEmpty { + do { + token = try await self.fetchToken( + from: DocsAppConfig.shared.tokenUrl, channel: channel, + role: self.role, userId: userId + ) + } catch { + print("token server fetch failed: \(error.localizedDescription)") + } + } + return self.joinChannel(channel, token: token, uid: userId, info: nil) +} +``` + + + +``` swift +/** +Joins a channel, starting the connection to an RTC session. +- Parameters: + - channel: Name of the channel to join. + - token: Token to join the channel, this can be nil for an weak security testing session. + - uid: User ID of the local user. This can be 0 to allow the engine to automatically assign an ID. + - info: Info is currently unused by RTC, it is reserved for future use. +- Returns: Error code, 0 = success, < 0 = failure. +*/ +@discardableResult +open func joinChannel( + _ channel: String, token: String? = nil, uid: UInt = 0, info: String? = nil +) -> Int32 { + self.agoraEngine.joinChannel( + byToken: token, channelId: channel, info: info, uid: uid + ) +} + +/** +This method is used by this app specifically. If there is a tokenURL, +it will attempt to retrieve a token from there. +Otherwise it will simply apply the provided token in config.json or nil. + +- Parameters: + - channel: Name of the channel to join. + - uid: User ID of the local user. This can be 0 to allow the engine to automatically assign an ID. +- Returns: Error code, 0 = success, < 0 = failure. +*/ +@discardableResult +internal func joinChannel(_ channel: String, uid: UInt? = nil) async -> Int32 { + let userId = uid ?? DocsAppConfig.shared.uid + var token = DocsAppConfig.shared.rtcToken + if !DocsAppConfig.shared.tokenUrl.isEmpty { + do { + token = try await self.fetchToken( + from: DocsAppConfig.shared.tokenUrl, channel: channel, + role: self.role, userId: userId + ) + } catch { + print("token server fetch failed: \(error.localizedDescription)") + } + } + return self.joinChannel(channel, token: token, uid: userId, info: nil) +} +``` + + + +``` swift +/** +Joins a channel, starting the connection to an RTC session. +- Parameters: + - channel: Name of the channel to join. + - token: Token to join the channel, this can be nil for an weak security testing session. + - uid: User ID of the local user. This can be 0 to allow the engine to automatically assign an ID. + - info: Info is currently unused by RTC, it is reserved for future use. +- Returns: Error code, 0 = success, < 0 = failure. +*/ +@discardableResult +open func joinChannel( + _ channel: String, token: String? = nil, uid: UInt = 0, info: String? = nil +) -> Int32 { + self.agoraEngine.joinChannel( + byToken: token, channelId: channel, info: info, uid: uid + ) +} + +/** +This method is used by this app specifically. If there is a tokenURL, +it will attempt to retrieve a token from there. +Otherwise it will simply apply the provided token in config.json or nil. + +- Parameters: + - channel: Name of the channel to join. + - uid: User ID of the local user. This can be 0 to allow the engine to automatically assign an ID. +- Returns: Error code, 0 = success, < 0 = failure. +*/ +@discardableResult +internal func joinChannel(_ channel: String, uid: UInt? = nil) async -> Int32 { + let userId = uid ?? DocsAppConfig.shared.uid + var token = DocsAppConfig.shared.rtcToken + if !DocsAppConfig.shared.tokenUrl.isEmpty { + do { + token = try await self.fetchToken( + from: DocsAppConfig.shared.tokenUrl, channel: channel, + role: self.role, userId: userId + ) + } catch { + print("token server fetch failed: \(error.localizedDescription)") + } + } + return self.joinChannel(channel, token: token, uid: userId, info: nil) +} +``` + \ No newline at end of file diff --git a/assets/code/video-sdk/get-started-sdk/swift/leave-channel.mdx b/assets/code/video-sdk/get-started-sdk/swift/leave-channel.mdx new file mode 100644 index 000000000..758053dc0 --- /dev/null +++ b/assets/code/video-sdk/get-started-sdk/swift/leave-channel.mdx @@ -0,0 +1,45 @@ + +``` swift +/** +Leaves the channel and stops the preview for the session. + +- Parameter leaveChannelBlock: An optional closure that will be called when the client leaves the channel. + The closure takes an `AgoraChannelStats` object as its parameter. + +This method also empties all entries in ``allUsers``. +*/ +@discardableResult +open func leaveChannel( + leaveChannelBlock: ((AgoraChannelStats) -> Void)? = nil +) -> Int32 { + let leaveErr = self.agoraEngine.leaveChannel(leaveChannelBlock) + self.agoraEngine.stopPreview() + defer { AgoraRtcEngineKit.destroy() } + self.allUsers.removeAll() + return leaveErr +} +``` + + + +``` swift +/** +Leaves the channel and stops the preview for the session. + +- Parameter leaveChannelBlock: An optional closure that will be called when the client leaves the channel. + The closure takes an `AgoraChannelStats` object as its parameter. + +This method also empties all entries in ``allUsers``. +*/ +@discardableResult +open func leaveChannel( + leaveChannelBlock: ((AgoraChannelStats) -> Void)? = nil +) -> Int32 { + let leaveErr = self.agoraEngine.leaveChannel(leaveChannelBlock) + self.agoraEngine.stopPreview() + defer { AgoraRtcEngineKit.destroy() } + self.allUsers.removeAll() + return leaveErr +} +``` + \ No newline at end of file diff --git a/shared/video-sdk/_get-started-sdk.mdx b/shared/video-sdk/_get-started-sdk.mdx index e863a5e61..1f41cbd62 100644 --- a/shared/video-sdk/_get-started-sdk.mdx +++ b/shared/video-sdk/_get-started-sdk.mdx @@ -7,16 +7,16 @@ import Reference from '@docs/shared/video-sdk/get-started/get-started-sdk/refere - enables one-to-one or small-group video chat connections with smooth, jitter-free streaming video. ’s makes it easy to embed real-time video chat into web, mobile and native s. + enables one-to-one or small-group video chat connections with smooth, jitter-free streaming video. ’s makes it easy to embed real-time video chat into web, mobile, and native s. - enables you to host live audio and video streaming events with real-time interactivity. ’s makes it easy to embed real-time video interaction into web, mobile and native s. + enables you to host live audio and video streaming events with real-time interactivity. ’s makes it easy to embed real-time video interaction into web, mobile, and native s. offers for applications where ultra-low latency (400-800ms) is needed to keep audiences engaged, such as talk shows, sports broadcasts and live auctions. For further information on setting the audience latency level in your , see the [Call quality](../develop/ensure-channel-quality) and [Pricing](../reference/pricing#unit-pricing) guides. - enables you to host large-scale live audio and video streaming events with real-time interactivity. ’s makes it easy to embed real-time video interaction into web, mobile and native s. - offers for applications where low latency (1500-2000ms) is needed for limited audience engagement, such as podcasts, gaming streams and other virtual events. For further information on setting the audience latency level in your , see the [Call quality](../develop/ensure-channel-quality) and [Pricing](../reference/pricing#unit-pricing) guides. + enables you to host large-scale live audio and video streaming events with real-time interactivity. ’s makes it easy to embed real-time video interaction into web, mobile, and native s. + offers for applications where low latency (1500-2000ms) is needed for limited audience engagement, such as podcasts, gaming streams, and other virtual events. For further information on setting the audience latency level in your , see the [Call quality](../develop/ensure-channel-quality) and [Pricing](../reference/pricing#unit-pricing) guides. Thanks to ’s intelligent and global Software Defined Real-time Network ([](../overview/core-concepts#agora-sd-rtn)), you can rely on the highest available video and audio quality. @@ -34,7 +34,7 @@ This section explains how you can integrate features into yo ![Video Calling Web UIKit](/images/interactive-live-streaming/get-started-sdk-livestreaming.png) -In an event, hosts stream a video feed to an audience. For example, when a CEO is giving a speech to the company employees, the CEO does not need to see all members of the audience. To represent this in your , when you join as a host, the local feed is started and you see your own video feed. When you join as a member of the audience, you see the host's video feed. +In an event, hosts stream a video feed to an audience. For example, when a CEO is giving a speech to the company employees, the CEO does not need to see all members of the audience. To represent this in your , when you join as a host, the local feed is started and you see your own video feed. When you join as a member of the audience, you see the host's video feed. @@ -80,7 +80,6 @@ This section shows how to use the to implement - ## Test your implementation recommends you run this project on a physical mobile device, as some simulators may not support the full features of this project. To ensure that you have implemented in your : @@ -96,6 +95,6 @@ This section contains information that completes the information in this page, o - [Downloads](../reference/downloads) shows you how to install manually. -- To ensure communication security in a test or production environment, use a token server to generate token is recommended to ensure communication security, see [Implement the authentication workflow](../get-started/authentication-workflow). +- To ensure communication security in a test or production environment, using a token server to generate token is recommended to ensure communication security, see [Implement the authentication workflow](../get-started/authentication-workflow). diff --git a/shared/video-sdk/get-started/get-started-sdk/project-implementation/swift.mdx b/shared/video-sdk/get-started/get-started-sdk/project-implementation/swift.mdx index 0763363dd..d4e6979ae 100644 --- a/shared/video-sdk/get-started/get-started-sdk/project-implementation/swift.mdx +++ b/shared/video-sdk/get-started/get-started-sdk/project-implementation/swift.mdx @@ -1,88 +1,9 @@ -import CreateUI from '@docs/assets/code/video-sdk/get-started-sdk/swift/create-ui.mdx'; -import ShowMessage from '@docs/assets/code/video-sdk/get-started-sdk/swift/show-message.mdx'; -import JoinAndLeave from '@docs/assets/code/video-sdk/get-started-sdk/swift/join-and-leave.mdx'; -import ViewDidDisappear from '@docs/assets/code/video-sdk/get-started-sdk/swift/view-did-disappear.mdx'; -import RoleAction from '@docs/assets/code/video-sdk/get-started-sdk/swift/role-action.mdx'; +import HandleEvents from '@docs/assets/code/video-sdk/get-started-sdk/swift/handle-events.mdx'; +import JoinChannel from '@docs/assets/code/video-sdk/get-started-sdk/swift/join-channel.mdx'; +import LeaveChannel from '@docs/assets/code/video-sdk/get-started-sdk/swift/leave-channel.mdx'; -### Implement the user interface - -To implement the user interface, create code with: - -- Views for local and remote video. - -- A button for the user to **Join** or **Leave** the channel. - - -- A selector so the user can join a channel as host or audience. - - -To create this UI, in `ViewController`, replace the contents of the file with the following: - - - -### Handle the system logic - -When your launches, ensure that the permissions necessary to insert feature into the are granted. If the permissions are not granted, use the built-in feature to request them; if they are, return `true`. - -1. **Import ** - - In `ViewController`, add the following line after the last `import` statement: - - ``` swift - import AgoraRtcKit - ``` - - If Xcode does not recognize this import, click **File** > **Packages** > **Reset Package Caches**. - -2. **Handle hardware permissions on the device** - - In `ViewController`, add the following lines after the `buttonAction(sender: UIButton!)` function: - - ``` swift - func checkForPermissions() async -> Bool { - var hasPermissions = await self.avAuthorization(mediaType: .video) - // Break out, because camera permissions have been denied or restricted. - if !hasPermissions { return false } - hasPermissions = await self.avAuthorization(mediaType: .audio) - return hasPermissions - } - - func avAuthorization(mediaType: AVMediaType) async -> Bool { - let mediaAuthorizationStatus = AVCaptureDevice.authorizationStatus(for: mediaType) - switch mediaAuthorizationStatus { - case .denied, .restricted: return false - case .authorized: return true - case .notDetermined: - return await withCheckedContinuation { continuation in - AVCaptureDevice.requestAccess(for: mediaType) { granted in - continuation.resume(returning: granted) - } - } - @unknown default: return false - } - } - ``` -3. **Show status updates to your users** - - In `ViewController`, add the following method to the `ViewController` class: - - - - -2. **Show status updates to your users** - - In `ViewController`, add the following method to the `ViewController` class: - - - - - -### Implement the channel logic - -When a user opens this , you initialize the . When the user taps a button, the joins or leaves a channel. - -The following figure shows the call sequence of implementing . +The following figure shows the API call sequence. ![video call logic ios](/images/video-sdk/video-call-logic-ios.svg) @@ -91,131 +12,81 @@ The following figure shows the call sequence of implementing < ![ils call logic ios](/images/video-sdk/ils-call-logic-ios.svg) -To implement this logic, take the following steps: +The reference app implements the business logic in the [`AgoraManager`](https://github.com/AgoraIO/video-sdk-samples-ios/blob/main/agora-manager/AgoraManager.swift) component. +This class encapsulates core functionality such as logging in to , joining a channel, listening for events from other users and logging out. -1. **Declare the variables that you use to integrate into your ** +The following code examples show how to implement these steps in your : - Add the following lines to the top of the `ViewController` class: +1. **Import classes** ``` swift - // The main entry point for Video SDK - var agoraEngine: AgoraRtcEngineKit! - // By default, set the current user role to broadcaster to both send and receive streams. - var userRole: AgoraClientRole = .broadcaster - - // Update with the App ID of your project generated on Agora Console. - let appID = "<#Your app ID#>" - // Update with the temporary token generated in Agora Console. - var token = "<#Your temp access token#>" - // Update with the channel name you used to generate the token in Agora Console. - var channelName = "<#Your channel name#>" + import AgoraRtcKit ``` -2. **Initialize the ** - - To implement , you use to create an instance. In `ViewController`, add the following lines after the `leaveChannel()` function: +2. **Declare variables to create an instance and join a channel** + ``` swift - func initializeAgoraEngine() { - let config = AgoraRtcEngineConfig() - // Pass in your App ID here. - config.appId = appID - // Use AgoraRtcEngineDelegate for the following delegate parameter. - agoraEngine = AgoraRtcEngineKit.sharedEngine(with: config, delegate: self) + // The Agora App ID for the session. + public let appId: String + // The client's role in the session. + public var role: AgoraClientRole = .audience { + didSet { agoraEngine.setClientRole(role) } } - ``` - - Each `AgoraRtcEngineKit` object supports one profile only. If you want to switch to another profile, call `destroy` to release the current `AgoraRtcEngineKit` object and then create a new one by calling `sharedEngine(with: , delegate: )` again. + // Integer ID of the local user. + @Published public var localUserId: UInt = 0 - You see a compilation error. Worry not, you fix this now by coding `ViewController` to delegate `AgoraRtcEngineDelegate`. + // The set of all users in the channel. + @Published public var allUsers: Set = [] -3. **Enable your to display a remote video stream** - - In `ViewController`, add the following lines after the `ViewController` class: - - ``` swift - extension ViewController: AgoraRtcEngineDelegate { - // Callback called when a new host joins the channel - func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinedOfUid uid: UInt, elapsed: Int) { - let videoCanvas = AgoraRtcVideoCanvas() - videoCanvas.uid = uid - videoCanvas.renderMode = .hidden - videoCanvas.view = remoteView - agoraEngine.setupRemoteVideo(videoCanvas) - } - } + private var engine: AgoraRtcEngineKit? ``` + - The compilation error disappears. Yay. - -4. **Enable your to display a local video stream** - - In `ViewController`, add the following lines after the `initializeAgoraEngine` function: - + ``` swift - func setupLocalVideo() { - // Enable the video module - agoraEngine.enableVideo() - // Start the local video preview - agoraEngine.startPreview() - let videoCanvas = AgoraRtcVideoCanvas() - videoCanvas.uid = 0 - videoCanvas.renderMode = .hidden - videoCanvas.view = localView - // Set the local video view - agoraEngine.setupLocalVideo(videoCanvas) + // The Agora App ID for the session. + public let appId: String + // The client's role in the session. + public var role: AgoraClientRole = .audience { + didSet { agoraEngine.setClientRole(role) } } - ``` - - -
    You can enable both cameras using enableMultiCamera.
    -
    + // Integer ID of the local user. + @Published public var localUserId: UInt = 0 -5. **Join and leave a channel** + // The set of all users in the channel. + @Published public var allUsers: Set = [] - - You assign all users in the channel the `.broadcaster` role. This role has rights to stream video and audio to a channel. For , set all users as `.broadcaster`. - - - You assign event hosts the `.broadcaster` role. This role has rights to stream video and audio to a channel, the audience views content streamed to the channel by the broadcaster, For , set the role chosen by the user. + private var engine: AgoraRtcEngineKit? + ``` - In `ViewController`, replace the existing `joinChannel()` and `leaveChannel()` functions with the following: - - - - -6. **Enable the user to join a channel as the host or the audience** - In `ViewController`, replace the existing `func roleAction(sender: UISegmentedControl!)` function with the following: - - - - - -### Start and stop your - -In this implementation, you initiate when you open the . The user joins and leaves a call using the `Join` button. - -To implement this feature: - -1. **Initialize and local video when the view is loaded** - - In `ViewController`, update `viewDidLoad` as follows: +3. **Configure an instance** ``` swift - override func viewDidLoad() { - super.viewDidLoad() - // Do any additional setup after loading the view. - // Initializes the video view - initViews() - // The following functions are used when calling Agora APIs - initializeAgoraEngine() + // The Agora RTC Engine Kit for the session. + public var agoraEngine: AgoraRtcEngineKit { + if let engine { return engine } + return setupEngine() + } + + open func setupEngine() -> AgoraRtcEngineKit { + let eng = AgoraRtcEngineKit.sharedEngine(withAppId: appId, delegate: self) + eng.enableVideo() + eng.setClientRole(role) + self.engine = eng + return eng } ``` -2. **Leave the channel and clean up all the resources used by your ** +4. **Handle and respond to events** + + + +5. **Join a channel to start ** - In `ViewController`, add the following lines after the `viewDidLoad` function: + - +6. **Leave the channel and clean up the resources used by the app when the local user ends the call** + diff --git a/shared/video-sdk/get-started/get-started-sdk/project-setup/swift.mdx b/shared/video-sdk/get-started/get-started-sdk/project-setup/swift.mdx index 76f40cec4..28c5a1d96 100644 --- a/shared/video-sdk/get-started/get-started-sdk/project-setup/swift.mdx +++ b/shared/video-sdk/get-started/get-started-sdk/project-setup/swift.mdx @@ -1,56 +1,14 @@ -1. [Create a new project](https://help.apple.com/xcode/mac/current/#/dev07db0e578) for this using the **App** template. Select the **Storyboard** Interface and **Swift** Language. - - If you have not already added team information, click **Add account…**, input your Apple ID, then click **Next**. - -1. [Enable automatic signing](https://help.apple.com/xcode/mac/current/#/dev23aab79b4) for your project. - - [Set the target devices](https://help.apple.com/xcode/mac/current/#/deve69552ee5) to deploy your iOS to an iPhone or iPad. - -1. Add project permissions for microphone and camera usage: - - 1. Open **Info** in the project navigation panel, then add the following properties to the [Information Property List](https://help.apple.com/xcode/mac/current/#/dev3f399a2a6): - - | Key | Type | Value | - |------------------------------|--------|------------------------| - | NSMicrophoneUsageDescription | String | Access the microphone. | - | NSCameraUsageDescription | String | Access the camera. | - - - 2. Add sandbox and runtime capabilities to your project: - - Open the target for your project in the project navigation properties, then add the following capabilities in **Signing & Capabilities**. - - **App Sandbox**: - - Incoming Connections (Server) - - Outgoing Connections (Client) - - Camera - - Audio Input - - **Hardened Runtime**: - - Camera - - Audio Input - - -1. Integrate into your project: - - These steps are for package install, if you prefer to use **CocoaPods** or manually install, follow the [installation instructions](../reference/downloads#manual-installation). - - 1. In Xcode, click **File** > **Add Packages**, then paste the following link in the search: - - ``` - https://github.com/AgoraIO/AgoraRtcEngine_macOS.git - ``` - - - ``` - https://github.com/AgoraIO/AgoraRtcEngine_iOS.git - ``` - - You see the available packages. Add the **** package and any other functionality that you want to integrate into your app. For example, _AgoraAINoiseSuppressionExtension_. Choose a version later than 4.0.0. - - 1. Click **Add Package**. In the new window, make sure to choose **** and click **Add Package**. - - You see **AgoraRtcKit** in **Package Dependencies** for your project. +1. Clone the [ sample project](https://github.com/AgoraIO/video-sdk-samples-ios) to `` on + your + development machine: + ```bash + git clone https://github.com/AgoraIO/video-sdk-samples-ios.git + ``` +1. Open the sample project in Xcode. + Select **File** > **Open...** then navigate to `/video-sdk-samples-ios/Docs-Examples.xcodeproj` and click **Open**. Xcode loads the project. +1. Connect a physical or virtual device to your development environment. diff --git a/shared/video-sdk/get-started/get-started-sdk/project-test/swift.mdx b/shared/video-sdk/get-started/get-started-sdk/project-test/swift.mdx index c0ad41998..8969ab4d9 100644 --- a/shared/video-sdk/get-started/get-started-sdk/project-test/swift.mdx +++ b/shared/video-sdk/get-started/get-started-sdk/project-test/swift.mdx @@ -1,7 +1,8 @@ -3. In Xcode, in `ViewController`, update `appID`, `channelName` and `token` with the values for your temporary token. +3. In the `video-sdk-samples-ios` reference app, open `DocsAppConfig.swift` file and update the values of `appId`, + `channel`, and `rtcToken` with the values for your temporary token. -2. Run your , then wait a few seconds until the installation is complete. +4. Run your , then wait a few seconds until the installation is complete. If this is the first time you run the project, grant microphone and camera access to your . @@ -10,9 +11,9 @@
    - 5. Select an option and click **Join** to start a session. When you join as a **Host**, the local video is published and played in the . When you join as **Audience**, the remote stream is subscribed and played. +5. Select an option and click **Join** to start a session. When you join as a **Host**, the local video is published and played in the . When you join as **Audience**, the remote stream is subscribed and played. - 5. Click **Join** to start a call. Now, you can see yourself on the test device and talk to the web demo app using your . +5. Click **Join** to start a call. Now, you can see yourself on the test device and talk to the web demo app using your . \ No newline at end of file From 18260285f646b86e702b30acb365c88ef16315fd Mon Sep 17 00:00:00 2001 From: Dasun Nirmitha Date: Fri, 18 Aug 2023 00:00:46 +0530 Subject: [PATCH 015/184] POC3 iOS call-quality changes. --- .../develop/_ensure-channel-quality.mdx | 14 +- .../project-implementation/swift.mdx | 303 ++++++------------ .../project-setup/index.mdx | 2 - .../project-setup/ios.mdx | 3 - .../project-test/swift.mdx | 40 +-- .../ensure-channel-quality/reference/ios.mdx | 2 - .../reference/swift.mdx | 44 +-- 7 files changed, 125 insertions(+), 283 deletions(-) delete mode 100644 shared/video-sdk/develop/ensure-channel-quality/project-setup/ios.mdx diff --git a/shared/video-sdk/develop/_ensure-channel-quality.mdx b/shared/video-sdk/develop/_ensure-channel-quality.mdx index fa3f91fb9..19feaabd3 100644 --- a/shared/video-sdk/develop/_ensure-channel-quality.mdx +++ b/shared/video-sdk/develop/_ensure-channel-quality.mdx @@ -4,6 +4,8 @@ import ProjectSetup from '@docs/shared/video-sdk/develop/ensure-channel-quality/ import ProjectImplement from '@docs/shared/video-sdk/develop/ensure-channel-quality/project-implementation/index.mdx'; import ProjectTest from '@docs/shared/video-sdk/develop/ensure-channel-quality/project-test/index.mdx'; import Reference from '@docs/shared/video-sdk/develop/ensure-channel-quality/reference/index.mdx'; +import {PlatformWrapper} from "../../../../src/mdx-components/PlatformWrapper"; + Customer satisfaction for your integrated depends on the quality of video and audio it provides. Quality of audiovisual communication through your is affected by the following factors: @@ -104,16 +106,16 @@ The following figure shows the workflow you need to implement to ensure channel ## Prerequisites -In order to follow this procedure you must have: - -* Implemented the [](../get-started/get-started-sdk) project for . - - +To follow this procedure you must have: +* Implemented the [](../get-started/get-started-sdk#prerequisites) project for . + ## Project setup To create the environment necessary to implement call quality best practices into your , open the [](../get-started/get-started-sdk) project you created previously. + + ## Implement best practice to optimize call quality @@ -126,7 +128,7 @@ This section shows you how to integrate call quality optimization features of : -1. [Generate a temporary token](../reference/manage-agora-account#generate-a-temporary-token) in . +1. [Generate a temporary token](../reference/manage-agora-account#generate-a-temporary-token) in . 2. In your browser, navigate to the dual stream web demo and update _App ID_, _Channel_, and _Token_ with the values for your temporary token, then click **Join**. diff --git a/shared/video-sdk/develop/ensure-channel-quality/project-implementation/swift.mdx b/shared/video-sdk/develop/ensure-channel-quality/project-implementation/swift.mdx index bffe01a06..065753263 100644 --- a/shared/video-sdk/develop/ensure-channel-quality/project-implementation/swift.mdx +++ b/shared/video-sdk/develop/ensure-channel-quality/project-implementation/swift.mdx @@ -1,52 +1,17 @@ -import ImplementDeclarations from '@docs/assets/code/video-sdk/ensure-channel-quality/swift/implement-declarations.mdx'; -import ImplementLabels from '@docs/assets/code/video-sdk/ensure-channel-quality/swift/implement-labels.mdx'; -import NetworkStatus from '@docs/assets/code/video-sdk/ensure-channel-quality/swift/implement-network-status.mdx'; - -### Implement the user interface - -This section guides you through the necessary UI changes in the project interface to implement call quality features. - -To implement this functionality for a UIKit based app, retrieve an Agora Engine instance with a call to `agoraView.agkit`. - - -To enable users to see the network status: - -1. Add the following declarations to the top of the `ViewController` class: - - - -1. Paste the following lines inside the `initViews` function: - - - -### Handle the system logic - -In order to display the network status visually: +To implement the call quality features, take the following steps: -1. **Define variables to manage test state and workflow** +1. **Import classes** - Add the following declarations to the `ViewController`: + The following lines import the necessary classes and interfaces into the project: ```swift - var counter1 = 0 // Controls the frequency of messages - var counter2 = 0 // Controls the frequency of messages - var remoteUid: UInt = 0 // Uid of the remote user - var highQuality = true // Quality of the remote video stream being played + import SwiftUI + import AgoraRtcKit ``` -1. **Update the network status indication in the UI** - - To show the network quality result visually to the user, add the following method to the `ViewController`: - - - -### Implement features to ensure quality - -To implement the call quality features, take the following steps: - 1. **Enable the user to test the network** - In the `ViewController`, add the following function: + The following method starts the last mile probe test to check network conditions: ```swift func startProbeTest() { @@ -61,190 +26,116 @@ To implement the call quality features, take the following steps: // The expected downlink bitrate (bps). The value range is [100000,5000000]. config.expectedDownlinkBitrate = 100000 - agoraEngine.startLastmileProbeTest(config); - - showMessage(title:"Probe Test", text:"Running the last mile probe test ...") + agoraEngine.startLastmileProbeTest(config) } ``` -1. **Set the audio scenario** - - Inside the `initializeAgoraEngine` function, replace the `agoraEngine = AgoraRtcEngineKit.sharedEngine(with: config, delegate: self)` line with the following: - - ```swift - let engineConfig = AgoraRtcEngineConfig() - engineConfig.appId = appID - engineConfig.audioScenario = AgoraAudioScenario.gameStreaming - agoraEngine = AgoraRtcEngineKit.sharedEngine(with: engineConfig, delegate: self) - ``` - 1. **Implement best practice for app initiation** - When starting your , the is initialized in `initializeAgoraEngine`. After initialization, do the following: - - * _Enable dual stream mode_: Required for multi-user scenarios. - * _Set an audio profile and audio scenario_: Setting an audio profile is optional and only required if you have special requirements such as streaming music. - * _Set the video profile_: Setting a video profile is also optional. It is useful when you want to change one or more of `mirrorMode`, `frameRate`, `bitrate`, `dimensions`, `orientationMode`, or `degradationPreference` from the default setting to custom values. - For more information, see [video profile table](#video-profile-table). - * _Start the network probe test_: A quick test at startup to gauge the network quality. - - To implement these features, add the following code inside the end of `initializeAgoraEngine`: + The `CallQualityView` struct displays the video feeds of all participants in a channel, along with their call quality statistics such as the following: + + * _receivedBitrate_: The bitrate (Kbps) of the remote video received since the last count. + * _width_: The width (pixels) of the video. + * _height_: The height (pixels) of the video. + * _receivedFrameRate_: The frame rate (Kbps) received since the last count. + * _frameLossRate_: The packet loss rate (%) of the remote video. + * _packetLossRate_: The packet loss rate (%) of the remote video after using the anti-packet-loss technology. + * _captureFrameWidth_: The width (px) for capturing the local video stream. + * _captureFrameHeight_: The height (px) for capturing the local video stream. + * _captureFrameRate_: The frame rate (fps) for capturing the local video stream. + * _encodedFrameWidth_: The width of the encoded video (px). + * _encodedFrameHeight_: The height of the encoded video (px). + * _encoderOutputFrameRate_: The output frame rate (fps) of the local video encoder. + * _sentFrameRate_: The actual frame rate (fps) while sending the local video stream. This value does not include the frame rate for resending the video after packet loss. + * _sentBitrate_: The actual bitrate (Kbps) while sending the local video stream. This value does not include the bitrate for resending the video after packet loss. + * _txPacketLossRate_: The video packet loss rate (%) from the local client to the Agora server before applying the anti-packet loss strategies. + + ``` swift + struct CallQualityView: View { + /// The Agora SDK manager for call quality. + @ObservedObject var agoraManager = CallQualityManager( + appId: DocsAppConfig.shared.appId, role: .broadcaster + ) + + var body: some View { + ScrollView { + VStack { + ForEach(Array(agoraManager.allUsers), id: \.self) { uid in + AgoraVideoCanvasView(manager: agoraManager, uid: uid) + .aspectRatio(contentMode: .fit).cornerRadius(10) + .overlay(alignment: .topLeading) { + Text(agoraManager.callQualities[uid] ?? "no data").padding(4) + .background { + #if os(iOS) + VisualEffectView(effect: UIBlurEffect(style: .systemMaterial)) + .cornerRadius(10).blur(radius: 1).opacity(0.75) + #endif + }.padding(4) + } + } + }.padding(20) + }.onAppear { + await agoraManager.joinChannel(DocsAppConfig.shared.channel) + }.onDisappear { + agoraManager.leaveChannel() + } + } - ```swift - // Enable the dual stream mode - agoraEngine.enableDualStreamMode(true) - // Set audio profile - agoraEngine.setAudioProfile(AgoraAudioProfile.default) - // Set the video profile - let videoConfig = AgoraVideoEncoderConfiguration() - // Set Mirror mode - videoConfig.mirrorMode = AgoraVideoMirrorMode.auto - // Set Framerate - videoConfig.frameRate = AgoraVideoFrameRate.fps10 - // Set Bitrate - videoConfig.bitrate = AgoraVideoBitrateStandard - // Set Dimensions - videoConfig.dimensions = AgoraVideoDimension640x360 - // Set orientation mode - videoConfig.orientationMode = AgoraVideoOutputOrientationMode.adaptative - // Set degradation preference - videoConfig.degradationPreference = AgoraDegradationPreference.balanced - // Apply the configuration - agoraEngine.setVideoEncoderConfiguration(videoConfig) - - // Start the probe test - startProbeTest() + init(channelId: String) { + DocsAppConfig.shared.channel = channelId + } + static let docPath = getFolderName(from: #file) + static let docTitle = LocalizedStringKey("ensure-channel-quality-title") + } ``` -1. **Listen to events to receive state change notifications and quality statistics.** +1. **Listen to events to receive network quality statistics.** - Add the following event handlers to receive state change notifications and quality statistics: + The `CallQualityManager` class holds the following event handlers to get quality statistics of local and remote users: - * `onLastmileQuality`: Receives the network quality result. * `onLastmileProbeResult`: Receives detailed probe test results. - * `onNetworkQuality`: Receives statistics on network quality. - * `onRtcStats`: Receives the stats. - * `onRemoteVideoStateChanged`: Receives notification regarding any change in the state of the remote video. * `onLocalVideoStats`: Receives stats about the local video. - - In the `ViewController.swift` file, add the following event handlers inside `extension ViewController: AgoraRtcEngineDelegate` along with the existing event handlers: + * `remoteVideoStats`: Receives stats about the remote video. ```swift - func rtcEngine(_ engine: AgoraRtcEngineKit, lastmileQuality quality: AgoraNetworkQuality) { - self.updateNetworkStatus(quality: Int(quality.rawValue)) - } - - func rtcEngine(_ engine: AgoraRtcEngineKit, lastmileProbeTest result: AgoraLastmileProbeResult) { - agoraEngine.stopLastmileProbeTest() + public func rtcEngine(_ engine: AgoraRtcEngineKit, lastmileProbeTest result: AgoraLastmileProbeResult) { + engine.stopLastmileProbeTest() // The result object contains the detailed test results that help you // manage call quality. For example, the downlink jitter" - showMessage(title: "Downlink jitter", text: String(result.downlinkReport.jitter)) + print("downlink jitter: \(result.downlinkReport.jitter)") } - func rtcEngine(_ engine: AgoraRtcEngineKit, networkQuality: UInt, txQuality: AgoraNetworkQuality, rxQuality: AgoraNetworkQuality) { - // Use DownLink NetQuality to update the network status - self.updateNetworkStatus(quality: Int(rxQuality.rawValue)) + /** Updates the call quality statistics for a remote user. + + - Parameters: + - engine: The Agora SDK engine. + - stats: The remote video statistics. + */ + public func rtcEngine(_ engine: AgoraRtcEngineKit, remoteVideoStats stats: AgoraRtcRemoteVideoStats) { + self.callQualities[stats.uid] = """ + Received Bitrate = \(stats.receivedBitrate) + Frame = \(stats.width)x\(stats.height), \(stats.receivedFrameRate)fps + Frame Loss Rate = \(stats.frameLossRate) + Packet Loss Rate = \(stats.packetLossRate) + """ } - func rtcEngine(_ engine: AgoraRtcEngineKit, reportRtcStats: AgoraChannelStats) { - counter1 += 1 - var msg = "" - - if (counter1 == 5) { - msg = "\(String(reportRtcStats.userCount)) user(s)" - } else if (counter1 == 10 ) { - msg = "Packet loss rate: \(String(reportRtcStats.rxPacketLossRate))" - counter1 = 0 - } - - if (msg.count > 0) { showMessage(title: "Video SDK Stats", text: msg) } - } - - func rtcEngine(_ engine: AgoraRtcEngineKit, remoteVideoStateChangedOfUid: UInt, state: AgoraVideoRemoteState, reason: AgoraVideoRemoteReason, elapsed: Int) { - let stateChangeReport = ["Uid = \(remoteVideoStateChangedOfUid)", "NewState = \(state):", "Reason = \(reason):", - "Elapsed = \(elapsed)"].joined(separator: "\n") - - showMessage(title: "Remote video state changed:", text: stateChangeReport, delay: 8) - } - - func rtcEngine(_ engine: AgoraRtcEngineKit, localVideoStats: AgoraRtcLocalVideoStats, sourceType: AgoraVideoSourceType) { - counter2 += 1 - - if (counter2 == 5) { - let localVideoStatsReport = ["SentBitrate = \(localVideoStats.sentBitrate)", "codecType = \(localVideoStats.codecType)"].joined(separator: "\n") - counter2 = 0; - showMessage(title: "Local Video Stats:", text: localVideoStatsReport) - } - } - ``` - - Each event reports the statistics of the audio video streams from each remote user and host. - -1. **Switch stream quality** - - To take advantage of dual-stream mode and switch remote video quality to high or low, add the following to the `ViewController` class: - - ```swift - func setStreamQuality() { - highQuality = !highQuality - - if (highQuality) { - agoraEngine.setRemoteVideoStream(remoteUid, type: AgoraVideoStreamType.high) - showMessage(title: "Stream Quality", text: "Switching to high-quality video") - } else { - agoraEngine.setRemoteVideoStream(remoteUid, type: AgoraVideoStreamType.low) - showMessage(title: "Stream Quality", text: "Switching to low-quality video") - } + /** Updates the call quality statistics for the local user. + + - Parameters: + - engine: The Agora SDK engine. + - stats: The local video statistics. + - sourceType: The type of video source. + */ + public func rtcEngine( + _ engine: AgoraRtcEngineKit, localVideoStats stats: AgoraRtcLocalVideoStats, + sourceType: AgoraVideoSourceType + ) { + self.callQualities[self.localUserId] = """ + Captured Frame = \(stats.captureFrameWidth)x\(stats.captureFrameHeight), \(stats.captureFrameRate)fps + Encoded Frame = \(stats.encodedFrameWidth)x\(stats.encodedFrameHeight), \(stats.encoderOutputFrameRate)fps + Sent Data = \(stats.sentFrameRate)fps, bitrate: \(stats.sentBitrate) + Packet Loss Rate = \(stats.txPacketLossRate) + """ } ``` - - To change the quality, add the quality action before `@objc func buttonAction(sender: NSButton!) {` - - ```swift - @objc func qualityAction(sender: NSButton!) { - setStreamQuality() - } - ``` - - - To fire this method when the user taps the remote view panel: - - a. Add the following code inside the `initViews` function: - - ```swift - // Create a gesture recognizer (tap gesture) - let tapGesture = UITapGestureRecognizer(target: self, action: #selector(handleTap(sender:))) - - // Add the gesture recognizer to a view - remoteView.addGestureRecognizer(tapGesture) - ``` - - b. Add a `handleTap` function inside the `ViewController`: - - ```swift - @objc func handleTap(sender: UITapGestureRecognizer) { - setStreamQuality() - } - ``` - -1. **Obtain the user id of the remote user that joins the channel** - - In the `extension ViewController: AgoraRtcEngineDelegate {` add the following line to the `func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinedOfUid uid: UInt, elapsed: Int) {` callback: - - ```swift - remoteUid = uid - ``` - -1. **Configure the log file** - - To customize the location, content and size of log files, add the following code to `initializeAgoraEngine` before `agoraEngine = AgoraRtcEngineKit.sharedEngine(with: config, delegate: self)`: - - ```swift - let logConfig = AgoraLogConfig() - logConfig.filePath = "AppSandbox/Library/caches/agorasdk1.log" // Default path AppSandbox/Library/caches/agorasdk.log - logConfig.fileSizeInKB = 256 // Range 128-1024 Kb - logConfig.level = .warn - config.logConfig = logConfig - ``` - - If you want to upload the log file automatically to a CDN, use the method `setLocalAccessPoint(withConfig: AgoraLocalAccessPointConfiguration)` to specify the local access point and assign the native access module to the SDK. diff --git a/shared/video-sdk/develop/ensure-channel-quality/project-setup/index.mdx b/shared/video-sdk/develop/ensure-channel-quality/project-setup/index.mdx index 1ef8811fc..20cd16e0e 100644 --- a/shared/video-sdk/develop/ensure-channel-quality/project-setup/index.mdx +++ b/shared/video-sdk/develop/ensure-channel-quality/project-setup/index.mdx @@ -1,5 +1,4 @@ import Android from './android.mdx'; -import Ios from './ios.mdx'; import Web from './web.mdx'; import Unity from './unity.mdx'; import Electron from './electron.mdx'; @@ -7,7 +6,6 @@ import Flutter from './flutter.mdx'; import ReactNative from './react-native.mdx'; - diff --git a/shared/video-sdk/develop/ensure-channel-quality/project-setup/ios.mdx b/shared/video-sdk/develop/ensure-channel-quality/project-setup/ios.mdx deleted file mode 100644 index 52ad025f4..000000000 --- a/shared/video-sdk/develop/ensure-channel-quality/project-setup/ios.mdx +++ /dev/null @@ -1,3 +0,0 @@ - - - diff --git a/shared/video-sdk/develop/ensure-channel-quality/project-test/swift.mdx b/shared/video-sdk/develop/ensure-channel-quality/project-test/swift.mdx index c65c53379..d291fa86a 100644 --- a/shared/video-sdk/develop/ensure-channel-quality/project-test/swift.mdx +++ b/shared/video-sdk/develop/ensure-channel-quality/project-test/swift.mdx @@ -1,42 +1,26 @@ -3. In your project, open `ViewController`, and update `appID`, `channelName` and `token` with the values for your temporary token. +3. In the `video-sdk-samples-ios` reference app, open `DocsAppConfig.swift` file and update the values of `appId`, + `channel`, and `rtcToken` with the values for your temporary token. -4. Run your . +4. Run your , then wait a few seconds until the installation is complete. If this is the first time you run the project, grant microphone and camera access to your . + - If you use an iOS simulator, you see the remote video only. You cannot see the local video stream because of [Apple simulator hardware restrictions](https://help.apple.com/simulator/mac/current/#/devb0244142d). + If you use an iOS simulator, you see the remote video only. You cannot see the local video stream because of [Apple simulator hardware restrictions](https://help.apple.com/simulator/mac/current/#/devb0244142d). -5. When the starts, it does the following: - - * Sets the log file location, size, and logging level according to your preference. - * Enables the dual-stream mode. - * Sets the audio profile. - * Sets the video profile. - * Starts the network probe test. - - You see the result of the network probe test displayed in the network status icon. - - 6. Select an option and click **Join** to start a session. When you join as a *Host*, the local video is published and played in the . When you join as *Audience*, the remote stream is subscribed and played. +5. Select an option and click **Join** to start a session. When you join as a **Host**, the local video is published and played in the . When you join as **Audience**, the remote stream is subscribed and played. - 6. Click **Join** to start a call. +5. Click **Join** to start a call. Now, you can see yourself on the test device and talk to the web demo app using your . -7. Install the on a second iOS device and join the same channel. - -8. After joining a channel, you receive toast messages informing you of some selected call statistics, including: - - * The number of users in the channel - * Packet loss rate - * Local video stats - * Remote video state changes - -9. You see the network status indicator updated periodically based on the result of the `onNetworkQuality` callback. - -10. Tap the remote video panel. You see the remote video switches from high-quality to low-quality. Tapping the remote video again switches back to hight-quality video. - +6. After joining a channel, you receive toast messages informing you of some selected call statistics, including: + * Received bitrate (Kbps) + * Remote video frame loss rate (%) + * Local video capturing frame rate (fps) + * Sent video packet loss rate (%) diff --git a/shared/video-sdk/develop/ensure-channel-quality/reference/ios.mdx b/shared/video-sdk/develop/ensure-channel-quality/reference/ios.mdx index 51328f282..d44ba0a5f 100644 --- a/shared/video-sdk/develop/ensure-channel-quality/reference/ios.mdx +++ b/shared/video-sdk/develop/ensure-channel-quality/reference/ios.mdx @@ -2,8 +2,6 @@ import Source from './swift.mdx'; -For more detailed information, about the connection state and reasons, see connectionChangedTo. - \ No newline at end of file diff --git a/shared/video-sdk/develop/ensure-channel-quality/reference/swift.mdx b/shared/video-sdk/develop/ensure-channel-quality/reference/swift.mdx index 075dde9e6..5b659637e 100644 --- a/shared/video-sdk/develop/ensure-channel-quality/reference/swift.mdx +++ b/shared/video-sdk/develop/ensure-channel-quality/reference/swift.mdx @@ -98,48 +98,20 @@ The way video is displayed on the playing device depends on `orientationMode` us ### API reference -- AgoraVideoEncoderConfiguration +- AgoraLastmileProbeResult -- AgoraRtcEngineConfig +- AgoraRtcRemoteVideoStats -- AgoraAudioScenario +- AgoraRtcLocalVideoStats -- enableDualStreamMode - -- setAudioProfile - -- setVideoEncoderConfiguration - -- AgoraLastmileProbeConfig - -- startLastmileProbeTest - -- setRemoteVideoStream - -- AgoraLogConfig - -- setLocalAccessPoint +- AgoraVideoSourceType -- AgoraVideoEncoderConfiguration - -- AgoraRtcEngineConfig - -- AgoraAudioScenario - -- enableDualStreamMode - -- setAudioProfile - -- setVideoEncoderConfiguration - -- AgoraLastmileProbeConfig - -- startLastmileProbeTest +- AgoraLastmileProbeResult -- setRemoteVideoStream +- AgoraRtcRemoteVideoStats -- AgoraLogConfig +- AgoraRtcLocalVideoStats -- setLocalAccessPoint +- AgoraVideoSourceType From eeef267dddb6d091b970adf703e484aa7fbfa333 Mon Sep 17 00:00:00 2001 From: saudsami Date: Fri, 18 Aug 2023 16:39:26 +0500 Subject: [PATCH 016/184] Updated doc structure --- .../get-started/get-started-sdk.mdx | 2 +- .../get-started/get-started-sdk.mdx | 2 +- shared/common/project-setup/android.mdx | 15 + .../project-setup/electron.mdx | 0 .../project-setup/flutter.mdx | 0 .../project-setup/index.mdx | 0 .../project-setup/ios.mdx | 0 .../project-setup/macos.mdx | 0 .../project-setup/react-native.mdx | 0 .../project-setup/swift.mdx | 0 .../project-setup/unity.mdx | 0 .../project-setup/web.mdx | 0 .../project-setup/windows.mdx | 0 .../get-started-sdk/index.mdx} | 6 +- .../project-implementation/android.mdx | 695 +++++------------- .../get-started-sdk/project-setup/android.mdx | 49 -- .../get-started-sdk/project-test/android.mdx | 14 +- video-calling/get-started/get-started-sdk.mdx | 2 +- 18 files changed, 196 insertions(+), 589 deletions(-) create mode 100644 shared/common/project-setup/android.mdx rename shared/{video-sdk/get-started/get-started-sdk => common}/project-setup/electron.mdx (100%) rename shared/{video-sdk/get-started/get-started-sdk => common}/project-setup/flutter.mdx (100%) rename shared/{video-sdk/get-started/get-started-sdk => common}/project-setup/index.mdx (100%) rename shared/{video-sdk/get-started/get-started-sdk => common}/project-setup/ios.mdx (100%) rename shared/{video-sdk/get-started/get-started-sdk => common}/project-setup/macos.mdx (100%) rename shared/{video-sdk/get-started/get-started-sdk => common}/project-setup/react-native.mdx (100%) rename shared/{video-sdk/get-started/get-started-sdk => common}/project-setup/swift.mdx (100%) rename shared/{video-sdk/get-started/get-started-sdk => common}/project-setup/unity.mdx (100%) rename shared/{video-sdk/get-started/get-started-sdk => common}/project-setup/web.mdx (100%) rename shared/{video-sdk/get-started/get-started-sdk => common}/project-setup/windows.mdx (100%) rename shared/video-sdk/{_get-started-sdk.mdx => get-started/get-started-sdk/index.mdx} (97%) delete mode 100644 shared/video-sdk/get-started/get-started-sdk/project-setup/android.mdx diff --git a/broadcast-streaming/get-started/get-started-sdk.mdx b/broadcast-streaming/get-started/get-started-sdk.mdx index 1ca6d22cf..930fd966a 100644 --- a/broadcast-streaming/get-started/get-started-sdk.mdx +++ b/broadcast-streaming/get-started/get-started-sdk.mdx @@ -6,7 +6,7 @@ description: > Rapidly develop and easily enhance your social, work, education and IoT apps with face-to-face interaction. --- -import GetStartedSDK from '@docs/shared/video-sdk/_get-started-sdk.mdx'; +import GetStartedSDK from '@docs/shared/video-sdk/get-started/get-started-sdk/index.mdx'; export const toc = [{}]; diff --git a/interactive-live-streaming/get-started/get-started-sdk.mdx b/interactive-live-streaming/get-started/get-started-sdk.mdx index 3f17c340e..4132c5ba2 100644 --- a/interactive-live-streaming/get-started/get-started-sdk.mdx +++ b/interactive-live-streaming/get-started/get-started-sdk.mdx @@ -6,7 +6,7 @@ description: > Rapidly develop and easily enhance your social, work, education and IoT apps with face-to-face interaction. --- -import GetStartedSDK from '@docs/shared/video-sdk/_get-started-sdk.mdx'; +import GetStartedSDK from '@docs/shared/video-sdk/get-started/get-started-sdk/index.mdx'; export const toc = [{}]; diff --git a/shared/common/project-setup/android.mdx b/shared/common/project-setup/android.mdx new file mode 100644 index 000000000..6654d77bc --- /dev/null +++ b/shared/common/project-setup/android.mdx @@ -0,0 +1,15 @@ + + +1. Clone the [ sample project](https://github.com/AgoraIO/video-sdk-samples-android) to `` on your development machine: + + ```bash + git clone https://github.com/AgoraIO/video-sdk-samples-android.git + ``` + +1. Open the sample project in Android Studio. + + From the **File** menu select **Open...** then navigate to `/video-sdk-samples-android/android-reference-app` and click **OK**. Android Studio loads the project and Gradle sync downloads the dependencies. + +1. Connect a physical or virtual Android device to your development environment. + + diff --git a/shared/video-sdk/get-started/get-started-sdk/project-setup/electron.mdx b/shared/common/project-setup/electron.mdx similarity index 100% rename from shared/video-sdk/get-started/get-started-sdk/project-setup/electron.mdx rename to shared/common/project-setup/electron.mdx diff --git a/shared/video-sdk/get-started/get-started-sdk/project-setup/flutter.mdx b/shared/common/project-setup/flutter.mdx similarity index 100% rename from shared/video-sdk/get-started/get-started-sdk/project-setup/flutter.mdx rename to shared/common/project-setup/flutter.mdx diff --git a/shared/video-sdk/get-started/get-started-sdk/project-setup/index.mdx b/shared/common/project-setup/index.mdx similarity index 100% rename from shared/video-sdk/get-started/get-started-sdk/project-setup/index.mdx rename to shared/common/project-setup/index.mdx diff --git a/shared/video-sdk/get-started/get-started-sdk/project-setup/ios.mdx b/shared/common/project-setup/ios.mdx similarity index 100% rename from shared/video-sdk/get-started/get-started-sdk/project-setup/ios.mdx rename to shared/common/project-setup/ios.mdx diff --git a/shared/video-sdk/get-started/get-started-sdk/project-setup/macos.mdx b/shared/common/project-setup/macos.mdx similarity index 100% rename from shared/video-sdk/get-started/get-started-sdk/project-setup/macos.mdx rename to shared/common/project-setup/macos.mdx diff --git a/shared/video-sdk/get-started/get-started-sdk/project-setup/react-native.mdx b/shared/common/project-setup/react-native.mdx similarity index 100% rename from shared/video-sdk/get-started/get-started-sdk/project-setup/react-native.mdx rename to shared/common/project-setup/react-native.mdx diff --git a/shared/video-sdk/get-started/get-started-sdk/project-setup/swift.mdx b/shared/common/project-setup/swift.mdx similarity index 100% rename from shared/video-sdk/get-started/get-started-sdk/project-setup/swift.mdx rename to shared/common/project-setup/swift.mdx diff --git a/shared/video-sdk/get-started/get-started-sdk/project-setup/unity.mdx b/shared/common/project-setup/unity.mdx similarity index 100% rename from shared/video-sdk/get-started/get-started-sdk/project-setup/unity.mdx rename to shared/common/project-setup/unity.mdx diff --git a/shared/video-sdk/get-started/get-started-sdk/project-setup/web.mdx b/shared/common/project-setup/web.mdx similarity index 100% rename from shared/video-sdk/get-started/get-started-sdk/project-setup/web.mdx rename to shared/common/project-setup/web.mdx diff --git a/shared/video-sdk/get-started/get-started-sdk/project-setup/windows.mdx b/shared/common/project-setup/windows.mdx similarity index 100% rename from shared/video-sdk/get-started/get-started-sdk/project-setup/windows.mdx rename to shared/common/project-setup/windows.mdx diff --git a/shared/video-sdk/_get-started-sdk.mdx b/shared/video-sdk/get-started/get-started-sdk/index.mdx similarity index 97% rename from shared/video-sdk/_get-started-sdk.mdx rename to shared/video-sdk/get-started/get-started-sdk/index.mdx index e863a5e61..23d336dbf 100644 --- a/shared/video-sdk/_get-started-sdk.mdx +++ b/shared/video-sdk/get-started/get-started-sdk/index.mdx @@ -1,6 +1,6 @@ import * as data from '@site/data/variables'; import Prerequisites from '@docs/shared/common/prerequities.mdx'; -import ProjectSetup from '@docs/shared/video-sdk/get-started/get-started-sdk/project-setup/index.mdx'; +import Setup from '@docs/shared/common/project-setup/index.mdx'; import ProjectImplement from '@docs/shared/video-sdk/get-started/get-started-sdk/project-implementation/index.mdx'; import ProjectTest from '@docs/shared/video-sdk/get-started/get-started-sdk/project-test/index.mdx'; import Reference from '@docs/shared/video-sdk/get-started/get-started-sdk/reference/index.mdx'; @@ -67,9 +67,9 @@ In order to follow this procedure you must have: To integrate into your , do the following: - + -You are ready to add features to your . +You are ready to implement features using the reference app. ## Implement a client for diff --git a/shared/video-sdk/get-started/get-started-sdk/project-implementation/android.mdx b/shared/video-sdk/get-started/get-started-sdk/project-implementation/android.mdx index c49b83036..30464a171 100644 --- a/shared/video-sdk/get-started/get-started-sdk/project-implementation/android.mdx +++ b/shared/video-sdk/get-started/get-started-sdk/project-implementation/android.mdx @@ -1,219 +1,6 @@ -### Implement the user interface - -In the interface, create frames for local and remote video, and the UI elements to join and leave a channel. In `/app/res/layout/activity_main.xml`, replace the contents of the file with the following: - - - - ``` xml - - - - - - - - - - - - - - - ``` - - - - ``` html - - - - - Get started with interactive live streaming - - - -

    Get started with interactive live streaming

    -
    -
    - -
    - -
    - - -
    -
    - - - ``` -
    - - ``` html - - - - - Get started with broadcast streaming - - - -

    Get started with broadcast streaming

    -
    -
    - -
    - -
    - - -
    -
    - - - ``` -
    + -This UI looks like: - - -![Interface](/images/video-sdk/quickstart_interface_web_VIDEO.png) - - -![Interface](/images/video-sdk/quickstart_interface_ILS.png) - +Best practice is to separate the workflows from your UI implementation. The + sample project implements logic in the + [AgoraManager](https://github.com/AgoraIO/video-sdk-samples-js/blob/vsdk-v1/src/agora_manager/agora_manager.js) component. + This encapsulates the `RTCEngine` instance and core functionality as illustrated by the excerpts below: ### Implement the channel logic -The following figure shows the API call sequence. +The following figure shows the API call sequence. ![Interface](/images/video-sdk/video-call-logic-web.png) @@ -100,552 +19,121 @@ The following figure shows the API call sequence. To implement this logic, you need to take the following steps: -1. Create an instance of the . Call AgoraRTC.createClient. - -2. To connect to a channel, call join and pass the App ID, user ID, token, and channel name. - -3. Create media tracks: +1. **Import the classes and interfaces**: - 1. Call createMicrophoneAudioTrack to create a audio track. + ``` javascript + import AgoraRTC from "agora-rtc-sdk-ng"; + import config from "./config.json"; + ``` - 2. Call createCameraVideoTrack to create a video track. +1. Create an instance of the . Call AgoraRTC.createClient. - 3. Publish audio and video tracks in the channel by calling publish. + ``` javascript + const AgoraRTCManager = async (eventsCallback) => { + let agoraEngine = null; -4. When a remote user joins the channel, this : + // Set up the signaling engine with the provided App ID, UID, and configuration + const setupAgoraEngine = async () => { + agoraEngine = new AgoraRTC.createClient({ mode: "rtc", codec: "vp9" }); + }; - 1. Listens for a `client.on("user-published")` event which returns a IAgoraRTCRemoteUser object and a string. The string is used to indicate the type of track which can be either `audio` or `video`. + await setupAgoraEngine(); - 2. Retrieves the remote audio or audio/video track using the IAgoraRTCRemoteUser object. + const getAgoraEngine = () => { + return agoraEngine; + }; + }; + ``` - 3. Play the retrieved tracks by calling play. +1. To connect to a channel: -To implement this logic in your , replace the contents of `main.js` with the following: + 1. Call join and pass the App ID, user ID, token, and channel name. + 1. Call createMicrophoneAudioTrack to create a audio track. + 1. Call createCameraVideoTrack to create a video track. + 1. Publish audio and video tracks in the channel by calling publish. - ``` javascript - import AgoraRTC from "agora-rtc-sdk-ng" - - let options = - { - // Pass your App ID here. - appId: '', - // Set the channel name. - channel: '', - // Pass your temp token here. - token: '', - // Set the user ID. - uid: 0, - }; - - let channelParameters = - { - // A variable to hold a local audio track. - localAudioTrack: null, - // A variable to hold a local video track. - localVideoTrack: null, - // A variable to hold a remote audio track. - remoteAudioTrack: null, - // A variable to hold a remote video track. - remoteVideoTrack: null, - // A variable to hold the remote user id.s - remoteUid: null, - }; - async function startBasicCall() - { - // Create an instance of the Agora Engine - - const agoraEngine = AgoraRTC.createClient({ mode: "rtc", codec: "vp9" }); - // Dynamically create a container in the form of a DIV element to play the remote video track. - const remotePlayerContainer = document.createElement("div"); - // Dynamically create a container in the form of a DIV element to play the local video track. - const localPlayerContainer = document.createElement('div'); - // Specify the ID of the DIV container. You can use the uid of the local user. - localPlayerContainer.id = options.uid; - // Set the textContent property of the local video container to the local user id. - localPlayerContainer.textContent = "Local user " + options.uid; - // Set the local video container size. - localPlayerContainer.style.width = "640px"; - localPlayerContainer.style.height = "480px"; - localPlayerContainer.style.padding = "15px 5px 5px 5px"; - // Set the remote video container size. - remotePlayerContainer.style.width = "640px"; - remotePlayerContainer.style.height = "480px"; - remotePlayerContainer.style.padding = "15px 5px 5px 5px"; - // Listen for the "user-published" event to retrieve a AgoraRTCRemoteUser object. - agoraEngine.on("user-published", async (user, mediaType) => -{ - // Subscribe to the remote user when the SDK triggers the "user-published" event. - await agoraEngine.subscribe(user, mediaType); - console.log("subscribe success"); - // Subscribe and play the remote video in the container If the remote user publishes a video track. - if (mediaType == "video") - { - // Retrieve the remote video track. - channelParameters.remoteVideoTrack = user.videoTrack; - // Retrieve the remote audio track. - channelParameters.remoteAudioTrack = user.audioTrack; - // Save the remote user id for reuse. - channelParameters.remoteUid = user.uid.toString(); - // Specify the ID of the DIV container. You can use the uid of the remote user. - remotePlayerContainer.id = user.uid.toString(); - channelParameters.remoteUid = user.uid.toString(); - remotePlayerContainer.textContent = "Remote user " + user.uid.toString(); - // Append the remote container to the page body. - document.body.append(remotePlayerContainer); - // Play the remote video track. - channelParameters.remoteVideoTrack.play(remotePlayerContainer); - } - // Subscribe and play the remote audio track If the remote user publishes the audio track only. - if (mediaType == "audio") - { - // Get the RemoteAudioTrack object in the AgoraRTCRemoteUser object. - channelParameters.remoteAudioTrack = user.audioTrack; - // Play the remote audio track. No need to pass any DOM element. - channelParameters.remoteAudioTrack.play(); - } - // Listen for the "user-unpublished" event. - agoraEngine.on("user-unpublished", user => - { - console.log(user.uid+ "has left the channel"); - }); - }); - window.onload = function () - { - // Listen to the Join button click event. - document.getElementById("join").onclick = async function () - { - // Join a channel. - await agoraEngine.join(options.appId, options.channel, options.token, options.uid); - // Create a local audio track from the audio sampled by a microphone. - channelParameters.localAudioTrack = await AgoraRTC.createMicrophoneAudioTrack(); - // Create a local video track from the video captured by a camera. - channelParameters.localVideoTrack = await AgoraRTC.createCameraVideoTrack(); - // Append the local video container to the page body. - document.body.append(localPlayerContainer); - // Publish the local audio and video tracks in the channel. - await agoraEngine.publish([channelParameters.localAudioTrack, channelParameters.localVideoTrack]); - // Play the local video track. - channelParameters.localVideoTrack.play(localPlayerContainer); - console.log("publish success!"); - } - // Listen to the Leave button click event. - document.getElementById('leave').onclick = async function () - { - // Destroy the local audio and video tracks. - channelParameters.localAudioTrack.close(); - channelParameters.localVideoTrack.close(); - // Remove the containers you created for the local video and remote video. - removeVideoDiv(remotePlayerContainer.id); - removeVideoDiv(localPlayerContainer.id); - // Leave the channel - await agoraEngine.leave(); - console.log("You left the channel"); - // Refresh the page for reuse - window.location.reload(); - } - } - } - startBasicCall(); - // Remove the video stream from the container. - function removeVideoDiv(elementId) - { - console.log("Removing "+ elementId+"Div"); - let Div = document.getElementById(elementId); - if (Div) - { - Div.remove(); - } + const join = async (localPlayerContainer, channelParameters) => { + await agoraEngine.join( + config.appId, + config.channelName, + config.token, + config.uid + ); + // Create a local audio track from the audio sampled by a microphone. + channelParameters.localAudioTrack = + await AgoraRTC.createMicrophoneAudioTrack(); + // Create a local video track from the video captured by a camera. + channelParameters.localVideoTrack = await AgoraRTC.createCameraVideoTrack(); + // Append the local video container to the page body. + document.body.append(localPlayerContainer); + // Publish the local audio and video tracks in the channel. + await getAgoraEngine().publish([ + channelParameters.localAudioTrack, + channelParameters.localVideoTrack, + ]); + // Play the local video track. + channelParameters.localVideoTrack.play(localPlayerContainer); }; ``` - - - ``` javascript - import AgoraRTC from "agora-rtc-sdk-ng" +1. When a remote user joins the channel, this : - let options = - { - // Pass your App ID here. - appId: '', - // Set the channel name. - channel: '', - // Pass your temp token here. - token: '', - // Set the user ID. - uid: 0, - // Set the user role - role: '' - }; + 1. Listens for a `client.on("user-published")` event which returns a IAgoraRTCRemoteUser object and a string. The string is used to indicate the type of track which can be either `audio` or `video`. + 1. Retrieves the remote audio or audio/video track using the IAgoraRTCRemoteUser object. + 1. Play the retrieved tracks by calling play. - let channelParameters = - { - // A variable to hold a local audio track. - localAudioTrack: null, - // A variable to hold a local video track. - localVideoTrack: null, - // A variable to hold a remote audio track. - remoteAudioTrack: null, - // A variable to hold a remote video track. - remoteVideoTrack: null, - // A variable to hold the remote user id.s - remoteUid: null, - }; - async function startBasicCall() -{ - // Create an instance of the Agora Engine - const agoraEngine = AgoraRTC.createClient({ mode: "live", codec: "vp9" }); - // Dynamically create a container in the form of a DIV element to play the remote video track. - const remotePlayerContainer = document.createElement("div"); - // Dynamically create a container in the form of a DIV element to play the local video track. - const localPlayerContainer = document.createElement('div'); - // Specify the ID of the DIV container. You can use the uid of the local user. - localPlayerContainer.id = options.uid; - // Set the textContent property of the local video container to the local user id. - localPlayerContainer.textContent = "Local user " + options.uid; - // Set the local video container size. - localPlayerContainer.style.width = "640px"; - localPlayerContainer.style.height = "480px"; - localPlayerContainer.style.padding = "15px 5px 5px 5px"; - // Set the remote video container size. - remotePlayerContainer.style.width = "640px"; - remotePlayerContainer.style.height = "480px"; - remotePlayerContainer.style.padding = "15px 5px 5px 5px"; - // Listen for the "user-published" event to retrieve a AgoraRTCRemoteUser object. - agoraEngine.on("user-published", async (user, mediaType) => - { +To implement this logic in your , add the following code after `await setupAgoraEngine();`: + + + ``` javascript + // Event Listeners + agoraEngine.on("user-published", async (user, mediaType) => { // Subscribe to the remote user when the SDK triggers the "user-published" event. await agoraEngine.subscribe(user, mediaType); console.log("subscribe success"); - // Subscribe and play the remote video in the container If the remote user publishes a video track. - if (mediaType == "video") - { - // Retrieve the remote video track. - channelParameters.remoteVideoTrack = user.videoTrack; - // Retrieve the remote audio track. - channelParameters.remoteAudioTrack = user.audioTrack; - // Save the remote user id for reuse. - channelParameters.remoteUid = user.uid.toString(); - // Specify the ID of the DIV container. You can use the uid of the remote user. - remotePlayerContainer.id = user.uid.toString(); - channelParameters.remoteUid = user.uid.toString(); - remotePlayerContainer.textContent = "Remote user " + user.uid.toString(); - // Append the remote container to the page body. - document.body.append(remotePlayerContainer); - if(options.role != 'host') - { - // Play the remote video track. - channelParameters.remoteVideoTrack.play(remotePlayerContainer); - } - } - // Subscribe and play the remote audio track If the remote user publishes the audio track only. - if (mediaType == "audio") - { - // Get the RemoteAudioTrack object in the AgoraRTCRemoteUser object. - channelParameters.remoteAudioTrack = user.audioTrack; - // Play the remote audio track. No need to pass any DOM element. - channelParameters.remoteAudioTrack.play(); - } - // Listen for the "user-unpublished" event. - agoraEngine.on("user-unpublished", user => - { - console.log(user.uid+ "has left the channel"); - }); + eventsCallback("user-published", user, mediaType) }); - window.onload = function () - { - // Listen to the Join button click event. - document.getElementById("join").onclick = async function () - { - if(options.role == '') - { - window.alert("Select a user role first!"); - return; - } - - // Join a channel. - await agoraEngine.join(options.appId, options.channel, options.token, options.uid); - // Create a local audio track from the audio sampled by a microphone. - channelParameters.localAudioTrack = await AgoraRTC.createMicrophoneAudioTrack(); - // Create a local video track from the video captured by a camera. - channelParameters.localVideoTrack = await AgoraRTC.createCameraVideoTrack(); - // Append the local video container to the page body. - document.body.append(localPlayerContainer); - - // Publish the local audio and video track if the user joins as a host. - if(options.role == 'host') - { - // Publish the local audio and video tracks in the channel. - await agoraEngine.publish([channelParameters.localAudioTrack, channelParameters.localVideoTrack]); - // Play the local video track. - channelParameters.localVideoTrack.play(localPlayerContainer); - console.log("publish success!"); - } - } - // Listen to the Leave button click event. - document.getElementById('leave').onclick = async function () - { - // Destroy the local audio and video tracks. - channelParameters.localAudioTrack.close(); - channelParameters.localVideoTrack.close(); - // Remove the containers you created for the local video and remote video. - removeVideoDiv(remotePlayerContainer.id); - removeVideoDiv(localPlayerContainer.id); - // Leave the channel - await agoraEngine.leave(); - console.log("You left the channel"); - // Refresh the page for reuse - window.location.reload(); - } - document.getElementById('host').onclick = async function () - { - if (document.getElementById('host').checked) - { - // Save the selected role in a variable for reuse. - options.role = 'host'; - // Call the method to set the role as Host. - await agoraEngine.setClientRole(options.role); - if(channelParameters.localVideoTrack != null) - { - // Publish the local audio and video track in the channel. - await agoraEngine.publish([channelParameters.localAudioTrack,channelParameters.localVideoTrack]); - // Stop playing the remote video. - channelParameters.remoteVideoTrack.stop(); - // Start playing the local video. - channelParameters.localVideoTrack.play(localPlayerContainer); - } - } - } - document.getElementById('Audience').onclick = async function () - { - if (document.getElementById('Audience').checked) - { - // Save the selected role in a variable for reuse. - options.role = 'audience'; - if(channelParameters.localAudioTrack != null && channelParameters.localVideoTrack != null) - { - if(channelParameters.remoteVideoTrack!=null) - { - // Replace the current video track with remote video track - await channelParamaters.localVideoTrack.replaceTrack(channelParamaters.remoteVideoTrack, true); - } - } - // Call the method to set the role as Audience. - await agoraEngine.setClientRole(options.role); - } - } - } - } - startBasicCall(); - // Remove the video stream from the container. - function removeVideoDiv(elementId) - { - console.log("Removing "+ elementId+"Div"); - let Div = document.getElementById(elementId); - if (Div) - { - Div.remove(); - } - }; + // Listen for the "user-unpublished" event. + agoraEngine.on("user-unpublished", (user) => { + console.log(user.uid + "has left the channel"); + }); ``` - - - ``` javascript - import AgoraRTC from "agora-rtc-sdk-ng" - let options = - { - // Pass your App ID here. - appId: '', - // Set the channel name. - channel: '', - // Pass your temp token here. - token: '', - // Set the user ID. - uid: 0, - // Set the user role - role: '' - }; +The `eventsCallback` callback can be used by the UI to handle all events. The sample project uses the following callback: - let channelParameters = - { - // A variable to hold a local audio track. - localAudioTrack: null, - // A variable to hold a local video track. - localVideoTrack: null, - // A variable to hold a remote audio track. - remoteAudioTrack: null, - // A variable to hold a remote video track. - remoteVideoTrack: null, - // A variable to hold the remote user id.s - remoteUid: null, - }; - async function startBasicCall() -{ - // Create an instance of the Agora Engine - const agoraEngine = AgoraRTC.createClient({ mode: "live", codec: "vp8" }); - // Dynamically create a container in the form of a DIV element to play the remote video track. - const remotePlayerContainer = document.createElement("div"); - // Dynamically create a container in the form of a DIV element to play the local video track. - const localPlayerContainer = document.createElement('div'); - // Specify the ID of the DIV container. You can use the uid of the local user. - localPlayerContainer.id = options.uid; - // Set the textContent property of the local video container to the local user id. - localPlayerContainer.textContent = "Local user " + options.uid; - // Set the local video container size. - localPlayerContainer.style.width = "640px"; - localPlayerContainer.style.height = "480px"; - localPlayerContainer.style.padding = "15px 5px 5px 5px"; - // Set the remote video container size. - remotePlayerContainer.style.width = "640px"; - remotePlayerContainer.style.height = "480px"; - remotePlayerContainer.style.padding = "15px 5px 5px 5px"; - // Listen for the "user-published" event to retrieve a AgoraRTCRemoteUser object. - agoraEngine.on("user-published", async (user, mediaType) => - { - // Subscribe to the remote user when the SDK triggers the "user-published" event. - await agoraEngine.subscribe(user, mediaType); - console.log("subscribe success"); - // Subscribe and play the remote video in the container If the remote user publishes a video track. - if (mediaType == "video") - { + ``` javascript + const handleVSDKEvents = (eventName, ...args) => { + switch (eventName) { + case "user-published": + if (args[1] == "video") { // Retrieve the remote video track. - channelParameters.remoteVideoTrack = user.videoTrack; + channelParameters.remoteVideoTrack = args[0].videoTrack; // Retrieve the remote audio track. - channelParameters.remoteAudioTrack = user.audioTrack; + channelParameters.remoteAudioTrack = args[0].audioTrack; // Save the remote user id for reuse. - channelParameters.remoteUid = user.uid.toString(); + channelParameters.remoteUid = args[0].uid.toString(); // Specify the ID of the DIV container. You can use the uid of the remote user. - remotePlayerContainer.id = user.uid.toString(); - channelParameters.remoteUid = user.uid.toString(); - remotePlayerContainer.textContent = "Remote user " + user.uid.toString(); + remotePlayerContainer.id = args[0].uid.toString(); + channelParameters.remoteUid = args[0].uid.toString(); + remotePlayerContainer.textContent = + "Remote user " + args[0].uid.toString(); // Append the remote container to the page body. document.body.append(remotePlayerContainer); - if(options.role != 'host') - { - // Play the remote video track. - channelParameters.remoteVideoTrack.play(remotePlayerContainer); + // Play the remote video track. + channelParameters.remoteVideoTrack.play(remotePlayerContainer); } - } - // Subscribe and play the remote audio track If the remote user publishes the audio track only. - if (mediaType == "audio") - { + // Subscribe and play the remote audio track If the remote user publishes the audio track only. + if (args[1] == "audio") { // Get the RemoteAudioTrack object in the AgoraRTCRemoteUser object. - channelParameters.remoteAudioTrack = user.audioTrack; + channelParameters.remoteAudioTrack = args[0].audioTrack; // Play the remote audio track. No need to pass any DOM element. channelParameters.remoteAudioTrack.play(); - } - // Listen for the "user-unpublished" event. - agoraEngine.on("user-unpublished", user => - { - console.log(user.uid+ "has left the channel"); - }); - }); - window.onload = function () - { - // Listen to the Join button click event. - document.getElementById("join").onclick = async function () - { - - if(options.role == '') - { - window.alert("Select a user role first!"); - return; - } - - // Join a channel. - await agoraEngine.join(options.appId, options.channel, options.token, options.uid); - // Create a local audio track from the audio sampled by a microphone. - channelParameters.localAudioTrack = await AgoraRTC.createMicrophoneAudioTrack(); - // Create a local video track from the video captured by a camera. - channelParameters.localVideoTrack = await AgoraRTC.createCameraVideoTrack(); - // Append the local video container to the page body. - document.body.append(localPlayerContainer); - - // Publish the local audio and video track if the user joins as a host. - if(options.role == 'host') - { - // Publish the local audio and video tracks in the channel. - await agoraEngine.publish([channelParameters.localAudioTrack, channelParameters.localVideoTrack]); - // Play the local video track. - channelParameters.localVideoTrack.play(localPlayerContainer); - console.log("publish success!"); - } } - // Listen to the Leave button click event. - document.getElementById('leave').onclick = async function () - { - // Destroy the local audio and video tracks. - channelParameters.localAudioTrack.close(); - channelParameters.localVideoTrack.close(); - // Remove the containers you created for the local video and remote video. - removeVideoDiv(remotePlayerContainer.id); - removeVideoDiv(localPlayerContainer.id); - // Leave the channel - await agoraEngine.leave(); - console.log("You left the channel"); - // Refresh the page for reuse - window.location.reload(); - } - document.getElementById('host').onclick = async function () - { - if (document.getElementById('host').checked) - { - // Save the selected role in a variable for reuse. - options.role = 'host'; - // Call the method to set the role as Host. - await agoraEngine.setClientRole(options.role); - if(channelParameters.localVideoTrack != null) - { - // Publish the local audio and video track in the channel. - await agoraEngine.publish([channelParameters.localAudioTrack,channelParameters.localVideoTrack]); - // Stop playing the remote video. - channelParameters.remoteVideoTrack.stop(); - // Start playing the local video. - channelParameters.localVideoTrack.play(localPlayerContainer); - } - } - } - document.getElementById('Audience').onclick = async function () - { - if (document.getElementById('Audience').checked) - { - // Save the selected role in a variable for reuse. - options.role = 'audience'; - var clientRoleOptions = { level: 1 }; // Use Low latency - if(channelParameters.localAudioTrack != null && channelParameters.localVideoTrack != null) - { - // Unpublish local tracks to set the user role as audience. - await agoraEngine.unpublish([channelParameters.localAudioTrack,channelParameters.localVideoTrack]); - // Stop playing the local video track - channelParameters.localVideoTrack.stop(); - if(channelParameters.remoteVideoTrack!=null) - { - // Play the remote video stream, if the remote user has joined the channel. - channelParameters.remoteVideoTrack.play(remotePlayerContainer); - } - } - // Call the method to set the role as Audience. - await agoraEngine.setClientRole(options.role, clientRoleOptions); - } - } - } - } - startBasicCall(); - // Remove the video stream from the container. - function removeVideoDiv(elementId) - { - console.log("Removing "+ elementId+"Div"); - let Div = document.getElementById(elementId); - if (Div) - { - Div.remove(); } }; ``` - - For choose `Level: 1` for audience roles. This ensures low latency, which is a feature of and its use is subject to special [pricing](../reference/pricing#unit-pricing). - + diff --git a/shared/video-sdk/get-started/get-started-sdk/project-test/web.mdx b/shared/video-sdk/get-started/get-started-sdk/project-test/web.mdx index 13240443a..5c97da312 100644 --- a/shared/video-sdk/get-started/get-started-sdk/project-test/web.mdx +++ b/shared/video-sdk/get-started/get-started-sdk/project-test/web.mdx @@ -1,13 +1,13 @@ -3. In **main.js**, update `appID`, `channel` and `token` with your values. +3. In **src/agora_manager/config.json**, update `appID`, `channelName` and `token` with your values. 4. Start the dev server Execute the following command in the terminal: ```bash - npm run dev + pnpm dev ``` Use the URL displayed in the terminal to open the in your browser. From 381fc70354e2f96ce63dc762d2e8887a5ab1780a Mon Sep 17 00:00:00 2001 From: Kishan Dhakan Date: Tue, 5 Sep 2023 13:43:51 +0100 Subject: [PATCH 041/184] update vsdk secure auth docs --- .../project-implementation/web.mdx | 246 +++++------------- .../project-test/web.mdx | 50 ++-- 2 files changed, 94 insertions(+), 202 deletions(-) diff --git a/shared/video-sdk/authentication-workflow/project-implementation/web.mdx b/shared/video-sdk/authentication-workflow/project-implementation/web.mdx index aeae25f10..c8acbc92a 100644 --- a/shared/video-sdk/authentication-workflow/project-implementation/web.mdx +++ b/shared/video-sdk/authentication-workflow/project-implementation/web.mdx @@ -1,209 +1,85 @@ - - -1. **Add the necessary dependencies** - - In order to make HTTPS calls to a token server and interpret the JSON return parameters, integrate a HTTP client into your . In *agora\_project*, open a command prompt, then run the following command. - - ``` bash - npm install axios - ``` - -2. **Import the HTTP client library** - - To access the axios HTTP client from your project, add the following line at the beginning of the file `main.js`: - - ``` javascript - import axios from "axios" - ``` - -3. **Enable the user to specify a channel** - - Add a text box to the user interface. Open `index.html` and add the following code before ``: - -```html - -
    - -
    - - -
    - -
    - - -``` - -### Handle the system logic - -**Declare the required variable** - - In `main.js`, add the following declaration after `async function startBasicCall(){`: - - ```javascript - // A variable to track the state of remote video quality. - var isHighRemoteVideoQuality = false; - // A variable to track the state of device test. - var isDeviceTestRunning = false; - // Variables to hold the Audio/Video tracks for device testing. - var videoTrack; - var audioTrack; - // A variable to reference the audio devices dropdown. - var audioDevicesDropDown; - // A variable to reference the video devices dropdown. - var videoDevicesDropDown; + ``` javascript + import AgoraRTCManager from "../agora_manager/agora_manager.js"; + import AgoraRTC from "agora-rtc-sdk-ng"; ``` ### Implement features to ensure quality @@ -67,39 +30,48 @@ To do this, open `index.html` and add the following lines after `` -``` html +```html ``` @@ -29,7 +29,7 @@ In your project, import the relevant libraries and declare the required variable To implement media relay, import the corresponding modules. In `preload.js`, add the following before `createAgoraRtcEngine,`: - ``` javascript + ```javascript ChannelMediaRelayEvent, ChannelMediaRelayState, LogLevel @@ -39,7 +39,7 @@ In your project, import the relevant libraries and declare the required variable To store source and destination channel settings and manage channel relay, in `preload.js`, add the following variables to declarations: - ``` javascript + ```javascript var destChannelName = ""; var destChannelToken = ""; var destUid = 100; // User ID that the user uses in the destination channel. @@ -54,7 +54,7 @@ To enable users to relay channel media to a destination chann When a user presses the button, the starts relaying media from the source channel to the destination channel. If channel media relay is already running, the stops it. To integrate this workflow, in `preload.js`add the following method before `document.getElementById("join").onclick = async function ()`: - ``` javascript + ```javascript document.getElementById("coHost").onclick = async function () { if (mediaRelaying) @@ -95,7 +95,7 @@ To enable users to relay channel media to a destination chann To receive the state change notifications sent during media relay, you add a callback to `EventHandles`. Your responds to connection and failure events in the `onChannelMediaRelayStateChanged` event handler. In `preload.js`, add the following method after `const EventHandles = {`: - ``` javascript + ```javascript onChannelMediaRelayStateChanged: (state, code) => { // This example shows toast messages when the relay state changes, @@ -122,7 +122,7 @@ To enable users to relay channel media to a destination chann To receive notifications of important channel relay events such as network disconnection, reconnection, and users joining channels, you add the `onChannelMediaRelayEvent` callback to `EventHandles`. In `preload.js`, add the following method after `const EventHandles = {`: - ``` javascript + ```javascript onChannelMediaRelayEvent: (code) => { switch (code) { @@ -146,7 +146,7 @@ The alternate approach to multi-channel live streaming is joining multiple chann In this example, you need a button to join and leave a second channel. To add a button, in `preload.js`, add the following code after ``: -``` html +```html ``` @@ -159,7 +159,7 @@ In your project, import the relevant libraries and declare the required variable To connect to multiple channels, import the corresponding module. In `preload.js`, add the following before `createAgoraRtcEngine,`: - ``` javascript + ```javascript ChannelMediaOptions, IRtcEngineEx, ChannelMediaInfo, @@ -169,7 +169,7 @@ In your project, import the relevant libraries and declare the required variable To join and manage a second channel, in `preload.js`, add the following to declarations: - ``` javascript + ```javascript var secondChannelName = ""; var secondChannelUid = 100; // Uid for the second channel var secondChannelToken = ""; @@ -187,7 +187,7 @@ To add multi-channel functionality to your , take the followin 1. In `preload.js`, add the following line before `agoraEngine = createAgoraRtcEngine();`: - ``` javascript + ```javascript agoraEngine = new IRtcEngineEx(); ``` @@ -195,7 +195,7 @@ To add multi-channel functionality to your , take the followin When a user presses **Join Second Channel**, the joins a second channel. If the is already connected to a second channel, it leaves the channel. To do this, add the following code before `document.getElementById("join").onclick = async function ()`: - ``` javascript + ```javascript document.getElementById("multiple-channels").onclick = async function () { if (isSecondChannelJoined) @@ -237,7 +237,7 @@ To add multi-channel functionality to your , take the followin The `onUserJoined` callback returns a parameter called `channelId`. Use the said parameter to setup remote view for the second channel remote user. To implement this logic, update the `onUserJoined` callback with the following code: - ``` javascript + ```javascript onUserJoined :(connection, remoteUid, elapsed) => { diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/flutter.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/flutter.mdx index d4d64a6ca..1199893fb 100644 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/flutter.mdx +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/flutter.mdx @@ -15,7 +15,7 @@ In this example, you use a single `Button` to start and stop channel media relay #### Implement the user interface To enable your users to start and stop relaying to another channel, add a `Button` to the user interface. Open `/lib/main.dart` and add the following lines after `ListView(...children: [` in the `build` method: -``` dart +```dart ElevatedButton( child: relayState == ChannelMediaRelayState.relayStateRunning ? const Text("Stop Channel media relay") @@ -34,7 +34,7 @@ To enable users to relay channel media to a destination chann To store source and destination channel settings and manage channel relay, in `/lib/main.dart`, add the following variable declarations to the `_MyAppState` class: - ``` dart + ```dart String destChannelName = ""; String destChannelToken = ""; int destUid = 0; // Uid to identify the relay stream in the destination channel @@ -47,7 +47,7 @@ To enable users to relay channel media to a destination chann When a user presses the button, the starts relaying media from the source channel to the destination channel. If channel media relay is already running, the stops it. To integrate this workflow, add the following method to the `_MyAppState` class. - ``` dart + ```dart void channelRelay() async { if (mediaRelaying) { agoraEngine.stopChannelMediaRelay(); @@ -79,7 +79,7 @@ To enable users to relay channel media to a destination chann To receive notification of connection state changes during channel media relay, you add an event handler. Your responds to connection and failure events in the `onChannelMediaRelayStateChanged` event. In `setupVideoSDKEngine()`, add the following code after `RtcEngineEventHandler(`: - ``` dart + ```dart onChannelMediaRelayStateChanged: (ChannelMediaRelayState state, ChannelMediaRelayError error) { setState(() { relayState = state; @@ -99,7 +99,7 @@ To enable users to relay channel media to a destination chann To receive notifications of important channel relay events such as network disconnection, reconnection, and users joining channels, you add the `onChannelMediaRelayEvent` method to the event handler. In `setupVideoSDKEngine()`, add the following code after `RtcEngineEventHandler(`: - ``` dart + ```dart onChannelMediaRelayEvent: (ChannelMediaRelayEvent mediaRelayEvent) { // This example shows messages when relay events occur. // A production level app needs to handle these events properly. @@ -115,7 +115,7 @@ The alternate approach to multi-channel live streaming is joining multiple chann 1. In this example, you use a single button to join and leave a second channel. To add a button to your UI, in `/lib/main.dart`, add the following lines after `ListView(...children: [` in the `build` method: - ``` dart + ```dart ElevatedButton( child: !isSecondChannelJoined ? const Text("Join second channel") @@ -142,7 +142,7 @@ To add multi-channel functionality to your , take the followin To join and manage a second channel, in `/lib/main.dart`, add the following declarations to the `_MyAppState` class: - ``` dart + ```dart late RtcConnection rtcSecondConnection; // Connection object for the second channel String secondChannelName = ""; int secondChannelUid = 100; // User Id for the second channel @@ -157,13 +157,13 @@ To add multi-channel functionality to your , take the followin 1. In the `_MyAppState` class, replace the declaration `late RtcEngine agoraEngine;` with: - ``` dart + ```dart late RtcEngineEx agoraEngine; // Agora multi-channel engine instance ``` 1. In the `setupVideoSDKEngine` method, replace the line `agoraEngine = createAgoraRtcEngine();` with the following: - ``` dart + ```dart agoraEngine = createAgoraRtcEngineEx(); ``` @@ -172,7 +172,7 @@ To add multi-channel functionality to your , take the followin When a user presses the button, the joins a second channel. If the is already connected to a second channel, it leaves the channel. To do this, add the following method to the `_MyAppState` class. - ``` dart + ```dart void joinSecondChannel() async { if (isSecondChannelJoined) { agoraEngine.leaveChannelEx(rtcSecondConnection); @@ -216,7 +216,7 @@ To add multi-channel functionality to your , take the followin The `RtcEngineEventHandler` registered with your instance of receives events from all channels that a user is connected to. To identify the channel from which a callback originates, you look at the `channelId` of the `connection` object in the callback. To enable your app to handle callbacks from both channels, in `setupVideoSDKEngine` **replace** the code block `agoraEngine.registerEventHandler(RtcEngineEventHandler(...))` with the following - ``` dart + ```dart agoraEngine.registerEventHandler( RtcEngineEventHandler( onJoinChannelSuccess: (RtcConnection connection, int elapsed) { @@ -266,7 +266,7 @@ To add multi-channel functionality to your , take the followin In this example, you display two remote videos from two different channels. To create the widget for displaying the second video, add the following method to the `_MyAppState` class: - ``` dart + ```dart Widget _secondVideoPanel() { if (!isSecondChannelJoined) { return const Text( diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/react-native.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/react-native.mdx index 7994f2702..7bb4e4125 100644 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/react-native.mdx +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/react-native.mdx @@ -16,7 +16,7 @@ In this example, you use a single `Button` to start and stop channel media relay To enable your users to start and stop relaying to another channel, add a button to the user interface. In ``, add the following code after `Leave`: -``` html +```html Start Channel Media Relay @@ -30,7 +30,7 @@ In your project, import the relevant libraries and declare the required variable To implement media relay, import the corresponding modules. In `App.tsx`, add the following imports before `createAgoraRtcEngine,`: - ``` ts + ```ts ChannelMediaRelayEvent, ChannelMediaRelayState, ``` @@ -39,7 +39,7 @@ In your project, import the relevant libraries and declare the required variable To store source and destination channel settings and manage channel relay, in `App.tsx`, add the following variables to the declarations: - ``` ts + ```ts const destChannelName = ''; const destChannelToken = ''; const destUid = 100; // User ID that the user uses in the destination channel. @@ -54,7 +54,7 @@ To enable users to relay channel media to a destination chann When a user presses the button, the starts relaying media from the source channel to the destination channel. If channel media relay is already running, the stops it. To integrate this workflow, in `App.tsx`add the following method after `const leave = () => {}`: - ``` ts + ```ts const coHost = () => { if (mediaRelaying) { agoraEngineRef.current?.stopChannelMediaRelay(); @@ -87,7 +87,7 @@ To enable users to relay channel media to a destination chann To receive the state change notifications sent during media relay, you add a callback to `EventHandles`. Your responds to connection and failure events in the `onChannelMediaRelayStateChanged` event handler. In `App.tsx`, add the following method after `onUserOffline: (_connection, remoteUid) => {}`: - ``` ts + ```ts onChannelMediaRelayStateChanged: (state, code) => { // This example shows toast messages when the relay state changes, // a production level app needs to handle state change properly. @@ -110,7 +110,7 @@ To enable users to relay channel media to a destination chann To receive notifications of important channel relay events such as network disconnection, reconnection, and users joining channels, you add the `onChannelMediaRelayEvent` callback to `EventHandles`. In `App.tsx`, add the following method after `onChannelMediaRelayStateChanged: (state, code) => {}`: - ``` ts + ```ts onChannelMediaRelayEvent: code => { switch (code) { case ChannelMediaRelayEvent.RelayEventNetworkDisconnected: // RELAY_EVENT_NETWORK_DISCONNECTED @@ -134,7 +134,7 @@ The alternate approach to multi-channel live streaming is joining multiple chann #### Implement the user interface In this example, you need a button to join and leave a second channel. In ``, add the following code after `Leave`: -``` html +```html Join Second Channel @@ -148,7 +148,7 @@ In your project, import the relevant libraries and declare the required variable 1. To implement media relay, in `App.tsx`, add the following imports before `createAgoraRtcEngine,`: - ``` ts + ```ts ChannelMediaOptions, IRtcEngineEx, RtcConnection, @@ -160,7 +160,7 @@ In your project, import the relevant libraries and declare the required variable To join and manage a second channel, in `App.tsx`, add the following variables to the declarations: - ``` ts + ```ts var rtcSecondConnection: RtcConnection; const secondChannelName = '<-----Insert second channel name------>'; const secondChannelUid = 100; // Uid for the second channel @@ -179,7 +179,7 @@ To add multi-channel functionality to your , take the followin 1. Replace `const agoraEngineRef = useRef(); // Agora engine instance` with: - ``` ts + ```ts const agoraEngineRef = useRef(); // Agora engine instance ``` @@ -193,7 +193,7 @@ To add multi-channel functionality to your , take the followin When a user presses the button, the joins a second channel. If the is already connected to a second channel, it leaves the channel. To do this, add the following code after the `const leave = () => {}` function: - ``` ts + ```ts const multipleChannels = () => { if (isSecondChannelJoined) { agoraEngineRef.current?.leaveChannelEx({ @@ -234,7 +234,7 @@ To add multi-channel functionality to your , take the followin To see if you have successfully joined the second channel, update the `onJoinChannelSuccess` callback to print the channel ID for the connection. - ``` ts + ```ts onJoinChannelSuccess: (connection, _Uid) => { showMessage( 'Successfully joined the channel ' + connection.channelId, diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/swift.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/swift.mdx index 030315b33..e2e8c54be 100644 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/swift.mdx +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/swift.mdx @@ -49,7 +49,7 @@ In your project, declare the required variables. relay, add the following declaration to the top of the `ViewController` class: - ``` swift + ```swift var destChannelName = "<#name of the destination channel#>" var destChannelToken = "<#access token for the destination channel#>" var destUid: UInt = 100 // User ID that the user uses in the destination channel. @@ -100,7 +100,7 @@ To enable users to relay channel media to a destination chann To receive notifications of important channel relay events such as network disconnection, reconnection, and users joining channels, you add a `didReceiveChannelMediaRelayEvent` function to the event handler. Add the following function inside `extension ViewController: AgoraRtcEngineDelegate {`: - ``` swift + ```swift func rtcEngine(_ engine: AgoraRtcEngineKit, didReceive event: AgoraChannelMediaRelayEvent) { switch event { case .disconnect: @@ -156,7 +156,7 @@ In your project, declare the necessary variables, and setup access to the UI ele To join and manage a second channel, add the following declarations before `class ViewController: UIViewController {`: - ``` swift + ```swift var secondChannelName = "<#name of the second channel#>" var secondChannelUid: UInt = 100 // Uid for the second channel var secondChannelToken = "<#access token for the second channel#>" @@ -169,7 +169,7 @@ In your project, declare the necessary variables, and setup access to the UI ele `agoraEngine = AgoraRtcEngineKit.sharedEngine(with: config, delegate: self)` in the `initializeAgoraEngine` function: - ``` swift + ```swift secondChannelBtn.isEnabled = false ``` @@ -177,7 +177,7 @@ In your project, declare the necessary variables, and setup access to the UI ele the following line to the `joinChannel` function after `joined = true`: - ``` swift + ```swift secondChannelBtn.isEnabled = true ``` @@ -199,7 +199,7 @@ the following steps: 1. In the `ViewController`, add the following line along with the other declarations at the top: - ``` swift + ```swift var secondChannelDelegate: SecondChannelDelegate = SecondChannelDelegate() ``` @@ -213,13 +213,13 @@ the following steps: In this example, you need both `ViewController` and `SecondChannelDelegate` to be able to set `remoteView` as its `UIView` when calling `setupRemoteVideo` or `setupRemoteVideoEx`, you also need to see the second connection instance. To do this, remove the `var remoteView: UIView!` declaration from `ViewController` and add the following before `class ViewController: UIViewController {`: - ``` swift + ```swift var rtcSecondConnection: AgoraRtcConnection! var remoteView: UIView! ``` - ``` swift + ```swift var rtcSecondConnection: AgoraRtcConnection! var remoteView: NSView! ``` diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/unity.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/unity.mdx index 74d7f7467..dde48abc5 100644 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/unity.mdx +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/unity.mdx @@ -30,7 +30,7 @@ In your project, import the relevant libraries and declare the required variable 1. **Declare the variables you need** To store source and destination channel settings and manage channel relay, in your script file, add the following declarations to `NewBehaviourScript`: - ``` csharp + ```csharp private string destChannelName = ""; private string destChannelToken = ""; private uint destUid = 100; // User ID that the user uses in the destination channel. @@ -43,7 +43,7 @@ In your project, import the relevant libraries and declare the required variable To import the required UI namespace, in your script file, add the following to the list of namespace declarations: - ``` csharp + ```csharp using TMPro; ``` @@ -51,7 +51,7 @@ In your project, import the relevant libraries and declare the required variable To programmatically access the channel media relay button, add the following at the end of `SetupUI`: - ``` csharp + ```csharp // Access the channel relay button. go = GameObject.Find("StartChannelMediaRelay"); go.GetComponent` -``` html +```html ``` @@ -30,7 +30,7 @@ channels your users are hosting or joining. In `main.js`, add the following variables to the declarations: - ``` javascript + ```javascript // A variable to track the co-hosting state. var isCoHost = false; // The destination channel name you want to join. @@ -58,7 +58,7 @@ To enable users to relay channel media to a destination chann To implement this logic, add the following code before `document.getElementById('leave').onclick = async function () {`: - ``` javascript + ```javascript document.getElementById('coHost').onclick = async function () { //Keep the same UID for this user. @@ -127,7 +127,7 @@ To enable users to relay channel media to a destination chann The supplies `channel-media-relay-state` callback that you use to learn about the current state of channel media relay. To implement this callback in your , in `main.js`, add the following code before `window.onload = function ()`: - ``` javascript + ```javascript agoraEngine.on("channel-media-relay-state", state => { console.log("The current state is : "+ state); @@ -144,7 +144,7 @@ In this example, you use a button to join and leave the second channel. To add a button, in `main.js`, add the following code after ``: -``` html +```html ``` @@ -156,7 +156,7 @@ To join a second channel, add the required variables in your code. In `main.js`, add the following to the declarations: - ``` javascript + ```javascript // A variable to create a second instance of Agora engine. var agoraEngineSubscriber; var isMultipleChannel = false; @@ -183,7 +183,7 @@ When the user presses **Leave Second Channel**, leave the new channel with a cal To implement this logic, in `main.js`, add the following code before `document.getElementById('leave').onclick = async function () {`: -``` javascript +```javascript document.getElementById('multiple-channels').onclick = async function () { // Check to see if the user has already joined a channel. diff --git a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/windows.mdx b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/windows.mdx index 801585b69..c8a53d7a9 100644 --- a/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/windows.mdx +++ b/shared/video-sdk/develop/live-streaming-over-multiple-channels/project-implementation/windows.mdx @@ -38,7 +38,7 @@ In your project, declare the required variables and reference the channel media To store source and destination channel settings and manage channel relay, in `AgoraImplementationDlg.cpp`, add the following declarations after the list of header includes: - ``` cpp + ```cpp CHAR destChannelName[] = ""; CHAR destSourceChannelToken[] = ""; int destUid = 100; // User ID that the user uses in the destination channel. @@ -50,7 +50,7 @@ In your project, declare the required variables and reference the channel media To access the channel relay button, in `AgoraImplementationDlg.cpp`, add the following to `OnInitDialog` before `return true;`: - ``` cpp + ```cpp channelMediaButton = (CButton*)GetDlgItem(IDC_BUTTON3); ``` @@ -62,7 +62,7 @@ To enable users to relay channel media to a destination chann When a user presses the button, the starts relaying media from the source channel to the destination channel. To setup an event list on the channel relay button, in **Dialog Editor**, double-click **Start Channel Media Relay**. **Dialog Editor** automatically creates and opens an event listener for you. To start and stop channel media relay on the button click event, add the following code to the event listener method you just created: - ``` cpp + ```cpp if (mediaRelaying) { m_rtcEngine->stopChannelMediaRelay(); @@ -160,7 +160,7 @@ In your project, import the relevant libraries and declare the required variable To join and manage a second channel, in `AgoraImplementationDlg.cpp`, add the following declarations after the list of header includes: - ``` cpp + ```cpp RtcConnection rtcSecondConnection; CButton secondChannelButton; CHAR secondChannelName[] = ""; @@ -179,18 +179,18 @@ To add multi-channel functionality to your , take the followin 1. In `AgoraImplementationDlg.h`, add the following to the list of includes: - ``` cpp + ```cpp #include ``` 2. In `AgoraImplementationDlg.h`, replace the line `IRtcEngine* m_rtcEngine = nullptr;` with the following: - ``` cpp + ```cpp IRtcEngineEx* agoraEngine = nullptr; ``` 3. In `AgoraImplementationDlg.cpp`, replace the line `agoraEngine = createAgoraRtcEngine();` with the following code: - ``` cpp + ```cpp agoraEngine = static_cast(createAgoraRtcEngine()); ``` @@ -198,7 +198,7 @@ To add multi-channel functionality to your , take the followin When a user presses the button, the joins a second channel. If the is already connected to a second channel, it leaves the channel. To implement this workflow, in **Dialog Editor**, double-click **Start Channel Media Relay**. **Dialog Editor** automatically creates and opens an event listener for you. To start and stop channel media relay on the button click event, add the following code to the event listener you just created: - ``` cpp + ```cpp if (isSecondChannelJoined) { agoraEngine->leaveChannelEx(rtcSecondConnection); } @@ -231,7 +231,7 @@ To add multi-channel functionality to your , take the followin 1. Setup an event handler class and declare the required callbacks. In `AgoraImplementation.h`, add the following code before the `AgoraEventHandler` class: - ``` cpp + ```cpp // Callbacks for the second channel. class SecondChannelEventHandler : public IRtcEngineEventHandler { @@ -249,7 +249,7 @@ To add multi-channel functionality to your , take the followin 2. Provide a definition for each callback. In `AgoraImplementation.cpp`, add the following callback definition at the end of the file: - ``` cpp + ```cpp void SecondChannelEventHandler::onJoinChannelSuccess(const char* channel, uid_t uid, int elapsed) { AfxMessageBox(L"You joined the second channel"); diff --git a/shared/video-sdk/develop/migration-guide/web.mdx b/shared/video-sdk/develop/migration-guide/web.mdx index d6adb4b7e..5a73f8c69 100644 --- a/shared/video-sdk/develop/migration-guide/web.mdx +++ b/shared/video-sdk/develop/migration-guide/web.mdx @@ -18,7 +18,7 @@ First, create a `Client` object and join a specified channel. - Use the v - ``` js + ```js const client = AgoraRTC.createClient({ mode: "live", codec: "vp9" }); client.init("APPID", () => { client.join("Token", "Channel", null, (uid) => { @@ -30,7 +30,7 @@ First, create a `Client` object and join a specified channel. ``` - Use the v - ``` js + ```js const client = AgoraRTC.createClient({ mode: "live", codec: "vp9" }); try { @@ -54,7 +54,7 @@ Second, create an audio track object from the audio sampled by a microphone and - Use the v - ``` js + ```js const localStream = AgoraRTC.createStream({ audio: true, video: true }); localStream.init(() => { console.log("init stream success"); @@ -66,7 +66,7 @@ Second, create an audio track object from the audio sampled by a microphone and - Use the v - ``` js + ```js const localAudio = await AgoraRTC.createMicrophoneAudioTrack(); const localVideo = await AgoraRTC.createCameraVideoTrack(); console.log("create local audio/video track success"); @@ -88,7 +88,7 @@ After creating the local audio and video tracks, publish these tracks to the cha - Use the v - ``` js + ```js client.publish(localStream, err => { console.log("publish failed", err); }); @@ -98,7 +98,7 @@ After creating the local audio and video tracks, publish these tracks to the cha ``` - Use the v - ``` js + ```js try { // Remove this line if the channel profile is not live broadcast. await client.setClientRole("host"); @@ -122,7 +122,7 @@ When a remote user in the channel publishes media tracks, we need to automatical - Use the v - ``` js + ```js client.on("stream-added", e => { client.subscribe(e.stream, { audio: true, video: true }, err => { console.log("subscribe failed", err); @@ -136,7 +136,7 @@ When a remote user in the channel publishes media tracks, we need to automatical ``` - Use the v - ``` js + ```js client.on("user-published", async (remoteUser, mediaType) => { await client.subscribe(remoteUser, mediaType); if (mediaType == "video") { @@ -191,14 +191,14 @@ The improved events are: - Use the v - ``` js + ```js client.on("connection-state-change", e => { console.log("current", e.curState, "prev", e.prevState); }); ``` - Use the v - ``` js + ```js client.on("connection-state-change", (curState, prevState) => { console.log("current", curState, "prev", prevState); }); diff --git a/shared/video-sdk/develop/play-media/project-implementation/android.mdx b/shared/video-sdk/develop/play-media/project-implementation/android.mdx index 9bbefa426..e9ca12bde 100644 --- a/shared/video-sdk/develop/play-media/project-implementation/android.mdx +++ b/shared/video-sdk/develop/play-media/project-implementation/android.mdx @@ -2,7 +2,7 @@ 1. **Import the required Agora libraries** - ``` kotlin + ```kotlin import io.agora.rtc2.ChannelMediaOptions import io.agora.rtc2.Constants import io.agora.rtc2.video.VideoCanvas @@ -15,7 +15,7 @@ 1. **Declare the variables you need** - ``` kotlin + ```kotlin private var mediaPlayer: IMediaPlayer? = null // Instance of the media player private var mediaDuration: Long = 0 // Duration of the opened media file private lateinit var mediaPlayerListener: MediaPlayerListener @@ -44,7 +44,7 @@ 1. **Open, play, pause, and resume media files** - ``` kotlin + ```kotlin fun openMediaFile(mediaLocation: String) { // Opens the media file at mediaLocation url // Supports URI files starting with 'content://' @@ -73,7 +73,7 @@ The `IMediaPlayerObserver` implements media player callbacks. You create an instance of `IMediaPlayerObserver` and register it with the media player instance. - ``` kotlin + ```kotlin private val mediaPlayerObserver: IMediaPlayerObserver = object : IMediaPlayerObserver { override fun onPlayerStateChanged(state: MediaPlayerState, error: MediaPlayerError) { // Reports changes in playback state @@ -133,7 +133,7 @@ You use `ChannelMediaOptions` and the `updateChannelMediaOptions` method to specify the tracks to publish. By setting appropriate parameters, you can publish the media player output, the user's local microphone and camera tracks, or both. - ``` kotlin + ```kotlin fun updateChannelPublishOptions(publishMediaPlayer: Boolean) { val channelOptions = ChannelMediaOptions() // Start or stop publishing the media player tracks @@ -153,7 +153,7 @@ To display the media player output locally, you create a `SurfaceView`, set it up using a `VideoCanvas` and call the `setupLocalVideo` method of the . - ``` kotlin + ```kotlin fun mediaPlayerSurfaceView(): SurfaceView { // Sets up and returns a SurfaceView to display the media player output // Instantiate a SurfaceView @@ -185,7 +185,7 @@ 1. **Clean up when you close the ** - ``` kotlin + ```kotlin fun destroyMediaPlayer(){ // Destroy the media player instance and clean up if (mediaPlayer != null) { diff --git a/shared/video-sdk/develop/play-media/project-implementation/electron.mdx b/shared/video-sdk/develop/play-media/project-implementation/electron.mdx index 097ebbab4..4f23d5e33 100644 --- a/shared/video-sdk/develop/play-media/project-implementation/electron.mdx +++ b/shared/video-sdk/develop/play-media/project-implementation/electron.mdx @@ -4,7 +4,7 @@ In a real-word application, you provide several buttons to enable a user to open, play, pause and stop playing files in the media player. In this page, you use a single `Button` to demonstrate the basic media player functions. You also add two `div` to display the play progress to the user. To implement this UI, in `index.html`, add the following code after ``: -``` html +```html
    @@ -19,7 +19,7 @@ To setup your project to use the media player APIs, take the following steps: To import the required libraries, in `preload.js`, add the following before ` createAgoraRtcEngine,` statement: - ``` javascript + ```javascript MediaPlayerState, ChannelMediaOptions, ``` @@ -28,7 +28,7 @@ To setup your project to use the media player APIs, take the following steps: To create and manage an instance of the media player and access the UI elements, in `preload.js`, add the following to the list of declarations: - ``` javascript + ```javascript var mediaPlayer; // To hold an instance of the media player var isMediaPlaying = false; var mediaDuration = 0; @@ -49,7 +49,7 @@ To implement playing and publishing media files in your , take When a user presses the button for the first time, you create an instance of the media player, set its `mediaPlayerObserver` to receive the callbacks, and open the media file. When the user presses the button again, you play the media file. On subsequent button presses, you pause or resume playing the media file alternately. To do this, add the following method to `preload.js` before `document.getElementById("leave").onclick = async function ()`: - ``` javascript + ```javascript document.getElementById("mediaPlayer").onclick = async function () { mediaButton = document.getElementById("mediaPlayer"); @@ -106,7 +106,7 @@ To implement playing and publishing media files in your , take The `IMediaPlayerSourceObserver` implements media player callbacks. You create an instance of `onPlayerSourceStateChanged` and register it with the media player instance. When the player state changes, you take appropriate actions to update the UI in `onPlayerStateChanged`. You use the `onPositionChanged` callback to update the progress bar. To implement these callbacks, add the following code before `const EventHandles = `: - ``` javascript + ```javascript const mediaPlayerObserver = { onPlayerSourceStateChanged: (state,error) => @@ -148,7 +148,7 @@ To implement playing and publishing media files in your , take You use `ChannelMediaOptions` and the `updateChannelMediaOptions` method to specify the type of stream to publish. To switch between publishing media player and camera and microphone streams, in `preload.js`, add the following method before `window.onload = () => `: - ``` javascript + ```javascript function updateChannelPublishOptions(publishMediaPlayer) { let channelOptions = new ChannelMediaOptions(); @@ -165,7 +165,7 @@ To implement playing and publishing media files in your , take Setup a `VideoCanvas` and use it in the `setupLocalVideo` method of the to show the media player output locally. To switch between displaying media player output and the camera stream, in `preload.js`, add the following function before `window.onload = () => `: - ``` javascript + ```javascript function setupLocalVideo(forMediaPlayer) { if (forMediaPlayer) @@ -196,7 +196,7 @@ To implement playing and publishing media files in your , take To free up resources when you exit the , add the following lines to the `document.getElementById("leave").onclick` method before `window.location.reload();`: - ``` javascript + ```javascript // Destroy the media player mediaPlayer.stop(); mediaPlayer.unregisterPlayerSourceObserver(mediaPlayerObserver); diff --git a/shared/video-sdk/develop/play-media/project-implementation/flutter.mdx b/shared/video-sdk/develop/play-media/project-implementation/flutter.mdx index 52c9b6d86..416271ebf 100644 --- a/shared/video-sdk/develop/play-media/project-implementation/flutter.mdx +++ b/shared/video-sdk/develop/play-media/project-implementation/flutter.mdx @@ -8,7 +8,7 @@ To add the UI elements, in `/lib/main.dart`. 1. Add the following code to the `build` method after `ListView(...children: [`: - ``` dart + ```dart _mediaPLayerButton(), Slider( value: _seekPos.toDouble(), @@ -25,7 +25,7 @@ To add the UI elements, in `/lib/main.dart`. 1. The `_mediaPLayerButton` widget displays a suitable caption on a button depending on the state of the media player. To define this widget, add the following method to the `_MyAppState` class: - ``` dart + ```dart Widget _mediaPLayerButton() { String caption = ""; @@ -54,7 +54,7 @@ To setup your project to use the media player APIs, take the following steps: To create and manage an instance of the `MediaPlayerController` and configure the UI elements, add the following declarations to the `_MyAppState` class after `late RtcEngine agoraEngine;`: - ``` dart + ```dart late final MediaPlayerController _mediaPlayerController; String mediaLocation = "https://www.appsloveworld.com/wp-content/uploads/2018/10/640.mp4"; @@ -75,7 +75,7 @@ To implement playing and publishing media files in your , take When a user presses the button for the first time, you create an instance of the media player and open the media file. When the user presses the button again, you play the media file. On subsequent button presses, you pause or resume playing the media file alternately. To do this, add the following method to the `_MyAppState` class: - ``` dart + ```dart void playMedia() async { if (!_isUrlOpened) { await initializeMediaPlayer(); @@ -108,7 +108,7 @@ To implement playing and publishing media files in your , take During initialization, you create an instance of `MediaPlayerController`, initialize it using the `initialize()` method and register a `MediaPlayerSourceObserver` to receive media player callbacks. When the player state changes, you take appropriate actions to update the UI in `onPlayerStateChanged`. You use the `onPositionChanged` callback to update the progress on the Slider. To do this, add the following code to the `_MyAppState` class: - ``` dart + ```dart Future initializeMediaPlayer() async { _mediaPlayerController= MediaPlayerController( rtcEngine: agoraEngine, @@ -163,7 +163,7 @@ To implement playing and publishing media files in your , take You use `ChannelMediaOptions` and the `updateChannelMediaOptions` method to specify the type of stream to publish. To switch between publishing media player and camera and microphone streams, add the following method to the `MainActivity` class: - ``` dart + ```dart void updateChannelPublishOptions(bool publishMediaPlayer) { ChannelMediaOptions channelOptions = ChannelMediaOptions( publishMediaPlayerAudioTrack: publishMediaPlayer, @@ -180,7 +180,7 @@ To implement playing and publishing media files in your , take To show the media player output locally, you create an `AgoraVideoView` and set its `controller` to `_mediaPlayerController`. To enable switching between displaying media player output and the camera stream, **replace** the `_localPreview()` method in the `_MyAppState` class with the following: - ``` dart + ```dart Widget _localPreview() { if (_isJoined) { if (_isPlaying) { @@ -208,7 +208,7 @@ To implement playing and publishing media files in your , take To free up resources when you exit the channel, add the following lines to the `leave` method after `agoraEngine.leaveChannel();`: - ``` dart + ```dart // Dispose the media player _mediaPlayerController.dispose(); diff --git a/shared/video-sdk/develop/play-media/project-implementation/react-native.mdx b/shared/video-sdk/develop/play-media/project-implementation/react-native.mdx index 15d36446f..419329de7 100644 --- a/shared/video-sdk/develop/play-media/project-implementation/react-native.mdx +++ b/shared/video-sdk/develop/play-media/project-implementation/react-native.mdx @@ -7,13 +7,13 @@ In a real-word application, you provide several buttons to enable a user to open To add the button, in `App.tsx`: 1. Add the following component to the `return` statement of the `App` component before ` ``` 1. The `MediaPlayerButton` component displays a suitable caption depending on the state of the media player. To define this component, add the following code to the `App` component: - ``` typescript + ```typescript const MediaPlayerButton = () => { var caption = ''; @@ -43,7 +43,7 @@ To setup your project to use the media player APIs, take the following steps: Add the following to the list of `import` statements in `App.tsx`: - ``` typescript + ```typescript import { IMediaPlayer, VideoSourceType, @@ -58,13 +58,13 @@ To setup your project to use the media player APIs, take the following steps: 1. To specify the path to the media file, add the following declaration to `App.tsx` after `const uid = 0;`: - ``` typescript + ```typescript const mediaLocation = 'https://webdemo.agora.io/agora-web-showcase/examples/Agora-Custom-VideoSource-Web/assets/sample.mp4'; ``` 1. To create and manage an instance of `IMediaPlayer` and configure the UI elements, add the following declarations to the `App` component after `const [message, setMessage] = useState('');`: - ``` typescript + ```typescript const mediaPlayerRef = useRef(); // Media player instance const [isUrlOpened, setIsUrlOpened] = useState(false); // Media file has been opened const [isPlaying, setIsPlaying] = useState(false); // Media file is playing @@ -80,7 +80,7 @@ To implement playing and publishing media files in your , take When a user presses the button for the first time, you create an instance of the media player and open the media file. When the user presses the button again, you play the media file. On subsequent button presses, you pause or resume playing the media file alternately. To do this, add the following method to the `App` component: - ``` typescript + ```typescript const playMedia = () => { if (!isJoined) { return; @@ -114,7 +114,7 @@ To implement playing and publishing media files in your , take During initialization, you create an instance of `IMediaPlayer` and register a `IMediaPlayerSourceObserver` to receive media player callbacks. When the player state changes, you take appropriate actions to update the UI in `onPlayerStateChanged`. You use the `onPositionChanged` callback to update the play progress in the UI. To do this, add the following code to the `App` component: - ``` typescript + ```typescript const initializeMediaPlayer = () => { mediaPlayerRef.current = agoraEngineRef.current?.createMediaPlayer(); @@ -148,7 +148,7 @@ To implement playing and publishing media files in your , take You use `ChannelMediaOptions` and the `updateChannelMediaOptions` method to specify the type of stream to publish. To switch between publishing media player and camera and microphone streams, add the following method to the `App` component: - ``` typescript + ```typescript const updateChannelPublishOptions = (publishMediaPlayer: boolean) => { var channelOptions = new ChannelMediaOptions(); channelOptions.publishMediaPlayerAudioTrack = publishMediaPlayer; @@ -170,12 +170,12 @@ To implement playing and publishing media files in your , take 1. **Replace** the first `RtcSurfaceView` component block in the `return` statement of `App` with the following: - ``` typescript + ```typescript ``` 1. To define the `LocalPreview` component, add the following method to `App`: - ``` typescript + ```typescript const LocalPreview = () => { if (!isPlaying) { return ; diff --git a/shared/video-sdk/develop/play-media/project-implementation/swift.mdx b/shared/video-sdk/develop/play-media/project-implementation/swift.mdx index 4387e1a1e..226566a41 100644 --- a/shared/video-sdk/develop/play-media/project-implementation/swift.mdx +++ b/shared/video-sdk/develop/play-media/project-implementation/swift.mdx @@ -28,7 +28,7 @@ To setup your project to use the media player APIs and access the UI elements, t To create and manage an instance of the media player and access the UI elements, add the following declarations to `ViewController`: - ``` swift + ```swift var mediaPlayer: AgoraRtcMediaPlayerProtocol? // Instance of the media player var isMediaPlaying: Bool = false var mediaDuration: Int = 0 @@ -52,7 +52,7 @@ To implement playing and publishing media files in your , take The `AgoraRtcMediaPlayerDelegate` implements media player callbacks. When the player state changes, update the UI using the `didChangedToState`, `didChangedToPosition` callbacks. To setup the `AgoraRtcMediaPlayerDelegate`, add the following extension to the `ViewController`: - ``` swift + ```swift extension ViewController: AgoraRtcMediaPlayerDelegate { func agoraRtcMediaPlayer(_ playerKit: AgoraRtcMediaPlayerProtocol, didChangedTo state: AgoraMediaPlayerState, error: AgoraMediaPlayerError) { if (state == .openCompleted) { @@ -101,7 +101,7 @@ To implement playing and publishing media files in your , take You use the `AgoraRtcChannelMediaOptions` class and the `updateChannelWithMediaOptions` method functions to specify the type of stream to publish. To switch between publishing media player and camera and microphone streams, add the following function to the `ViewController`: - ``` swift + ```swift func updateChannelPublishOptions(_ publishMediaPlayer: Bool) { let channelOptions: AgoraRtcChannelMediaOptions = AgoraRtcChannelMediaOptions() @@ -119,7 +119,7 @@ To implement playing and publishing media files in your , take Create an `AgoraRtcVideoCanvas` and use it in the `setupLocalVideo` method of the to show the media player output locally. To switch between displaying media player output and the camera stream, replace the `setupLocalVideo()` method in the `ViewController` with the following: - ``` swift + ```swift func setupLocalVideo(_ forMediaPlayer: Bool) { // Enable the video module agoraEngine.enableVideo() @@ -147,7 +147,7 @@ To implement playing and publishing media files in your , take When you join a channel, you set up the local video panel to initially display the camera output. In the `joinChannel()` function, replace `setupLocalVideo()` with a call to the updated function: - ``` swift + ```swift setupLocalVideo(false) ``` @@ -155,7 +155,7 @@ To implement playing and publishing media files in your , take To free up resources when you exit the channel, add the following lines at the end of the `leaveChannel` function: - ``` swift + ```swift // Destroy the media player agoraEngine.destroyMediaPlayer(mediaPlayer) mediaPlayer = nil diff --git a/shared/video-sdk/develop/play-media/project-implementation/unity.mdx b/shared/video-sdk/develop/play-media/project-implementation/unity.mdx index 087738fca..e757b5af9 100644 --- a/shared/video-sdk/develop/play-media/project-implementation/unity.mdx +++ b/shared/video-sdk/develop/play-media/project-implementation/unity.mdx @@ -26,7 +26,7 @@ To declare the required variables and access the UI elements, take the following To import the required Uinty UI library, in your script file, add the following to the list of namespace declarations: - ``` csharp + ```csharp using TMPro; ``` @@ -34,7 +34,7 @@ To declare the required variables and access the UI elements, take the following To create and manage an instance of the media player and access the UI elements, in your script file, add the following declarations to `NewBehaviourScript`: - ``` csharp + ```csharp private IMediaPlayer mediaPlayer; // Instance of the media player private bool isMediaPlaying = false; private long mediaDuration = 0; @@ -69,7 +69,7 @@ To implement playing and publishing media files in your , take When a user presses the button for the first time, you create an instance of the media player, set its `mediaPlayerObserver` to receive the callbacks, and open the media file. When the user presses the button again, you play the media file. On subsequent button presses, you pause or resume playing the media file alternately. To implement this workflow, in your script file, add the following method to `NewBehaviourScript`: - ``` csharp + ```csharp public void playMedia() { // Initialize the mediaPlayer and open a media file if (mediaPlayer == null) { @@ -115,7 +115,7 @@ To implement playing and publishing media files in your , take The `IMediaPlayerSourceObserver` class implements media player callbacks. To setup the media player callback, create a new class from `IMediaPlayerSourceObserver` and specify the required callback in the class. When the player state changes, you take appropriate actions to update the UI in `OnPlayerSourceStateChanged`. You use `OnPositionChanged` to update the progress bar. To implement the media player callbacks in your , in your script file, add the following code to `NewBehaviourScript`: - ``` csharp + ```csharp internal class MediaPlayerObserver : IMediaPlayerSourceObserver { private readonly NewBehaviourScript _videoSample; @@ -176,7 +176,7 @@ To implement playing and publishing media files in your , take You use `ChannelMediaOptions` and the `UpdateChannelMediaOptions` method to specify the type of stream to publish. To switch between publishing media player and camera and microphone streams, in your script file, add the following method to `NewBehaviourScript`: - ``` csharp + ```csharp private void updateChannelPublishOptions(bool publishMediaPlayer){ ChannelMediaOptions channelOptions = new ChannelMediaOptions(); channelOptions.publishMediaPlayerAudioTrack.SetValue(publishMediaPlayer); @@ -192,7 +192,7 @@ To implement playing and publishing media files in your , take Access the `LocalView` game object from the UI and update its video surface each time when you play and pasue the media player to switch between displaying media player output and the camera stream. To implement this logic, in your script file, add the following method in `NewBehaviourScript`: - ``` csharp + ```csharp private void setupLocalVideo(bool forMediaPlayer) { if (forMediaPlayer) { GameObject go = GameObject.Find("LocalView"); @@ -216,7 +216,7 @@ To implement playing and publishing media files in your , take To free up resources when you exit the , in your script file, locate `OnApplicationQuit` and add the following after `RtcEngine = null;`: - ``` csharp + ```csharp // Destroy the media player if (mediaPlayer != null) mediaPlayer.Stop(); diff --git a/shared/video-sdk/develop/play-media/project-implementation/windows.mdx b/shared/video-sdk/develop/play-media/project-implementation/windows.mdx index 11ba09576..0d4321e7a 100644 --- a/shared/video-sdk/develop/play-media/project-implementation/windows.mdx +++ b/shared/video-sdk/develop/play-media/project-implementation/windows.mdx @@ -38,7 +38,7 @@ To setup your project to use the media player APIs and access the UI elements, t To import the required libraries, in `AgoraImplementationDlg.h`, add the following header files at the start: - ``` cpp + ```cpp #include #include using namespace agora::base; @@ -49,7 +49,7 @@ To setup your project to use the media player APIs and access the UI elements, t To create and manage an instance of the media player and access the UI elements, in `AgoraImplementationDlg.h`, add the following to `CAgoraImplementationDlg`: - ``` cpp + ```cpp IMediaPlayer *mediaPlayer; // Instance of the media player BOOL isMediaPlaying = false; long mediaDuration = 0; @@ -66,7 +66,7 @@ To setup your project to use the media player APIs and access the UI elements, t In `AgoraImplementationDlg.h`, add the following at the end of `OnInitDialog`: - ``` cpp + ```cpp mediaButton = (CButton*)GetDlgItem(IDC_BUTTON3); mediaProgressBar = (CSliderCtrl*)GetDlgItem(IDC_SLIDER1); ``` @@ -79,7 +79,7 @@ To implement playing and publishing media files in your , take When a user presses the button for the first time, you create an instance of the media player, set its `mediaPlayerObserver` to receive the callbacks, and open the media file. When the user presses the button again, you play the media file. On subsequent button presses, you pause or resume playing the media file alternately. To implement this workflow, in **Dialog Editor**, double-click **Open Media File**. **Dialog Editor** automatically creates and opens an event listener for you. Add the following code to the event listener you just created: - ``` cpp + ```cpp // Initialize the mediaPlayer and open a media file if (mediaPlayer == NULL) { // Create an instance of the media player @@ -124,7 +124,7 @@ To implement playing and publishing media files in your , take 1. In `AgoraImplementationDlg.h`, add the following class before `AgoraEventHandler`: - ``` cpp + ```cpp class MediaPlayerSourceObserver : public IMediaPlayerSourceObserver { public: @@ -183,28 +183,28 @@ To implement playing and publishing media files in your , take 1. In `AgoraImplementationDlg.h`, add the following to `CAgoraImplementationDlg`: - ``` cpp + ```cpp afx_msg LRESULT OnEIDMediaPlayerStateChanged(WPARAM wParam, LPARAM lParam); afx_msg LRESULT OnEIDMediaPlayerPositionChanged(WPARAM wParam, LPARAM lParam); ``` 1. In `AgoraImplementationDlg.cpp`, add the following after the list of header files: - ``` cpp + ```cpp #define PLAYER_POSITION_CHANGED 0x00000005 #define PLAYER_MEDIA_COMPLETED 0x00000006 ``` 1. In `AgoraImplementationDlg.cpp`, add the following code: - ``` cpp + ```cpp ON_MESSAGE(WM_MSGID(PLAYER_POSITION_CHANGED), &CAgoraImplementationDlg::OnEIDMediaPlayerPositionChanged) ON_MESSAGE(WM_MSGID(PLAYER_STATE_CHANGED), &CAgoraImplementationDlg::OnEIDMediaPlayerStateChanged) ``` 1. In `AgoraImplementationDlg.cpp`, add the following methods before `OnInitDialog`: - ``` cpp + ```cpp LRESULT CAgoraImplementationDlg::OnEIDMediaPlayerStateChanged(WPARAM wParam, LPARAM lParam) { if (wParam == PLAYER_STATE_OPEN_COMPLETED) @@ -253,14 +253,14 @@ To implement playing and publishing media files in your , take 1. In `AgoraImplementationDlg.h`, add the following to method `CAgoraImplementationDlg` before `afx_msg void OnClose();` - ``` cpp + ```cpp // Declare a method to publish and publish the local and media file streams. void updateChannelPublishOptions(BOOL value); ``` 2. In `AgoraImplementationDlg.cpp`, add the method after `OnInitDialog`: - ``` cpp + ```cpp void CAgoraImplementationDlg::updateChannelPublishOptions(BOOL publishMediaPlayer) { // You use ChannelMediaOptions to change channel media options. @@ -280,14 +280,14 @@ To implement playing and publishing media files in your , take 1. In `AgoraImplementationDlg.h`, add the following to method `CAgoraImplementationDlg` before `afx_msg void OnClose();` - ``` cpp + ```cpp // Declare a method to switch between the local video and media file output. void setupLocalVideo(BOOL value); ``` 2. In `AgoraImplementationDlg.cpp`, add the method after `OnInitDialog`: - ``` cpp + ```cpp void CAgoraImplementationDlg::setupLocalVideo(BOOL forMediaPlayer) { // Pass the window handle to the engine so that it renders the local video. @@ -310,7 +310,7 @@ To implement playing and publishing media files in your , take To free up resources when you exit the , add the following lines to the `onDestroy` method after `super.onDestroy();`: - ``` java + ```java // Destroy the media player mediaPlayer.stop(); mediaPlayer.unRegisterPlayerObserver(mediaPlayerObserver); diff --git a/shared/video-sdk/develop/product-workflow/project-implementation/android.mdx b/shared/video-sdk/develop/product-workflow/project-implementation/android.mdx index 05d5c363d..8eaea454e 100644 --- a/shared/video-sdk/develop/product-workflow/project-implementation/android.mdx +++ b/shared/video-sdk/develop/product-workflow/project-implementation/android.mdx @@ -187,7 +187,7 @@ To implement these features in your , take the following steps To show the screen sharing preview in the local video container, add the following method to the `MainActivity` class: - ``` java + ```java private void startScreenSharePreview() { // Create render view by RtcEngine FrameLayout container = findViewById(R.id.local_video_view_container); @@ -211,7 +211,7 @@ To implement these features in your , take the following steps Add the following method to the `MainActivity` class to update publishing options when starting or stopping screen sharing: - ``` java + ```java void updateMediaPublishOptions(boolean publishScreen) { ChannelMediaOptions mediaOptions = new ChannelMediaOptions(); mediaOptions.publishCameraTrack = !publishScreen; diff --git a/shared/video-sdk/develop/product-workflow/project-implementation/electron.mdx b/shared/video-sdk/develop/product-workflow/project-implementation/electron.mdx index c9d9d6ec8..e9db70d3e 100644 --- a/shared/video-sdk/develop/product-workflow/project-implementation/electron.mdx +++ b/shared/video-sdk/develop/product-workflow/project-implementation/electron.mdx @@ -82,7 +82,7 @@ To implement volume control and screen sharing in your , take Set up a `VideoCanvas` and use it in the `setupScreenSharing` method to show the screen track locally. The new `VideoCanvas` enables you to publish your screen track along with your local video. In `preload.js`, add the following method before `window.onload = () => `: - ``` javascript + ```javascript function setupScreenSharing(doScreenShare) { if (doScreenShare) diff --git a/shared/video-sdk/develop/product-workflow/project-implementation/flutter.mdx b/shared/video-sdk/develop/product-workflow/project-implementation/flutter.mdx index 93ee46b89..311f6d993 100644 --- a/shared/video-sdk/develop/product-workflow/project-implementation/flutter.mdx +++ b/shared/video-sdk/develop/product-workflow/project-implementation/flutter.mdx @@ -152,7 +152,7 @@ To implement these features in your , take the following steps dimensions: VideoDimensions(height: 1280, width: 720), frameRate: 15, bitrate: 600))); - ``` + ``` 1. **Show screen sharing preview** diff --git a/shared/video-sdk/develop/product-workflow/project-implementation/macos.mdx b/shared/video-sdk/develop/product-workflow/project-implementation/macos.mdx index df8813b97..24837442a 100644 --- a/shared/video-sdk/develop/product-workflow/project-implementation/macos.mdx +++ b/shared/video-sdk/develop/product-workflow/project-implementation/macos.mdx @@ -144,7 +144,7 @@ To implement these features in your , take the following steps In `ViewController`, add the following function after `buttonAction`: - ``` swift + ```swift @objc func screenShareAction(sender: NSButton!) { if !joined { // Check if successfully joined the channel and set button title accordingly diff --git a/shared/video-sdk/develop/product-workflow/project-implementation/react-native.mdx b/shared/video-sdk/develop/product-workflow/project-implementation/react-native.mdx index 41ff2f2f9..e21bcadf7 100644 --- a/shared/video-sdk/develop/product-workflow/project-implementation/react-native.mdx +++ b/shared/video-sdk/develop/product-workflow/project-implementation/react-native.mdx @@ -219,5 +219,5 @@ To implement volume control in your , take the following steps } } }, - ``` + ``` \ No newline at end of file diff --git a/shared/video-sdk/develop/product-workflow/project-implementation/swift.mdx b/shared/video-sdk/develop/product-workflow/project-implementation/swift.mdx index dfea602ab..638835412 100644 --- a/shared/video-sdk/develop/product-workflow/project-implementation/swift.mdx +++ b/shared/video-sdk/develop/product-workflow/project-implementation/swift.mdx @@ -31,7 +31,7 @@ To create this user interface, in the `ViewController` class: In `ViewController`, add the following line after the last `import` statement: - ``` swift + ```swift import ReplayKit ``` diff --git a/shared/video-sdk/develop/product-workflow/project-implementation/unity.mdx b/shared/video-sdk/develop/product-workflow/project-implementation/unity.mdx index 02e5cc35b..06c17dc30 100644 --- a/shared/video-sdk/develop/product-workflow/project-implementation/unity.mdx +++ b/shared/video-sdk/develop/product-workflow/project-implementation/unity.mdx @@ -28,7 +28,7 @@ In a real-world application, for each volume setting you want to control, you ty To import the required Unity UI library, in your script file, add the following to the namespace declarations: - ``` csharp + ```csharp using TMPro; ``` diff --git a/shared/video-sdk/develop/product-workflow/project-setup/android.mdx b/shared/video-sdk/develop/product-workflow/project-setup/android.mdx index e1c2aba91..e96308b37 100644 --- a/shared/video-sdk/develop/product-workflow/project-setup/android.mdx +++ b/shared/video-sdk/develop/product-workflow/project-setup/android.mdx @@ -10,7 +10,7 @@ 1. In `/Gradle Scripts/build.gradle (Module: .app)`, add the following line under `dependencies`: - ``` text + ```text implementation files('libs/AgoraScreenShareExtension.aar') ``` diff --git a/shared/video-sdk/develop/product-workflow/project-setup/flutter.mdx b/shared/video-sdk/develop/product-workflow/project-setup/flutter.mdx index dbbad1cc4..0700cfc73 100644 --- a/shared/video-sdk/develop/product-workflow/project-setup/flutter.mdx +++ b/shared/video-sdk/develop/product-workflow/project-setup/flutter.mdx @@ -136,7 +136,7 @@ Depending on your target platform, follow these steps: } @end - ``` + ``` 1. Start screen sharing diff --git a/shared/video-sdk/develop/stream-raw-audio-and-video/project-implementation/android.mdx b/shared/video-sdk/develop/stream-raw-audio-and-video/project-implementation/android.mdx index 0f6d790e3..b113ae559 100644 --- a/shared/video-sdk/develop/stream-raw-audio-and-video/project-implementation/android.mdx +++ b/shared/video-sdk/develop/stream-raw-audio-and-video/project-implementation/android.mdx @@ -5,7 +5,7 @@ To enable or disable processing of captured raw video data, add a button to the user interface. In `/app/res/layout/activity_main.xml` add the following lines before ``: -``` xml +```xml `: - -```html - -
    -
    - - -``` - -### Implement the workflow - -To implement the workflow logic in your , take the following steps: - -1. **Bypass autoplay blocking** - - Web browsers use an autoplay policy in order to improve the user experience and reduce data consumption on expensive networks. The autoplay policy may block the audio or video playback in your . To deal with autoplay blocking, supplies the `onAutoplayFailed` callback. Your listens for `onAutoplayFailed` and prompts the user to interact with the webpage to resume the playback. - - To bypass autoplay blocking, in `main.js`, add the following code before `agoraEngine.on("user-published", async (user, mediaType) =>`: - - ```javascript - AgoraRTC.onAutoplayFailed = () => { - // Create button for the user interaction. - const btn = document.createElement("button"); - // Set the button text. - btn.innerText = "Click me to resume playback"; - // Remove the button when onClick event occurs. - btn.onClick = () => { - btn.remove(); - }; - // Append the button to the UI. - document.body.append(btn); - } - ``` - - Safari and WebView in iOS have a stricter autoplay policy. They only allow playback with sound that is triggered by user intervention and do not remove the autoplay block even after user intervention. To bypass this strict policy, provide a button in the UI to stop and play the remote audio track. When the user stops the audio track and plays it again, the browser automatically removes the autoplay block. - -1. **Add the required variable** - - In `main.js`, add the following variables to the declarations: - - ```javascript - var isMuteAudio = false; - ``` - -1. **Add the volume control logic** - - When the user moves a range slider, adjust the volume for the local or remote audio track. In `main.js`, add the following code before `agoraEngine.on("user-published", async (user, mediaType) =>`: - - ```javascript - // Set an event listener on the range slider. - document.getElementById("localAudioVolume").addEventListener("change", function(evt) { - console.log("Volume of local audio :" + evt.target.value); - // Set the local audio volume. - channelParameters.localAudioTrack.setVolume(parseInt(evt.target.value)); - }); - // Set an event listener on the range slider. - document.getElementById("remoteAudioVolume").addEventListener("change", function(evt) { - console.log("Volume of remote audio :" + evt.target.value); - // Set the remote audio volume. - channelParameters.remoteAudioTrack.setVolume(parseInt(evt.target.value)); - }); - ``` - - When using a device to capture audio, sets a default global volume value of 85 (range [0, 255]). automatically increases a capture device volume that's too low. You can adjust the capture volume as per your needs by adjusting the microphone or sound card's signal capture volume. - -1. **Add the logic to mute and unmute the local audio track** - - To mute or unmute the local audio track, call `setEnabled` and pass a `Boolean` value. In `main.js`, add the following code before `document.getElementById('leave').onclick = async function ()`: - - ```javascript - document.getElementById('muteAudio').onclick = async function () - { - if(isMuteAudio == false) - { - // Mute the local audio. - channelParameters.localAudioTrack.setEnabled(false); - // Update the button text. - document.getElementById(`muteAudio`).innerHTML = "Unmute Audio"; - isMuteAudio = true; - } - else - { - // Unmute the local audio. - channelParameters.localAudioTrack.setEnabled(true); - // Update the button text. - document.getElementById(`muteAudio`).innerHTML = "Mute Audio"; - isMuteAudio = false; - } - } - ``` - - diff --git a/shared/voice-sdk/get-started/get-started-sdk/project-implementation/poc3.mdx b/shared/voice-sdk/get-started/get-started-sdk/project-implementation/poc3.mdx index c7ea415cf..d6c36cabc 100644 --- a/shared/voice-sdk/get-started/get-started-sdk/project-implementation/poc3.mdx +++ b/shared/voice-sdk/get-started/get-started-sdk/project-implementation/poc3.mdx @@ -72,7 +72,7 @@ The following code examples show how to implement these steps in your + ### Request microphone permission @@ -103,4 +103,4 @@ The following code examples show how to implement these steps in your