From e75a6cdf739aafa5143de066d7aa59b4acb66f17 Mon Sep 17 00:00:00 2001 From: atovpeko Date: Fri, 26 Jul 2024 10:51:48 +0300 Subject: [PATCH] review --- .../ai-noise-suppression/reference/web.mdx | 2 +- .../beauty-effect/reference/web.mdx | 34 +++++++++++-------- .../project-implementation/web.mdx | 10 +++--- .../super-clarity/reference/web.mdx | 6 ++-- .../project-implementation/web.mdx | 20 +++++------ .../video-compositor/reference/web.mdx | 12 +++---- .../virtual-background/reference/web.mdx | 2 +- 7 files changed, 45 insertions(+), 41 deletions(-) diff --git a/shared/extensions-marketplace/ai-noise-suppression/reference/web.mdx b/shared/extensions-marketplace/ai-noise-suppression/reference/web.mdx index b9c89c0cb..dd9e3d3af 100644 --- a/shared/extensions-marketplace/ai-noise-suppression/reference/web.mdx +++ b/shared/extensions-marketplace/ai-noise-suppression/reference/web.mdx @@ -1,6 +1,6 @@ -- For a working example, check out the [AI Denoiser web demo](https://webdemo.agora.io/aiDenoiser/index.html) and the associated [source code](https://github.com/AgoraIO/API-Examples-Web/tree/main/Demo/aiDenoiser). +- For a working example, check out the [AI Noise Suppression web demo](https://webdemo.agora.io/aiDenoiser/index.html) and the associated [source code](https://github.com/AgoraIO/API-Examples-Web/tree/main/Demo/aiDenoiser). ### API reference diff --git a/shared/extensions-marketplace/beauty-effect/reference/web.mdx b/shared/extensions-marketplace/beauty-effect/reference/web.mdx index 46bd1e158..8b4c1fcc0 100644 --- a/shared/extensions-marketplace/beauty-effect/reference/web.mdx +++ b/shared/extensions-marketplace/beauty-effect/reference/web.mdx @@ -1,33 +1,37 @@ -- For a working example, check out the [beauty effect demo](https://webdemo.agora.io/beauty-extension/index.html). +- For a working example, check out the [Beauty Effect demo](https://webdemo.agora.io/beauty-extension/index.html). ### Considerations -- **Browser Support**: - - The beauty extension supports the latest versions of Chrome, Firefox, and Safari. +- **Browser support**: + - The extension supports the latest versions of Chrome, Firefox, and Safari. - For the best beautification experience, recommends using the latest version of Chrome on desktop. - Safari versions below 15.4 are not supported due to a [known WebKit issue](https://bugs.webkit.org/show_bug.cgi?id=181663) that causes a black screen. - Enabling beauty mode on mobile devices is not recommended. -- **Device Requirements**: - - The beauty extension has high performance requirements. Agora recommends the following: +- **Device requirements**: + + The extension has high performance requirements. recommends the following: - Intel Core i5 2-core processor or above. - 8GB of RAM or more. - 64-bit operating system. -- **Browser Settings**: - - Ensure that browser hardware acceleration is enabled when using the beauty extension. +- **Browser settings**: + + Ensure that browser hardware acceleration is enabled when using the extension. + +- ** extension and SDK**: + + The extension encapsulates the beauty function built into Web SDK 4.x (enabled by `setBeautyEffect`) and upgrades the beauty algorithm. If you use the beauty function built into the SDK, Agora recommends upgrading to v4.12.0 or above and using the extension implementation. The built-in beauty function will be gradually discontinued. -- **Beauty extension and SDK**: - - The beauty extension encapsulates the beauty function built into Web SDK 4.x (enabled by `setBeautyEffect`) and upgrades the beauty algorithm. If you use the beauty function built into the SDK, Agora recommends upgrading to v4.12.0 or above and using the beauty extension implementation. The built-in beauty function will be gradually discontinued. +- **Using multiple extensions**: -- **Using Multiple extensions**: - - If you need to use multiple media processing extensions simultaneously, Agora recommends an Intel Core i5 4-core or higher processor. When multiple extensions are enabled, other running programs that occupy significant system resources may cause audio and video freezes in your app. + If you need to use multiple media processing extensions simultaneously, recommends an Intel Core i5 4-core or higher processor. When multiple extensions are enabled, other running programs that occupy significant system resources may cause audio and video freezes in your app. -### API Reference +### API reference -This section provides the API reference for the local image merging extension. +This section provides the API reference for the extension. #### `IBeautyExtension` @@ -93,11 +97,11 @@ onoverload?: () => void; recommends calling `disable` within this event callback function to cease beautification and provide a UI prompt. -#### Type Definition +#### Type definition #### `BeautyEffectOptions` -* Beauty parameters used in the [setOptions](#setoptions) method. +Beauty parameters used in the [setOptions](#setoptions) method. ```typescript export type BeautyEffectOptions = { diff --git a/shared/extensions-marketplace/super-clarity/project-implementation/web.mdx b/shared/extensions-marketplace/super-clarity/project-implementation/web.mdx index 905424502..ad36a79a9 100644 --- a/shared/extensions-marketplace/super-clarity/project-implementation/web.mdx +++ b/shared/extensions-marketplace/super-clarity/project-implementation/web.mdx @@ -14,7 +14,7 @@ To install the [ extension](https://www.npmjs.com/package 1. To import the extension, use one of the following methods: - - **Method 1**: Add the following code to your JavaScript file. + - **Method 1**: Add the following code to your JavaScript file: ```javascript import { @@ -64,8 +64,8 @@ Before joining a channel, create a `SuperClarityExtension` object and call the S When subscribing to a remote video track, follow these steps to enable the extension: 1. Call the extension's `createProcessor` method to create the extension's processor and listen to related events. -2. Call the SDK's `IRemoteVideoTrack.pipe` method to connect the pipeline between the extension and the video track. -3. Call the `enable` method of the extension to activate it. After enabling, play the video track to see the effect of the extension. +1. Call the SDK's `IRemoteVideoTrack.pipe` method to connect the pipeline between the extension and the video track. +1. Call the `enable` method of the extension to activate it. After enabling, play the video track to see the effect of the extension. To avoid performance issues caused by multiple processors working simultaneously, you can create a maximum of two processors. As a result, the extension can only be applied to a maximum of two video tracks at the same time. @@ -127,8 +127,8 @@ track2.play(ele); To stop using the extension, follow these steps to ensure proper cleanup and avoid potential issues: 1. Call the SDK's `IRemoteVideoTrack.unpipe` method to disconnect the pipeline between the `processor` and the current video track. -2. Call the extension's `release` method to destroy the `processor`. -3. Call the SDK's `stop` method to stop the video track, set it to `null`, and destroy it. +1. Call the extension's `release` method to destroy the `processor`. +1. Call the SDK's `stop` method to stop the video track, set it to `null`, and destroy it. ```typescript async function onUserUnpublished(user, mediaType) { diff --git a/shared/extensions-marketplace/super-clarity/reference/web.mdx b/shared/extensions-marketplace/super-clarity/reference/web.mdx index 7adf0cc3b..33de482b1 100644 --- a/shared/extensions-marketplace/super-clarity/reference/web.mdx +++ b/shared/extensions-marketplace/super-clarity/reference/web.mdx @@ -89,9 +89,9 @@ var destroySCProcessor = async (processor) => { -### API Reference +### API reference -This section provides the API reference for the plugin. +This section provides the API reference for the extension. #### `ISuperClarityExtension` @@ -123,7 +123,7 @@ disable(): void | Promise; ##### `release` -Releases all resources used by the plugin. +Releases all resources used by the extension. ```typescript release(): Promise; diff --git a/shared/extensions-marketplace/video-compositor/project-implementation/web.mdx b/shared/extensions-marketplace/video-compositor/project-implementation/web.mdx index 69105f398..bcdff935a 100644 --- a/shared/extensions-marketplace/video-compositor/project-implementation/web.mdx +++ b/shared/extensions-marketplace/video-compositor/project-implementation/web.mdx @@ -10,24 +10,24 @@ In this scenario, you need to composite the following content into a single vide - A screen sharing video track showing the presentation. - Two local images. -- Source video track 1: Created from the video stream captured by the camera, with the background removed using the [virtual background extension](/video-calling/advanced-features/virtual-background). +- Source video track 1: Created from the video stream captured by the camera, with the background removed using the [Virtual Background extension](/video-calling/advanced-features/virtual-background). - Source video track 2: Created from a local video file. This guide uses the sample scenario to introduce steps required to create a composite video track. ### Integrate the extension -For this example, integrate both the virtual background extension and the video compositor extension. +For this example, integrate both the Virtual Background extension and the extension. -1. Integrate the [Virtual background](/video-calling/advanced-features/virtual-background#integrate-the-virtual-background-extension) extension. Ensure that you understand the [considerations](/video-calling/advanced-features/virtual-background#considerations). +1. Integrate the [Virtual Background](/video-calling/advanced-features/virtual-background#integrate-the-virtual-background-extension) extension. Ensure that you understand the [considerations](/video-calling/advanced-features/virtual-background#considerations). -2. Run the following command to integrate the [ extension](https://www.npmjs.com/package/agora-extension-video-compositor) into your project using npm: +1. Run the following command to integrate the [ extension](https://www.npmjs.com/package/agora-extension-video-compositor) into your project using npm: ```bash npm install agora-extension-video-compositor ``` -3. Import the extension in either of the following ways: +1. Import the extension in either of the following ways: - **Method 1:** Add the following code to the JavaScript file: @@ -51,7 +51,7 @@ const client = AgoraRTC.createClient({ mode: "rtc", codec: "vp8" }); // Create VideoCompositingExtension and VirtualBackgroundExtension objects const extension = new VideoCompositingExtension(); const vbExtension = new VirtualBackgroundExtension(); -// Register plugins +// Register extensions AgoraRTC.registerExtensions([extension, vbExtension]); // Create a VideoTrackCompositor object let compositor = extension.createProcessor(); @@ -91,7 +91,7 @@ Follow these steps to inject images into the local video stream: }); ``` -2. Create the input layers of the image and video tracks in order, from bottom to top. The layers created later overlay the earlier ones. In the following code, the screen-sharing image layer is at the bottom, and the image of source video track 2 is at the top layer: +1. Create the input layers of the image and video tracks in order, from bottom to top. The layers created later overlay the earlier ones. In the following code, the screen-sharing image layer is at the bottom, and the image of the source video track 2 is at the top layer: ```typescript // Create the input layer for the screen sharing video track @@ -132,7 +132,7 @@ Follow these steps to inject images into the local video stream: height: 180, fit: "cover", }); - // Set the virtual background of source video track 1 + // Set the virtual background of the source video track 1 if (!vbProcessor) { vbProcessor = vbExtension.createProcessor(); await vbProcessor.init("./assets/wasms"); @@ -152,7 +152,7 @@ Follow these steps to inject images into the local video stream: .pipe(sourceVideoTrack2.processorDestination); ``` -3. Merge all input layers and inject the output into the local video track: +1. Merge all input layers and inject the output into the local video track: ```javascript const canvas = document.createElement("canvas"); @@ -172,7 +172,7 @@ Follow these steps to inject images into the local video stream: .pipe(localTracks.videoTrack.processorDestination); ``` -4. Play and publish the local video track: +1. Play and publish the local video track: ```javascript // Play the local video track diff --git a/shared/extensions-marketplace/video-compositor/reference/web.mdx b/shared/extensions-marketplace/video-compositor/reference/web.mdx index 5d567101b..38250ae1d 100644 --- a/shared/extensions-marketplace/video-compositor/reference/web.mdx +++ b/shared/extensions-marketplace/video-compositor/reference/web.mdx @@ -1,16 +1,16 @@ -- For a working example, check out the [image compositor extension](https://webdemo.agora.io/example/plugin/videoCompositor/index.html?_gl=1*1437x8t*_gcl_au*MTM4NzU5ODkyNy4xNzE5NTgxMTUy*_ga*MjA2MzYxMjY4Mi4xNzAzMDczMjA1*_ga_BFVGG7E02W*MTcxOTY1MDQ3NS4zNTQuMC4xNzE5NjUwNDc1LjAuMC4w) extension. +- For a working example, check out the [Video Compositor extension](https://webdemo.agora.io/example/plugin/videoCompositor/index.html?_gl=1*1437x8t*_gcl_au*MTM4NzU5ODkyNy4xNzE5NTgxMTUy*_ga*MjA2MzYxMjY4Mi4xNzAzMDczMjA1*_ga_BFVGG7E02W*MTcxOTY1MDQ3NS4zNTQuMC4xNzE5NjUwNDc1LjAuMC4w) extension. ### Considerations -- **Browser Support**: - - The image compositor extension supports Chrome 91 and above, Edge 91 and above, and the latest version of Firefox. For the best experience, use Chrome or Edge 94 and above. +- **Browser support**: + - The extension supports Chrome 91 and above, Edge 91 and above, and the latest version of Firefox. For the best experience, use Chrome or Edge 94 and above. - Due to a [bug](https://bugs.webkit.org/show_bug.cgi?id=181663&from_wecom=1) in certain versions of Safari, only iOS Safari 15.4 and above and macOS Safari 13 and above are supported. -- **Performance Considerations**: +- **Performance considerations**: - - The image compositor extension can combine up to two video streams (from cameras or local video files), one screen sharing stream, and two images. Combining more image sources can affect performance and user experience. + - The extension can combine up to two video streams (from cameras or local video files), one screen sharing stream, and two images. Combining more image sources can affect performance and user experience. - If you need to use multiple media processing extensions simultaneously, Agora recommends using an Intel Core i5 4-core or higher processor. When multiple extensions are enabled, other programs running with high resource usage may cause your app to experience audio and video freezes. ### API reference @@ -114,7 +114,7 @@ Stops merging images. stop(): Promise; ``` -#### Type Definition +#### Type definition ##### `LayerOption` diff --git a/shared/extensions-marketplace/virtual-background/reference/web.mdx b/shared/extensions-marketplace/virtual-background/reference/web.mdx index f54fd0d7f..6b81eddc5 100644 --- a/shared/extensions-marketplace/virtual-background/reference/web.mdx +++ b/shared/extensions-marketplace/virtual-background/reference/web.mdx @@ -1,6 +1,6 @@ -- For a working example, check out the [Virtual background web demo](https://webdemo.agora.io/virtualBackground/index.html) and the associated [source code](https://github.com/AgoraIO/API-Examples-Web/tree/main/Demo/virtualBackground). +- For a working example, check out the [Virtual Background web demo](https://webdemo.agora.io/virtualBackground/index.html) and the associated [source code](https://github.com/AgoraIO/API-Examples-Web/tree/main/Demo/virtualBackground). ### Considerations